20 Jun Chinese-Built ChatGLM Exceeds GPT-4 Across Several Benchmarks Gopika Raj AI News & Update The GLM-4 model was pre-trained on 10 trillion tokens of multilingual data and further aligned via supervised fine-tuning and reinforcement learning from human feedback.