In a recent development, Chinese researchers have created a gigantic language model that can be compared to GPT-2 in terms of the number of parameters that it is trained on. The language model developed by the researchers from Tsinghua University and the Beijing Academy of Artificial Intelligence has trained on around 2.6 billion parameters with…

The post GPT-2 Vs Chinese Language Model: How Was The Latter Trained appeared first on Analytics India Magazine.