OpenAI researchers released a paper describing the development of GPT-3, a state-of-the-art language model made up of 175 billion parameters.  The previous OpenAI GPT model had 1.5 billion parameters and was the biggest model back then, which was soon eclipsed by NVIDIA’s Megatron, with 8 billion parameters followed by Microsoft’s Turing NLG that had 17…

The post OpenAI Releases GPT-3, The Largest Model So Far appeared first on Analytics India Magazine.