EleutherAI has announced GPT-NeoX-20B, a 20 billion parameter model trained using the GPT-NeoX framework on GPUs from CoreWeave. The company claimed it is the largest publicly accessible pretrained general-purpose autoregressive language model.

Image: EleutherAI

The model will help in accelerating research towards the safe use of AI systems.

Image:  EleutherAI

The full model weights will be downloadable for free from February 9, under a permissive Apache 2.0 license from The Eye. Till then, you can try the model using CoreWeave and Anlatan’s inference service, GooseAI.

“GPT-NeoX and GPT-NeoX-20B are very much research artifacts and we do not recommend deploying either in a production setting without careful consideration,” EleutherAI said.