OpenAI Releases GPT-3, The Largest Model So Far

What is Microsoft Without OpenAI?
Image by Microsoft CEO Satya Nadella and OpenAI CEO Sam Altman at the Microsoft campus in Redmond, Wash. on July 15, 2019. (Photography by Scott Eklund/Red Box Pictures)
OpenAI researchers released a paper describing the development of GPT-3, a state-of-the-art language model made up of 175 billion parameters.  The previous OpenAI GPT model had 1.5 billion parameters and was the biggest model back then, which was soon eclipsed by NVIDIA’s Megatron, with 8 billion parameters followed by Microsoft's Turing NLG that had 17 billion parameters. Now, OpenAI turns the tables by releasing a model that is 10x larger than Turing NLG. Current NLP systems still largely struggle to learn from a few examples. With GPT-3, the researchers show that scaling up language models greatly improves task-agnostic, few-shot performance, sometimes even reaching competitiveness with prior state-of-the-art fine-tuning approaches. Natural language processing tasks rang
Subscribe or log in to Continue Reading

Uncompromising innovation. Timeless influence. Your support powers the future of independent tech journalism.

Already have an account? Sign In.

📣 Want to advertise in AIM? Book here

Picture of Ram Sagar
Ram Sagar
I have a master's degree in Robotics and I write about machine learning advancements.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed