- Openai has released the GPT-OSS-20B and GPT-OSS-1120B as open-weight models
- The GPT-OSS-20B model can run locally on a consumer PC
- These models let users see every step in AI’s thinking process, making them more transparent in how they work
Openai has just dropped two new AI models, GPT -OSS -120B and GPT -OSS -20B. Not only are the new ones, but they are the first open weight models from chatgpts create since GPT -2.
The smaller of the two-GPT-OSS-20B is especially remarkable to be easy enough to run on a decently specified consumer PC. If you have about 16 GB of RAM and a little patience, you can load it, ask that question and actually see how it arrives for answers. The larger 120B model still requires serious hardware or cloud support.
Both models are part of Openai’s new push to encourage developers to play with the models and even commercialize them for the average user. For the first time this year, both developers and curious individuals can download and run Openai models on their own machines, inspecting how they think and build on them freely. They are available via Hugging Face and AWS under the Apache 2.0 license.
Being open weight means that the models provide a level of transparency and independence that most people haven’t had since chatgpt first became viral. Real -time reasoning is visible everywhere, and you can see how the model’s ‘logic’ leads to its final options for how to react and how it makes this decision.
It’s a big shift for Openai. The company has spent several years limiting access to its most powerful tools and only offering API points and paid levels. Now it returns a bit to the GPT-2 era, but with far more skilled models. Still, the lighter model is not something that everyone will hurry as a replacement for the chatgpt app.
GPT-EOS is out! We made an open model that works on par with O4-MINI and runs on an advanced laptop (WTF !!) (and a smaller one running on a phone). Super proud of the team; Great triumph of technology.August 5, 2025
Open weight access
The flexibility provided by the new models could be a blessing to Openai as the open weight approach becomes more popular. Deepseek, Meta and Mistral have all released open models somehow recently. But most people have proven to be semi-open, which means they are trained in non-compiled data or have narrowed expressions and utility limits.
The GPT-OSS models are straightforward in offering weights and licensing, although the training data remains proprietary. And Openais GPT-OSS models bring compatibility with Openai’s widely used interface, as well as a larger window into how the model makes decisions that stand out.
So where Deepseek models tend to emphasize the raw power and relatively cheap performance of their models, Openai is more interested in explaining how the models work. It will take the interest of many developers trying to learn or experiment. You can literally break and look at what’s going on inside the model.
It is also a signal to the developer community: Openai invites people to build with their technique in a way that has not been possible for years. These models do not just emit chat finishes; They offer a foundation. With the Apache license you can fine-tune, distill, integrate or wrap them into brand new products. And you don’t have to send data to a third party to do so. Those who hope for a more decentralized AI ecosystem are likely to be satisfied.
For those of us who are less technical, the main point is that you may see AI apps that are running very well without needing a subscription price from you, or personalized tools that do not send your data to the cloud. With this release, Openai can claim to be willing to share more with developers, if not quite everything.
For a company that has spent much of the past two years building closed systems and premium subscriptions, releasing a few models capable of open licenses feels like an important philosophical change, or at least the desire to make such a change seem real.
And although these two models do not represent pushed into the era with GPT-5, Openai Manager Sam Altman has teased that even more messages are on the horizon, so it is likely that the next generation model is still clear.



