- Openai’s new models are running effectively on minimal hardware but has not been tested independently for workload
- The models are designed for Edge Use Cases where Full -scale infrastructure is not always available
- Apache 2.0 License can encourage wider experimentation in regions with strict data requirements
Openai has released two open-weight models, GPT-OSS-12B and GPT-OSS-20B, and placed them as directly challengers for offers such as Deepseek-R1 and other large language learning models (LLMS) that are currently creating the AI-ecosystem.
These models are now available on AWS through its Amazon Bedrock and Amazon Sagemaker AI platforms.
This marks Openai’s entry into the open model segment, a space that has so far been dominated by competitors such as Mistral AI and Meta.
Openai and Aws
The GPT-OSS-12B model runs on a single 80 GB GP, while the 20B version is targeted at maternal environments with only 16 GB of memory required.
Openai claims that both models provide strong reasoning that matches or exceeds its O4-MINI model on Key Benchmarks.
However, external evaluations are not yet available, leaving the actual benefit across different workloads open for control.
What separates these models is not only their size but also the license.
Published under Apache 2.0, they are intended to lower access barriers and support wider AI development, especially in high security or resource-limited environments.
According to Openai, this is in line with its wider mission to make artificial intelligence tools more useful across industries and geographies.
At AWS, the models are integrated into corporate infrastructure via Amazon Bedrock Agent Core, enabling the creation of AI agents capable of performing complex workflows.
Openai suggests that these models are suitable for tasks such as code generation, scientific reasoning and multiple -step problem solving, especially where adjustable reasoning and thought growth outputs are required.
Their 128K context window also supports longer interactions, such as document analysis or technical support tasks.
The models also integrate with developer tools, supporting platforms such as VLLM, Llama.cpp and embraced face.
With features such as protective frames and upcoming support for custom model imports and knowledge bases, Openai and AWS throw this as a developer-clear foundation for building scalable AI applications.
Still, the release is partially strategically, Openai positions as a key player in open model infrastructure while tying its technology closer to Amazon Web Services, a dominant power in cloud computing.



