- OpenAI releases gpt-oss-120b and gpt-oss-20b as open weights language models licensed under the Apache 2.0 license.
- They allow local execution, customization, commercial use and offer performance close to proprietary models such as the o3 and o4-mini.
- Focused on advanced reasoning, chain thinking, and support for autonomous tools.
- Security has been a priority, with independent reviews and protocols against malicious use.
OpenAI has changed its strategy and has introduced gpt-oss-120b along with gpt-oss-20b, the first language models of open weights that it has published in over five years. This launch marks a break with the company's policy of closed developments, and opens the door to developers, companies and individuals can use advanced AI without relying on proprietary services or incurring large costs.
Both models are now available in free on the Hugging Face platform and are distributed under the Apache 2.0 license. This allows any user run them locally, adapt them to specific tasks, integrate them into your own software, and even use them for commercial purposes, without any additional payment or restrictions. OpenAI stresses that with this movement It seeks to make artificial intelligence more accessible globally and foster innovation within a framework of transparency and accountability..
Key technical features of gpt-oss-120b
The gpt-oss-120b model stands out for its architecture based on “mixture-of-experts” (MoE), allowing you to manage 117.000 billion parameters With remarkable efficiency: only 5.100 billion tokens are activated per processed token. This makes it possible, despite its size, to run on a single 80 GB GPU, an affordable requirement for research centers and companies with moderately advanced resources. The gpt-oss-20b variant, meanwhile, is aimed at devices with less memory, and can run on consumer hardware and even laptops with 16 GB of RAM.
In both cases, advanced reasoning has been chosen using the chain of thought technique, allowing each response to be broken down into explanatory intermediate steps. The models are trained with STEM-focused data, programming and general knowledge, which provides them a solid foundation for complex tasks and use of specific tools, such as web searching or running Python code.
Performance and practical applications
Comparative tests show that gpt-oss-120b approaches the level of o4-mini and outperforms OpenAI's o3-mini on most programming, competitive math, and healthcare tasks. The gpt-oss-20b model, being lighter, manages to compete with third-party solutions like DeepSeek R1 and outperforms some benchmarks on specific tasks, especially on edge devices.
Another of its strong points is its customizability: The user can adjust the degree of reasoning (low, medium or high) depending on the task, thus balancing latency and accuracyThis configuration, along with the option to run models offline and behind a firewall, is especially useful in corporate environments with privacy restrictions or auditing needs.
Security, auditing and community
OpenAI has paid special attention to the safety and risk reduction in these models, delaying their publication to subject them to rigorous internal and external evaluations. They have Built-in filters and alignment protocols to prevent misuse, such as the generation of sensitive information or identity theft in areas such as cybersecurity or biotechnology.
Furthermore, The company has invited the community to participate in red teaming challenges, equipped with a $500.000 fund to encourage the detection of new vulnerabilities and emerging threats.
As for limitations, OpenAI recognizes that, despite its advanced architecture, Open models may have slightly higher "hallucination" rates than their proprietary counterparts., and its training has been conducted primarily with English data. However, the documentation and controls in place facilitate auditing and continuous adjustment of these models, promoting responsible and safe use within the global AI ecosystem.
Integration, licensing, and adoption prospects
The weights for both models are offered in MXFP4 format, and reference implementations already exist for PyTorch, Apple Metal, and improved support for platforms such as Azure, AWS, vLLM, llama.cpp, LM Studio, Baseten, and Cloudflare. The Apache 2.0 license allows extremely flexible use, including the possibility of monetizing, redistributing, and integrating them into third-party tools.
For the Spanish and European business community, the arrival of gpt-oss-120b and gpt-oss-20b opens new avenues for automate analysis, develop intelligent assistants y maintain control over data within their own infrastructures, all while reducing costs and accelerating innovation cycles. Considering the potential importance of artificial intelligence in different sectors, These tools allow you to experiment and research in AI without relying on external APIs or restrictive licenses., promoting its own technological development.
This advancement allows technology sector players to access more open, transparent, and adaptable tools, thus promoting a more collaborative and responsible innovation ecosystem.
I am a technology enthusiast who has turned his "geek" interests into a profession. I have spent more than 10 years of my life using cutting-edge technology and tinkering with all kinds of programs out of pure curiosity. Now I have specialized in computer technology and video games. This is because for more than 5 years I have been writing for various websites on technology and video games, creating articles that seek to give you the information you need in a language that is understandable to everyone.
If you have any questions, my knowledge ranges from everything related to the Windows operating system as well as Android for mobile phones. And my commitment is to you, I am always willing to spend a few minutes and help you resolve any questions you may have in this internet world.