OpenAI Releases GPT-OSS, Its First Open-Source Large Language Models

In a significant move for the AI and open-source software communities, OpenAI has released GPT-OSS, its first-ever family of open-source large language models. Licensed under the permissive Apache 2.0, this release introduces two powerful models: gpt-oss-120b and gpt-oss-20b. Both are engineered for efficient inference and strong reasoning, leveraging a Mixture-of-Experts (MoE) architecture. The larger gpt-oss-120b model rivals OpenAI’s proprietary o4-mini and operates on a single 80GB GPU. Meanwhile, the compact gpt-oss-20b model, comparable to o3-mini, is optimized for on-device AI applications, requiring as little as 16GB of memory. Complementing the models, OpenAI also launched ‘Harmony,’ an open-source project defining a new prompt template format.

GitHub Fund Boosts Security of 71 Open-Source Projects

GitHub is bolstering the cybersecurity of the open-source ecosystem with its Secure Open Source Fund, which has already enhanced the security posture of 71 critical projects. This initiative provides open-source maintainers with funding, expert guidance, and intensive security education. The impact is clear: participants have remediated over 1,100 vulnerabilities, issued 50+ new CVEs, and prevented numerous secret leaks. The program has driven the adoption of security best practices, with 80% of participating projects now using at least three GitHub-native security features. A notable success story is Ollama, a project for running LLMs locally, which leveraged the fund to conduct a full system threat model and streamline its dependencies, strengthening the software supply chain.

Sources: