Nvidia’s $26B open-weights wager
Plus: Possible backdoors in AI training data, open infrastructure gets new backing, and (much) more
Hi folks,
This week’s lead story looks at Nvidia’s reported $26 billion push into open-weights AI models, a move that would see the chip giant wade further into the software layer of AI while reinforcing the hardware ecosystem that made it dominant.
Elsewhere, there’s new research warning that hidden “Trojan” behaviours could be embedded inside AI models during training, raising fresh questions about security in the growing open model ecosystem — along with a round-up of funding, infrastructure, and open-source projects shaping the wider developer landscape.
As usual, feel free to reach out to me with any questions, tips, corrections, or suggestions: forkable[at]pm.me.
Paul
<Open issue>
Nvidia joins the open-weights race
Nvidia plans to invest heavily in developing open-weights AI models in the coming years, a move that would push the chip giant deeper into the software side of artificial intelligence (AI).
The company has already cemented its position in AI infrastructure through CUDA, a proprietary parallel compute platform designed to help developers extract more performance from Nvidia GPUs. Over time, it has become a core part of the AI development environment, giving researchers and companies tools built specifically for Nvidia hardware.
Now, it appears ready to pursue a similar strategy with open-weights models.
Nvidia intends to put some $26 billion toward building these models over the next five years. Moving further into model development would give Nvidia more direct control over how next-generation AI systems are built, tested, and deployed alongside its chips.
Developers who fine-tune, adapt, or deploy models designed with Nvidia systems in mind are more likely to run those workloads on the company’s GPUs. Model development becomes another layer reinforcing the hardware ecosystem underneath it.
Most of the leading AI systems today remain proprietary, operated behind paid APIs by major labs including OpenAI, Google, and Anthropoic. Models with “open weights” take a different approach: the trained parameters are released so developers can run and modify the model themselves. That doesn’t necessarily make them open source, though, which would also include publishing the training code, data, and full methodology.
At any rate, the timing reflects the rising cost of building advanced AI systems. Training frontier models now demands vast computing resources and infrastructure investments measured in billions of dollars.
For Nvidia, encouraging more organisations to experiment with open-weights models could expand the number of teams building and running AI systems. And each new model trained or deployed ultimately feeds demand for the computing power required to run them.
Read more: The Motley Fool & Wired (paywall)
<Patch notes>
Hidden ‘trojans’ in AI models?
Researchers warn that AI models can carry hidden “Trojan” behaviours inserted during training, lying dormant until triggered by specific inputs. The risk is especially relevant in the open model ecosystem, where developers often download and reuse pre-trained models that may already contain those backdoors.
Read more: Resilience Media
AWS backs open VS Code registry
Open VSX — the open-source extension registry used by VS Code alternatives — has passed 300 million monthly downloads, with AWS and AI coding editor Cursor providing new investment. The funding helps keep the registry running as shared developer infrastructure outside Microsoft’s proprietary marketplace.
Read more: The New Stack
MariaDB bets on Apache Ignite
MariaDB plans to acquire GridGain, the company behind the in-memory computing platform built around the open-source Apache Ignite project. The move signals a bet that faster, memory-centric data infrastructure will be needed as AI systems demand real-time access to large datasets.
Read more: The New Stack
OpenTitan lands in Chromebooks
OpenTitan — the open-source hardware root of trust backed by Google — is now shipping in commercial Chromebooks after seven years of development. The chip’s design is fully open, allowing companies to inspect or even manufacture the security hardware themselves rather than relying on proprietary silicon.
Read more: Google
OpenAI snaps up Promptfoo
OpenAI plans to acquire Promptfoo, a platform used to test and secure AI systems before deployment. The company said Promptfoo’s open-source CLI and evaluation tools will continue to be developed even as its technology is integrated into OpenAI’s enterprise AI platform.
Read more: OpenAI
OpenAI and Anthropic court maintainers
OpenAI and Anthropic are battling for the best open source maintainers through new support programs, aiming to get their coding assistants embedded in widely used projects. The programs are perhaps less charity than strategy: winning over maintainers means influencing the developer tools many other engineers rely on.
Read more: The New Stack
A portal to another dimension
The United Nations (UN) has launched a new Open Source Portal designed to catalogue software projects built across the UN system. The platform aims to encourage reuse between agencies and connect UN-developed code with the wider open-source community.
Read more: LinkedIn & UN Portal
Qdrant raises $50M for open vector search
Open-source vector search engine Qdrant has raised $50 million in Series B funding to expand its infrastructure for production AI systems. The Rust-built project, already widely used for AI retrieval workloads, has surpassed 250 million downloads and continues to grow through its open developer community.
Read more: Press Release
Unleash raises $35M for open feature flags
Open-source feature management platform Unleash has raised $35 million in Series B funding to expand its “FeatureOps” tooling. The project — with more than 40 million downloads and 13,000 GitHub stars — aims to help companies control AI-accelerated software releases using open feature flags and runtime kill switches.
Read more: The New Stack
Don’t give up the Ghost…
Open-source publishing platform Ghost has passed $10 million in annual recurring revenue, founder John O’Nolan said. The bootstrapped non-profit foundation says publishers using the platform have now earned about $130 million collectively.
Read more: John O’Nolan (LinkedIn)
<Final commit>
WordPress, minus the server
WordPress has launched my.WordPress.net, a private workspace where you can spin up a personal WordPress environment instantly in the browser.
The secret sauce lies in WordPress Playground, the open-source project that runs the whole stack client-side and can launch a full WordPress install with a single click — no hosting, database setup, or server required.
Read more: TechCrunch & My.Wordpress


