The 90 Percent Lie and the Myth of the OpenAI Martyr

The 90 Percent Lie and the Myth of the OpenAI Martyr

The Narrative is a Trap

The tech press is currently obsessed with a single number: 90%.

They want you to believe that the ongoing legal battle between Elon Musk and OpenAI is a simple morality play. In this version of the story, Musk is the greedy villain who demanded 90% of the company, and Sam Altman is the visionary hero who protected the "mission" by saying no. It’s a clean, digestible headline. It’s also total nonsense.

If you’re reading the standard coverage, you’re being fed a sanitized version of how power actually works in Silicon Valley. The debate shouldn't be about who wanted what percentage of a non-profit-turned-capped-profit-hybrid. That’s a distraction. The real story is about the death of the open-source ideal and the cold, hard reality that "safety" is often just a marketing term for "moat."

Musk isn't a martyr, but Altman isn't a saint. They are two titans fighting over the most valuable intellectual property in human history. To understand why the 90% figure is a red herring, we have to look at the mechanics of equity, ego, and the inevitable pivot to closed-source dominance.

Equity is a Proxy for Control

In the early days of OpenAI, the 90% figure wasn't a demand for personal wealth in the traditional sense. It was a demand for control.

When Musk allegedly asked for the lion's share of the entity, he was operating under a thesis he still holds today: AI is too dangerous to be governed by a committee of "effective altruists" or a shifting board of directors. He wanted the Apple model—centralized, vertical, and ruthlessly directed by a single vision.

The media frames this as greed. It’s actually about governance.

In any high-stakes venture, the person who provides the initial capital and the brand-name gravity usually wants the steering wheel. The mistake Musk made wasn't asking for 90%; it was thinking that a non-profit structure would ever survive the massive compute costs required to actually build an LLM.

Altman and Brockman realized earlier than Musk that the non-profit dream was a thermal exhaust port in their Death Star. They didn't reject Musk's 90% demand because they were more "noble." They rejected it because they found a better deal with Microsoft—one where they could keep the optics of a non-profit while effectively operating as the R&D arm of a trillion-dollar software giant.

The Open-Source Bait and Switch

Let’s be brutally honest about the name: OpenAI.

The "Open" part was a recruitment tool. It was a way to attract the world’s top researchers who were tired of the "black box" culture at Google and Meta. It was a promise that the work would benefit humanity, not just shareholders.

Then, GPT-3 happened. Then GPT-4. Suddenly, "openness" became a "safety risk."

This is the most successful pivot in the history of corporate communications. By claiming that their models are too powerful to be shared, OpenAI effectively closed the door behind them. They used the open-source community to build their foundations, then pulled up the ladder and called it "responsibility."

Musk’s lawsuit, while draped in personal grievances, hits on this fundamental betrayal. He is suing because the contract he thought he signed—the "founding agreement"—was treated as a suggestion rather than a mandate.

If you think OpenAI is still a non-profit mission, I have a bridge in the metaverse to sell you. They are a product company. Their product is intelligence, and you don’t give away your product for free when it costs $100 million per training run.

The Compute Tax

Why did OpenAI transition to a "capped profit" entity? It wasn't a philosophical choice. It was an engineering necessity.

Training a model like GPT-4 requires an astronomical amount of hardware. We are talking about tens of thousands of H100 GPUs. Musk’s initial $100 million commitment, while generous for a typical startup, is a rounding error in the world of modern AI.

  1. Energy Costs: The power requirements for these data centers are now rivaling small countries.
  2. Hardware Scarcity: You can't build a frontier model without Nvidia's cooperation.
  3. Data Acquisition: The era of free, scraped data is ending. Lawsuits from publishers mean data will soon be a licensed commodity.

Altman understood the "Compute Tax" better than anyone. He realized that to win, OpenAI had to stop being a research lab and start being a fundraising machine. Musk’s 90% demand was based on the old world—where a single billionaire could fund a revolution. Altman’s world is one where you need the sovereign wealth of nations and the cloud infrastructure of Microsoft just to stay in the race.

The "Safety" Smoke Screen

Whenever a tech company says they are doing something for "safety," look at their balance sheet.

By keeping their models closed, OpenAI prevents competitors from understanding the fine-tuning processes and data sets that make GPT-4 superior. They argue that an open-source GPT-4 would allow bad actors to create bio-weapons or launch massive cyberattacks.

While those risks are real, the solution isn't "trust us, we're the only ones who can hold this power." That’s not safety; that’s a monopoly.

If Musk had his 90%, he would likely be doing the exact same thing. He would be citing "X-risk" (existential risk) as a reason to keep the weights of Grok 3 under lock and key. The 90% figure is just a number in a ledger. The real asset is the ability to define what "safe AI" looks like for the rest of the world.

The Boardroom Coup was the Real Trial

The trial we are seeing now is just the sequel to the messy boardroom coup that briefly ousted Altman.

Remember the confusion? The board fired him, the employees revolted, and within days, he was back with a new board. That wasn't a victory for "AI progress." It was the final nail in the coffin of the original OpenAI mission.

The original board tried to exercise the "non-profit" oversight they were legally granted. They failed because the "capped profit" subsidiary had become too big to fail. Microsoft didn't care about the non-profit charter; they cared about their $13 billion investment.

The "90 percent" narrative is being pushed now to make the current leadership look like the defenders of the faith. It frames the struggle as: "Elon wanted it for himself, but we kept it for you."

Don't buy it. They didn't keep it for you. They kept it for the shareholders and the enterprise clients who pay for API access.

Why Musk is Actually Suing

Musk isn't suing for the money. He has enough. He is suing because he was outmaneuvered in a game he thought he invented.

He helped create the most powerful entity in the world, gave it a name that symbolized transparency, and watched it become the most opaque, profit-driven powerhouse in tech. His 90% demand was a clumsy attempt to keep OpenAI under his sphere of influence. Altman’s counter-move was more elegant: he turned the whole company into a structure so complex that the original mission became legally unenforceable.

The Irony of the High-Stakes Trial

The ultimate irony is that while these two fight over the history of OpenAI, the future is moving toward local, smaller, and—yes—open-source models.

Companies like Meta with Llama and startups like Mistral are proving that you don't need a $100 billion "God-model" to provide massive value. The gatekeeping that OpenAI is trying to maintain through regulatory capture and "safety" rhetoric is already starting to crack.

  • Dismantle the 90% distraction: It’s a ego-driven figure from a dead era of the company.
  • Ignore the "non-profit" branding: OpenAI is a hardware-intensive software company.
  • Watch the regulation: The "safety" laws being lobbied for in D.C. and Brussels are designed to protect the incumbents, not the public.

Stop asking who wanted what percentage of the company in 2015. Ask who controls the weights of the models in 2026. Ask why a "non-profit" is charging you $20 a month for a pro subscription. Ask why the "founding agreement" is being treated like a piece of fan fiction.

The 90% figure is a shiny object designed to keep you from looking at the man behind the curtain. The trial won't save the "mission" of OpenAI because that mission died the moment the first H100 was plugged in.

Build your own models. Own your own data. Stop waiting for a billionaire to save humanity with a closed-source chatbot.

VW

Valentina Williams

Valentina Williams approaches each story with intellectual curiosity and a commitment to fairness, earning the trust of readers and sources alike.