1
Anthropic Is Winning the Enterprise AI War, and the Receipts Are Public
Ramp published its March 2026 AI Index showing Anthropic now wins 70% of head-to-head matchups against OpenAI among first-time business AI buyers. Nearly one in four businesses on Ramp pays for Anthropic, up from one in 25 a year ago. OpenAI adoption fell 1.5% while Anthropic grew 4.9% month-over-month. The data landed on X like a grenade. Aakash Gupta's viral post called out Sam Altman for sharing a Codex growth chart with no Y-axis, noting it "conveniently leaves out" the Ramp spending data. Alex Banks amplified it: "Claude is becoming the default AI for businesses."
Why it matters: This is the first time hard spending data (not benchmarks, not vibes, not Twitter polls) has shown a clear market reversal in enterprise AI. For founder-operators, the signal is straightforward: if you have been defaulting to OpenAI because "everyone uses ChatGPT," the market is moving. Anthropic's Claude is gaining ground specifically among businesses that evaluate both options and choose with their wallets. This does not mean you should switch tomorrow. It means your next AI vendor evaluation should be a genuine comparison, not a rubber stamp. The days of a single default are over.
Hot on X
The single hottest AI discussion on the platform right now. Ramp data plus Altman's chart blunder created a perfect storm of engagement.
2
Altman vs. Musk: Space Data Centers and the Feud That Won't Quit
Elon Musk wants to put AI data centers in orbit. SpaceX filed FCC documents for up to 1 million satellites functioning as distributed compute nodes. Sam Altman called the idea "ridiculous," citing prohibitive launch costs and the impossibility of on-orbit repairs. Google's Sundar Pichai then jumped in, saying Google could deploy solar-powered data centers in space "as soon as next year" via Project Suncatcher. This sits on top of the ongoing OpenAI-vs-Musk lawsuit heading to jury trial, and reports that Altman is eyeing a rocket company (Stoke Space) to compete with SpaceX directly.
Why it matters: Strip away the CEO theater and the underlying question is real: where does AI compute go when terrestrial power and cooling constraints hit their ceiling? Data center power demand is already straining local grids. Community opposition is growing (see story 5). Space is almost certainly not the near-term answer, as Altman correctly notes. But the fact that three of the world's most powerful tech leaders are publicly debating it tells you the infrastructure bottleneck is severe enough to force creative thinking. For founder-operators, the practical takeaway is simpler: compute capacity will remain constrained and expensive for the foreseeable future. Build your AI workflows to be compute-efficient, not compute-hungry.
Hot on X
The space-data-center concept is polarizing (visionary or delusional?) and feeds the Altman-Musk rivalry narrative that the platform thrives on.
3
AI Funding Hits $189B in a Single Month
Global AI startup funding reached $189 billion in February 2026 alone. The headliners: OpenAI closed a $110 billion round at an $840 billion valuation. AMI Labs raised a $1.03 billion European seed round for world models. Nscale secured $2 billion for AI compute infrastructure. And over $1.2 billion flowed into robotics companies in a single week. These numbers dwarf anything from 2024 or 2025.
Why it matters: When $189 billion enters a sector in 30 days, the competitive landscape shifts fast. More money means more products, more price pressure, and more companies fighting for the same customers. For founder-operators, this funding wave creates both opportunity and noise. The opportunity: tools you need will get cheaper and better as funded startups compete for your business. The noise: every vendor will claim to be "AI-powered" whether the AI is core or cosmetic. Your evaluation filter matters more than ever. Ask what the AI actually does, not what the pitch deck says it does.
4
Google Makes Gemini Personal Intelligence Free for All US Users
Google announced that Gemini's personal intelligence features, previously locked behind paid tiers, are now free for all US users. This includes AI integration across Gmail, Photos, Chrome browsing context, Docs, Sheets, Slides, and Drive. The move puts pressure on Microsoft's Copilot (still $30/month for full features) and positions Google to be the default AI layer for anyone already in the Google Workspace ecosystem.
Why it matters: Free is the most powerful pricing strategy in software, and Google just deployed it across products that billions of people already use. For founder-operators running teams on Google Workspace, this changes the AI adoption conversation. You no longer need to justify a per-seat AI budget to get basic AI capabilities into your team's daily tools. The catch: "free" in Google's world means your data trains their models (check the terms). If you handle sensitive client data in Google Docs or Gmail, read the fine print before celebrating. The price is zero dollars. The cost might be your data.
5
Populist Backlash Against AI Data Centers Is Growing
TIME reported on a spreading populist movement against AI data centers across the United States. Communities are pushing back against new construction, citing energy costs, water usage, noise, and strain on local power grids. Several proposed data center projects have been delayed or blocked by local opposition. The backlash connects to broader concerns about who benefits from AI infrastructure (tech companies and their investors) versus who bears the costs (local residents paying higher electricity bills).
Why it matters: AI needs compute. Compute needs data centers. Data centers need power and cooling and land. Every link in that chain now faces organized opposition. For founder-operators, this is not an abstract policy debate. If your AI vendor's infrastructure plans depend on new data center construction, delays could affect capacity and pricing. More immediately, this signals that the "unlimited compute" assumption baked into many AI growth projections is hitting physical and political limits. The operators who build AI workflows that are efficient with compute (rather than assuming infinite cheap capacity) will have a structural advantage as these constraints tighten.
6
Atlassian Cuts 10% of Workforce, Replaces CTO with Two AI-Focused CTOs
Atlassian laid off roughly 10% of its staff and restructured its technical leadership, replacing its single CTO with two new CTOs focused specifically on AI integration. The move mirrors a broader pattern: established tech companies are not just adding AI features. They are reorganizing entire leadership structures around AI as a core competency. Separately, Visa announced it is testing AI agents with actual payment authority, allowing AI to authorize transactions on behalf of users.
Why it matters: When a company the size of Atlassian restructures its C-suite around AI, it tells you where enterprise software is headed. Every major platform your business depends on (project management, payments, communications) is being rebuilt with AI at the center, not bolted on at the edges. The Visa detail is worth separate attention: AI agents with payment authority means AI is moving from "suggests actions" to "takes actions with financial consequences." For founder-operators, this is the moment to think about what your AI agents should be authorized to do, and what guardrails you need before that authority expands.