The Niantic Data Loophole
How Pokémon GO became a robot navigation company without ever "selling" anyone's data.
Niantic collected 30+ billion geotagged images from Pokémon GO players (many of them children), built an AI mapping model from that data, then spun the AI business into a separate company that licenses navigation tech to delivery robots. The privacy policies say "we don't sell your data." But they sold the company that owns the model trained on your data.
Source: MIT Technology Review, March 10, 2026
Timeline
| Date | Event |
|---|---|
| 2016 | Pokémon GO launches. Millions of players (including children) begin generating location data, camera scans, and movement patterns. |
| 2016-2025 | Niantic accumulates 30+ billion posed images from players scanning PokéStops, Gyms, and real-world landmarks. Each image includes GPS (centimeter precision), phone orientation, movement speed, time, weather. |
| ~2020-2024 | Niantic builds Visual Positioning System (VPS) and Large Geospatial Model from this crowdsourced data. |
| March 2025 | Niantic announces restructuring: games division sold to Scopely for ~$3.85B; geospatial AI spun out as Niantic Spatial Inc. |
| May 2025 | Deal closes. Niantic Spatial (led by John Hanke) takes ownership of VPS, Large Geospatial Model, and the 30B+ image dataset. Receives $250M in funding. |
| May 29, 2025 | New Niantic Spatial privacy policy goes live. Contains same "we don't sell your data" language. |
| March 2026 | MIT Technology Review reports Niantic Spatial's tech powers Coco Robotics' pizza delivery robots with centimeter-level sidewalk navigation. |
The Core Legal Mechanism
What the privacy policies say
Both Niantic's original policy and Niantic Spatial's new policy (effective May 29, 2025) include two key provisions:
"We do not sell or share your Personal Data as those terms are defined under the CCPA."
"If we are acquired by a third party as a result of a transaction such as a merger, acquisition, or asset sale... some or all of our assets, including your Personal Data, will be disclosed or transferred to a third-party acquirer in connection with the transaction."
Why this is technically legal
- Under CCPA/CPRA, a corporate restructuring or asset transfer is not classified as a "sale" of personal data
- Players consented (via ToS) to Niantic using their gameplay data to "improve AR features" and "build a 3D understanding of real-world places"
- Niantic's ToS granted broad rights to "use, modify, and transfer" contributed content "for any purpose"
- The trained AI model is treated as Niantic's intellectual property, not as personal data
The loophole in plain English
You can't auction off kids' raw location data. But you can:
- Collect it through a game kids love. Every PokéStop scan, every Gym visit, every AR photo becomes a training sample.
- Train a billion-dollar AI model on it. The model is intellectual property. The raw data dissolves into weights and embeddings.
- Spin that model into a separate company. Asset transfer, not a data sale. Privacy policy permits it by design.
- License the model to robot fleets. Coco Robotics delivers pizza using centimeter-accurate maps built from children's gameplay.
At no point did anyone "sell personal data" under the legal definition. The value was extracted, laundered through model training, and monetized via corporate restructuring.
COPPA Analysis
Protections in place
- Pokémon GO requires parental consent for under-13s via Niantic Kids program (FTC safe-harbor certified)
- Niantic claims they don't knowingly collect or keep children's data without consent
- Under-13 data reportedly not shared with outside analytics partners
Gaps and concerns
- Did parents understand that "help improve AR features" meant training commercial navigation AI for delivery robots?
- Parental consent was for the game. The consent did not contemplate a corporate spin-out that transfers derived data products to a new entity.
- The opt-in for contributing scans ("to build a 3D understanding of real-world places") is vague enough to cover almost anything.
- No recent FTC actions, lawsuits, or enforcement actions tie this specific restructuring to a COPPA violation.
- Privacy advocates have raised concerns since 2016 (senator letters, EPIC complaints), but Niantic responded with policy tightening, not structural changes.
The Bigger Pattern
This isn't just a Niantic story. It's a template.
- Collect data under broad ToS through a consumer product people love.
- Train a proprietary model on that data. The model is IP, not personal data.
- Restructure the company to extract the model into a new entity.
- Monetize the model through B2B licensing.
- Point to the privacy policy that says you never sold anyone's data.
This pattern is replicable by any company sitting on large user datasets: fitness apps (movement patterns), photo apps (visual data), social platforms (behavioral data), navigation apps (driving patterns).
Editorial Position: The Buffett Paradox
Understanding this playbook is not the same as endorsing it.
Warren Buffett has spent decades exploiting every legal tax advantage available to Berkshire Hathaway. He does this because fiduciary duty requires it. If the rules allow it and your competitors use it, you're not being virtuous by leaving money on the table. You're being negligent.
But Buffett also says the rules are broken. He's argued publicly, repeatedly, that he shouldn't pay a lower tax rate than his secretary. He exploits the system and advocates for fixing it at the same time. These are not contradictory positions. They're the only intellectually honest ones.
The Niantic data loophole works the same way. If you're a founder-operator sitting on a valuable user dataset, you should understand this structure. Your competitors do. Your investors expect you to. Your fiduciary obligations demand it.
That does not make it right.
The idea that you can collect location data from millions of children playing a game, train a commercial AI model on it, spin that model into a new company, and then claim you "never sold anyone's data" is a legal fiction. It's technically compliant and morally bankrupt. The fact that the privacy policy permits it is an indictment of the privacy policy, not a defense of the practice.
Founder-operators should learn this playbook the way Buffett learned the tax code: master the rules, use them where fiduciary duty requires, and advocate loudly for better ones.
Open Questions
- FTC position: Has the FTC weighed in on whether derived AI models trained on children's data constitute a "use" that requires separate COPPA consent?
- Legal precedent: What precedent exists for treating model training as a fundamentally different "use" than the original data collection purpose?
- Class action viability: Could a class action succeed arguing that the asset transfer clause doesn't cover spinning out a new company (vs. being acquired)?
- International implications: GDPR has stricter rules on purpose limitation and legitimate interest. Would this fly in the EU?
- Right to deletion: If a player deletes their Pokémon GO account, does their contribution to the 30B image training set get removed? (Almost certainly not, since the model is treated as derived IP.)
- Downstream contracts: What exactly is Coco Robotics receiving: raw scans, processed maps, or API access to the VPS?
Why This Matters for Founder-Operators
Data Moat Strategy
Niantic built a $250M+ business from data collected as a byproduct of gameplay. Every founder sitting on user-generated data should understand this playbook.
AI Leverage
The "collect data, train model, spin out IP" pattern is the purest form of AI-powered leverage. The marginal cost of the 30 billionth image was zero.
Legal Architecture
The difference between "selling data" and "transferring assets" is entirely a function of deal structure. Think about this from day one.
COPPA/Privacy Risk
The regulatory environment around derived data products is unsettled. What's legal today may not be tomorrow.