The Niantic Data Loophole

How Pokémon GO became a robot navigation company without ever "selling" anyone's data.

March 12, 2026··Research
Built with Founder OS

Research-grade analysis in hours, not weeks.

This analysis was produced using the Founder OS skills library: AI-augmented operational intelligence for founder-operators. The research, legal analysis, and editorial framing were all generated inside the system.

What is Founder OS? A skills and agent layer installed directly into Claude. You point it at a real business problem. It does the work. This post is what the research workflow actually looks like.
Get Founder OS See the system behind this research

Niantic collected 30+ billion geotagged images from Pokémon GO players (many of them children), built an AI mapping model from that data, then spun the AI business into a separate company that licenses navigation tech to delivery robots. The privacy policies say "we don't sell your data." But they sold the company that owns the model trained on your data.

Source: MIT Technology Review, March 10, 2026

Timeline

DateEvent
2016Pokémon GO launches. Millions of players (including children) begin generating location data, camera scans, and movement patterns.
2016-2025Niantic accumulates 30+ billion posed images from players scanning PokéStops, Gyms, and real-world landmarks. Each image includes GPS (centimeter precision), phone orientation, movement speed, time, weather.
~2020-2024Niantic builds Visual Positioning System (VPS) and Large Geospatial Model from this crowdsourced data.
March 2025Niantic announces restructuring: games division sold to Scopely for ~$3.85B; geospatial AI spun out as Niantic Spatial Inc.
May 2025Deal closes. Niantic Spatial (led by John Hanke) takes ownership of VPS, Large Geospatial Model, and the 30B+ image dataset. Receives $250M in funding.
May 29, 2025New Niantic Spatial privacy policy goes live. Contains same "we don't sell your data" language.
March 2026MIT Technology Review reports Niantic Spatial's tech powers Coco Robotics' pizza delivery robots with centimeter-level sidewalk navigation.

The Core Legal Mechanism

What the privacy policies say

Both Niantic's original policy and Niantic Spatial's new policy (effective May 29, 2025) include two key provisions:

"We do not sell or share your Personal Data as those terms are defined under the CCPA."

"If we are acquired by a third party as a result of a transaction such as a merger, acquisition, or asset sale... some or all of our assets, including your Personal Data, will be disclosed or transferred to a third-party acquirer in connection with the transaction."

Why this is technically legal

The loophole in plain English

You can't auction off kids' raw location data. But you can:

  1. Collect it through a game kids love. Every PokéStop scan, every Gym visit, every AR photo becomes a training sample.
  2. Train a billion-dollar AI model on it. The model is intellectual property. The raw data dissolves into weights and embeddings.
  3. Spin that model into a separate company. Asset transfer, not a data sale. Privacy policy permits it by design.
  4. License the model to robot fleets. Coco Robotics delivers pizza using centimeter-accurate maps built from children's gameplay.

At no point did anyone "sell personal data" under the legal definition. The value was extracted, laundered through model training, and monetized via corporate restructuring.

COPPA Analysis

Protections in place

Gaps and concerns

The Bigger Pattern

This isn't just a Niantic story. It's a template.

  1. Collect data under broad ToS through a consumer product people love.
  2. Train a proprietary model on that data. The model is IP, not personal data.
  3. Restructure the company to extract the model into a new entity.
  4. Monetize the model through B2B licensing.
  5. Point to the privacy policy that says you never sold anyone's data.

This pattern is replicable by any company sitting on large user datasets: fitness apps (movement patterns), photo apps (visual data), social platforms (behavioral data), navigation apps (driving patterns).

Editorial Position: The Buffett Paradox

A necessary distinction

Understanding this playbook is not the same as endorsing it.

Warren Buffett has spent decades exploiting every legal tax advantage available to Berkshire Hathaway. He does this because fiduciary duty requires it. If the rules allow it and your competitors use it, you're not being virtuous by leaving money on the table. You're being negligent.

But Buffett also says the rules are broken. He's argued publicly, repeatedly, that he shouldn't pay a lower tax rate than his secretary. He exploits the system and advocates for fixing it at the same time. These are not contradictory positions. They're the only intellectually honest ones.

The Niantic data loophole works the same way. If you're a founder-operator sitting on a valuable user dataset, you should understand this structure. Your competitors do. Your investors expect you to. Your fiduciary obligations demand it.

That does not make it right.

The idea that you can collect location data from millions of children playing a game, train a commercial AI model on it, spin that model into a new company, and then claim you "never sold anyone's data" is a legal fiction. It's technically compliant and morally bankrupt. The fact that the privacy policy permits it is an indictment of the privacy policy, not a defense of the practice.

Founder-operators should learn this playbook the way Buffett learned the tax code: master the rules, use them where fiduciary duty requires, and advocate loudly for better ones.

Open Questions

  1. FTC position: Has the FTC weighed in on whether derived AI models trained on children's data constitute a "use" that requires separate COPPA consent?
  2. Legal precedent: What precedent exists for treating model training as a fundamentally different "use" than the original data collection purpose?
  3. Class action viability: Could a class action succeed arguing that the asset transfer clause doesn't cover spinning out a new company (vs. being acquired)?
  4. International implications: GDPR has stricter rules on purpose limitation and legitimate interest. Would this fly in the EU?
  5. Right to deletion: If a player deletes their Pokémon GO account, does their contribution to the 30B image training set get removed? (Almost certainly not, since the model is treated as derived IP.)
  6. Downstream contracts: What exactly is Coco Robotics receiving: raw scans, processed maps, or API access to the VPS?

Why This Matters for Founder-Operators

Data Moat Strategy

Niantic built a $250M+ business from data collected as a byproduct of gameplay. Every founder sitting on user-generated data should understand this playbook.

AI Leverage

The "collect data, train model, spin out IP" pattern is the purest form of AI-powered leverage. The marginal cost of the 30 billionth image was zero.

Legal Architecture

The difference between "selling data" and "transferring assets" is entirely a function of deal structure. Think about this from day one.

COPPA/Privacy Risk

The regulatory environment around derived data products is unsettled. What's legal today may not be tomorrow.