An incident recently occurred when a French sailor on the only nuclear-powered aircraft carrier jogged, tracked his path with Strava, and set his profile to public.
Le Monde journalists overlaid this data with satellite imagery and showed that said nuclear-powered aircraft carrier was, indeed, chugging along in the Mediterranean near active operations near Iran.
France was already meant to be aware of this, as Strava's global heat map also outed military bases in 2018 and, also in 2018, the Pentagon banned deployed personnel from using geolocation apps.
450 soldiers. Public profiles. Sensitive bases.
Le Monde reported that there were 450 French soldiers over the last decade who were publicly tracking their workouts from sensitive areas. The French military responded, stating, "Appropriate measures will be taken by the command."
This isn't about the French military's account getting a spammy follow request on Strava. We default to putting everyone's location data online.
Strava technically is doing nothing wrong here. It asked a user if it was ok to share their jog publicly, and it was. It's public.
"Works as designed" is not "safe to use"
The problem is that "works as designed" and "safe to use" are not the same thing. All of these apps are trivially repurposable as intelligence tools given one oversharing user.
Engineers don't think about that. They aren't supposed to. They're supposed to design for the happy path: Someone logs their run, all their friends see it, everyone's happier and more motivated.
But the same data that shows how long your Sunday 5K was also shows a carrier strike group's patrol route. No amount of "please review our security recommendations" popups is going to fix a default of public.
The question isn't whether militaries should be banning fitness applications. The question is whether any application that makes highly accurate location data public should be defaulting to public.
The vast majority still do.
What's the most dangerous "works as designed" default you've seen in something you've built or used?
Top comments (1)
The framing of "works as designed" vs "safe to use" is the most underappreciated distinction in software. I run a programmatic SEO site that serves financial data across 8,000+ stock tickers in 12 languages, and we hit a version of this exact problem — not with location data, but with metadata leakage. Every page we generate includes structured data (JSON-LD, Open Graph tags, hreflang attributes) that individually is harmless, but when you scrape the full sitemap you can reconstruct our entire content generation pipeline: which LLM we use, what data sources feed each page, even the order pages were built. All public, all "working as designed."
The scariest version of this I've seen in practice is analytics data. Google Search Console, by design, exposes which queries your site appears for and at what positions. That's useful for SEO. But if a competitor scrapes your public sitemap and cross-references it with their own GSC data, they can reverse-engineer your entire keyword strategy. Again — every piece is public, every API is functioning correctly, and yet the aggregate effect is a competitive intelligence goldmine that nobody opted into sharing.
Your point about defaults is the key lever. The cognitive load of opting out of every possible data exposure across every tool you use is unsustainable. The real fix has to be at the platform level — default to private, make public an active choice that requires understanding what you're exposing.