DEV Community

TechPulse Lab
TechPulse Lab

Posted on • Originally published at techpulselab.com

Perplexity Wants Your Health Data Now — And Apple Is Letting Them Have It

Perplexity — the AI search company caught scraping publishers' content without permission, sued by Dow Jones and the New York Post, and exposed by Wired and Cloudflare for spoofing user-agent strings to bypass website blocks — now wants access to your heart rate, sleep patterns, step counts, and medical data.

And Apple is letting them have it.

What Perplexity Health Actually Does

Perplexity Health integrates directly with Apple's HealthKit framework. Once you grant permission, it accesses your Apple Health data — heart rate, sleep duration, step counts, workout history, and other biometrics your Apple Watch and iPhone have been collecting for years.

The pitch: ask a health question, get answers referencing your actual data. "Why am I sleeping poorly?" pulls your real sleep metrics. Sounds useful, right?

That's the trap. The question isn't whether personalized AI health answers could be valuable. The question is whether this company, with this track record, should hold the keys to your most intimate biological data.

The Trust Problem Is a Pattern

The highlight reel:

  • June 2024: Forbes accused Perplexity of publishing stories "largely copied" from proprietary articles without citation
  • June 2024: Dow Jones and NYP filed a copyright infringement lawsuit — alleging Perplexity's AI hallucinated quotes and attributed them to real articles
  • October 2024: The New York Times sent a cease-and-desist for unauthorized content scraping
  • Wired/Cloudflare exposé: Perplexity used undisclosed crawlers with spoofed user-agent strings to bypass explicit scraping blocks

This is the company that now wants to read your Apple Health data.

The Privacy Calculus Is Insane

Health data is not like search history. Your resting heart rate reveals cardiovascular conditions. Sleep patterns suggest mental health issues. Activity data reveals mobility problems. Together, they paint an extraordinarily detailed picture — one enormously valuable to insurance companies, employers, and data brokers.

Perplexity hit a $21.21 billion valuation. Its ARR grew from $80M to ~$200M. It processes 30 million queries daily. It committed $750 million to Microsoft Azure for GPU capacity.

These numbers don't get paid back with $20/month subscriptions alone. The pressure to monetize your data isn't hypothetical — it's economic inevitability.

Apple's HealthKit Hypocrisy

Apple marketed itself as the privacy company for a decade. "Privacy. That's iPhone." Tim Cook positioned himself as the anti-Zuckerberg.

Except you can hand all your health data to Perplexity with a single permission toggle.

Yes, HealthKit requires user consent. But "user consent" in the age of AI is a fig leaf. Most people don't read permission dialogs. Most assume that if an app integrates with Apple's frameworks, Apple has vetted it.

Technical compliance and trustworthiness are not the same thing.

The Medical Misinformation Risk

Perplexity Health is not FDA-regulated. It's not staffed by doctors. It's an LLM pointed at your health data told to say helpful things.

Remember: Perplexity's AI was caught attributing made-up quotes to real news organizations. That was journalism. Now apply that hallucination tendency to your health data.

Imagine asking "Is my heart rate pattern concerning?" and getting a confidently wrong answer.

What HIPAA Doesn't Cover

Consumer health data from apps like Perplexity Health likely isn't covered by HIPAA. The law only applies to "covered entities" like hospitals and insurance companies. A consumer AI app reading your Apple Health data? Almost certainly not covered.

Your heart rate, sleep patterns, and activity metrics sit in a legal gray zone where the protections most people assume exist simply don't.

What You Should Do

If you've granted Perplexity Health access: revoke it now. Settings → Health → Data Access & Devices → Perplexity.

If you haven't: don't. The risk-reward is wildly unfavorable. You're handing your most intimate biological data to a company that repeatedly demonstrates it doesn't respect data boundaries — in exchange for an unregulated AI health assistant that can hallucinate.

Want AI health insights? Wait for a company that has actually earned your trust. Or talk to your doctor. They went to medical school. The LLM didn't.


Originally published on TechPulse Daily.

Top comments (0)