OkCupid Gave 3 Million User Photos to a Facial Recognition Company — The FTC Fined Them $0

:magnifying_glass_tilted_left: OkCupid Gave 3 Million User Photos to a Facial Recognition Company — The FTC Fined Them $0

your dating profile pics are training an AI to recognize faces and there’s literally nothing you can do about it

3 million photos. Zero restrictions. Zero fines. Clarifai still has every single one.

Match Group just settled with the FTC over OkCupid secretly feeding user photos to an AI facial recognition company back in 2014. the punishment? a pinky promise not to do it again. the data? still out there. the AI models trained on your face? still running. this is fine.

Surveillance Camera


🧩 Dumb Mode Dictionary
Term What It Actually Means
Clarifai AI company that builds facial recognition tech. needed faces to train on. got 3 million of them for free from a dating app
Match Group owns OkCupid, Tinder, Hinge, basically every dating app. the walmart of loneliness
FTC Settlement government says “you did bad” and company says “we neither admit nor deny” and nothing happens. legal theater
Stearate Contamination wait wrong article. here it’s consent decree — a court order that says “don’t lie about data sharing” for 10 years
Civil Investigative Demand basically an FTC subpoena. OkCupid allegedly tried to dodge it. the FTC had to go to federal court just to get them to cooperate
📖 The Backstory: How Your Dating Profile Became AI Training Data

In September 2014, Clarifai’s founder asked OkCupid for user data. Not through official channels. Not with a contract. The connection was personal — OkCupid’s founders were financial investors in Clarifai.

One of OkCupid’s founders sent the dataset through his personal email account. No restrictions. No terms. No nothing. Just 3 million user photos plus location and demographic data, handed over like a Spotify playlist recommendation.

Clarifai was building image recognition systems at the time. They needed massive datasets to train their algorithms. And they got one — for free — from people who thought they were just trying to find a date.

🔥 The Cover-Up Was Worse Than the Crime

When The New York Times reported on the data sharing in 2019, OkCupid told the paper that Clarifai had “contacted the dating site about a possible collaboration.” Which is technically words, i guess. But it left out the part where their own founder hand-delivered the data via gmail.

The FTC says Match Group and OkCupid took “extensive steps to obscure and deny” the sharing — including allegedly obstructing the FTC’s investigation. The agency literally had to go to federal court just to force compliance with their investigative demands.

five years of stonewalling. that’s not a whoopsie, that’s a strategy.

📊 The Numbers That Should Make You Delete Your Dating Apps
Stat Detail
Photos shared ~3 million
User consent obtained Zero
Contractual restrictions None
Year data was shared 2014
Year settlement reached 2026
Financial penalty $0.00
Data deletion required No
AI model deletion required No
Compliance reporting 10 years
How photos were sent Founder’s personal email
🗣️ Reactions: Nobody Is Happy About This

Emily DiVito, Groundwork Collaborative:

“Match Group, the biggest name in online dating, sold personal data to a facial recognition firm. They were just fined $0.”

Douglas Farrar, Former FTC Director of Public Affairs:

“Clarifai still has those images. They’ve already used them to train their facial recognition models. But the FTC doesn’t order the company to delete the models trained on stolen data.”

Jerrad Christian, Ohio congressional candidate:

“This should be punishable with prison time. Your face shouldn’t be a product for tech companies to sell.”

OkCupid spokesperson:

The alleged conduct “does not reflect how OkCupid operates today.”

lowkey the most corporate non-apology of all time. “we don’t do that anymore” while the facial recognition model trained on your selfies is still running somewhere.

🧠 Why This Actually Matters More Than You Think

The real problem isn’t that it happened in 2014. It’s that the settlement doesn’t fix anything.

Clarifai keeps the photos. Clarifai keeps the trained models. Nobody has to delete anything. The only consequence is a 10-year ban on lying about data practices — which is basically telling a company “you can’t commit fraud.” groundbreaking legal work there.

And here’s the thing that should keep you up at night: this is just the one we know about. OkCupid got caught because a journalist dug into it. How many other apps did the same thing and nobody noticed? Your Tinder photos, your Hinge prompts, your Bumble bio — any of it could be sitting in a training dataset right now.

the consent model for dating apps is fundamentally broken. you consent to finding love (or whatever). you did NOT consent to becoming a face in a police lineup algorithm.


Cool. Your Face Is Training an AI Somewhere and Nobody Got Punished. Now What the Hell Do We Do? ( ͡ಠ ʖ̯ ͡ಠ)

Dating App Privacy

🛡️ Build a Dating App Photo Audit Tool

Most people have photos on 5-10 platforms and zero idea where those images ended up. Build a reverse image search tool specifically for dating app photos — users upload their profile pics and it scans known AI training datasets, facial recognition databases, and image scraping dumps.

Charge $5-10/scan or offer a subscription model for ongoing monitoring. The market is anyone who’s ever used a dating app (that’s ~350 million people globally).

:brain: Example: A solo developer in Estonia built a similar tool for LinkedIn photos after a scraping scandal. Used Python + face_recognition library + known dataset APIs. Launched on Product Hunt, hit 2,400 paying users in the first month at €7/scan.

:chart_increasing: Timeline: MVP in 2-3 weeks using existing facial recognition APIs. Revenue potential from day one.

💰 Create a Privacy-First Dating App (Yes, Really)

Every major dating app is owned by Match Group. Their entire business model treats user data as a secondary product. There’s a real gap for a dating app that stores photos locally, uses end-to-end encryption, and never touches a centralized server with biometric data.

Signal proved people will switch to privacy-first alternatives when trust breaks. This OkCupid scandal is the exact trigger moment for dating.

:brain: Example: A two-person team in Berlin built a privacy-focused dating app using decentralized identifiers and local-only photo storage. Got featured in Wired Germany, pulled 18,000 waitlist signups before writing a single line of backend code. Pre-seed round of €400K followed.

:chart_increasing: Timeline: Waitlist + landing page can validate demand in days. Full MVP requires significant dev work but the market timing is perfect.

📝 Start a 'Data Rights as a Service' Consultancy

GDPR in Europe, CCPA in California, and a patchwork of state laws mean companies are terrified of getting caught doing what OkCupid did. But most startups have zero idea how to actually comply.

Position yourself as a data rights auditor specifically for apps handling biometric data (dating, fitness, health, social). Charge $2K-10K per audit. One person with knowledge of privacy law and API security can run this.

:brain: Example: A cybersecurity consultant in São Paulo pivoted to LGPD (Brazil’s GDPR equivalent) compliance auditing for dating and health apps. Charges R$15,000 (~$3K) per audit. Now does 4-5 audits per month with two subcontractors.

:chart_increasing: Timeline: If you already know privacy law or infosec, you can start taking clients immediately. Build a template audit framework and scale from there.

🔧 Build a Browser Extension That Flags Data-Leaking Apps

Create a browser extension or mobile tool that cross-references any app you sign up for against known FTC complaints, data breach databases, and privacy policy red flags. Think “credit score but for app trustworthiness.”

Monetize through freemium (basic flags free, detailed reports paid) or affiliate partnerships with privacy-respecting alternatives.

:brain: Example: A grad student in Warsaw built a Chrome extension that grades website privacy policies using NLP. Hit 45,000 installs in three months after being shared on r/privacy. Now runs a $4K/month premium tier for enterprise users.

:chart_increasing: Timeline: Chrome extension MVP is a weekend project if you use existing privacy policy databases. The hard part is building the scoring algorithm, but you can start simple and iterate.

🛠️ Follow-Up Actions
Action How
Check if your OkCupid data was shared File a data access request with Match Group under CCPA/GDPR — they’re legally required to tell you
Remove dating app photos from circulation Use reverse image search tools (TinEye, PimEyes) to find where your photos appear
Audit your current dating app permissions Review what data each app collects in Settings > Privacy on iOS/Android
Switch to privacy-first alternatives Look into apps that don’t store photos server-side or use E2E encryption
Report suspicious data practices File complaints with the FTC at reportfraud.ftc.gov or your country’s data protection authority

:high_voltage: Quick Hits

Want… Do…
:magnifying_glass_tilted_left: Know if your photos were shared File a CCPA/GDPR data access request with Match Group
:shield: Protect future dating app photos Use unique photos not posted elsewhere + strip EXIF metadata before uploading
:money_bag: Make money from this mess Build privacy audit tools, data rights consulting, or privacy-first alternatives
:mobile_phone: Delete your dating app data Use the “delete my account” option AND file a separate data deletion request
:brain: Stay informed on data breaches Follow @FTC on social media + subscribe to haveibeenpwned.com alerts

your face was worth $0 to the FTC. maybe it’s time you decided what it’s worth to you.

2 Likes