Clarifai deletes 3 million photos that OkCupid provided to train facial recognition AI, report says

Clarifai deleted 3 million photos that OkCupid provided for facial recognition training, following an FTC settlement. The 2014 data-sharing arrangement between the dating app and the AI company—whose executives had financial ties to OkCupid—now faces regulatory consequences over undisclosed training practices.
Modelwire context
ExplainerThe detail that Clarifai executives held financial ties to OkCupid at the time of the 2014 arrangement is the buried lede here. This wasn't just a vendor relationship; it was a conflict-of-interest situation where insiders may have had personal incentives to share user photos without disclosure.
None of the related stories on Modelwire connect directly to this case. The closest thematic thread is the April 16 Gemini-Google Photos story, which raised its own questions about personal photo data being fed into AI systems, though that arrangement is opt-in and disclosed. The Clarifai case belongs to an older, murkier category: pre-regulation data deals struck when AI training pipelines had no established disclosure norms and users had no meaningful way to know their images were being used.
Watch whether the FTC's settlement terms include any requirement that Clarifai audit other training datasets for similarly undisclosed sourcing. If the order is narrowly scoped to just these 3 million images, it signals the agency is treating this as a one-off rather than a precedent for broader training-data accountability.
This analysis is generated by Modelwire’s editorial layer from our archive and the summary above. It is not a substitute for the original reporting. How we write it.
MentionsClarifai · OkCupid · FTC
Modelwire summarizes — we don’t republish. The full article lives on techcrunch.com. If you’re a publisher and want a different summarization policy for your work, see our takedown page.