Meta will record employees’ keystrokes and use it to train its AI models

Meta is deploying an internal tool that converts employee mouse movements and keystrokes into training data for its AI models. The practice raises immediate questions about consent, data governance, and whether employee activity can ethically fuel model development at scale.
Modelwire context
Analyst takeThe more consequential detail isn't the privacy concern, which is real but well-trodden territory: it's that Meta is treating its own workforce as a continuous, captive data pipeline rather than sourcing training data through external licensing or synthetic generation. That's a specific strategic bet about where proprietary signal lives.
MIT Technology Review's piece on 'enterprise AI as an operating layer' argued that competitive advantage increasingly belongs to whoever controls the infrastructure where AI is governed and refined, not whoever has the best base model. Meta's keystroke program is a direct expression of that thesis applied internally: the workforce becomes part of the refinement infrastructure itself. This also sits alongside the broader tension surfaced in WIRED's newsroom AI piece from mid-April, where labor and consent questions were already surfacing around AI-assisted workflows. Meta is essentially running that same experiment at a much larger scale, with less voluntary participation implied.
Watch whether any Meta employees or labor advocates file a formal complaint with a data protection authority in the EU under GDPR within the next 60 days. A regulatory challenge in Europe would force disclosure of the consent mechanisms Meta is relying on, which is the detail this story currently lacks.
This analysis is generated by Modelwire’s editorial layer from our archive and the summary above. It is not a substitute for the original reporting. How we write it.
MentionsMeta
Modelwire summarizes — we don’t republish. The full article lives on techcrunch.com. If you’re a publisher and want a different summarization policy for your work, see our takedown page.