Employee Surveillance for AI Training: 5 Reasons Meta's Jaw-Dropping Tracking is Unethical
·
Meta’s Embrace of A.I. Is Making Its Employees Miserable discusses how Meta is now tracking everything that employees do on their computers, specifically:
What employees typed into their computer, how they moved their mouse, where they clicked and what they saw on their screen would be tracked, Meta said. The goal, the company said, was to capture employee data so Meta’s artificial intelligence models could learn “how people actually complete everyday tasks using computers.”
Here are five reasons that this is absolutely bonkers, and how employee surveillance for AI training is unethical.
1. No way to opt out
When an engineering manager asked Andrew Bosworth, CTO of Meta, about how to opt out, Bosworth noted that is not an option. Sure, the computers that employees use are property of Meta. And Meta can legally do whatever they want with their own devices. But that doesn't make it ethical. Ethical would mean that an employee has given consent to the collection of their data. But since Meta knew that nobody would consent, they made it into an issue of compliance. Now employment is conditional upon acceptance of surveillance. Employees weren't told about this policy when they were hired. Now they have to make the decision to stay with the company or quit, there's no real choice.
2. Coercive timing
Here's why I say there was no real choice. Facebook said they would lay off 10% of their workforce. Rolling out a plan to surveil employees weeks before layoffs was strategically timed to get people to leave. But if you think about it, folks generally aren't going to leave right before layoffs. The thinking is that It's better to stick it out and collect severance pay than quit on ethical grounds and get nothing. Meta may say that these events have nothing to do with each other. But it doesn't matter. The conditions of free choice were not there regardless of intent. Add Meta to the layoffs wall of shame.
3. Training replacements
In the Times article Susan Li, CFO of Meta, hinted that the size of the company would change due to A.I. capabilities.
We don't really know what the optimal size of the company will be in the future... I think there's a lot of change right now, with A.I. capabilities advancing rapidly.
Janelle Gale, the HR head, made it explicit in an internal message:
The cuts will allow the company “to offset the other investments we’re making,” Janelle Gale, Meta’s head of human resources, said in an internal message. She added, “I know this leaves everyone with nearly a month of ambiguity which is incredibly unsettling.”
How humiliating - train your own replacement - just to be let go. Let me spell this out. The company announces that there are going to be layoffs and then they say employee surveillance for AI training is mandated.
4. Sharing PII with Meta
Personally Identifiable Information (PII) will get swept up in the screen and keystroke capture. For example, when you type a password the keylogger will just eat it right up. It could also Hoover up health information, personal messages, confidential information. If they track the clipboard as well, there's literally no safe way to enter a password.
5. Inefficiency
AI doesn't need to watch you use your computer, because AI doesn't need to use a computer that way. I'll restate it differently. AI doesn't have the same constraints that humans have. Why tie AI to a slow user interface when it can operate a system using Unix-style commands at a much faster pace. "OK, wise guy," I hear you saying. "What if there are legacy apps with no APIs. AI could interact with it." Well, you're right, as long as you're not concerned about performance, compute per task, or visual parsing errors. Even if none of this was an issue, AI is still going to create more engineering work (in a good way).
And more...
Zuckerberg's "not for surveillance" line collides with token dashboards already tied to performance reviews.
"Agents to find agents" is Goodhart's Law in real time (when a measure becomes a target, it ceases to be a good measure).
The idea that there "will not be a leak risk" is indefensible, but readers have heard this concern about every data program.
It sets a bad industry precedent. We don't want even more companies surveilling their employees.
Employee surveillance for AI training is unethical
The case against Meta's program isn't that surveillance is new, or that AI training is inherently wrong, or that employees have an absolute right to privacy on corporate devices. But meaningful consent requires the genuine ability to refuse, and Meta has explicitly removed that ability. When the CTO confirms there is no opt-out. When the rollout lands three weeks before an 8,000-person layoff. When the data being captured includes keystrokes, screen content, and mouse movement. What's being asked of employees is not participation in a data program. It's compliance with a condition of continued employment, on terms they had no role in setting.
That's what makes employee surveillance for AI training ethically distinct from ordinary workplace monitoring. The traditional justification for watching workers is operational (security, productivity, legal compliance). Here, employees are being conscripted for commercial AI products, under conditions where objecting carries real career risk and the technical premise (that AI must learn by watching humans use computers) is itself a choice rather than a necessity.
The deeper problem isn't any single design decision but the pattern they form together: a company with the engineering capacity to do this differently has chosen the path that extracts the most from the people with the least power to say no. Whatever the eventual capabilities of the resulting AI, the precedent being set - that workforce-scale behavioral capture is an acceptable cost of staying competitive - is one other employers will follow, and one that will be much harder to walk back than it was to begin.
Here's a picture of a kid to calm your nerves. Just look at those tiny horns and that devilish grin. (Photo by Gordon Milligan)
