Meta Moderators Handed Out Access To Facebook Accounts In Exchange For Bribes

from the just-BEGGING-for-direct-regulation dept

Moderation at scale is impossible. This truism has been enshrined on the pages of Techdirt. Anyone working for a platform with thousands of users — much less millions or billions of users — knows this is true. Meta, the rebrand now controlling Facebook, certainly knows this to be true. Facebook has billions of users and the amount of user-generated content requiring moderation is only slightly easier to manage than the 720,000 hours of video uploaded to YouTube every day.

What doesn’t help this already-impossible task is rogue employees who decide (either for altruistic or self-serving reasons) to abuse a system internally, exacerbating the gamesmanship exhibited by users who know how to manipulate the weaknesses inherent in impossible tasks.

Meta Platforms Inc. has fired or disciplined more than two dozen employees and contractors over the last year whom it accused of improperly taking over user accounts, in some cases allegedly for bribes, according to people familiar with the matter and documents viewed by The Wall Street Journal.

Some of those fired were contractors who worked as security guards stationed at Meta facilities and were given access to the Facebook parent’s internal mechanism for employees to help users having trouble with their accounts, according to the documents and people familiar with the matter.

Hey, if you want Congress to get more directly involved in content moderation, this is a great way to do it. The job is already impossible to do well. Getting paid twice to do it poorly is bad optics and bad business. It lets everyone know Meta can’t handle the moderators it has employed, much less the moderation load of day-to-day business.

Making this whole debacle even more unfortunate is the acronym used internally to refer to the exploited moderation system: “Oops.” It stands for online operations used to reconnect users who’ve forgotten their login passwords or email addresses. Its a good acronym, considering the problems it’s supposed to handle, but “Oops” is going to lend itself to punchy public statements by grandstanding legislators looking for any reason to end Section 230 immunity, and will make multiple appearances if Meta is summoned to Congressional hearings over this headline-generating bit of malfeasance.

The good news is that heads have rolled. Whether the rolling heads will deter others remains to be seen, though. If nothing else, it shows Meta is at least doing some due diligence when it comes to accusations of abuse. But Meta’s statement on the issue isn’t exactly comforting: it suggests this abuse is both external and internal, and that there’s no apparent solution that will prevent this sort of thing from happening in the future.

“Individuals selling fraudulent services are always targeting online platforms, including ours, and adapting their tactics in response to the detection methods that are commonly used across the industry,” said Meta spokesman Andy Stone. He added that the company “will keep taking appropriate action against those involved in these kinds of schemes.”

Making this problem even more impossible to solve is that the “Oops” system often interacts with third parties acting on behalf of users who have been locked out of their accounts, some of whom have had their accounts hijacked by malicious hackers. There are several levels of potential exploitation here, beginning with the account hijackers, running through third-party services that may be promising locked-out users more than they can deliver, and finally ending up on the doorstep of “Oops,” where certain Meta staffers are apparently willing to grant access to whoever’s asking, so long as they’re willing to shell out a few bucks.

The Wall Street Journal article suggests firing moderators may not actually solve the problem. There’s evidence out there showing former employees have leveraged their contacts inside Meta to trigger account actions normal users are often unable to access.

Meta is also investigating some former employees for remaining in contact with other workers, allegedly to hijack user accounts. In July, an attorney on behalf of Meta sent a letter to one former security contractor who was fired in 2021, Kendel Melbourne, alleging that he assisted “third parties to fraudulently take control over Instagram accounts,” including after he left the company, according to a copy of the letter.

Another contractor working for the same company that employed Kendel Melbourne was fired after it was discovered she had provided hackers fraudulent access to multiple accounts in exchange for “thousands of dollars in bitcoin.”

Moderation at scale is still impossible. But it can be done better than this. Internal teams like this should be subjected to constant oversight, rather than given free rein to do whatever they want until it starts causing problems too big to ignore. This cleanup effort certainly won’t be Meta’s last. And until it shows it’s serious about protecting users from internal threats to account security, it’s going to generate headlines and government scrutiny that’s ultimately going to harm platforms that don’t have the market share or on-hand capital to survive increased regulation or the stripping of Section 230 immunity.

Filed Under: , , ,
Companies: facebook, meta

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Meta Moderators Handed Out Access To Facebook Accounts In Exchange For Bribes”

Subscribe: RSS Leave a comment

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:


Well, for a change, Koby is correct, even though he didn’t state why. Cubby v. Compuserve ended with a summary judgment dismissal in favor of Compuserve. That kind of thing should’ve been the standard finding in succeeding cases alleging libel on the internet, but Stratton Oakmont turned the idea of “knew or should’ve known” on its ear, thus leading to Section 230 becoming law in 1996.

Christenson says:

Re: No-hands all computer companies

Undoing fuckery is going to remain impossible on large platforms because to profit, they must have a very large number of users for each human employee.

Consider twitter as a benchmark: 5000 contractors and 7500 employees handling 400 million (or was it 200?) daily active users. Rounding heavily, that’s roughly 40K users for every twitter human employee or contractor in the entire company, so getting ahold of a human being’s attention isn’t going to be easy.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...