Facebook Messed With The Emotions Of 689,003 Users… For Science

from the and-maximum-creepiness dept

As you may have heard (since it appears to have become the hyped up internet story of the weekend), the Proceedings of the National Academy of Sciences (PNAS) recently published a study done by Facebook, with an assist from researchers at UCSF and Cornell, in which they directly tried (and apparently succeeded) to manipulate the emotions of 689,003 users of Facebook for a week. The participants — without realizing they were a part of the study — had their news feeds “manipulated” so that they showed all good news or all bad news. The idea was to see if this made the users themselves feel good or bad. Contradicting some other research which found that looking at photos of your happy friends made you sad, this research apparently found that happy stuff in your feed makes you happy. But what’s got a lot of people up in arms is the other side of that coin: seeing a lot of negative stories in your feed appears to make people mad.

There are, of course, many different ways to view this: and the immediate response from many is “damn, that’s creepy.” Even the editor of the study, admits to the Atlantic, that she found it to be questionable:

“I was concerned,” she told me in a phone interview, “until I queried the authors and they said their local institutional review board had approved it?and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time… I understand why people have concerns. I think their beef is with Facebook, really, not the research.”

Law professor James Grimmelmann digs deeper into both the ethics and legality of the study and finds that there’s a pretty good chance the study broke the law, beyond breaking standard research ethics practices. Many people have pointed out, as the editor above did, that because Facebook manipulates its news feed all the time, this was considered acceptable and didn’t require any new consent (and Facebook’s terms of service say that they may use your data for research). However, Grimmelmann isn’t buying it. He points to the official government policy on research on human subjects, which has specific requirements, many of which were not met.

While those rules apply to universities and federally funded research, many people assumed that they don’t apply to Facebook as a private company. Except… this research involved two universities… and it was federally funded (in part) [Update: Cornell has updated its original story that claimed federal funding to now say the study did not receive outside funding.]. The rest of Grimmelmann’s rant is worth reading as well, as he lays out in great detail why he thinks this is wrong.

While I do find the whole thing creepy, and think that Facebook probably could have and should have gotten more informed consent about this, there is a big part of this that is still blurry. The lines aren’t as clear as some people are making them out to be. People are correct in noting that Facebook changes their newsfeed all the time, and of course Facebook is constantly tracking how that impacts things. So there’s always some “manipulation” going on — though, usually it’s to try to drive greater adoption, usage and (of course) profits. Is it really that different when it’s done just to track emotional well-being?

As Chris Dixon notes, doing basic a/b testing is common for lots of sites, and he’s unclear how this is all that different. Of course, many people pointed out that manipulating someone’s emotions to make them feel bad is (or at least feels) different, leading him to point out that plenty of entertainment offerings (movies, video games, music) also manipulate our emotions as well — though Dixon’s colleague Benedict Evans points out that there’s a sort of informed consent when you “choose” to go to see a sad movie. Though, of course, a possible counter is that there are plenty of situations in which emotions are manipulated without such consent (think: advertising). In the end, this may just come down to being about what people expect.

If anything, what I think this does is really to highlight how much Facebook manipulates the newsfeed. This is something very few people seem to think about or consider. Facebook’s newsfeed system has always been something of a black box (which is a reason that I prefer Twitter’s setup where you get the self-chosen firehose, rather than some algorithm (or researchers’ decisions) picking what I get to see). And, thus, in the end, while Facebook may have failed to get the level of “informed consent” necessary for such a study, it may have, in turn, done a much better job of accidentally “informing” a lot more people how its newsfeeds get manipulated. Whether or not that leads more people to rely on Facebook less, well, perhaps that will be the subject of a future study…

Filed Under: , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Facebook Messed With The Emotions Of 689,003 Users… For Science”

Subscribe: RSS Leave a comment
54 Comments
Ninja (profile) says:

Cue lawsuits in 3, 2, 1…

I wonder, what if some of the people used as guinea pigs chose to make nothing public (ie: only visible to friends on their lists)? It would mean that facebook would have to actively invade their personal space since they wouldn’t be able to see their messages and define if they were happier or madder without doing so. Indeed creepy.

Anonymous Coward says:

Control people?

So can we assume that they will now think they can control how people vote based on how many negative/positive images/stories they show about 1 or the other political group?

Or how about societies beliefs in general such as religion or how to treat your neighbor or whether you should confront the local police, etc…

1 more reason to never use social media. Glad that train went right past me.

Anonymous Coward says:

Re: Control people?

Is showing images and telling stories that are or positive or negative proper to social media ? No, it concerns all types of media and this experience could have been done based on any of them, from movies to comic books. Here, the main advantage for researchers in using social media is that the process is much faster and the study can be done on a much larger sample.
So… If the fact that it can have an influence on our opinions or emotion is a reason for us to stay away from social media, we should stop using all types of media! Is that the solution?
I think that it is easier and more practical to develop a critical mind; that it is important to look at every piece of information that is given to us and ask ourselves “Is this real?” or “Why was this information brought to me? Is someone trying to convince me of something?”

Anonymous Coward says:

Whatever!

This is nothing new, most people are default evil and are mindless sheep in need of a Shepard. This is why all of those child day cares have “Share your Toys” reminders on the walls and not “Be Greedy with your Toys” reminders. Our Selfishness comes natural, too bad this shit does not go away with age!

There is a reason that Dictators stay in power, because of the cowardice and gullibility of the people.

All that is necessary for Evil to prevail… and when the good guys do nothing in the face of Evil they might as well just be Evil themselves.

If you save the lives of those whom would enslave and murder on humanitarian grounds… then are you approving that they enslaved and slaughtered on humanitarian grounds? They stared the slaves and the slaughtered right in the eyes as they did these things to them… what of these ‘tyrants’ is left to save?

Anonymous Coward says:

Re: Whatever!

“most people are default evil and are mindless sheep in need of a Shepard.”
— Not everyone is like yourself.

“This is why all of those child day cares have “Share your Toys” reminders on the walls “
— Well, there’s incontrovertible evidence right there.

“Our Selfishness comes natural, too bad this shit does not go away with age”
— Is this sarcasm or where you just reading the news on FB?

Evil doerzzz – oh my! Repent now.

Eldakka (profile) says:

Re: Whatever!

most people are default evil and are mindless sheep in need of a Shepard.

This is not true. Most people obey what they view as fair, sensible laws, not making waves, fitting in, treating others how you want to be treated. There has been much study on this, especially where sentencing guildelines are being determined.

Examples of this are the steady ratcheting up of penalties for copyright infringement (piracy!). Many people do not view the extreme maximalist copyright position as reasonable. Therefore even with the escalation in penalties, copyright infringement is increasing.

And for other laws, say murder and so on, that most people generally agree with, often increasing penalties has little to no effect because most people just don’t agree with committing murder, most people just don’t do it. The sort of people who do commit murder, do believe it is a viable option, will commit the murder whether the penalty is 15 years, 30 years or even execution.

This is why all of those child day cares have “Share your Toys” reminders on the walls and not “Be Greedy with your Toys” reminders. Our Selfishness comes natural, too bad this shit does not go away with age!

err, selfishness != evil. I think you are confusing self-interest, selfishness, with evil.

Selfishness is more along the lines of not being nice, kind, etc. Just because someone is a selfish b@st@rd doesn’t make them evil.

Just because I don’t want to share my toys doesn’t mean I’m going to go and steal someone else’s.

And the reminders to share toys and so on are reminders about being nice and kind to others. To make the environment more peaceful. To make it less stressful for the staff so they don’t have upset kids throwing temper tanti’s because they can’t get their favourite toy. They are not about good and evil.

Michael (profile) says:

This is outrageous. Facebook should not be able to get away with experimenting on people this way. I’m going to look at my new feed to make sure it has not been included in this.

Facebook is the greatest company in the world. They would never do anything to hurt their users and we should all commend them for allowing this kind of research to be done. It improves humanity and those of you that may have been impacted should feel lucky to have been involved.

Call me Al says:

Been a guinee-pig before

A relatively small online game I’ve been playing for most of a decade (www.pardus.at) was in part a psychology project for one of the creators of the game. I read through a synopsis of the paper produced and it was quite interesting to me.

On the one hand I was mildly annoyed that I was part of a study without my consent but I recognised that if I had been aware of the study I would have behaved differently which would rather ruin the whole thing. In the end I decided I was ok with it and actually quite liked the way it made use of information gathered from the game mechanics.

The sheer size of online games and social networks is an enormous boon for academics. It gives them a level of scale they’ve never really had access too before. Yes they don’t have specific consent but, as noted above, if you know about it then your behaviour will be different.

For me I think I’m willing to accept a degree of manipulation in the name of research. It then becomes more of a question of to what degree it is accetable and ultimately the use to which the research will be put.

An AC noted above about voting. If the newsfeed can be manipulated based on emotions it surely can based on politics or lobbying. I can well imagine RIAA or MPAA types seeing this and wondering if this is the answer to their prayers for control… all they have to do is get Facebook on their side.

Eldakka (profile) says:

Re: Been a guinee-pig before

Was it a study or an experiment?

That is, did they gather data from the game AS IS and use that, or did they specifically manipulate the game to test various theories?

Being part of a study where they don’t manipulate the environment, just gather data from the environment, is different and less intrusive than being experimented on by being manipulated.

Also, personally, I feel there is a difference between playing a game that is supposed to manipulate you for your entertainment (e.g. questing for items, getting experience to level and become stronger, earning money to again become ‘better’ in some way is all a form of manipulatoin by the game designers to encourage certain activities), and participating in ‘real life’ social interactions that are being deliberately manipulated by a non-involved 3rd party for research.

But then again, I suppose Facebook is about as real life as Days of our lives…

ponk head says:

this is unreasonable

This is the first time ive heard that facebook manipulates the user feeds which is the opposite of why i joined. I want to see my friends items they post, as they are my friends and i want to be in on the post as well. When i asked around apparently its common knowledge that they also dont allow all the messages from friends to appear on all the other friends news feed. Ok, why? Apparently its to make more room for advertising. So i think i will find another way to talk to my friends. I think this wont go to well for facebook and its dirty secrets and nasty ways.

Everyone should drop Facebook as this is the sign of a nasty evil company that should die.

Anonymous Coward says:

I think basic ethics demand that you get informed consent before you start messing with people’s emotions by filtering out positive messages from their friends.

The study claims that Facebook’s TOS is informed consent. That’s nonsense. A generic line buried in the TOS stating that research may be conducted is NOT informed consent. When people see that line, if they notice it at all, they imagine a passive study of how often you click what.

And Facebook? Your usefulness is one thing: letting people see their friend’s posts. If you’re arbitrarily filtering those posts, then why should anyone continue to use your site?

OldMugwump (profile) says:

Re: Re: Re:

On the other hand, maybe somebody who was going to suicide get lots of positive messages and decided not to.

We should be careful before screaming about very minor marginal effects on large numbers of people – there are a million variables that affect us every day, fiddling with just one is unlikely to cause major changes to any individual (as opposed to tiny changes across a population, visible only with statistical analysis).

Suppose I run a little personal “experiment” – one day I smile at everyone I meet; the next day I frown. I note the reactions.

Have I done something horrible? I think not.

Coyne Tibbets (profile) says:

Re: Re: Re: Re:

Okay, maybe this was a minor effect. Maybe this was fiddling with just a few variables.

But where does one draw the line? Suppose Facebook had selected a group of people and modified their messages with the specific experimental goal of trying to get those people to commit suicide?

Let’s say it wasn’t successful and no one actually did commit suicide: Would you say that, because the effects weren’t major, therefore such an experiment was okay?

If you don’t think so, then where does one draw the line?

OldMugwump (profile) says:

Re: Re: Re:2 Re:

No, I’d agree that was “over the line”.

But that just serves to illustrate that this is not a hard-and-fast thing, but something that requires judgement to decide how much is too much.

I think most people would agree that my smile/frown experiment is OK, yet that your suicide experiment is not OK.

In between is a grey area; what is acceptable becomes a matter of opinion and debate.

Personally I don’t think FB went over the line, but I agree that reasonable people can differ over it.

CanadianByChoice says:

I really don’t understand the apparent outrage over this. All “news” services manipulate thier feeds; they only tell you what they (or their political affiliates) want you to hear/see. It’s known as “slanting” and it’s always been this way. This is why I stopped paying any attention to “news” decades ago.
Even TechDirt has it’s bias, although they try (much) harder to “play fair” than anyone else I’ve seen – which is why I do follow TechDirt!. (Keep up the good work, guys.)
As to the research – it was really rather pointless; politicians and advertisers have known about this – and used it to their advantage – forever…

Anonymous Coward says:

Facebook is strip-mining human society

“The idea of social sharing, in a context in which the service provider reads everything and watches everybody watch, is inherently unethical.

But we need no more from Facebook than truth in labeling.

We need to no rules, no punishments, no guidelines. We need nothing but the truth.

Facebook should lean in and tell its users what it does.

It should say “We watch you every minute that you’re here. We watch every detail of what you do, what you look at, who you’re paying attention to, what kind of attention you’re paying, what you do next, and how you feel about it based on what you search for.”

We have wired the web so that we watch all the pages that you touch that aren’t ours, so that we know exactly what you’re reading all the time, and we correlate that with your behavior here.”

To every parent Facebook should say, “Your children spend hours every day with us. Every minute of those hours, we spy upon them more efficiently than you will ever be able to.”

Only that, just the truth. That will be enough.

But the crowd that runs Facebook, that small bunch of rich and powerful people, will never lean in close enough to tell you the truth.

So I ought to mention that since the last time we were together, it was revealed that Mr. Zuckerberg has spent thirty million dollars that he got from raping human society on buying up all the houses around his own in Palo Alto.

Because he needs more privacy.”

Eben Moglen on the “privacy transaction”.

OldMugwump (profile) says:

A though experiment

Suppose instead of Facebook it was the National Weather Service.

In cooperation with a university studying moot, they seed clouds to cause rain in area A one day, while it’s sunny in area B.

Another time, they reverse – rain in B, while sunny in A.

Then measure something to gauge happiness/sadness to find out if it correlates with weather.

Would we be equally upset? Why or why not?

OldMugwump (profile) says:

Re: Re: A though experiment

If there was actual deception involved, I’d agree with you – totally unacceptable.

But unless I misunderstood (of course possible), all they did was bias which of various legitimate postings by friends they chose to show.

It’s not like they promised to show ALL the friend’s postings in the first place – they’ve been selective all along. All they did was bias the selection criteria.

I really don’t see the problem. But it does seem I’m the odd one out here.

John Fenderson (profile) says:

Re: Re: Re: A though experiment

The deception was not that they were being selective, it’s that they deviated rather severely from their stated selection criteria. What they’d said about the selection criteria was that it was based on things that were intended to try and predict the postings that would be of most use to you, based on things like which posts you liked, which people you comment the most on, etc.

This test changed the purpose of the selection criteria from trying to show you what interests you the most to something completely unrelated to your needs or interests.

Really, I consider this a bit of a tempest in a teapot — I think Facebook routinely engages in greater sins, and people are foolish to trust them in the first place. But nonetheless, I think Facebook was wrong on this.

OldMugwump (profile) says:

Re: Re: Re:2 A though experiment

OK, I think you’ve convinced me. This was wrong because it was intended to produce science instead of benefit the users.

But it’s a very fine line – by that criterion, the same experiment would have been OK if the A/B test intent had been to find the best algorithm to make the users happier (surely that is directly related to users’ interests).

[But I agree – FB does worse things than this that people don’t complain much about.]

Coyne Tibbets (profile) says:

Ethics: Before and after

I find myself amazed by the apologists. Paraphrased: “Facebook’s terms of service allow modification of messages. No one was hurt. So they are in the clear.”

Suppose, without informed consent, you feed a thousand people a small dose of a poison to determine if the poison is safe. None of them get sick, none of them die. Since no “major harm” results, is the test therefore ethical? I think not.

Because ethics applies to your actions, not the result of your actions. The question isn’t whether people got sick or people died. The question is: Was it ethical to give them the poison without informed consent?

Yes, Facebook routinely modifies messages, and is allowed to do so by its terms of service. It is entirely different to deliberately select 639,000 people, and deliberately experiment to see if selected modifications will help some or harm others. To me, this appears unethical, even if there were no major harms as a result. The results don’t matter, Facebook’s actions matter.

John Fenderson (profile) says:

Re: Ethics: Before and after

I’m certainly not a Facebook apologist, but there is a certain measure of “what did you expect?” about all of this. Facebook has a strong track record of being untrustworthy with regards to how they handle your personal information. I do have a hard time seeing how anyone who continues to use Facebook has some kind of moral high ground here.

It’s a bit like people using Google but then working themselves into a moral outrage that Google mines the information they give them.

zip says:

"They trust me ... Dumb fucks."

It seems that no matter how many times Facebook gets caught flagrantly breaking its promises or otherwise doing something underhanded or unethical, the vast majority of Facebook users will remain loyal.

This has always been a mystery to me. Though apparently not to Mark Zuckerberg, who years ago sized up Facebook users rather accurately, in his famously leaked chat log:

“They “trust me”
“Dumb fucks.”

It’s basically the same conclusion P.T. Barnum came to over a century ago, as in his famous quote “There’s a sucker born every minute.” (and like P.T. Barnum, Mark Zuckerberg obviously has no qualms about turning that observation into a business model)

Mike Masnick (profile) says:

Re: "They trust me ... Dumb fucks."

“They “trust me”
“Dumb fucks.”

First off, while that quote is inexcusable, I think it’s silly that folks still point to it, as if a comment made by Mark Zuckerberg 10 years ago while he was in his dorm room has any relevance on the multinational company that Facebook has become today. That’s just silly.

It’s basically the same conclusion P.T. Barnum came to over a century ago, as in his famous quote “There’s a sucker born every minute.”

Speaking of suckers… PT Barnum never said that.

http://www.historybuff.com/library/refbarnum.html

Anonymous Coward says:

Full Disclosure

The mood experiment on FB was not the actual study, although I still must apologize for any distress caused to third parties.

My real research study was in fact:

If the communications between an institutional review board and university researchers was slightly modified (“You should not do that” was altered to read “You should do that”), would the massive violations of professional ethics (and subsequent loss of employment and reputation) affect the mood of the researchers?

Zonker says:

I wonder how many people got “unfriended” for too many negative posts on the feed while their positive posts were hidden. How many people upset that everybody ignored the secretly hidden posts about their parents passing away but everyone liked their cat pictures. How many people thought their friends we blocking them because they no longer see any of their posts at all.

Population studies where you unobtrusively observe a population for research is one thing. Manipulating them or their experience without their knowledge or consent to see what would happen is another. Guess which research method is the unethical one.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...