The company tweaked the Newsfeed algorithms of 600,000 unwitting Facebook users, so that people were seeing an abnormally low number of either positive or negative posts.Â
In a recently published study [1] , the scientists say they found that when people saw fewer positive posts on their feeds, they produced fewer positive posts and instead wrote more negative posts, too. On the flip side, when scientists reduced the number of negative posts on a person's newsfeed, those individuals became more positive themselves.
"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," study authors Adam Kramer, Jamie Guillory, and Jeffrey Hancock write. "We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues."
This idea is interesting in and of itself, but The AV Club's William Hughes [2] also points out that the study highlights something that most users probably don't think about: By agreeing to the Facebook's Data Use Policy [3]  when you sign up, you're automatically giving it permission to include you in big psychological experiments like this without your knowledge.Â
(Hat-tip to Rami Ismail who tweeted the study [4] .)
Links
- ^ In a recently published study (www.pnas.org)
- ^ The AV Club's William Hughes (www.avclub.com)
- ^ Data Use Policy (www.facebook.com)
^who tweeted the study (twitter.com)
No comments:
Post a Comment