Facebook deliberately screwed with users' news feed to run an experiment on "massive-scale emotional contagion".
Facebook have something of a reputation for pushing the boundaries of privacy, but to their credit they are treading where no web service has gone before. There's never been a social network of 1billion + people before.
But people all over the world were in shock to find out what Facebook had done this time, and the folk at Facebook actually published a paper on their experiment, as though they were proud of what they'd done. Read it here.
Here's how Facebook works normally
A lot of people still think when they log in to Facebook they see the most recent posts from their friends and pages or groups they follow.
But that's not how it works. Facebook decides what it wants to show you, based on whether you normally interact with certain people, how popular a particular post is... and many many other factors.
Facebook say the reason they do this is to provide the best possible user experience. There's thousands of new posts they COULD show you, but you won't have time to see them all. So they dish up what they think you'll like the most.
No big deal, this has been public knowledge for some time (even if many people are unaware or unhappy about it).
But here's what Facebook did in their experiment
Basically they wanted to know what would happen if they showed a bunch of users more negative content, or more positive content.
So they manipulated the content seen by more than 600,000 users, to either have more negative keywords in it, or more positive keywords, and then measured what those people went and posted later on Facebook themselves.
And it turns out...
That what you see on Facebook DOES affect your emotional state.
People shown more negative content, were more likely to post more negative things later, and vice versa.
And all Facebook users have agreed to participate in this.
It's in Facebook's terms of service (you know, that stuff that none reads) that they're allowed to experiment on you.
Perhaps someone needs to talk to the data scientists at Facebook about depression, anxiety, suicide, and using unwitting human subjects as test objects.