Sinister Mood Experiment By Facebook

by Vargis.Khan

Our emotional response and moods are always a direct result of what we see, hear of feel. For example, if a small kid tries to talk to you in those broken cute sentences, it will immediately lighten up your mood but if you stood witness to a fatal road accident, you are bound to remain depressed for rest of the day. When we wake up every morning, our mood is either a response to the happenings of the day before or what we have in plan for today. As we go through the day, this mood changes based on what we see or hear around us.

Now we all know this for a fact. This is of course no hidden secret but Facebook took this to a new whole level of creepy in a recent experiment they conducted on some 689,000 users who had absolutely no idea that they were a part of a study.

All companies and business organizations do research and experiments in order to gain more business. In the world of Internet, these experiments are termed as “A/B” test. During this research, different users are shown different content on a website even though they are looking at the same webpage. Google runs these tests by making small tweaks to its search algorithms to see if the changes provided more useful results. CNN for example runs a tool that throws different headlines to different users in order to check which one generated more clicks. Motive of these tests is to gauge the kind of content that can produce more clicks for a website.

Facebook however decided to take these experiments to a whole another level. In their test, they picked 689,000 random users and changed the posts that appeared on a user’s timeline. These users were divided into two groups. With one group, the posts that appeared on their timeline were all negative while the other group saw a complete set of all positive posts. Result of this experiment? Users who saw negative posts produced even more negative posts while the users who had a positive experience posted something positive. This test was conducted for an entire week and in a way, FB ruined the entire week of a significant amount of people by showing them posts of violence and hatred on their timeline. There was an outrage when the word of this experiment leaked. Facebook’s innocent explanation to this “mood experiment” was that they conducted the tests to improve their services and make the content people see on Facebook as relevant and engaging as possible.

When we sign up on these websites, we all agree to these experiments, whether we are made aware of them or not. It is in the terms of service agreement of the website that none of us bothers to read. If you do not agree to be a part of any such experiment, the website would not allow you to complete the registration process. For example, FB’s data use policy reads, ““We may use the information we receive about you for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” These kinds of tests are very common in the world of internet and people are merely treated as lab rats in an Internet maze.




FB may be true in what they stated in their official explanation but isn’t there a more sinister result of such experiments. These days, entire world seems to have registered on Facebook. It’s agreed that majority of the people log in to Facebook to kill time and stay connected to their friends but lately, there have been instances when people used Facebook for lobbying, protests, revolts and even riots. A/B tests are highly manipulative. They are said to check a person’s response but are actually a persuasion to get the user to click on desired link or produce desired emotional response. If you are a part of the test, you might click on something you may otherwise would have ignored or feel something you wouldn’t have otherwise felt. Doesn’t this open us all to a very sinister and dangerous situation? Every mutiny, riot, revolt in the history of the world was only possible by invoking people. The entire world today is an audience to Facebook. What if tomorrow the government decides to use FB to promote hatred towards another government, or a community or maybe an entire country?

Even if the implications are not that drastic, what gives a right to Facebook to ruin someone’s day like that? Any of the users who saw posts of violence and hatred on his timeline could have gone out and spread it further in real world. What if a person of already violent nature saw violent posts for an entire week? All of us can image a direct result of that.

We can crib about it, cry about it but as I mentioned, we are merely the lab rats in the Internet maze. Facebook’s policy is simple; if you do not agree to it, do not bother to sign up. But the big question is, should there not be a law against such type of experiments? Think about it; is it not really possible that Facebook might even have killed someone during that week of test by triggering a violent response in one of the users due to all negative posts? What is the guarantee that people who saw negative posts on their timeline only spread negativity online and not out in the real world?

Related Articles

Leave a Comment