Guest post by Liza Persson.
Recently Facebook conducted an experimental study exploring online “emotional contagion”; the emotional bias or “tone” (negative or positive) of the content of what people see online and whether it affects the emotional “tone” of content they create online afterwards. Emotional bias or tone of content was inferred using an algorithm developed for this purpose, which in itself is a good tool for analyzing content (Kramer, Guillory, & Hancock, 2014).
What Facebook was doing was not psychology or science in any other area though. Facebook violated procedures and principles in regard to conducting research scientifically. It didn’t live up to the ethical safeguards of protecting those participating in the study, although it did get consent via its terms and services policy which is probably sufficient enough to protect itself in the case of lawsuits (American Psychological Association, 2010). The goal of raising ad revenue is not the rationale for scientific research; serving the good of humanity is (Riley, 2014; Nisen, 2014).
With such different ideas of intent and who the beneficiaries are (profit and shareholders vs. improve conditions of life and humanity) it’s no wonder the standards for whom to protect from harm and what is meant by harm, are as different as the criteria that express these differences.
They did not produce any results I think were worth what must have been the cost of the study. Even though there were findings supporting the hypothesis that would have been worth notice had it been scientific research, the study design made them useless for drawing conclusions and guiding practice.
In the long run, the reaction and response by users of social media will decide the standards of conduct for experimenting on them without the informed consent demand of science. Ultimately they will decide whether the likes of Facebook must conform to standards and accountability like those taken into consideration in scientific research. In the end; unless we want to outsource enforcing the compliance with scientific standards to a government, we all decide what is meant by terms like “science”, “scientific”, and “scientist”. We as a society, as well as we as individuals, create the standards.
The enforcer of scientific standards is not just the community of peers. We all, as the ultimate intended user, beneficiary, and owner of results, are responsible for preventing dilution of the demands about what is considered scientific. That means that we define the meaning of science too.
Dilution occurs when research that does not meet these demands is none the less treated and referred to as if it was on par with research that is.
American Psychological Association. (2010). Ethical Principles of Psychologists and Code of Conduct, 2010 Amendments. Retrieved from American Psychological Association: http://www.apa.org/ethics/code/index.aspx?item=11
Kramer , A. D., Guillory , J. E., & Hancock , J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24). Retrieved June 2014, from http://www.pnas.org/content/111/24/8788.full?_ga=1.63229676.2108248947.1404172404
Nisen, M. (2014, June 30). Facebook is learning the hard way that with great data comes great responsibility. Retrieved from Quartz: http://qz.com/227869/facebook-is-learning-the-hard-way-that-with-great-data-comes-great-responsibility/
Riley, C. (2014, June 30). Internet outraged by Facebook’s ‘creepy’ mood experiment. Retrieved from CNN Money: http://money.cnn.com/2014/06/30/technology/facebook-mood-experiment/index.html?hpt=hp_t2
- Need to know: About Facebook’s emotional contagion study (IDEAS.TED.COM)
- Facebook: Unethical, untrustworthy, and now downright harmful (ZD Net)
- How an IRB Could Have Legitimately Approved the Facebook Experiment—and Why that May Be a Good Thing (The Faculty Lounge)
- What’s So Bad About Facebook Editing Our Feeds (Dr. Keely Kolmes)