Tuesday, July 1, 2014

Facebook: Unethical, untrustworthy, and now downright harmful (ZDNet)

Summary: News of Facebook experimenting on its users' emotional states has rattled everyone. Worse, the tool used to perform the experiments is so flawed there's no way of knowing if users were harmed.

By  for Pulp Tech |
facebook manipulation study
If there's one thing we've learned from zombie movies, it's that when the word "contagion" is associated with humans getting experimented on without their knowledge at the hands of a cold, massive corporation -- things never end well.
On June 2, the Proceedings of the National Academy of Sciences published "Experimental evidence of massive-scale emotional contagion through social networks." It made headlines last weekend, which can be succinctly described as a 'massive scale contagion' of fury and disgust.
In "Experimental evidence" Facebook tampered with the emotional well-being of 689,003 unknowing users to see how emotional contagion could be controlled; basically, how to spread, or avoid the spread of, its users' feelings en masse.
It's hard to fathom how far things have gone at Facebook that researchers feel entitled to experiment on the emotional state of its users.
Everyone except the people who worked on "Experimental evidence" agree that what Facebook did was unethical. In fact, it's gone from toxic pit of ethical bankruptcy to unmitigated disaster in just a matter of days.
Cornell University is now distancing itself from involvement in "Experimental evidence." Facebook appears to have been caught changing its Terms to include "research" after the work had been done. The study is now beingcalled into question over approval-laundering by respected academics.
It's not going to get any better when people take a look at the tool Facebook used to do its experiments -- a tool so woefully wrong for the job that no one, including Facebook, will ever know what Facebook actually did to its users' emotional health.
For all of its work thus far studying emotional contagion, Facebook has used the Linguistic Inquiry Word Count (LIWC2007) tool, though "Experimental evidence" was the first time Facebook used the tool to actively interfere with its users.
LIWC 2007 -- note that date -- was conceived to provide a method for studying the "various emotional, cognitive, structural, and process components present in individuals' verbal and written speech samples."
The tool was created for analyzing long form blocks of text, like a book, research paper, therapy transcripts, etc.

Facebook actually doesn't know what it did to half a million people

The Base Rates of Word Usage that LIWC is based on (all prior to 2007) include American and British novels, "113 highly technical articles in the journal Science published in 1997 or 2007," text from studies, random writing assignments, and,
A fourth sample [which] included 714,000 internet web logs, or blogs, from approximately 20,000 individuals who posted either on Blog.com in 2004 or LiveJournal.com in the summer and fall of 2001.
LIWC 2007 fails when it comes to short bursts -- especially with double negatives.
The LIWC2007 website offers free use of the tool. Below is an example of a negative status update ("I am not having a great day"), yet the tool failed to account for positive and negative words within the same sentence.
Facebook experiment
It's hard to fathom how far things have gone at Facebook that researchers feel entitled to experiment on the emotional state of its users -- the removal of content -- with LIWC2007.
With the lack of post-experiment interview or debriefing, there is no way of knowing exactly what Facebook did to the emotional temperature of over half a million people.

Conspicuously different from Facebook's otheremotional contagion studies

"Experimental evidence" was the third time Facebook studied its users' emotional contagion without their knowledge -- thought it is the first known time Facebook has experimented with controlling the emotions of its users.
Almost four months before "Experimental evidence" appeared in PNAS and started making the neuroscience rounds on Twitter last week, "Detecting Emotional Contagion in Massive Social Networks" was published March 12, 2014 in PLOS ONE, an international, peer-reviewed, online scientific journal for reports on primary research. 
"Detecting Emotional Contagion" was a product of UC San Diego and Yale, with Facebook employees Adam Kramer and Cameron Marlow.
"Experimental evidence" was a product of UCSF's Center for Tobacco Control Research and Education, Cornell University, and Facebook's Adam Kramer, who is listed as the paper's primary contact. The primary contact for the first study is UC San Diego's James H. Fowler.
Marlow was thanked in "Experimental evidence" -- the study which became Facebook's foray into contagion experimentation -- and was co-author on a preceding study, "Structural diversity in social contagion" (October 6, 2011, also in conjunction with Cornell and UCSD), the earliest of the studies, which did not include Facebook's Adam Kramer.
"Detecting Emotional Contagion" ran for 1180 days from January 2009 to March 2012. "The study was approved by and carried out under the guidelines of the Institutional Review Board at the University of California, San Diego, which waived the need for participant consent."
According to "Experimental evidence" researchers, user consent was not necessary because Facebook's Terms stood as agreement -- specifically the word "research" indicated that users agreed to the emotional manipulation experiment because they had clicked "agree" when signing up, or by continuing to use the site after ToS updates.
Whereas "Experimental evidence" hid both positive and negative posts from friends, colleagues and family from users to see if it changed the way users influenced each other's feelings, "Detecting Emotional Contagion" instead examined external influences on users to see if users simply influenced each other's feelings by a naturally occurring, impossible to manipulate occurrence: The rain.
Here, we elaborate a novel method for measuring the contagion of emotional expression. With data from millions of Facebook users, we show that rainfall directly influences the emotional content of their status messages, and it also affects the status messages of friends in other cities who are not experiencing rainfall. For every one person affected directly, rainfall alters the emotional expression of about one to two other people, suggesting that online social networks may magnify the intensity of global emotional synchrony.
UCSD's "Detecting" study noted, "Importantly, rainfall is unlikely to be causally affected by human emotional states, so if we find a relationship it suggests that rainfall influences emotional expression and not vice versa."
Instead of changing the user’s emotion directly with an experimental treatment, we let rainfall do the work for us by measuring how much the rain-induced change in a user’s expression predicts changes in the user’s friends’ expression.
Facebook's "Experimental evidence" hypothesis amounted to "let's see if we can plant unhappiness and make it spread." The hypothesis was tested on a large group of people -- and their networks -- that couldn't consent to the experiment, and had no way to actually track whatever impact it had on people's lives.
Facebook, once again, did what it's good at: tracking us, failing to get consent, and avoiding accountability.
Adam Kramer -- who worked on both studies -- posted a non-apology to Facebook that utterly missed the point, saying they were sorry about the way they had described the experimentwhile attempting to re-frame the concept of user consent as if it were a formality.
In classic Facebook style, he blamed users for being upset, as if news of emotional tampering in people's day-to-day lives was simply a misunderstanding that only anxious people worried about.
I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.
As if Facebook's other, massive studies on emotional contagion never happened, he said:
We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.
Ludicrously, he added, "At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."

Special Feature

Going Deep on Big Data
Big data is transitioning from one of the most hyped and anticipated tech trends of recent years into one of the biggest challenges that IT is now trying to wrestle and harness. We examine the technologies and best practices for taking advantage of big data and provide a look at organizations that are putting it to good use.
However, he admitted that the firm did not "clearly state our motivations in the paper."
Emotional manipulation is such a strangely intimate place to discover you're the subject of surveiilance-cum-manipulation, that even your unguarded moments of sharing feelings are subject to someone trying to get something out of you.
We want to call into account what makes this system of control possible, but if Cornell is any example of what to expect from the fallout, no one is going to be held accountable for companies like Facebook recklessly endangering users -- yet again. For those of us observing this spectacle in a sort-of state of self-aware, displaced horror reserved for those moments when life and sci-fi dystopia cross shadows, it has never been more clear that Facebook's ideas about organizing society are wholly broken. 
Intentionally doing things to make people unhappy in their intimate networks isn't something to screw around with -- especially with outdated and unsuitable tools. 
It's dangerous, and Facebook has no way of knowing it didn't inflict real harm on its users.
We knew we couldn't trust Facebook, but this is something else entirely.
Violet Blue is the author of The Smart Girl's Guide to Privacy. She contributes to ZDNet, CNET, CBS News, and SF Appeal.

No comments:

Post a Comment