Facebook’s ad business is booming, but by most other metrics, it has had a very bad couple of years. The social media company, which changed its mission statement in 2017 to “giv[ing] people the power to build community and bring the world closer together,” has given some people the power to use information about the rest of us in ways we could not have intended.
The latest controversy concerning Facebook user data came to light on March 16 and involves a data analytics firm closely associated with White House strategist Steve Bannon that worked for the Trump campaign. According to reporting by the New York Times and London’s the Guardian, in 2014, a Cambridge-based, Russian-American computer science professor gathered data from about 250,000 Facebook users who were offered three or four dollars to download an app called thisisyourdigitallife that included personality quizzes (not “Which Muppet Are You?,” but similar to those offered by psychological measurement centers), the results of which the app’s developers said would be used only for research purposes.
However, the app collected not only survey results from those who willingly downloaded it to share their data, but it also scraped data from the survey respondents’ friend networks as well, who had never taken the app’s quizzes, downloaded the app, or had probably even heard of it at all — to the tune of over 50 million users. That information was then shared (in violation of its terms of service, Facebook claims) with Cambridge Analytica, the data analysis firm that President Trump’s 2016 campaign hired to develop and run targeted political advertising on social media. Perhaps most worryingly, even though Facebook asked Cambridge Analytica to delete the information in 2015 when it learned of this activity and found its sharing violated the social media company’s terms and conditions, reports indicate that these Facebook users’ personal data was still being held by Cambridge Analytica as late as 2017.
Facebook doesn’t permit this kind of scraping any more, but there’s no debate it did at the time. The problem with the Cambridge Analytica controversy isn’t that an app developer dissembled about what the data he collected from both willing and unwilling users would be used for, or whether he shared it with a political data research firm. It’s that the data of nonconsenting Facebook users was available for sharing at all.
As is the case with the old-line media outlets like newspapers and television stations, users’ interactions with other companies’ apps and websites have been completely integrated within the Facebook platform. There are very few third-party websites or apps that don’t permit (or for those users who favor the convenience of a single password over dozens of different ones, encourage) users to log in and interact with the app or site through their Facebook account credentials. This has been so even for wildly popular gaming apps like FarmVille, which a normal user would think is for harvesting virtual trees and crops, not the personal information of the user and her network of friends. Sharing personal data via Facebook, in other words, isn’t a misuse of Facebook; it’s the whole point of Facebook. Without monetizing personal data, Facebook has nothing to sell.
The Cambridge Analytica story has also reminded us that it’s not just peoples’ posts to Facebook or interactions with apps that have created this vulnerability. Much of what Facebook knows about us comes from what we like — or more specifically, what we “like.” Compiling and analyzing user “likes” allows companies to predict not just our political preferences, gender, location, and race, but even our fears, doubts, and life traumas. Advertisers — who, unlike its users, actually pay Facebook — can then use that data to tailor their messages in micro-targeted ways, using both real and fake news, that were unimaginable before social media.
This has caused a sea change in not just product-based advertising (that pair of shoes you looked at on Zappos a year ago will follow you around the Internet forever), but in political advertising as well. Whether this targeting actually works is an open question, and many think it doesn’t. The point is that our data is being used in attempts to manipulate us without our consent, irrespective of whether those attempts are eventually successful.
Those waiting for Facebook to solve these issues will be waiting a very long time. The trove of data it collects about each of its users is exactly what makes it valuable to those who do business with it. And deleting our Facebook apps altogether and en masse is an unsatisfying solution as well; some people need Facebook to stay connected to friends and family from far away, and in some countries Facebook is the de facto Internet. So a world without Facebook is very hard to imagine. The solution, if there is one, thus has to come from somewhere else. We might start with ourselves.
If we have to trust our friends or Facebook to decide how private our data is going to be, then maybe the only solution is to be more thoughtful about what we share. Both Facebook’s mission statement and its business model rely on another well-known adage: knowledge is power. By giving up so much about ourselves every day on social media, we give up a certain part of our power as well.
Enrique Armijo is the Associate Dean for Academic Affairs and an Associate Professor at the Elon University School of Law in Greensboro.