National Opinions

Cambridge Analytica behaved appallingly, but don’t overreact

The horrendous actions by Cambridge Analytica, a voter-profiling company, and Aleksander Kogan, a Russian-American researcher, raise serious questions about privacy, social media, democracy and fraud.

Amidst the justified furor, one temptation should be firmly resisted: for public and private institutions to lock their data down, blocking researchers and developers from providing the many benefits that it promises – for health, safety, and democracy itself.

The precise facts remain disputed, but according to reports, here's what happened. Kogan worked as a lecturer at Cambridge University, which has a Psychometrics Centre. The Centre purports to be able to use data from Facebook (including "likes") to ascertain people's personality traits. Cambridge Analytica and one of its founders, Christopher Wylie, attempted to work with the Centre for purposes of vote profiling. It refused, but Kogan accepted the offer.

Without disclosing his relationship to Cambridge Analytica, Kogan entered into an agreement with Facebook, which agreed to provide data to him — solely for his own research purposes. Kogan created an app, called "thisisyourdigitallife." Offering a personality prediction, the app described itself on Facebook as "a research app used by psychologists." About 270,000 Facebook users agreed to disclose their data (again, for research purposes).

[Facebook and Mark Zuckerberg need to come clean about 2016. Now.]

By sharing data with Cambridge Analytica, Kogan violated his agreement with Facebook. According to one report, he ended up providing more than 50 million user profiles to Cambridge Analytica, not for academic research, but to build profiles for partisan political uses.

Armed with those profiles, Cambridge Analytica worked with members of the Ted Cruz and Donald Trump campaigns in 2016. Among other things, the firm helped to model voter turnout, identify audiences for fund-raising appeals and advertisements, and specify the best places for Trump to travel to increase support.

ADVERTISEMENT

As early as 2015, Facebook learned that Kogan was sharing his data and demanded that Kogan, Cambridge Analytica, and Wylie cease using, and destroy, all the information they had obtained. They certified that they had done so.

That was a lie – which recently led Facebook to suspend all three from its platform. Facebook was careful to add, "People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked."

All this raises numerous questions – some of which involve difficult tradeoffs with respect to privacy and competing values. Aware of the risks, Facebook emphasizes that all apps requesting detailed user information have to "go through our App Review process, which requires developers to justify the data they're looking to collect and how they're going to use it – before they're allowed to even ask people for it."

In view of Kogan's misconduct, it's reasonable to ask whether that process contains sufficient safeguards. An external review panel might well be a good addition; continuing monitoring of all uses of Facebook data, on the part of app developers, seems important.

But let's not overreact. Authorized use of that data can do a great deal of good.

For example, Genes for Good, from the University of Michigan, is using a Facebook App to help combat diabetes, cancer, and heart disease. It seeks to learn how genes interact with the environment to produce – or not to produce – serious illness. There's tremendous potential there.

A more immediate response to health problems is HealthTap, an app that permits users to type questions into Facebook's Messenger and to obtain free responses from doctors – or to see answers from doctors to questions that are like their own.

Stanford's Raj Chetty, one of the world's leading experts on inequality, is working with Facebook data to learn more about economic opportunity and in particular to understand the sources of intergenerational mobility.

Chetty has an extraordinary track record and there's every reason to think that we will learn a great deal from his work.

With respect to politics, the Pew Research Center has used Facebook data to see how often, and exactly when, members of Congress directly express disagreement with the other party.

Pew found that disagreement comes most often from party leaders – and that it is far more common from Republicans than from Democrats. Sure, those aren't the most surprising findings, but there is far more to learn about polarization and partisanship – and Facebook's data will prove exceedingly valuable.

It is true, of course, that social media users should have a great deal of control over whether and how their information is used, and that app developers should be sharply constrained in their ability to share data.

The U.S. government has faced, and solved, similar problems: Data.gov discloses a great deal of information, with more than 230,000 data sets involving health, safety, travel, energy, and the environment. Available apps, made possible by that information, are helping people to save money and to avoid health risks.

For social media providers, including Facebook, the Cambridge Analytica fiasco underlines the need for more careful vetting of all developers who seek access to their data. But it would be a mistake to take the fiasco as a reason to keep treasure troves of information out of the hands of people who can provide immensely valuable services with it.

Sunstein is a Bloomberg View columnist. He is the editor of "Can It Happen Here? Authoritarianism in America" and a co-author of "Nudge: Improving Decisions About Health, Wealth and Happiness." In the last year, he has occasionally served as a consultant to Facebook but not on any issue related in any way to the topic here.

ADVERTISEMENT