Which insight suppliers and clients are the most innovative? | READ THE LATEST GRIT REPORT

When Worlds Collide

Why the MR industry needs to get ahead of the Facebook-Cambridge Analytica controversy and start advocating for new frameworks on the use of personal data from all sources, before others without the knowledge, experience, and personal stake in the issue decide for us.

Editor’s Note: Sometimes we are graced to be exposed to brilliance, and this is one of those times. I asked Nelson to write this up because I thought he would do a good job, but he blew me away. This is the single best distillation of the issues involved in the larger Facebook/CA imbroglio I have read anywhere by anyone. That it is through the lens of the potential implications for the MR industry makes it twice as relevant for us all.   

GreenBook and The ARF are teaming up to do a “pop-up” event on this topic on April 26th (register here!); Nelson does an incredible job of outlining why our industry needs to get ahead of this controversy and start advocating for new frameworks on the use of personal data from all sources, before others without the knowledge, experience, and personal stake in the issue decide for us.

Mark Zuckerberg always wanted Facebook to bring the world closer together, and this is what it’s like when worlds collide.

Facebook has provided a central node of focus for Americans, Europeans (including the UK!), and Russians; for local, state, federal, and international governments; for consumer groups, trade associations, and political parties; and for research, insights, marketing, and tech sector professionals. The cause célèbre is concern over data protection and the unwanted intrusions of third (or second) parties into the private business of those who provide the data; the public. As the so-called Facebook-Cambridge Analytica scandal boiled over on March 16, the threat of unwanted intrusions suddenly expanded beyond those who provide the data to those who collect and use the data; our industry.

Immediately after Facebook, the data gate-keeper, awkwardly (and perhaps libelously) announced it had suspended the accounts of alleged data user SCL/Cambridge Analytica, data collector Aleksandr Kogan, and whistleblower Christopher Wylie, the reaction from the public, investors, and government officials of every political stripe was swift and tangible. Before we heard a single word from Zuckerberg, these forces converged:

  • #DELETEFACEBOOK began trending on Twitter
  • Facebook stock dropped significantly the next weekday, dropping further once news of an FTC investigation broke
  • Massachusetts Attorney General Maura Healey announced an investigation, joined days later by New York State Attorney General Eric Schneiderman
  • Democratic Senator Mark Warner reiterated his plans for Congress to improve controls on political ad targeting
  • President of the European Parliament, Antonio Tajani, tweeted that EU lawmakers “will investigate fully, calling digital platforms to account”
  • Adam Schiff, a top Democrat on the House Intelligence Committee, invited Wylie, who accepted, to appear before the panel.
  • Senators Amy Klobuchar (D) and John Kennedy (R) demanded hearings on the security of user data online, calling for the CEOs of Facebook, Twitter and Alphabet Inc.’s Google to testify before the Judiciary Committee

All of this sits on top of ongoing investigations by the British Information Commissioner’s Office into Cambridge Analytica and Facebook, by the British electoral commission into Cambridge Analytica‘s role in the EU referendum, US Special Counsel Mueller’s collusion probe, and David Carroll’s lawsuit against Cambridge Analytica. There has been no shortage of fresh controversy or new lawsuits, and, on May 25, enforcement of the EU’s General Data Protection Regulation (GDPR) begins.

All of these events suggest that government bodies, politicians, courts, and litigants will play increasingly significant roles in defining concepts and establishing and enforcing policy for the traditionally self-regulated research and insights industry. While the digital age has certainly ushered in new opportunities for research and insights professionals, it has limited the industry’s ability to influence conformity to ethics and, therefore, to prevent outside encroachment. Many industry participants, such as Zuckerberg and Wylie, come from different professional cultures than the traditional researcher, and the conflicts of interest that traditional researchers take for granted are invisible to them. Further, the digital age arrived with the promise of great riches but with no roadmap to success, and a modern gold rush has ensued where opportunity and urgency have precluded caution: One must stake a claim as quickly as possible – before looking to see if the stake pierces someone else’s heart. As a result of trends such as these, we have an industry that is better at writing sound policies than it is at training effective police.

The Facebook-Cambridge Analytica scandal poses several threats to the research and insights industry:

  • Public confidence may erode
  • Participation in market research or opting-in to allow data access may decline
  • Tools such as microtargeting may not realize their potential
  • Costs may increase due to greater regulatory encumbrances

To blunt these potential threats, we need to articulate a clear and unified story about the Facebook-Cambridge Analytica scandal that highlights contrasts between ethical and unethical practices, rewards the innocent and punishes the guilty, and demystifies arcane terms and processes that confuse and mislead the public. An honest, fact-based analysis of these events may serve as a roadmap for the government bodies, politicians, litigants, and courts looming in our future.

As a start, it might help to simplify the story as it relates to data protection and discuss what it might mean for research and insights professionals. The scandal has many twists, turns, nuances, and alternative versions, but it boils down to eight major activities:

  1. Facebook provides Terms of Service and Privacy Policy to user.
  2. Facebook provides Platform Policy for developers and grants access once developer agrees.
  3. GSR agrees to Platform Policy.
  4. GSR recruits user to app and provides its policies.
  5. User allows access to own profile and friends’ and completes survey.
  6. GSR generates psychographic segments from survey data, then assigns each profile to a segment.
  7. GSR provides profiles with segment assignments (and probably the algorithm) to CA.
  8. Cambridge Analytica allegedly microtargets voters.

Although there are tantalizing questions about what happened in each of these activities, the last activity, microtargeting of voters by Cambridge Analytica, is the only one of these activities in dispute; for example, the Trump campaign has recently denied that it used Cambridge Analytica’s data, claiming that it decided to use only data provided by the Republican National Committee. It’s unclear whether it was used in other campaigns, however.

 

 

A closer look at each of these activities may help to clarify the story and identify more general risks to the research and insights industry.

1. Facebook provides Terms of Service and Privacy Policy to user.

Context: In 2009, Facebook enabled certain information that was designated by users as “private” (such as friends lists) to be public. Facebook did not notify users of this in advance and did not seek their approval. In 2011, Facebook was forced to sign a consent decree with the FTC for failing to protect user data. In the wake of the current scandal, the FTC is investigating whether Facebook violated that decree.

Caution for Industry: Even a perfect privacy policy can be undermined if the highest profile actors in the industry establish a pattern of disguising their policies, changing them without notice, or even ignoring them altogether. Facebook has consistently engaged in these behaviors over a long period of time, and, as that history gains traction in the public consciousness, it may create legal or personal barriers for anyone who wants access to data, regardless of their own policies.

2. Facebook provides Platform Policy for developers and grants access once developer agrees.

3. GSR agrees to Platform Policy.

Context: In 2007, when Facebook was one-tenth the size of MySpace, it changed its policies to provide an open platform for developers. Five years later, it was successful enough to have an IPO, proving the financial advantage of openness.

When Kogan entered his agreement with Facebook to allow his app to use the platform, he specified it would be used for “research.” In a March 18, 2018 email, Kogan says he went into the official platform in 2014 to change his app’s terms and conditions from “research” to “commercial use.” He claims that Facebook did not object, but Facebook claims that its privacy policy forbade a developer from transferring data externally once it was collected and that its policy superseded whichever terms and conditions he set up for his app. Kogan has agreed to an audit by independent forensics experts. Facebook’s current data policy includes, among other things, the statement: “Information collected by these apps, websites or integrated services is subject to their own terms and policies,” which seems easy to read as superseding Facebook’s policies once the transfer occurs.

Caution for Industry: If Kogan’s transfer of the profiles to Cambridge Analytica was not in violation of his agreement with Facebook, then Facebook bears the brunt of the responsibility for the scandal (at least the data part of it.) Perhaps the forensics or the official investigations and litigation will eventually reveal the truth. For now, we have to question two practices: 1) leaving any aspect of data protection up to a self-service agreement and 2) having so many policies that they at least seem to contradict each other, a condition that exists to this day.

4. GSR recruits user to app and provides its policies.

5. User allows access to own profile and friends’ and completes survey.

Context: These seem to have gone according to policy. One might wonder how clear it was to the Facebook user that their friends’ profiles could be accessed or what kind of information could be acquired that way.

Caution for Industry: The lesson here would seem to be around transparency and clarity, as well as the wisdom of allowing one person to speak for another’s data.

6. GSR generates psychographic segments from survey data, then assigns each profile to a segment.

Context: In the wake of the scandal breaking, a lot has been written in several publications mocking this approach and saying that it is not actionable. Critics have ranged from journalists with some relevant background to leaders of PACs who derided Cambridge Analytica’s sales pitches.

Caution for Industry: It is not clear how much of this rhetoric is fact-based versus speculation-based versus politically motivated. While some of the critics may know the subject matter well, others have a personal interest in distancing themselves and their political allies from Cambridge Analytica. It might be healthy for research and insights experts in this area to generate some PR around this topic that adds some rigor to the public discussion.

7. GSR provides profiles with segment assignments (and probably the algorithm) to CA.

Context: As mentioned under activity 3, this is a major turning point in the story of the scandal. Kogan delivered the data in late summer 2014 and claims he believed he had the right to turn the data over to his client. Wylie and Cambridge Analytica claim they had asked Kogan about whether he was allowed to transfer it and had no reason to doubt him, even though such a transfer violates UK law. Cambridge Analytica let Kogan keep a copy after he delivered it to them. A year later, Facebook finds out about the harvesting, does not tell users or make a public statement, removes Kogan’s app, and demands each party certify that they have destroyed their copies of the data. (Wylie, however, does not receive this demand until August of 2016 and maintains he had already destroyed any copies he had.) On March 17, 2018, the New York Times reports that copies of the dataset still exist, that a former Cambridge Analytica employee claimed to have recently seen an unencrypted copy on their servers, and that the Times has seen a copy personally.  

Caution for Industry: Hearing that copies of 50 million Facebook profiles with psychographic targeting codes exist in multiple places beyond Facebook’s control probably doesn’t make the public eager to share data with anyone. The chain of possession story and lazy follow through don’t instill much confidence, either. Following major hacks to Experian, Yahoo, Target, and so on, public trust can’t be very high in any industry. The research and insights industry may need to decide whether public trust increases more if it emphasizes its uniqueness from other industries based on its traditional ethics and need to instill trust or whether it is better to unite behind a broader, cross-industry effort.

8. Cambridge Analytica allegedly microtargets voters.

Context: This is the most emotional and murkiest of area of the entire scandal and probably deserves a separate discussion. Cambridge Analytica is alleged to have used voter’s own data to manipulate them and control their minds. Yet, many political insiders claim that their microtargeting approach could not be effective. Others claim that it was never used – including Trump’s campaign and Cambridge Analytica. Others outside the political realm claim that it is not possible to change people’s minds or that anyone whose mind could be changed by microtargeting is weak-minded anyway.  Wylie and others claim it was a very effective tactic for Obama in 2008 and 2012, and Hillary Clinton’s camp thinks it is reasonable that such an approach could have swung the 70K voters they say made the difference in the election.

Perhaps more significant than the doubts about its effectiveness are the beliefs that microtargeting is inherently evil. No doubt this perception is enhanced among people who resent Trump’s election and those who have heard Cambridge Analytica bragging about manipulating elections in multiple nations. Many seem to perceive it as kind of “brainwashing.”

If it was used, it is not clear how the microtargeting was carried out; for example, how much was via ads or how much was via networking using real or fake profiles.  It is not clear what messages were used; for example, messages that reinforce a candidate’s core promises or hateful ones meant to discourage people from voting. Whether the process and messages were ethical or not should be a separate issue from whether microtargeting is an ethical method.

Caution for Industry: Marketers and consumers would lose out should micotargeting become demonized or wrongly debunked. Industry professionals who have invested in these methodologies might face an undeserved backlash, marketers who might increase profits would lose a valuable tool, and consumers who are sick of seeing the same old irrelevant spam ads might tune out completely. Microtargeting seems to be an issue ripe for high profile, serious discussion to demystify it and highlight its benefits for those on the periphery.

Perhaps all of the issues raised by the scandal have different implications when discussed in a political versus a marketing context; or perhaps none of them do. Microtargeting seems to be the one most in need of separate political and marketing discussion because it seems to be the least well understood and the most emotional, aspects that are enhanced in political discussions.

To conclude where we began, Mark Zuckerberg has made good on his promise to bring the world closer together. Now it’s time for research and insights professionals to leverage their skills in analysis and storytelling to keep the world’s governments and courts from imploding on top of us.

Please share...

One response to “When Worlds Collide

  1. The trouble with collisions is that they are usually avoidable – and most of the issues surrounding this event could have been easily avoided, had any qualified researcher been involved.

    In the space of the past five years or so, many expert commentators – particularly those with links to ‘new tech’ – have been dismissive of “codes of conduct”, claiming them to be ‘old-fashioned’, a ‘hindrance’ to innovation, and in some cases, as completely ‘redundant’. Now, surprisingly (!), “Codes of Conduct” are the flavour of the month again, and the risks of the FB/CA case on perceptions of self-regulation are suddenly back on top of everyone’s mind.

    For those of us who have had (and are still having), the pleasure of working with ESOMAR, an organization that has a 70-year track record of successful self-regulation in the insights industry, these issues have never been anything but top of mind, all of the time.

    Our ICC/ESOMAR Code of Conduct, drafted by practitioners for practitioners (including new tech practitioners!!), has managed to stay at the heart of our profession due to the tireless efforts of working researchers who want the Code to remain the effective tool of self-regulation it always has been. Thus the ICC/ESOMAR Code – updated a year ago – is endorsed by over 50 countries globally, and its principles are found at the heart of many other national Codes of Conduct.

    Furthermore, we didn’t wait for the FB/CA instance, to have invested heavily in new technologies to boost transparency of our self-regulation, which will come online later this year – demonstrating again that we remain pro-active in adapting to the new operational realities of the industry, and the changing expectations from both the general public as well as the regulators tasked with protecting them.

    If we have failed, it has been in the marketing of our Code, and what it truly means to the insights profession. Nowadays, thankfully, more brand owners have signed up to the Code, noting its importance in this new era of digital information.
    Let us therefore address our marketing shortcoming, and shout out loud about our successful 70 year history! Maybe then Facebook will understand why a Code of Conduct is so important – not to mention CA.

Join the conversation