Civicom: Your project success is our number one priority

Insights Ethics: Setting Standards That Differentiate Persuasion From Manipulation

The ARF's Town Hall discussion on a variety of themes related to ethics and privacy in the time of Cambridge Analytica, including government regulation vs. self-regulation, GDPR and implications for US policy, blockchain, and the role of trade organizations.

Editor’s Note: Last week, the ARF co-sponsored a major Town Hall event with GreenBook, focusing on ethics and privacy issues spurred by the recent Cambridge Analytica debacle. I attended and found the discussion both informative and passionate, a rare combination. It could have gone on far longer than the allotted two hours, so I am sure that this is just the start of a longer process. Given the interest aroused, we have two summaries of the event for our readers. This article is by Jeni Chapman of Insight and Impact Consulting, in which she raises issues of “insights ethics” beyond the realm of the traditional market research world.

The Facebook Cambridge Analytica revelations have sparked an industry conversation across industry groups far and wide. Spurred by the level of dialogue and the import of the topic, The ARF sponsored a Town Hall Meeting for the advertising and broader research industry on Research Ethics in partnership with GreenBook. They hosted the event on Thursday, April 26th at their Park Ave South offices in New York while providing a live stream to those that could not attend in person.

The room was packed by a host of concerned industry practitioners who made the time to attend this meeting in-person; many others attended online as well.

The Why?

Scott McDonald, the President & CEO of the ARF, set the stage for the meeting by calling for a new set of industry standards to govern both data collection and consumer protection.

The What?

The What seems straightforward enough, but of course, it is not. Paul Donato, Chief Research Officer of the ARF, shared a matrix of the different types of trade organizations and what the “standards” are today to protect research participants. He found them fairly consistent, with the main principles focused on maintaining their anonymity, not re-selling their contact information and not ever trying to sell them something afterward.  These standards, while in need of some rework to better address the new digital research, are fairly straightforward.

But what are the standards on “respondent privacy” when we move beyond primary research – where there is a clear opt-in procedure and process that goes back many years? What is the role of trade organizations in providing standards for companies like Shareablee – whose CEO and Founder Tania Yuki joined the industry panel? At Shareablee, they never interact directly with a respondent, but rather are collecting and aggregating data on millions of consumers and delivering those insights to clients. They are doing it and want to do so ethically, and they are eager to have published standards that have wide recognition by companies and clients alike. Today, they struggle to get clients to understand and or care about the differences in how data is collected and how metrics surrounding that data are calculated.

The How?

The discussion on how to best protect individual’s data was centered around the following key themes:

  1. Is government regulation needed for there to be meaningful change and for protection of an individual’s data? Or is self-regulation really good enough? Is the FB / Cambridge Analytica just the case of one bad apple spoiling the basket or is it the tip of the iceberg? 
  2. Should people be compensated for the passive collection of their data that is then monetized by companies, as a way of re-shifting the balance of power back to the individual? Is blockchain technology a way to accomplish this?
  3. Will the recently introduced European Union GDPR provide a roadmap that the US can follow for implementing a comprehensive approach to protecting the personal data of US citizens?
  4. What is the role or responsibility of industry and trade groups like the ARF, ESOMAR or the Media Research Council in developing standards to guide companies in the world of today, on ethical approaches to gathering passive data and creating metrics? And should these be guidelines or should they be a set of standards?
  5.  Government regulation – Yes or No? 

The ARF invited Allie Bohm, Policy Counsel for the Public Knowledge Advocacy Center.  As they state,  “Public Knowledge promotes freedom of expression, an open internet, and access to affordable communications tools and creative works. We work to shape policy on behalf of the public interest.”

Ms. Bohm helped us understand that the current laws we are working under are from 1932, one from the 1980s and another from the 1990s. Her point was clear, that the current laws are out of date and out of context, as they do not address the new technology we are all using – the internet. She helped the audience understand that they represent the consumer and from the Policy Counsel’s perspective and experience, believe nuanced regulation is needed in order for there to be meaningful change, as self-regulation is what brought upon the Facebook / Cambridge Analytica debacle.

In general, Rolfe Swinton, Director, Data Assets, GfK, agreed with this point of view and shared that the efforts of countries like Germany, that take privacy seriously while continuing to function well commercially, set a good example of what can be done.

Protecting consumers – is the alternative to regulation implementing technologies like blockchain? Should people be compensated for the use of data that is collected about them passively?

Tania Yuki, CEO of Shareablee, as well as Rick Bruner, Vice-Chair of I-COM, disagreed. Tania was in full support of standards that trade organizations would create and would help her business justify the ethical approaches to data collection that they are already taking. However, she was extremely concerned if the outcome of regulation meant that every person whose data was used, needed to get compensated in some form. Rick’s concerns centered more about the practicality of trying to regulate something that is changing every day; people would perhaps be better served if they understood how their data is used and how it could not be used.

For more information on blockchain, here is an article that talks about blockchain’s application in the video advertising space.


Ben Hoxie, Director of Product Management at mParticle, shared a good overview that made it clear that the European Union has been working on this for many years and that one of the fundamental principles of the European Union’s approach is to recognize that our personal information and its protection is a fundamental human right. Per the GDPR website  “This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data”

Looking at personal data in this light, it really changes the perspective of how you address the issue of privacy. He was very confident that, while there will be industry-specific updates made to the GDPR, the fact that they can collect fines will allow them to fund the policing of the policy, as well as the fact that it is the work of a broad collaboration of experts and countries. Understanding that his company is part of a consortium that has put together platforms to ensure that companies operating in the EU are GDPR  compliant, he felt that it would be implemented successfully and that life would go on for the ad tech industry and the research industry. He did acknowledge that some businesses that relied on getting data in certain ways will no longer be viable under these new laws.

Rick Bruner shared that he was delighted that the EU would be implementing this first and that the US could learn from the challenges implementation will present, and that such learning would further inform a sound course of action for the US. Bruner was skeptical, based on his 20 years of experience in ad tech, that trying to regulate this area too closely would be successful. He argued that overall, it would be important to clarify the problem for which we are solving. He shared specific examples of using data to manipulate opinions by expressly delivering incorrect or misleading messages or information to targeted audience demographics.

Role of Industry & Trade Associations in Setting Guidelines and Standards

There seemed to be a consensus that it was incumbent upon organizations like the ARF, the AMA and existing industry regulatory groups like the MRC[1]  to actively be part of the conversation and provide guidelines for ethical insight delivery, to clients.

One of the key things that I have learned from all of this is that Cambridge Analytica did not do anything technically wrong in using the Facebook data. Facebook was actively pushing and peddling these data to developers as part of their desire for developers to create cool stuff for them that drove and maintained people within their walled garden. That is not to say what Cambridge Analytica did with the data was ethical. Questions around the keeping of the data even when Facebook changed their policy, as well as how the data was used and the allegations suggesting the firm’s tactics actively focused on a goal of misinformation, are currently being investigated in the UK and here in the US.

What has been eye-opening is how cleverly Facebook is using Cambridge Analytica as a scapegoat and focusing everyone’s energy on them as the bad actors, who took advantage of poor old Facebook. This “scandal” may actually help them further increase their control of our data and to make it harder for our individual data to be protected.

How are they doing this? For one, they are throwing off or greatly limiting, the use of third-party data suppliers. This is important. Third-party accredited data suppliers, weirdly enough, help protect our privacy.  For example, a JP Morgan Chase or a Target does not send their consumer data directly to Facebook to develop their audiences for ad delivery. What they do is send the data to a company like Acxiom, as published in a terrific article on this by John Battelle, “a company that purchases rights to public or legally sourced information like mortgages and other data and creates privacy secure, anonymous “segments” that marketers can use to target customers across various digital channels – which until recently, included Facebook.”

The fact that Facebook is shutting this down and that Google is following suit in this trend in order to “comply with GDPR” – means we are essentially risking less transparency and less privacy protection rather than more protection and more transparency.  More and more of the control gets centered within the walls of Facebook and Google. How can monopolies, in particular, information monopolies of this scale, be good for consumers? Individuals? Businesses? Society?

As professionals in the advertising and insights business, we know that there is a clear conflict of interest in having the house be the buyer and the seller. They sell your data to brands and they sell advertising to brands. So how are they motivated to safeguard our privacy? In addition, Facebook continues to talk about being a platform – but they are a media company. They are ABC, CBS, NBC, etc. all wrapped up in one and delivered on your mobile devices instead of into a box in your living room. And we would never allow ABC, CBS or NBC be the ones to dictate the currency of audience measurement. What we have are separate audience measurement providers (who with all their faults, are independent of the media companies) determine the audience of these content providers, and these providers need to be MRC accredited if they want to become an industry currency or standard. While Facebook has finally started the MRC accreditation process, it still does not have it despite billions of dollars being spent on advertising within their property.

Facebook needs to decide if they are going to be an advertiser or a data broker – you can’t be both and do it ethically. Why? Because what is not talked about so far in the Cambridge Analytica story is that while one part of the company at Cambridge Analytica did this research project and created audience targeting models that they then monetized, another part of the company bought millions of dollars of advertising on Facebook on behalf of their political clientele.  And given Facebook’s ad revenues today represent about 20% of the total global advertising market, it seems likely they will continue in that business – the business of media and advertising. And if they are in that business, then they need to be held to the same standards of quality that a local TV station in any town in the US would be held to relative to the product being advertised on their station. You cannot advertise hate groups on TV – why can you on Facebook?

The ARF will be posting materials that were shared; here is the link to the site.

[1] MRC – an organization founded when a U.S. Congressional Committee in the 1960’s held hearings around the accuracy of audience research for TV and Radio. These hearings resulted in the formation of what is now known as the MRC, an industry-funded organization that reviews and accredits audience rating services.


Please share...

Join the conversation

Jeni Lee Chapman

Jeni Lee Chapman

Principal, Insights and Impact Consulting