The controversy surrounding Cambridge Analytica’s role in utilizing Facebook’s data – OUR data – to direct the outcome of the 2016 election has caused quite a firestorm of misunderstanding and dissemination of misguided information. We are still learning all the specific details regarding the tactics Cambridge Analytica deployed, so to keep us focused, I want to concentrate on the use of personal data and how it is leveraged to influence or manipulate people and the impact this could have on our society and the industries that drive our economy.
As headline after headline rolls out of the media industrial complex, the world is finally learning what “big data” really means and how their personal data can be leveraged to manipulate the behavioral outcomes of large population groups. In the Information Age, I would call this our moment of enlightenment. And I do believe this situation will continue to escalate and become an existential crisis for the research, data management, and media industries. A crisis that if not properly addressed, will lead to a collapse of current business models and potentially entire industry sectors.
Every one of us who were innovators and early adopters of behavioral targeting, data management, psychometrics, and social analytics are all accountable for what has manifested today. It is in times like this that we must insert ourselves as socially responsible mediators and practitioners to secure the future of research, data management, and communications industries and to secure a future of sustainable cultural discourse.
I have been working in the psychometrics space for nearly ten years, with application to the practice of behavioral design and cultural transformation. But prior to that, I was in digital media. About 8 years ago, I had formed a joint venture with a social science research company, to leverage their methodology to design a practice, which at the time we called ‘cognitive marketing’. Very quickly we realized the impact these methods could have in changing collective perception and behavior, and ultimately creating cultural paradigm shifts through intentional design, and not necessarily for the better.
In 2012, I hit a moral crossroads in my career and realized it was time to get out of digital marketing and apply these practices in more socially responsible ways. In the process, I found purpose amid this moral dilemma. And now, entire industries are being confronted with this dilemma and it is time for everyone to self-reflect on their own values and purpose to determine what actions or inactions to take in shaping the future of the industries we pioneered with the greatest of intentions to create a better tomorrow. Well, tomorrow is now today.
For many of us who work in psychometrics and memetics, it was rather easy to project that Trump would be the winner, months before the actual election. Purely on observation (not even with scientific validation), we could project based of the narrative framing of information, how media was reporting, and how memes were propagating across social networks the probable outcome. It was not only unsurprising that Trump won, but it was likely, based on the evolving patterns of perception and systemic interactions of our cultural system. For the trained eye, it was obvious this was by design; otherwise known as social engineering. Only now, the public is learning about it and understanding how it was done. Yet, much of the techniques Cambridge Analytica applied are the same practices (less the alleged illegal practices) most sophisticated marketers practice every day.
The attack on marketing methodologies and practices will be coming fast and furious, so it is best to attempt to get out in front of the situation. We need to recognize and appreciate the impact these new revelations can have on the research industry, as well as all media and marketing disciplines. The issues we are confronted with today are ethical and moral issues and ones we have been incapable of self-regulating. Many of us who were in digital media can recall that for many years the FTC was present at many of the digital media conferences preaching to us to self-regulate before the government stepped in. And frankly, we did a terrible job. Well, the chickens have come home to roost.
A call for self-regulation has been going on since the DoubleClick Abacus project of 1999 (which I was a part of as well) and the data layering was primitive at best, compared to what we are witnessing today. This has been an evolving issue since the first cookie was dropped. We now appear to be approaching the precipice of widespread consumer activism and what I believe to be the inevitable regulation of data collection and management practices that will have dramatic economic consequences.
Additionally, we must separate the methodology and platforms from the intentions of bad actors. Let us not throw the baby out with the bath water. The use of data is very beneficial. But, with big data comes big responsibility. What we are witnessing today is a taste for how social engineers can weaponize communication channels and platforms. This isn’t new. It’s been going on for many years. But ignorance is bliss and we have now been enlightened. So, we must take action.
This issue transcends all industries and pillars of society. I personally know someone who works for the military whose sole purpose is to understand the complexity of terrorist organizations and how they recruit and persuade people through similar social engineering techniques. And major investors use similar methods to influence market behavior. And companies use similar techniques to build brands. And yes, politicians use these techniques to win elections. Frankly, it is common practice. Unethical, perhaps. Illegal, no. Requires change, definitely!
So, my question to you is…
As industry leaders and pioneers how will you take action and transform your practices so that we can secure the future of our industries and create a sustainable culture for all?