The Ad Industry Needs to Figure Out How It Feels About Reading Consumers' EmotionsOpinion: What would people think about the way marketers use such data?
By Guest Author
8 hours ago
The ad industry has no code of ethics for reading consumer emotions
The technology is here that lets advertisers read consumers' emotions. Now comes the hard part.
Facebook's Cambridge Analytica incident was a Russian nesting doll of scandals. The most tangibly galling aspect was the use of consumer data. But further analysis showed that the company was using data about consumers' emotions for targeted messages that took advantage of users' anxiety or depression. It was manipulation, in other words. If Cambridge Analytica found that a Facebook user was prone to anxiety, then it showed them ads designed to exploit that fear. If a user was afraid of crime, then the user would see ads hyping up those dangers.
If you imagine a continuum of the use of emotion in advertising, Cambridge Analytica is on one far end—an area that most marketers would say went too far. On the other end maybe is an ad for Coca-Cola that aims to put a smile on your face. There's a line somewhere in the continuum. The industry needs to find it.
The need is pressing because technology that can identify emotions is already here. Facial recognition can not only identify consumers, but read their emotions. As more data becomes available, artificial intelligence systems will be able to predict our emotions based on what we're reading and other factors like our age, sex and the time of day. Retailers and banks are analyzing the way we type. For now, they are using such biometric data for identification, but such data can also reveal our emotional states. Customer service software can read emotion in the caller's voice.
Such analysis is very good in 2018. By 2028 it will be spectacularly good. Meanwhile, the ad industry has no code of ethics for reading consumer emotions. Any code should answer these five questions:
Are negative emotions fair game? If you can tell that someone is in an upbeat, happy mood, then your pitch is likely to resonate more. A recent studyshowed that people are 24 percent more receptive to advertising and 40 percent more receptive to digital advertising when they're in a good mood. As Cambridge Analytica illustrated, though, marketers can harness negative emotions to prompt action. Is it OK to hit someone who is depressed with messages that will "speak” to them and possibly make them more depressed? Is it OK to read emotions in public? From an advertiser's point of view, A/B testing billboards sounds like a great idea. Facial recognition technology can tell who smiled at the ads. The ad that gets the most smiles wins. But should these consumers who are taking part in this real-time marketing research be notified of their participation? Should marketers be able to compile a detailed psychographic of a consumer? Marketers already look at things like interests and affiliations to compile a portrait of a consumer. But is it OK to also look at their emotional makeup and conclude that they're introverted or extroverted, jovial or melancholy? Considering how emotions correlate to advertising success, doing so is a temptation for marketers, but what would consumers think about the way marketers use such data? What type of emotional manipulation is acceptable? Advertising is rooted in emotional manipulation, so taking that out of the equation is a non-starter. But is it alright to craft emotionally resonant messages after the nation suffers a major tragedy, for example? What about after your favorite team wins the Super Bowl? What kind of disclosure would work with consumers? The European Union's General Data Protection Regulation and ePrivacy give consumers control of their data, but once they let marketers use such data, are they aware how it's being used? If not, is there a form of disclosure that will inform consumers that data about their current and long-term emotional states will be used to inform marketing messages?
There are no easy answers to these questions. As the technology progresses, though, the industry needs to start asking about the ethics of using emotion-based data. Better to have this conversation now than to have it forced on us later when another scandal breaks. In the end, talking about the issue now will lead to a happier outcome for everyone.