Not content to conquer Mars, Elon Musk recently set his sights on colonizing the brain. Musk's new company, Neuralink, announced in late March, plans to make devices that can be implanted in the brain to enable a direct link to computers.
Shortly after Musk's announcement, Facebook aired plans for technology that would let users type texts via their thoughts. Often when we try to imagine personal technology a decade from now, this type of scenario ranks high. Google futurist Ray Kurzweil predicted in March that in the 2030s, scientists will connect our neocortexes to the cloud, making us all smarter and funnier.
Those scenarios may still be a few years off. But the ability to use technologies ranging from facial recognition to MRIs to wearables to read consumers' emotions is real. And brands could use these tools to serve consumers ads when they are most amenable to seeing them.
How marketing is already reading minds in 2017
We've all had times when we're rushing to catch a train or trying to get information in a time-sensitive manner when ads are a nuisance. On the other hand, we've also all had moments when we're bored or hungry and wouldn't mind seeing an ad, particularly if it offered a deal on dinner at a nearby restaurant.
While brands can time their ads based on a user's location and the time of day, they aren't yet able to tap into the most important factor: the user's state of mind. Research has shown that consumers are more receptive to advertising when they're tired. Other research has shown that reaching consumers when they're in an upbeat mood can increase the efficacy of digital ads by as much as 40 percent. Several firms are trying to take advantage of these insights. Affectiva uses facial recognition to gauge consumers' emotions on behalf of Mars (the candy brand) and CBS, among other clients. TVision employs Microsoft's Kinect technology to see who's in the room when TV ads play and who smiles when they see an ad. Ad agency M&C Saatchi has also experimentedwith outdoor ads that use the expressions of passers-by as a sort of A/B test. The ads that get the most smiles stay.
In a test, MRI-based neuromarketing was a more accurate indicator of sales than standard marketing testing. Reading the minds of consumers exposed to ads offers a window into those ads' efficacy. That can be a game-changer for all kinds of marketers.
What the future of mind-reading ads might look like
It's easy to see where all this might go: Marketers might gain insight into a consumer's emotional state and determine whether they're hungry or tired and serve them up the appropriate ad. Perhaps Facebook will be able to go even deeper and read a consumer's thought that "I'd love a Whopper right now” and then flash an ad directing them to the nearest Burger King.
New wearables will get us even closer to that day, with EEGs that can measure brainwaves and physiological markers like pulse that indicate when a consumer is receptive to messaging. As the science improves, marketers might be able to determine which of a consumer's emotional motivations ranging from self-improvement to thrill-seeking prompt them to engage with brands.
It's kind of crazy how plausible this all is. And mind-reading will be the pinnacle of achievement for marketers. Instead of using tricks and psychology to get consumers to buy things, they'll merely use scientific tools to read the consumer on a deep level. From there, the messaging isn't all that difficult.
The obvious ethical questions
That scenario of course presents all kinds of ethical conundrums. Many consumers will bristle at the thought of having their thoughts and facial expressions examined. They have a right to. This sort of emotional analysis needs to be permission-based.
But consumers have already given up their data and privacy to Facebook and Google in exchange for free services. Asking them to take another step in that direction wouldn't be unprecedented. On the other hand, going from actions to thought might be a step too far for consumers.
Given how fast this technology seems to be progressing, we should have an answer to that question soon enough.