Tap Into These Emotions If You Want To Go Your Ad Viral

Tap Into These Emotions If You Want To Go Your Ad Viral
Tap Into These Emotions If You Want To Go Your Ad Viral

Every year, while two football teams face off in the sport’s top contest, marketers compete to win a Super Bowl of their own – investing millions into ads that will air on one of television’s biggest nights. But what does it take to craft an advertisement that’s so engaging that viewers will choose to share it themselves, making it go viral? To explore this question, the author conducted a study using automated facial recognition technology to track reactions while people watched ads and identify the emotions that correlated with higher share rates. The study found that while some positive emotions made viewers more likely to share a video, negative emotions such as disgust also correlated with a greater likelihood that the viewer would share the ad. When it comes to engaging viewers, what matters is not just provoking positive emotions, but provoking activating emotions’ and those can be positive or negative. Based on this research, the author suggests that marketers should aim to develop content that’s specifically optimized to provoke these activating emotions, and furthermore, that marketers and analysts can consider implementing a similar methodology as that used in the study to better understand how their content makes people feel.

This year, over 180 million consumers are expected to tune in to the Super Bowl – making it as important a night for advertising as it is for football. Brands will spend millions of dollars on single 30-second spots, hoping to craft a piece of content that won’t just be well-received when aired, but that will go viral, joining the cohort of top-performing ads that get more than twice as many impressions from social media as they do from TV.

Of course, going viral is no easy feat. What makes the difference between a typical ad and a video that’s so compelling and entertaining that viewers choose to share it over and over?

To find out, Microsoft Researcher Daniel McDuff and I conducted a study that used automated facial recognition technology to analyze how people reacted to different kinds of ads. We showed people video ads for everything from beauty products to pet care, Snickers to environmentally friendly paper towels, and we asked them how likely they would be to share each ad. As they watched each video, we used their webcams (with their consent) to track their facial expressions, and then a supervised machine learning algorithm categorized their expressions into different emotions.

This novel methodology had a few key advantages. First, while traditional facial recognition technology generally requires people to come into a lab and use specialized, expensive hardware, our approach allowed participants to use the webcams they already had at home (they simply clicked a link in an email to access the videos, completed a consent form giving us permission to record via webcam, and then filled out a short survey after watching each ad). This enabled us to reach a lot more participants than standard methods would have allowed and meant that participants had a more authentic at-home experience when watching the ads.

In addition, instead of needing researchers to manually view recordings of people’s faces and label their expressions, we leveraged machine learning to automate the entire process. We designed a custom-made system for detecting common facial reactions based on movement in key facial features (smiles, brow furrows, etc.), and trained the model on a set of pre-labeled examples to determine how often each participant expressed different emotions while watching the videos. This enabled us to test hundreds of videos on over 2,000 participants from around the world – a sample size order of magnitude greater and more geographically diverse than similar studies have been able to achieve.

So, what did we find? Unsurprisingly, emotional reaction played a major role in how likely a participant was to say that they would share an advertisement. But how exactly emotion impacted shares wasn’t always so obvious.

One might expect that people share ads that make them feel good, and they don’t share ads that make them feel bad. And to some extent, this intuition is correct: In our study, ads that appeared to evoke positive emotions such as joy were more likely to be shared, while ads that evoked negative emotions like sadness or confusion were less likely to be shared.

But there’s more to it than that. While some negative emotions, such as sadness, decreased sharing, others, such as disgust, slightly increased it. Why is that? The answer is that emotions are more complicated than just “happy” and “sad.”

In addition to feeling good or bad, emotions are also be characterized by how activating – or “physiologically arousing” – they are. Whether positive or negative, some emotions just fire us up. When we come across a snake while walking in the woods, or when we get in a fight, we go on high alert. Our heart beats faster, our pulse quickens, and we’re ready for action. The same is true for certain positive emotions: When our team scores a touchdown in the final seconds, or we find out we’ve been promoted, we can’t help but get excited.

Other emotions calm us down. Think about how you feel after going to the beach or watching a sunset. You feel good, but that feeling of contentment doesn’t make you want to do very much. Similarly, while anger might make us want to yell at someone or take some sort of action when we feel a less-activating negative emotion like sadness, we just want to curl up in a ball and do nothing.

My prior research has shown that the more an experience provokes activating emotions, the more likely people are to share it. And our results were in line with this: Provoking positive, activating emotions increased the likelihood that a video would be shared the most, but when an ad provoked a negative activating emotion (i.e., disgust), viewers were also motivated to take action and share the video – even though it didn’t make them feel good. Conversely, emotions that were less activating (i.e., sadness) reduced the likelihood that the viewer would share the ad.

“Good” Isn’t Good Enough

These findings have a number of implications for marketers. First, it’s not enough to make people feel good about your brand. Too often, content creators think that if they can just make customers feel positive about their products, services, or brands, then people will share those products with their friends. But as our results show, “good” isn’t enough. You need to fire your customers up. If you want to stick to positive emotions, go further than contentment and create content that excites, inspires, and delights your customers – that is, content that evokes activating emotions.

“Bad” Isn’t Always Bad

Second, although we tend to shy away from negative emotions, going negative isn’t always a bad idea. We may assume that if a commercial makes viewers feel disgusted, for example, they won’t share it. But when used correctly, negative emotions can actually be a powerful way to engage your customers.

For example, ads that make people feel angry by illustrating injustice, or that make them feel anxious or grossed out by describing the health risks of disease, may incite them to take action by sharing the information with others (over the last few weeks, we’ve seen all too well how powerful disinformation can be when it inspires fear and anger – but similar emotions can be harnessed for good as well). On a lighter note, one of the most successful ads of 2020 was Dashlane’s “Password Paradise,” which compared the frustration of losing your password to being stuck in Dante’s inferno.

Do Try This At Home

Third, while we analyzed ads from a wide variety of industries in order to explore the general link between emotional reactions and share rates, marketers and analysts can adapt our methodology to gain insights around reactions to their specific content. Of course, traditional market research methods such as surveys and focus groups can also be effective in some cases, but they have some serious limitations.

For one, simply asking people how they feel doesn’t guarantee accurate information: people may be biased or uncomfortable with their own emotions, or they may be unsure how to verbalize their feelings. In addition, focus group responses can be shaped by social influence, further impacting their reliability. Plus, these standard tools require a lot of manual, in-person work, meaning that it can be quite costly to scale them to any useful degree. In contrast, automated facial recognition using home webcams can help marketers’ measure consumer reactions in a manner that’s effective, unobtrusive, and highly scalable.

That said, these tools certainly have limitations. Researchers have documented racial and cultural biases in facial recognition programs, which can make it challenging to accurately (and ethically) measure responses across a diverse consumer base. In addition, even aside from broad cultural differences, the link between facial expression and underlying emotion can vary significantly and unpredictably across individuals (for example, some people may be more outwardly expressive than others, but that doesn’t necessarily reflect their true emotional state). Our methodology focuses on recognizing facial expressions, so it doesn’t fall prey to some of the pitfalls of systems attempting to use similar technology to recognize specific faces, but it’s still important to be aware of these potential issues. As with any technology, tools that infer emotions from facial expressions aren’t perfect – but armed with an awareness of their shortcomings, marketers, and researchers will be well-positioned to leverage these tools for good, gaining insights that empower them to offer content that’s as impactful as possible.

originally posted on hbr.org by Jonah Berger

About Author: Jonah Berger is a professor at the Wharton School of the University of Pennsylvania and the author, most recently, of The Catalyst: How to Change Anyone’s Mind (Simon & Schuster, 2020).