Cannes Lions
WUNDERMAN THOMPSON, London / HSBC / 2023
Overview
Entries
Credits
Background
Fraudsters steal trillions of dollars every year and their methods grow ever more cunning and sophisticated. HSBC wanted to help customers avoid fraud, but many consumers underestimate the risk and find it difficult to imagine fraud happening to them. In addition, many fraud victims feel such shame, they don’t share their experiences, fueling the cycle of ignorance. Most banks only offer generic guidelines on how to avoid fraud and the messaging is easy to ignore, like legal small print. The challenge was to create a cut-through campaign that teaches the public on the tactics fraudsters use and make fraudsters a real and present danger, rather than an abstract threat. HSBC have a wealth of knowledge around the subject of fraud and one of the main objectives of the campaign was to help consumers understand the social engineering these criminals use in order to cheat people out of their money.
Idea
Most people think fraud will never happen to them because the criminals are invisible. These fraudsters hide behind their anonymity, using their voice as a weapon. To make the risk of fraud real, it needs a face. What if we could unmask these fraudsters and use their own voices against them? We used an AI that can predict what a person looks like from their voice data. Each voice has an ‘audible DNA’ that correlates to facial features like size of jaw or shape of nose. These features, along with weight and age can be determined to 80% accuracy. In a world’s first, we fed actual recordings of fraudsters from scam calls into this AI and recreated their faces. These portraits were then brought to life as digital reconstructions and used for tutorial films that teach the public about fraud and finally put a face to a faceless crime.
Strategy
Fraudsters notoriously hide in shadows. They won’t show their real face or meet you in person, but they often use their voice to coerce you into doing what they want, using social engineering. Voice was the chink the fraudster’s armour and thanks to Carnegie Mellon’s AI tool we were able to turn fraudsters voices against them. In a world’s first, we used the voice data of actual scam recordings and fed them into this AI in order to predict each fraudster’s face. The use of voice data was important in creating a more authentic, meaningful connection between consumers and the threat of fraud, which is often perceived as ‘something that happens to others.’ The data for the various fraudsters was harvested from publicly available sources and represented a range of ethnicities and gender.
Execution
To create the Faces of Fraud, we spent over a year collaborating with Carnegie Mellon University. This prestigious technology university has created a ‘voice to face’ AI trained with thousands of human faces and their corresponding voices. Voices carry such nuanced details about their speakers that the likelihood that one person has the same voice is one in a trillion. Each voice has an ‘audible DNA’ that correlates to facial features like size of jaw or shape of nose. These features, along with weight and age can be determined to 80% accuracy. The ‘voice to face’ AI has learned to predict facial features associated with an individual’s voice and the technology is so reliable, it is used by US law enforcement to narrow down suspects. In a world's first, we fed actual recordings of scam calls into this neural network. The AI was then able to predict facial skeletal measurements and tissue density using various computational algorithms and a 2D picture began to emerge of the fraudster’s face. We then turned to Unreal Engine’s Metahuman platform to translate each fraudster’s face into a highly realistic 3D model. We also used the voice data to predict the approximate dimensions of the room the fraudsters were standing in when they made their calls. Lighting was chosen that simulated a mobile illumination or computer lighting. Using motion-capture, the faces were then brought to life and scripted with dialogue. Never before have actual fraudsters been digitally reconstructed based on voice data and brought to life in film. The campaign premiered on HSBC Australia social channels in October 2022. To coincide with International Fraud Awareness week in November 2022, the work then expanded to the UAE on digital OOH, in branch, on HSBC global social channels and on a microsite with an associated CRM programme.
Outcome
For Meta, on both ETB and NTB Faces of Fraud exceeded benchmark video through rate achieving 18.85% & 21.87% respectively. For Twitter we have a VTR of 30.86% & and the highest CTR in social of 0.29%. As for TikTok, we’re recording the highest view through rate of 32.2% exceeding benchmarks by more than 2 times. Engagement rates were much higher than industry benchmarks – 0.05% for Facebook and 0.45% for Instagram. In terms of total views, we exceeded expectations while spending 27% of the budget. For Snapchat, we achieved a VTR of 18% while achieving all of our planned views. On YouTube we achieved a view through rate of 66.5% vs. a benchmark of 35%. HSBC also received more fraud-related enquiries during the series, per our day-to-day community management.
Similar Campaigns
12 items