CTPerspectives: Deep Fakes

Celebrities and prominent figures of all kinds have recently found their identities used in ‘deep fake’ content. Perhaps the most popularly deep-faked celebrity is Tom Cruise, who can be seen on TikTok doing everything from magic tricks to working on his golf swing. The problem is – he wasn’t.  A neural network was fed several hours of real video footage in order to gain a realistic ‘understanding’ of him to recreate him in different situations. More on that here.


As you can imagine – there are several implications to this technology being used to recreate the likenesses of celebrities. Some are harmless, like Hulu using athlete’s likeness in order to create its ‘Hulu has live sports’ spot without assuming the risks of in-person production during a pandemic. But there are certainly frightening possibilities as well. So we asked our CTPers, what they think may be some of the key implications for our field. 


The Deep Fake fad is nothing new – think of the 2012 Tupac performance at Coachella. We’re now seeing it not just affect celebrities, but also consumers directly. A recent example is the MyHertiage app, which makes old photos seemingly come to life. 


Moving forward, I think the “using my likeness” aspect of a celebrity’s contract is going to be extremely important. It’ll be interesting to see how brands that are less scrupulous try to get around it (think of a store brand Brad Pitt). Another interesting aspect will be when those contracts expire, and a likeness might now be considered fair game as a part of the creative commons. A main marketing tactic of late has been nostalgia, and deep fakes will provide marketers and companies with a new ability to capitalize on that trend. Imagine having had an image of Anne Frank saying that she was a Belieber, or having Einstein speak on how amazing Tesla is. More discerning consumers would be able to tell the truth, but others not so much. With the landscape right for disinformation consumers will need to be more aware than ever of the content they’re consuming. 


This will be a fascinating legal issue, and I imagine it will make its way to SCOTUS. – Carissa Ryan, Account Supervisor



I think it will continue to evolve. This started 20 years ago with photoshop. We have just evolved the CPU power to write algorithms that make these deep fakes amazingly convincing. I think for the most part, this is harmless. You could get into scary areas with politics, but I see this evolving more to Kurt Cobain selling you Smuckers jam. Imagine John Lennon selling you a Coke. It’s the likeness that consumers trust and grab onto. Now, RE-IMAGINE a celebrity licensing their image to a deep fake company to sell a product they don’t even have to be on set for? That’s where the licensing of images and likeness can be profitable from a deep fakes POV. – Will Claflin, Director of Creative Content



Much like other digital platforms, trends and technologies that continue to proliferate, there’s no black-and-white answer here. Rather, it’s complex and multifaceted – filled with perhaps limitless potential while fraught with immense danger.


Deep fakes, and the deep learning technology powering them, can present amazing new avenues for content that marketers can use to engage with their audiences. Content that’s more dynamic, hyper-targeted and even featuring true personalization. It can expand experiential to virtual environments. It can make celebrity influence more accessible – and forever available. Consider a spot featuring every member of The Beatles, mentioning you, and promoting the latest product launch from Budweiser.


While potentially exciting, fun and powerful,  brand marketers must really understand the brand and legal implications from uncharted technological territories like these. Trust and transparency are paramount with your consumers – now more than ever. Your brand is at the mercy of those representing it. And while accessibility creates potential, it also will translate into more clutter. – Todd Graff, SVP, Public Relations



Spokespeople could be hijacked. Products could be fake. Everything could happen. Advertising could turn into TRON. The lines could become so blurred we won’t be able to distinguish between reality and fake reality or even know who is marketing to us. – Andrea Lenig, VP Media Director

See our work.

Partner with us.

See our work.

Partner with us.