Will Smith posted a video on social media that shows oceans of fans cheering him on during his recent European tour.
“My favorite part of the tour is seeing you all up close,” the caption says. “Thank you for seeing me too.”
In these thousands-deep crowds, some fans are holding up signs espousing their love for Smith, with one even saying that his music helped them survive cancer.
But the video gives off an odd aura — it looks believably real at first glance, until you look closer and find digitally-mangled faces, nonsensical finger placements, and oddly augmented features across the series of clips.
The video looks strange enough that fans responded with accusations that the crowd footage was created using AI. It’s bad news for Smith, who’s already suffered reputational damage after “the slap.” If he were using AI to make his concerts look more impressive, or even spinning up stories of fans using his music to cope with cancer treatment, that would be pretty indefensible.
These fans aren’t fake, though — or at least, that’s our best guess. (There’s not a reliable way to determine whether or not content was created using AI, which has made the current online landscape a nightmare of misinformation.)
As tech blogger Andy Baio pointed out, Will Smith has posted photos and videos throughout his tour that show some of the same fans and signs depicted in the questionable video.
Techcrunch event
San Francisco
|
October 27-29, 2025
There’s nothing about these older posts that indicates that the photos and videos are synthetic, yet when they’re depicted in this new video, they look like they’ve been generated using AI. It seems like Smith’s team has collaged real footage with AI-generated videos that use real crowd photos as source images, which makes the video even more difficult to interpret.
But social media audiences will not take the time to scroll through past Will Smith posts, find evidence that a fan really did listen to his music during cancer treatment, and give him the benefit of the doubt. What fans will take away from the post is that Smith is posting fake videos of his fans, which is deeply cringe, even if the reality is a bit less egregious.
It’s bad timing for Smith, too, that YouTube had recently begun testing a feature that would use “traditional machine learning technology to unblur, denoise, and improve clarity” on some Shorts posts — these edits made Smith’s YouTube Short look even more fake than the videos on other platforms.
YouTube’s creator liaison Rene Ritchie has since shared that the platform will soon allow creators to opt out of this feature, which has proven unpopular thus far.
You could make the argument that Will Smith has not duped his fans — that his team simply used AI to generate footage from photographs to create a more visually gripping social media post, and that this practice could be compared to other forms of video editing.
Fans don’t see it this way, though. The public is more resistant to generative AI technology than existing creative tools, like autotune or Photoshop. But even in those cases, many fans remain turned off by artists who rely on these tools in ways that feel untruthful.
If a fan buys tickets to see a pop star, but it turns out that his recordings only sound good because his terrible voice has been auto-tuned, then they’d feel duped. It’s like photographing a model to advertise a facial moisturizer, only to edit acne off the model’s face.
Once an artist breaks their audience’s trust, it’s hard to win it back — even if you’re the Fresh Prince of Bel-Air.