It’s often said that a picture is worth a thousand words. Unfortunately, when it comes to a lot of the pictures of classic rockers currently circulating on social media, some of those words include “fake,” “AI,” and “slop.”
Although many people are already well aware of the dangers posed by the AI technology that’s capable of producing vaguely lifelike renderings of real people doing things they’ve never actually done, we’ve also been conditioned for generations to believe what we see in photographs, so it doesn’t take much at all for some fairly outlandish things to go viral.
And when you add AI video to the equation, credulity becomes even more of an issue — as a number of celebrities and their beleaguered reps have repeatedly learned in recent months.
From a certain point of view, a lot of these images and videos can appear harmless, if not humorous or even touching. If you aren’t concerned with how they might impact the people depicted in them — or with the bigger picture of what it means when we can’t trust anything we see — it can be difficult to understand what the big deal really is.
If, for example, someone wants to pretend that Lenny Kravitz invited an elderly concertgoer on stage to sing with him, is that really hurting anyone? Even if the post’s caption claims the woman in question had been a fan of Kravitz’s “since the 1970s”? Does it matter that Aerosmith singer Steven Tyler never actually “tore into ‘Sweet Emotion'” with a crew member who told him he’d “once shredded guitar in a scrappy garage band back in his youth?”
Well, yes and no. On one hand, it’s hard to begrudge anyone deriving a small bit of pleasure from the increasingly dire and ad-stuffed social media landscape — but on the other hand, every single one of these posts makes it harder to simply believe anything you see. And while it’s good policy to apply critical thinking to everything you’re told, it’s a bad sign when so much of what ends up in your feed is not only false, but looks just real enough to trick the viewer into thinking it’s real, even if only for a moment.
Read More: Queensrÿche Surprised by Fake Digital EP Release
As Rolling Stone reports, these images have been gunking up an increasing number of algorithms lately, and gaining enough traction to earn widespread attention. That attention isn’t always positive — plenty of comments on the posts consist of angry warnings that the photos are fake — but that almost doesn’t matter, because any kind of engagement only amplifies their reach.
It’s become a booming business for social media content farms, and unlike spreading political disinformation, it’s still being pitched as something silly and fun rather than nefarious. (The fact that quite a few of these images purport to show various classic rockers and/or their spouses in the hospital is glossed over.)
As Justin Grome, founder of social media marketing firm Clonefluence, told Rolling Stone, “These types of posts definitely tap into nostalgia, and people want to believe these types of things. Even if they aren’t real, they’re wholesome. It���s not a case of fake news in the political sense. It’s not really meant to enrage people. It’s meant to comfort, which makes it even harder to combat, because who is going to question something like that that makes them feel good?”
Of course, it doesn’t always feel good if you’re the subject of the post in question — as Crowded House frontman Neil Finn can probably personally attest after finding himself the focus of a bizarre fake story claiming he’d recently fathered a new child at the age of 67 after experiencing erectile dysfunction for years. In this case, the “source” wasn’t an AI image, but a video of a news broadcast that never happened, which only increased the hoax’s reach.
Ultimately, the band was forced to release a statement that simply read, “We’re not sure where this came from, but please don’t be fooled. Neil’s never had trouble with erections.”
Since they’re the focus of all this flimflammery, you’d think classic rockers would be wholly opposed to it, but that isn’t always the case. Sometimes it’s in service of a “hey, remember when” aesthetic, as with the videos for Billy Joel‘s “Turn the Lights Back On” and the Rolling Stones‘ “Angry,” both of which used de-aging technology to make the artists resemble their younger selves, however briefly and uncannily.
Other times, it’s just as hokey and weird as anything you might see from a gullible relative or former co-worker, as when Rod Stewart made the decision to “pay tribute” to the recently deceased Ozzy Osbourne with a ghoulish, AI-created concert backdrop depicting the Black Sabbath frontman in “Heaven” taking selfies with Kurt Cobain, Freddie Mercury, Michael Jackson, and Tina Turner. (Also objectionable was Stewart’s pronouncement: “Very sad. A lot of those people died ’cause of drugs. I’m still here, though!”)
What might even be more troubling is the steady creep of AI-generated music. Listeners attuned to some of the shadier business practices employed by digital streaming services are likely already aware of the rise of “ghost artists,” or fake artist names used for songs cranked out by firms that are essentially music content farms.
They’re cheaper to license than legitimate recordings, and thus more profitable for the platforms, which has opened the gateway to “bands” like Velvet Sundown, the group who made headlines earlier this year for amassing more than a million monthly Spotify listeners despite not actually existing.
And “ghost artists” might be only the beginning. In July, Toto frontman Steve Lukather found himself in the irritating position of having to give a statement explaining that “Name This Night,” a new instrumental track that surfaced on streaming platforms credited to the band, wasn’t written or recorded by Toto at all. It was quickly taken down — by some platforms, anyway — but this is the type of whack-a-mole situation that requires time, attention, and resources that many artists and labels simply lack.
This particular case was easy to pick out not only because of Toto’s stature, but because “Name This Night” didn’t even sound like it came from the band. What happens, however, when we start seeing an influx of true soundalikes? Someone using AI tech to produce a “lost song” from a deceased or retired act is the stuff nightmares are made of if you’re committed in any way to protecting the artist’s estate.
But if you’re a fan, how hard are you going to complain about a piece of work that comes even a little bit close to making you feel the way you felt when you fell in love with that artist’s actual work?
Given that this technology is taking off at a moment when public trust in everything from news reports to vaccines is eroding at an alarmingly rapid rate, it feels like we could be in danger of losing touch with the very concept of a shared reality.
And the thing is, it’s still in its relative infancy — as AI sounds, videos, and images continue to improve, it’ll only become more difficult to sift through the slop and be confident that anything you’re seeing or hearing is real.
After all, when you can’t even trust a Prince quote, what can you trust?
50 Rock Album Covers You Can Visit in Real Life
Recreate your favorite LP artwork in person.
Gallery Credit: Allison Rapp