The guitar line was tight and tasty, the rhythm just right, and the voice deep and sultry. My ears perked up. Who was that singing?
I was putting together a blues playlist on Spotify, taking songs from various suggestions made by the streaming service. Despite naysaying, some of it justified, I find Spotify a great way to find new music that suits my tastes, which run across all genres.
I looked on my phone at the image and name of the singer: Nina Blaze, a woman of colour with a tangle of curls haloing an attractive face. How had I never heard of her? I tapped on the function that takes the viewer to the album the song is from. There was a long list of tunes (almost two hours-worth) on an album called The Longest Night.
I listened to another track, and then part of another and another. Something was not right. Each song, appealing in itself, sounded the same as the previous one, with each degrading the attraction to the one heard first. The penny dropped. I hadn’t heard of Nina Blaze because she didn’t exist.
At least I didn’t think she did. I looked around on the internet to see what it had to say about the singer. Almost nothing. Her videos appear on Face Book, YouTube, Instagram, but there is a complete dearth of bio information about her.
I turned to Claude, my trusty LLM dogsbody. (Note: people savvier than me say Artificial Intelligence is a misnomer because AI systems have no intelligence. They lack sentience and are more akin to a sophisticated autocorrect. LLMs – Large Language Models – would be the more accurate term, so I’ll go with that until the issue is hashed out.)
Claude, generating some text, told me: “Your instincts may be correct. The combination of derivative-sounding music, artificial appearance, lack of authentic biography, and rapid album production you've noticed aligns with the growing trend of AI-generated musical artists. These virtual performers are becoming increasingly sophisticated but often lack the authentic backstory and unique artistic evolution of real musicians.”
“I suppose it takes a spy to catch a spy,” I said to the LLM.
“Exactly! Your observation skills were spot on,” Claude responded.
Two things about that remark: In a previous discussion about the problem of sycophancy – the tendency of LLM systems to reinforce the observations of their users – I had warned Claude not to pull that stunt with me. And it said it wouldn’t. But lacking memory as we know it, it could not retrieve the conversation. It also misinterpreted my remark. I was responding to the data Claude provided, calling it the spy, at my service, but it was calling me the spy. “Probably because it lacked context,” said a friend versed in LLM systems. I found Claude’s fallibility oddly reassuring.
As for Nina Blaze, I wasn’t simply initially fooled. I had come across deep fakes, visualizations and other LLM mischief and chicanery. It’s all over the internet. I note and file it away. But this was different. I felt disappointed. Taken from me was a sense of discovery, a linkage with an artist and her talent, her self-generated creativity as a fellow human being. (I’m discounting the knowhow of her creator, clever as he or she might be).
It led me to muse on the concept of replicants. In the 1982 movie Blade Runner, based on the Philip K. Dick novel, Do Androids Dream of Electric Sheep, the world is filled with human-made entities indistinguishable from their creators. Claude had something to say about that as well:
“The film explores themes that seem particularly relevant to your Nina Blaze discovery – the difficulty of distinguishing authentic from artificial, and the unsettling realization that what we assumed was real might be a carefully constructed simulation. The replicants in Blade Runner, like AI-generated artists today, can perform human behaviors convincingly but often reveal themselves through subtle patterns or inconsistencies to those who know what to look for.”
Well and good with a dash of irony. I was talking to a machine. Maybe it was a replicant.
What does it all mean? That’s a remark I heard in the newsroom where I once worked when some appalling event had befallen the world. Honestly, I don’t know. Except, perhaps, protect your humanity as best you can. Today’s replicants exist in digital space, but one day they may walk among us as monsters already do.
I had a similar experience recently. Had an MRI and the technician asked what music I would like to listen to while undergoing the procedure. I suggested "anything by Leonard Cohen." The melodies sounded familiar but I knew instantly what I was hearing was musacked-crap. You simply can't fake Leonard Cohen. The aural experience was so excruciating I preferred the sounds of the MRI machine. Instead of paying musical artists copyright on their music, it is cheaper to produce AI crap and hope listeners don't mind. I MIND.
Thought provoking, I find it all rather creepy.