Authentic Wirehead
Bhagpuss has a post out called “It’s Real if I Say It’s Real,” with a strong argument that while people say they desire authenticity in the face of (e.g.) AI-generated music, A) people often can’t tell the difference, and B) if you enjoyed it, what does it even matter?
It was the clearest, most positive advocacy of the wirehead future I’ve ever seen in the wild.
Now, speaking of clarity, Bhagpuss didn’t advocate for wirehead in the post. Not directly. I have no personal reason to believe Bhagpuss would agree with my characterization of his post in the first place. However. I do believe it is the natural result and consequence of accepting the two premises.
Premise one is that we have passed (and are perhaps far beyond) the point at which the average person can easily differentiate between AI-generated content and the “real thing.” Indeed, is there really anyone anywhere ready to argue the opposite? Linked in the Bhagpuss’ post was this survey showing 97% of respondents being unable to tell the difference between human-made and AI-generated music across three samples. ChatGPT 4.5 already passed the classical three-way Turing Test, being selected as the human 73% of the time. Imagine that other person the research subject was texting with, and being so resoundingly rejected as human.
Then again, perhaps the results should not be all that surprising. We are very susceptible to suggestion, subterfuge, misdirection, and marketing. Bhagpuss brought up the old-school Pepsi vs Coke challenge, but you can also look at wine tasting studies where simply being told one type was more expensive led to it being rated more highly. Hell, the simple existence of the placebo effect at all should throw cold (triple-filtered, premium Icelandic) water on the notion that we exist in some objective reality. And us not, you know, just doing the best we can while piloting wet bags of sentient meat.
So, premise #1 is that it has become increasingly difficult to tell when something was created by AI.
Premise #2 is when we no longer care that it was artificially generated. For a lot of people, we are already well past this mile marker. Indirectly, when we no longer bother trying to verify the veracity of the source. Or directly, when we know it is AI-generated and enjoy it anyway.
I am actually kind of sympathetic on this point, philosophically. I have always been a big believer that an argument stands on its own merits. To discredit an idea based on the character of the person who made it is the definition of an ad hominem fallacy. In which case, wouldn’t casting aspersions on AI be… ad machina? If a song, or story, or argument is good, does its origins really matter? Maybe, maybe not.
Way back in my college days, I studied abroad in Japan for a semester. One thing I took was a knock-off Zune filled with LimeWired songs, and it was my proverbial sandbar while feeling adrift and alone. Some memories are so intensely entangled with certain songs, that I cannot think of one without the other. One of my favorites back then was… Last Train Home. By lostprophets. Sung by Ian Watkins.
So… yeah. It’s a little difficult for me to square the circle that is separating the art from the artist.
But suppose you really don’t care. Perhaps you are immune to “cancel culture” arguments, unmoved from allegations of a politician’s hypocrisy, and would derive indistinguishable pleasure between seeing the Mona Lisa in person and a print thereof hanging on your wall. “It’s all the same in the wash.”
To which I would ask: what distance remains to simply activating your nucleus accumbens directly?
What is AI music if not computer-generated noises attempting to substitute for the physical wire in your brain? Same for AI video, AI games, AI companions. If the context and circumstances of the art have no meaning, bear no weight, then… the last middle-man to cut out is you. Wirehead: engage.
…
I acknowledge that in many respects, it is a reductive argument. “Regular music is human-generated noises attempting to substitute for the wire.” We do not exist in a Platonic universe, unmoored from biological processes. Even my own notion that human-derived art should impart greater meaning into a work is itself mental scaffolding erected to enhance the pleasure derived from experiencing it.
That said, this entire thought experiment is getting less theoretical by the day. One of the last saving graces against a wirehead future is the minor, you know, brain surgery component. But what if that was not strictly necessary? What if there was a machine capable of gauging our reactions to given stimuli, allowing it to test different combinations of outputs in the form of words, sounds, and flashing lights to remotely trigger one’s nucleus accumbens? They would need some kind of reinforcement mechanism to calculate success, and an army of volunteers against which to test. The whole thing would cost trillions!
Surely, no one would go for that…
Posted on November 21, 2025, in Philosophy and tagged AI, Bhagpuss, Nucleus Accumbens, Subjectivity, Wirehead. Bookmark the permalink. 4 Comments.
I remember reading Ringworld back when I was a young teenager and not really seeing what Louis Wu’s issue was, other than he couldn’t afford to stay jacked in. Other than that, I thought he should have stayed where he was, not gone gallivanting about allover the universe. (I may be misremembering the exact details, although I have read the novel again since then, just not in a decade or so.) On the other hand, I was quite irritated when Pohl and Kornbluth’s Wolfbane turned into an adventure novel. I wanted to read more about about the Nine Boiling Stages of Water, which sounded a lot more interesting than all that running around in tunnels. I was clearly hard to please when it came to plot development back then.
I’m not sure, though, that the issues of authenticity inevitably devolve to a rat pressing a lever. It’s a spectrum, isn’t it? Or a continuum. One of the two. Anyway, it’s not a conveyor belt carrying us all remorselessly into the void. Probably.
As for who is going to argue the opposite, Rob Fahey at Gamesindustry was doing just that in an opinion piece earlier today. I’d link it but is has an insanely long URL. It’s called “AI assets are an unconscionable risk for premium-priced games” if you want to look it up. His argument is that there’s always a huge premium on authenticity and that customers can always tell the difference between a fake and the genuine article. Here are the three key paragraphs:
“Consumers crave authenticity. They value “realness”, and they’re willing to pay for it – whether it’s in food, in clothing, in experiences, or in anything else, including media, this is an absolutely fundamental truth of how consumers think.
People pay a premium for genuine brand goods over replicas; they pour scorn on knock-offs. “Hand-made” commands a higher price than the most precise and efficient machine ever will. We pay more for local crafts, for art and music that makes us feel a sense of connection to a creator, for food that feels authentic and “real”, connected to a place and a history.
It’s an instinct that suffuses all areas of consumer activity. From the ludicrous prices of a brand like Hermes to the hand-made souvenirs you overpaid for on holiday because they had a story to them; we crave authenticity. To most of us, there’s nothing more valuable you can be than “real”, and nothing worse than being “fake”.”
This is pretty much the opposite of what I was saying and I’m not at all sure Rob’s living in the same timeline as the rest of us. I think what humans actually value is their belief in authenticity, not authenticity itself. Otherwise, no-one would ever be fooled by a fake. So long as AI is obviously artificial, his argument may hold but how long will that last? A lot of people drew that line in the digital sand with music when CDs arrived and look how long that lasted.
Anyway, already much too long for a comment so I’ll stop. Interesting times we live in, eh?
LikeLike
The fun thing about wirehead is that it isn’t science fiction – we already did it for real on people a few times:
So we have that to look forward to.
I largely agree with you re: belief in authenticity. If I lost my wedding ring, I would be sad, even if I could replace it with an identical version. To me, it’s not just having a ring, it’s the continuity that grants it more meaning; having this particular ring on my finger for the last X years. Same as with an original Picasso or whatever versus an otherwise identical forgery.
…but that continuity is only ever implied. If someone swapped my wedding ring while I slept, and I never found out, it would still have the same “enhanced” value to me. Same with the forgery, assuming it is never revealed.
Having said that, authenticity only matters if it matters. Most people doom-scrolling just want to be entertained for a few hours before passing out. Continuity is the last thing on their mind. Once AI is “good enough,” the conveyer belt kicks into gear.
LikeLike
I feel like you’re conflating two separate issues here. To me, the problem with the “wirehead” future as criticized – first time I’m encountering the term, so going by the Wiki definition linked – is the potential for addiction by bypassing the middleman and directly stimulating the pleasure center with wires.
This addiction isn’t a new thing. We’ve seen criticism of internet addiction, MMO addiction, people who are glued to their phones, people spending ridiculous amounts on phone games, gacha games and other such items with predatory design, social media addiction, etc. The latest is now groups of people potentially addicted to AI and/or even skirting the lines on AI psychosis.
I also do think it’s a fair argument that the closer we get to directly and immediately stimulating those pleasure centers, the more and larger that “group of people” might become.
And then we have the issue of authenticity, whether it came from human or machine brain, or even a combination of both. That’s the “is it real, and does it matter” argument that people are struggling with and answering differently.
Both these issues are… not solved exactly, but addressed somewhat or confronted with conscious agency and critical thinking. Alas, we all also know that a fair amount of the human population will not apply these; but then again, we should also sit with the possibility that a fair amount will, as well.
LikeLike
The addiction portion of wirehead is the least interesting part. Well, I suppose it would be interesting to know if it is even possible to be resisted when turned on, but that’s the extent of it for me.
I’m much more interested in how people with “conscious agency and critical thinking” eventually logic themselves into the wire. To me, wirehead is the ultimate denial of reality, a complete capitulation of the self, the final form of total nihilism. So… if someone already knows that, why would they go all the way?
My argument is that if someone is fine with AI-created entertainment (premise #2), then there are few, if any, remaining off-ramps.
LikeLike