I've been sitting with these thoughts for months now. Turning them over. Attempting to smooth their edges. Typically, I aim to be more constructive than this. To offer solutions alongside problems. But sometimes, the most honest thing you can do is share how you feel.
So here goes.
This will likely upset some people. But I know others will whoop and punch the air. Such is the nature of strong thoughts and feelings. So here are mine, unvarnished.
The fundamental offence?
I’m appalled – genuinely, viscerally appalled – at the concept of creating a digital AI twin.
The idea of downloading your thoughts, feelings, voice and viewpoint into a black box algorithm and having it spit out "content" on autopilot so you can... what exactly? Grow on social? Get more dopamine hits? Establish yourself as a "thought leader" without actually having to think?
It offends me on a fundamental level because your voice and your view are not products to be commodified. They're not “assets to be leveraged”. What you say, why you say it and how you say it are uniquely shaped by your life experiences. They are yours – to be guarded, honoured and celebrated.
Not sold out. Not replicated. Not cheapened.
The end of original thought?
It offends me because it feels we're collectively giving up on original, critical thought. Somehow, churning out endless content for likes has become more important than all the things that make us, well, us.
We're voluntarily buying into a future of AI-generated mediocrity, where genuine insight and hard-won wisdom are no longer valued. Where the work of articulating complex ideas clearly – that beautiful, messy, human process – is bypassed for the sake of efficiency.
Is this what we want? A world where the tough cognitive work of forming and articulating thoughts is outsourced to machines? Where the spaces between thinking, writing, editing and publishing – the crucial gaps where reflection, reconsideration and refinement happen – are eliminated in favour of volume and speed?
The internet’s death spiral
It offends me because I don't understand why we're so eager to push the internet's death spiral further and faster into absurdity. We’re all already totally overwhelmed by the noise. Why are we so keen to add more?
To use up a horrific volume of natural resources (AI models don't run on good intentions – they consume enormous energy), and to what end? To share substanceless, performative bullshit on LinkedIn so other people hoping to boost the performance of their own substanceless shit will like, share and leave a comment?
(This portion of the rant is inspired by Amanda Baker's recent piece on the creative deficit.)
The existential question
What the fuck are we actually doing? What is the point of all this? Can anyone give me a good answer?
(I personally lean towards the point being to make a few enormously wealthy people even more enormously wealthy.)
The one saving grace is the rise of long-form platforms like Substack. Perhaps there's a counter-movement brewing – people hungry for depth, nuance and genuine human perspective. Maybe we're not entirely doomed just yet.
But these platforms aren't immune. How long before they too are flooded with AI-generated essays, perfectly optimised for engagement but devoid of the substance that makes writing worth reading?
Questions to ask before creating your digital twin:
If you have created a digital AI twin or you're thinking of doing so, I urge you to pause and consider:
- Why? What's your actual rationale? Be honest with yourself.
- Misinformation: Are you comfortable with the risks of your twin generating false, potentially harmful information? What about perpetuating biases you may not even be aware you hold? Or adding new ones to the mix?
- Privacy: Can you say with any certainty what happens to the information you're plugging into that black box system? Can anyone?
- Plagiarism: How would you feel if someone ripped off what you said wholesale from LinkedIn or wherever you publish? How different is what you're planning to do here?
- Environmental impact: Do the natural resources you're about to consume justify the means? For context – I think there are many good uses of AI to solve some of our biggest challenges that justify the resource use. Sharing lightweight thought leadership for cheap dopamine hits is not one of them.
But, it's just a tool?
Calm down, it’s just a tool! You might say.
But let me ask you this: if you think about it, like really think about it – is this a tool to augment and/or accelerate human creativity and potential? Or is it a tool to concentrate wealth and commodify said creativity and potential?
Could it be both things? I'm doubtful…
So… where does this leave me?
Where does this leave me (and others like me, because I know I'm not alone)? Maybe we'll be left behind. But left behind where, exactly? The direction all this is heading seems so futile that perhaps I don't even mind.
I'd rather stand firm in my conviction that some things – like authentic human expression – shouldn't be automated, than make the sacrifices required to keep up with this particular version of "progress." Whatever that really means.
Time to start living less online, and working out ways to make money in the real world? Maybe…
This piece represents my personal views on AI-generated content and digital twins. I welcome thoughtful disagreement and alternative perspectives– that's how we all grow – but I won't apologise for feeling strongly about this issue.
.png)