AI Hallucinations are a Feature not a Bug
The quest for accuracy in AI is fascinating in a super nerdy way. Anytime I read about "evals" which is kind of the AI term for "testing" it feels like I'm either supercharging all my dormant left brain cells who haven't done hard math since college or I'm getting into a deep philosophical debate: "what is truth, really..."
But what about the creative space? Do we always need extreme precision and accuracy?
It would certainly suck to watch a movie where the protagonist has a random face change and looks like a different person every 5 minutes. And in grounded stories, certain laws of physics must apply. Chances are the audience will be taken out of a drama if the furniture moves around on its own.
But in creative writing, the AI’s ability to deliver a sheer volume of ideas + variation of ideas is kind of the killer app, IMO. Maybe the protagonist is actually becoming a different person every 5 minutes or maybe the furniture is alive. This is all for a writer to decide.
Anyone who has done classic writing exercises, such as coming up with 100 possible obstacles a character might face in a story in order to reach their goal, knows how much of a slog that exercise is… but also how awesome the result is, if you can just get to the end. AI is great at coming up with large volumes of ideas.
And just the ability to riff on ideas and iterate from one to another is also amazing. I remember working on an idea for a script around late 2023 I think, and it was a kids’ fantasy/adventure about a world where Earth was on the brink of environmental collapse so the trees decided to get up and walk so they could gather together under the hole in the ozone layer and fix it. Something like that.
I remember the incredibly awesome rabbit hole I went down as I riffed with AI on that idea, coming up with things like: what if in the ancient past, the trees roamed the Earth like dinosaurs? Maybe there were flocks of roaming trees and tree herders? How fast do they move? Do they migrate seasonally? How would the different types of trees move differently?
Each weird tangent wasn't a mistake. It was just another idea to play with.
So it's just interesting that AI is constantly critiqued based on inaccuracies and hallucinations, which requires the narrowing of possible outcomes to get just the one correct answer, while what is even more fascinating from a creativity perspective, is the exponential expansion of results from a vast mesh of connections the AI can draw from. Connections a human brain might take ages to slog through.
It's that expansion into a large volume of ideas and the ability to wander around through them, like trees who have regained their ability to walk, exploring "what ifs" and "what might be" connections, that fuel my creative brain with AI.
The hallucinations aren't the problem. They're kind of the whole point.