Al Is Making Me Obsolete, But It Can't Replace Human Creativity—Yet

Summary: The rise of Al has brought both awe and dread to the creative world. While Al can streamline tasks and generate content, it poses a significant risk of stifling human creativity. As Al tools become increasingly sophisticated, writers—and, by extension, their readers—risk losing the deeper critical thinking and emotional understanding that makes writing an art and sets human-created art apart. Meanwhile, as Al takes over more tasks, there's the risk of societal intellectual stagnation and the decline in critical thinking skills that are already waning due to numerous other accelerating economic, social, and political forces. Ultimately, this helps to create a society that is easier to manipulate and control. Arguably, this may already be true.


Last month, a client asked me to write technical documentation for their new software product. After two decades of making a living as a writer, I instinctively reached for my notebook and began formulating questions to outline the project. Then I paused, my fingers tightened on the pen, hands hovered over the keyboard, a thought struck me: Why bother? Al could write a first draft faster than I could even create an initial outline. And it did.

As someone who works as a screenwriter, copywriter, and technical writer, l've watched Al transform my profession with a mix of awe and dread. I've seen Al write decent movie dialogue, craft compelling ad copy, and generate clear technical documentation. For the past two years, I’ve also worked in the AI space—mostly out of necessity, but, on the better days, out of curiosity. Among other positions, I've worked as an Al trainer and internal quality assurance auditor on contracts associated with Google’s and OpenAI’s public and future large language models. Most days, it felt like I was training my replacement. Because I was. 

But here's what terrifies me more than Al potentially taking my job: I've started letting it do my thinking for me. More terrifying still, that’s the point.

It began subtly. Instead of wrestling with a tricky bit of dialogue, I asked ChatGPT for suggestions. Rather than carefully structuring a user manual, I asked Gemini to generate an outline. When a client needed ad copy variations, I'd feed prompts to Claude, ChatGPT, or Gemini until something bloomed. Then, I'd just choose my favorite flower, and let the others die on the vine.

Ideas are no longer precious. Great ideas will always be valuable, but ideas in general, good, bad, mediocre—they’re more available to people than ever before. Ideas were once rare butterflies we had to chase and capture with careful precision. We'd save them; we had to nurture them, if we could ever capture them at all; now, they're mass-produced paper cutouts that Al spits out by the thousands. They’re mostly reminiscent of something you half-remember half of; they’re usually disposable; they often require no effort to catch or nurture.

The efficiency is intoxicating. Why spend hours thinking about something, then crafting something, when Al can generate a workable version in seconds? And let's be honest, workable is often good enough. Great is the enemy of good, anyway, so... why not? Well, with each shortcut, I felt my creative muscles atrophy. The hard work of writing—the deep thinking, the careful consideration of word choice, the subtle layering of meaning, subtext (!), context! - all of it was being replaced by prompt engineering and content curation. 

Last week, while working on a screenplay, I caught myself in an absurd loop: asking one Al to generate plot ideas, another to critique them, and a third to refine the dialogue. An experiment, I lied to myself. I’d used them enough to know which can do what best, after all. Finally, I came clean. I wasn't writing anymore. I was orchestrating machines to write for me. I was a... product manager? I was something else. I didn't feel like a writer; I was an editor, at best, a stan writing fan fic at worst.

That's when I realized what we're really losing: the very concept of writing, itself. It's something I saw over the last two years training LLM models and quality-assuring the real, human, hourly-wage, underappreciated work that makes these billion-dollar models even usable enough to convince people that these machines have “intelligence.” From the inside, the goal is simple. The goal is to make writing into math. But writing isn't just about producing words correctly and predictably as math is about producing verifiable, predictable numbers, proofs, and theories. Writing is about the thought process behind those words, the context of the words in relation to themselves and the situation in which those words will be read and the audience who will be reading them and.... 

It's about understanding and being understood. Writing is necessarily human. That's its most important feature, not a bug to overcome.

When I write technical documentation, I'm not just explaining features; I'm thinking deeply about how users will interact with the product and why they are consulting the documentation in the first place. When I write ad copy, I'm not just stringing together persuasive phrases or SEO keywords; I'm understanding human psychology and motivation and attempting to find not just what has worked before, but what will work that maybe has never worked that trends and logic say won't work, but still does. When I write screenplays, I'm not just crafting interesting dialogue or intriguing plots or unique characters, I'm exploring the complexities of human experience based on human experience, not just based on what words have most often followed other words according to an incomplete and, often, dubiously procured sample set. All of these scenarios rely on context, reason, logic, understanding, empathy, and humanity. Above all, good writing requires emergent thinking.

Al, in its current forms, can not achieve emergent, novel thought because it can not reason about its future because it does not have experience, past nor present nor future. While engineers desperately wish writing to be mathematics, it just isn't. Writing is a conduit for communicating experience through information in unique, ever-evolving ways. Without experience, which Al does not and can not yet have, there's only information left. Information is important, but it's only half of the equation. Unfortunately, it's the least important half, too. Emotion is the sugar that helps the medicine of information go down. Humans are driven by emotion far more often than they are swayed by facts. This should be evident to every person observing the current state of the United States.

Al can simulate emergence, but it can't accomplish it. It can't truly understand user frustration or human desire or emotional, ecstatic truth. It can pattern-match its way to competent writing, even interesting, entertaining content, but it can't bring with it the depth of human experience and understanding that makes writing, and the communication of ideas through writing, powerful.

Yet, increasingly, I see fellow writers surrendering not just the task of writing, but the thinking behind it. I see us, myself included, not only surrendering but training these machines to be the version of us that so many in our lives, personal and professional, have wished we would be: easier to control, predictable, and writing for the lowest common denominator at the lowest possible price (free). We are sharpening their swords as they coil to strike.

People don't know what they want, that is why creativity and art is so powerful—and, to those in power, so dangerous. It gives people something new that they had no idea they wanted but desperately needed without ever knowing it. Al gives people what it knows people have wanted in the past, but little of what we really need. Writing, until now, has always been additive; sometimes positive, sometimes negative, but it’s always been created from nothing and made into something. That’s why AI it’s so often boring, unfulfilling, and hollow. AI-generated writing is not additive. It’s inherently backward-looking, reliant on the remixing of what’s come before without the ability to sublimate that remix into more than the sum of its parts. For me, AI-generated writing will always be this: 2 + 2 = 4, while human-generated writing, though rare, can achieve this: 2 + 2 = 5. 

These AIs are sold to us under the moniker of "intelligence," so most rightfully assume they are. And isn’t that, right there, the power of writing? (And the power of irony.) The advertising world might be the greatest offender, so far. I've sat in meetings where creative teams skip the brainstorming process entirely, jumping straight to Al-generated concepts written by the space where a full-time copywriter used to sit. The resulting campaigns are polished, unnaturally confident, and mostly coherent, but predictable, factually dubious, and lacking the creative spark that comes from human minds colliding and collaborating and reasoning. Again, without emergence, novelty, and creativity, these LLMs can only say what has already been said in a diminishing variety of ways. In an industry where attention is everything, being boring, unfulfilling, hollow, and redundant is a death sentence. And maybe, just maybe, it’s how we got two Super Bowl commercials each about both flying facial hair and Goldilocks and the Three Bears…

Don't misunderstand—I still use Al tools. It would be professional suicide not to. But l've established strict boundaries. Al can help me refine ideas, not generate them. It can help me outline, but rarely compose and never polish. It can not feel and often forgets we, humans, can. Most importantly, it can't replace the thinking that happens before I ever touch a keyboard. Because, again, it can not think.

This means I sometimes work slower than colleagues who fully embrace Al. I spend time thinking through problems that Al could "solve" instantly. I write rough drafts that Al could generate more quickly and cleanly. But here's what I've preserved: the ability to think deeply about my craft and the world. The satisfaction of solving creative problems and the necessary frustration when I can’t, yet. The joy of finding the perfect word or crafting the perfect scene before rewriting it all over again. The human understanding that comes from deeply engaging with my work. For me, good enough is the enemy of great, and if it's not going to be great, why do it at all? I don't expect Al to understand that. It's irrational. It's human.

Recently, I experimented with my writers' group. I asked them to compare two versions of a scene—one I wrote and one generated by Al. They could tell one from the other immediately, to their credit. However, it was difficult for all of us to explain why one was better than the other. “Why” is always the most difficult muscle to exercise. And it's precisely the muscle that Al is accelerating to atrophy. The plot was the same, the characters were the same, the dialogue said the same things in different ways. And yet...

I can explain why, now: one had humanity, the other didn’t. We, humans, have recognized others’ humanity (and, for worse, lack thereof) for longer than anything else we’ve ever experienced as a species. However, and maybe not since there were many different species of humans walking Earth together, we have not had to so actively utilize our natural, inherent discernment of what’s us.

As we increasingly outsource our thinking to Al, that’s what we risk losing. The very core of us. Not just jobs or skills, but the essential humanity that makes writing, and all creative work, meaningful to those who make it and consume it. That makes being human, itself, meaningful…

Further, and finally, the most insidious effect of this Al-driven anti-intellectualism is that when we become accustomed to having complex ideas pre-digested by Al, we lose the mental fortitude required for critical thinking. I see it in how readers and clients increasingly demand shorter, simpler content. This was true even before AI. I see how audiences gravitate toward more formulaic stories, how consumers respond to increasingly basic marketing messages we've all seen before, only maybe in a different font. We're creating a world where the default response to complexity is "make it simpler." Often, complex ideas are complex for a reason, and making them simpler shaves off necessary nuance. 

Ironically, a product made to answer questions is causing people to ask fewer questions and get worse answers. This intellectual learned helplessness encourages a population that's easier to manipulate, easier to mislead, and less capable of questioning the systems controlling their lives. As a writer, I'm watching in real-time as the capacity for nuanced thought—the very thing that enables us to resist propaganda and manipulation—is being systematically dimmed in the name of convenience and efficiency, all for $20 a month. 

Tonight, I'll return to my screenplay. I’ve been working on it for four months, now. Even though I could probably have been finished by now, I'll resist the urge to let Al solve my plot problems or generate my dialogue. Instead, I'll think, struggle, and create. Because, while Al might make me obsolete as a word generator, it can never replace what really matters: the human thought, understanding, experience, and creativity that make writing worth reading. Well, until it can.


Previous
Previous

SEQUENT: Memory Hunter — 001: “Echoes and Triggers”

Next
Next

If You’re Playing Me, Quit