top of page

Create Your First Project

Start adding your projects to your portfolio. Click on "Manage Projects" to get started

Why ChatGPT Isn’t and Never Will Be a Good Writer

ChatGPT can’t think. It can’t analyze, reflect on, or evaluate its database or its output. Should we really be surprised it’s such a lousy writer?

ChatGPT is a major breakthrough in artificial intelligence. This natural language processing chatbot, developed by Open AI, can rapidly and effectively perform a wide range of tasks, from conducting research and answering questions to troubleshooting, writing code, and debugging.

ChatGPT can also write: it’s quite good at producing simple content. On the other hand, it’s now generally recognized that it’s not cut out for more sophisticated content. Ask it to write about euthanasia, global warming, or experiential learning, even briefly, and it falls short. It’s important to understand why and what this implies about ChatGPT’s writing potential.

What kind of content ChatGPT can produce

ChatGPT can generate a broad scope of content. With straightforward prompting, it can write forms, letters, emails, thank-you notes, speeches, and social media posts. It’s particularly well-suited for short, simple, and straightforward pieces.

For instance, I asked it to write a 100-word email to serve as a rejection letter for article submissions to magazines. I gave it clear instructions: “You're a magazine editor with 30 years of experience in writing and editing. Write a 100-word email that explains why you're rejecting someone's article and why you can't give any real feedback on it.” It produced this:


“Subject: Regarding Your Article Submission

Dear [Author's Name],

Thank you for submitting your article to [Magazine Name]. We appreciate the time and effort you put into crafting your piece.
After careful consideration, we regret to inform you that we will not be able to publish your article in our upcoming issue. Please understand that the decision was not made lightly and does not reflect on your writing abilities or the relevance of the topic.
Due to the high volume of submissions we receive and limited space in our publication, we are unable to provide detailed feedback on every submission. We apologize for any disappointment this may cause. We encourage you to continue writing and submitting your work to other outlets.
We wish you the best in your writing endeavours.

Sincerely,

[Your Name] [Your Title] [Magazine Name]”

Now, this isn’t bad at all. With some minor tweaks, it can easily be turned into a general-purpose rejection form letter for magazine articles.

And this is right in ChatGPT’s wheelhouse. With clear and strategic prompts, it can produce, in a matter of seconds, simple, formulaic content that can be used for myriad purposes. And since this content often requires only minimal editing, ChatGPT is a huge time saver for writers—it frees us up to focus on more challenging tasks.

Where ChatGPT falls short: writing

But while ChatGPT excels at producing simple content, the same can’t be said for more high-minded pursuits. Ask it to explore a serious question, topic, or dilemma, and it invariably falls short. It’s just not capable of generating content that requires significant analysis.

For instance, I asked ChatGPT to explain, in 200 words, how overnight camps can help kids mature. I gave it clear prompts and instructions: “You're a seasoned writer with 30 years of professional writing experience. Your main area of expertise is the benefits of overnight camps for kids. Write a 200-word essay explaining how overnight camp can help kids grow, mature, and develop.”

“Title: Overnight Camps: Fostering Growth, Maturity, and Development in Kids

Overnight camps provide a nurturing and immersive environment that enables children to grow, mature, and develop in various aspects of their lives. These camps offer a range of opportunities for personal, social, and emotional growth, helping children thrive in a supportive and adventurous setting.

Firstly, overnight camps promote personal growth by encouraging independence and self-reliance. Away from the comfort and familiarity of home, children learn to navigate new experiences, make decisions, and take responsibility for themselves. This fosters confidence, resilience, and a sense of empowerment as they face challenges and overcome obstacles.

Secondly, overnight camps foster social development and interpersonal skills. Campers from diverse backgrounds come together in a shared environment, fostering teamwork, empathy, and cultural understanding. Through engaging in group activities, collaborative projects, and building lasting friendships, children learn important social skills that will serve them well throughout their lives.

Furthermore, overnight camps provide an environment for emotional growth and self-discovery. Campers have the freedom to express themselves, explore their interests, and develop a sense of identity separate from their everyday routines. This enables them to discover their strengths, passions, and values, contributing to their overall personal and emotional development.

In conclusion, overnight camps play a significant role in helping children grow, mature, and develop in various areas of their lives. These camps foster personal growth, independence, social skills, and emotional well-being. By providing a supportive and adventurous environment, overnight camps empower children to thrive, discover their potential, and create lasting memories and friendships. The transformative impact of overnight camps extends far beyond the camp experience, shaping children into confident, resilient, and well-rounded individuals.”

Now, make no mistake about it: there’s something extremely impressive about this. With proper instructions, ChatGPT can, in roughly 30 seconds, generate a cohesive, grammatically sound, and focused 200-word article. This is something we’ve never seen before, and it’s a critical development in artificial intelligence.

But don’t be fooled: the article is second-rate. Even a fleeting look reveals it to be bland, poorly structured, and unconvincing. Once our initial excitement about its lightning-quick output recedes, it’s easy to see the ChatGPT article for what it is: a generic, formulaic, and dull piece of writing devoid of any real insight. Even for a short 200-word article, it fails to effectively convey what’s unique about summer camp—what differentiates it from other experiences—so that it can help kids grow, mature, and develop in ways nothing else can.

And this point can be generalized. When ChatGPT is asked to write anything that requires analysis or evaluation, it delivers inferior results. Why? As Chomsky argues in a recent New York Times article, the short answer is that ChatGPT doesn’t think, at least in the way humans do. That’s just not how it works. For all its bells and whistles, it’s nothing more than a mimic: it predicts what a human might say based on what humans have said, and it dispenses it, word by word and sentence by sentence. It’s like a parrot that repeats what it’s heard before but doesn’t understand it.

With each word and sentence ChatGPT cranks out, it strives to inform and enlighten the reader while being ignorant of what came before and what will come next. This imbues the writing with a kind of artificiality or phoniness, furnishing us with prose bereft of voice, personality, and emotion. ChatGTP, at least when it comes to more challenging writing tasks, fails the Turing test by a country mile; however, as Biever argues in Nature, it may pass an easier, more superficial version of this test.

For similar reasons, ChatGPT is utterly incapable of originality. Since it can’t form judgements, as humans do, on the basis of its own ideas, thoughts, or experiences, its output is restricted to what others have said. Although it’s trained on a massive data set—hundreds of billions of words scraped from the Internet—its algorithm merely scans this data set and generates text based on patterns and examples it has learned from it. Since it can’t reflect on its responses or engage in abstract reasoning beyond what it’s learned from its training data, it’s constitutionally incapable of originality. ChatGPT, it appears, will never produce original writing.

But what does this conclusion imply? Because it might be claimed that all writing is inherently unoriginal. After all, one might argue, don’t we all replicate each other’s ideas, and isn’t it virtually impossible to say something entirely new, something no one else has said before?

Sure. It’s undeniable that it’s extremely challenging to cook up entirely original ideas. Similarly, it hardly requires mentioning that all writing, even the very best, draws on other people’s ideas, thoughts, arguments, and research, in one way or another. This is an essential and accepted part of the creative process.

But none of this implies our writing is unoriginal. Even when we draw on others’ work, we can still provide a new slant or perspective on an old idea or a new way of thinking about things. We can deliver fresh and novel insight by framing and developing theories, concepts, and beliefs in ways others haven’t. Even Einstein, Newton, and Gödel didn’t start from ground zero.

Yet this is something ChatGPT can’t do. While its massive database includes almost every theory, hypothesis, and idea that’s ever been proposed, it can’t reason about, reflect on, or evaluate this data. It can merely extract patterns from this corpus—dissecting, combining, and recombining the ideas of others—and then generate answers based on “the statistical correlations between words in the billions of online sentences [it’s] trained on.” (Biever, Nature), Though ChatGPT may provide us with novel combinations of ideas from its database, the fact that it lacks judgment, experience, and intuition means it can never take us beyond the data.

The ability to evaluate data and draw new causal inferences to explain it is a uniquely human capacity. Chomsky conveys this point best (March 2023): “The human mind is not, like ChatGPT and its ilk, a … statistical engine for pattern matching, gorging on hundreds of terabytes of data and extrapolating the most likely conversational response or most probable answer to a scientific question. On the contrary, the human mind is a surprisingly efficient and even elegant system that operates with small amounts of information; it seeks not to infer brute correlations among data points but to create explanations.” Unlike AIs such as ChatGPT, we’re hardwired to relentlessly pursue answers to the deepest of questions—questions about the causal structure of the world.

Humans, unlike machines, look for answers that” “aren’t captured in language-use statistics.” (Computation and Language) And this is the source of our unreproducible originality, in thought and language.

But let’s explore this point further. Because one may reasonably ask: Won’t ChatGPT improve as a writer? Won’t it get better at producing content that effectively answers people’s questions and offers fresh insight?

Yes, ChatGPT will get better at writing, maybe a lot better. Its memory, processing power, and ability to extract patterns from its database will improve. But what it will never provide is the kind of novel and insightful analysis humans do. Original writing requires the use of thought, reason, and analysis, and ChatGPT, like all other predictive language models, can’t engage in any of these things. Since it merely projects what to say based on what other humans have said, originality is beyond its ken.

Does this mean, though, that writers shouldn’t use ChatGPT? Absolutely not. Even if you’re aiming for original, insightful analysis, ChatGPT can be a valuable, time-saving resource. It can help you conduct research, brainstorm ideas, and generate outlines or templates for reports, articles, and even books. It can also help you proofread and edit your writing and provide useful feedback on it (though it has limitations here as well).

What’s more, ChatGPT can be an invaluable model that shows us how not to write. It can be a mirror for those whose writing is filled with cliches, platitudes, and truisms, and that lacks insight and discernment. There may be no better way to underscore the need for writers to think and reflect more when they write. ChatGPT can be the unintentional foil that shows how insufficient our solutions can be. As Haruni says in another Wired article, it can be “a tool to transform the way we think, work, and act, reflecting back to us what we will most probably say the first time around and inviting us to try again and again.”

bottom of page