AI Should Be a Writing Tool, Not a Ghostwriter

AI should be a writing tool, not a ghostwriter

In a few previous blog posts (see here, here, and here), I discussed the pitfalls of AI-generated written content. I’ve taken a pretty anti-AI stance in that respect. When I see this kind of content online, I don’t want to read it (I imagine many people also spot it as AI-generated and, once realising it is, also don’t want to read it).

In addition to the qualitative deficits of AI-generated content, I think there’s some interesting psychology behind people distinguishing between AI- and human-generated content, and once they notice the former, negative reactions and judgements occur. I think it brings up judgements about deception (i.e. someone failing to disclose that the content isn’t their own and passing it off as such) and cheating (this is related to the former point: someone is posting content that is trying to speak to the reader, but it’s not a person doing the speaking; the content is therefore seen as lazy and trying to gain rewards and communicate through no effort, skill, or knowledge of their own).

A similar scenario is when I get copied and pasted template emails, and the sender doesn’t even bother to address me by name. Others might experience this when someone gets an email with the embarrassing square brackets not removed (e.g. Dear [insert name]) – some have gotten job application rejection emails like this (this is the worst example I’ve seen of this). Automated emails may also rub us the wrong way. The reason we react negatively to this kind of communication is that the human element is taken out of it. We’re being spoken to, but not by a person. We see this way of communicating as cold, uncaring, and lazy. Psychologically, even if AI-generated content is given a prompt to sound warm and compassionate, as soon as we recognise it as AI-generated, its efficacy fails. The illusion of compassion cannot work.

I have several gripes with AI-generated articles and blog posts, but this isn’t to say that I think AI has no place in the writing process, especially for professional writers. I do use a few AI tools myself in my writing. However, I’d say the way this impacts my writing is slight, particularly in terms of the creative dimension.

As much as I’ve been anti-ChatGPT in the context of writing, I do find it useful for generating topic ideas and ideas for article titles when I get stuck on thinking of one I like. I don’t find myself using a ChatGPT-generated title verbatim, as they often have an ‘AI feel’ to them. But sometimes, certain word choices, phrases, or title structures fit the subject well, and so I’ll cherry-pick elements, mix and match, and make my own alterations so that I end up with a title that feels fitting and in line with the kinds of titles I like to use and see. I’ve also found ChatGPT helpful for creating decent meta descriptions for articles. By copying and pasting the introduction of the article and giving ChatGPT the prompt to summarise that text in one sentence, a pretty decent meta description is churned out (but as with title suggestions, I’ll typically make some adjustments).

That’s essentially all I use generative AI for in terms of content creation. I also use Grammarly (as a Chrome extension, for Google Docs, and in Word), which uses AI to review spelling and grammar. I see no reason why writers shouldn’t employ an AI tool to clean up and polish writing, as well as for suggestions on improving writing in terms of enhancing phrasing, sentence structure, and cutting out redundancies. Personally, Grammarly has been a great tool for being able to write in a more grammatically correct way, rather than a crutch that has fixed my writing for me, without any learning involved.

I will say, though, that Grammarly does take issue with perfectly valid stylistic choices, flagging them as incorrect or suggesting a supposedly ‘superior’ alternative. One example of this is the use of em or en dashes to set off parenthetical information; often Grammarly will say commas should be used instead, even though that’s not grammatically necessary. Other times, it’ll suggest other unnecessary corrections, such as adding commas, which, while not incorrect to have them, aren’t necessary for the writing to be grammatically correct. Other spelling, grammar, and sentence checker AI tools I’ve used include Quillbot and ProWritingAid. I use these to cross-check certain corrections or recommendations from Grammarly; again, they’re useful for checking correct grammar and getting into the habit of writing with grammatical rules in mind.

Another AI tool I’ve found very helpful for improving my writing has been Ludwig Guru’s ‘Search’ function. You can search for certain words, phrases, or sentences, and you’ll get results from content available online showing the context in which a certain word, phrase, or sentence appears. This is useful when I’m unsure if I’m using a phrase correctly, if I want to see if a word or phrase appears in a more formal or informal context, or if I want to know which way of phrasing something is more common (and, conversely, if certain phrasing is outdated, obscure, or unusual to most readers).

That pretty much sums up how I use AI in my writing. (I did use the Hemingway App, which is used for improving the readability of your writing, for a previous client, but I haven’t used it since then. That’s not to say there wouldn’t be benefits to making my writing more readable, but I think if I started using the Hemingway App to reach a certain readability score, it would change the authenticity of my writing style and voice.) 

I try to only use AI as a writing tool, or a writing assistant, and never as a ghostwriter. If you start relying on AI to do your writing for you, there’s a risk that it will stunt your ability to improve as a writer. It gets in the way of being able to discover and develop your voice and to experiment in your writing. We also know now that using AI to write content for you has cognitive costs. A study by MIT researchers found that using ChatGPT to write an essay results in ‘cognitive debt’ and “likely a decrease in learning skills”. The study divided 54 subjects (18 to 39 year-olds from the Boston area) into three groups, and asked them to write several SAT essays using ChatGPT, Google, and nothing at all, respectively. The researchers used an EEG to record the participants’ brain activity across 32 regions. ChatGPT users had the lowest brain engagement and “consistently underperformed at neural, linguistic, and behavioral levels.” Over a period of several months, ChatGPT users got lazier with each subsequent essay. They often resort to simply copying and pasting information by the end of the study.

Relying on ChatGPT for writing, then, can potentially impair critical thinking, creativity, and problem-solving, which are crucial to writing well. It can result in ‘metacognitive laziness’, preventing people from putting in the time, effort, and (sometimes) frustration that comes with developing one’s skills as a writer. This is why AI should exclusively be used as a tool, not a crutch. I don’t feel that the minimally invasive way I’ve used AI as a writer has negatively affected my writing; it’s only helped in terms of idea generation and keeping my writing polished. Otherwise, I’ve kept AI out of my writing process entirely – it’s not something I want dictating or advising how I write. As soon as AI starts removing the human elements from writing, the writing (for me) suffers a real loss.

In future posts, however, I would like to delve a bit deeper into the psychology of engaging with AI-generated vs human-generated writing, as well as the ethical question of whether writers should boycott AI tools like ChatGPT, based on their environmental costs, the risk of using plagiarised content, and how they impact the livelihoods of writers and the writing industry. The moral quandary of using AI, albeit minimally, is something I’ve been thinking about. I personally avoid using AI art in my writing (partly for aesthetic reasons, but also because widespread use of it spells bad news for visual artists). However, whether writers should avoid ChatGPT or similar tools entirely is a question I’m still considering. Some AI tools may be more problematic than others (e.g. in terms of energy consumption and impacts on creatives), and we also need to be nuanced in terms of weighing up the benefits of limited uses of AI and the purported ethical costs of using it at all.

Leave a Reply