“Is training AI to write like you a fool’s errand?” Copywriter Bob Bly weighs in

The legendary B2B Copywriter, Bob Bly (whom I was delighted to meet in the US in 2012), recently blogged on the topic, “Is training AI to write like you a fool’s errand?”

Bly is a lifelong writer, so, he objects (first) that he would never outsource his craft at all, much less to a force whose processes he does not know. (“Why would I want to use ChatGPT . . . when I both enjoy [writing] immensely and do it reasonably well?”). He quotes the writer, Ben Settle, as saying, “The joy of writing is not in speed or style . . . it is in the bleeding, constant rewriting, and burgeoning floodgates of thoughts that can only come from battling the blank page.”

Secondly, Bly says, some marketers and communication-types argue that AI can “take care of the grunt work.”

Bly argues that “writing is not grunt work.” That “Human writing is anything but. . . Essentially, the core of writing is thinking.” And he quotes Plato: “Thinking is the talking of the soul with itself.” Do we really want to give up the fundamentally human processes of thinking and feeling?

True “grunt work,” Bly writes, inheres in prompting and “fixing up” bad ChatGPT prose!

Last year, in conversation with (the brilliant) English copywriter, Nick Usborne, in his “Futureproof Copywriting” course, learners (including me) felt happiest when AI would perform “grunt” background work for us (e.g. researching and retrieving sources, critiquing weak areas in our drafts.) But each of us still absolutely wanted to stay in the driver’s seat of our own writing. We aspired to be “the human in the loop.” One year later, that plan rings utterly naïve.

Copywriters have very little control over how (and how much) they use AI.  Our clients or clients’ companies, including the investors and political players that control them, dictate that.

Thirdly, Bly observes, in his response to the current wave of heavy AI use, few AI writers actually make real money with AI (even under the Amazon book-generating industry). Most of the marketers in this area make money “by creating, teaching and selling ‘how to write with AI’ courses.”

Fourthly, he rejects that anyone should “get AI to write like [they] do.” He says: “I have no desire to train AI to write like me, because I already write like me . . . and have spent 45+ years learning how to do so.” One of Bly’s colleagues observes that “the funny thing about even trying to use AI to do this,  is you will spend more time trying to get AI to do anything right, style-wise, than you would writing the damn thing yourself.”

Even if you can speed up copy generation, “when you’re done, you will find that AI doesn’t write all that well—and doesn’t sound much like you, either.”

Writers who use AI extensively are spending “more time than ever . . . and publishing LESS,” one of Bly’s colleagues has complained to him.

Instead of reckoning with a blank page or screen (something long ago overcome by making “mind-maps” [as writers Ed Gandia and Daphne Gray-Grant have long lauded]), “now [writers] stare at ChatGPT output, wondering how to fix it.”

Fifthly, Bly reminds us that many readers and publishers don’t want to read AI writing. More and more mainstream book publishers are rejecting books that they suspect were written using AI. (The sad dispute over whether writers use em-dashes points to mainstream publishers’ anxieties over AI use.)

Sixthly (and finally, for this round of the AI debate), Bly disputes that AI’s “speed” outpaces its “performance.” Many AI training courses promote how to learn “to use AI to write your book for you in days.” But he intones this logic shows sadly that “what matters most today is how fast you can write rather than how well you can write” (my emphasis).

Bly acknowledges that in many genres and media (such as daily newspapers), writing quickly is valuable when tight and “frequent deadlines” are required. But, he adds, “in other channels and types of writing, quality trumps speed every time.” In Bly’s homegrown territory of copywriting, he says, “clients value landing pages that double revenues much more than those that could be written in half the time, but [which] hardly move the sales needle.”

Now, AI enthusiasts might be tempted to label Bly a curmudgeon and traditionalist. But after earning millions of dollars over more than 40 years of copywriting (including over 100 books on topics including copywriting, all written by himself, and none by AI), we should not dismiss his rebuttal too fast.

As the “Godfather of AI,” Geoffrey Hinton, argued (when accepting the Nobel Prize two months ago today), many of us have become anxious because danger inheres in our building AI that are smarter than we are—and not only in our writing. The manipulation of supra-human technology is easily attained by authoritarian rulers and governments (e.g. the US, Russia, China, North Korea and more). Such regimes can and will–if unbridled–bring the annihilation of any humanity with values and decency.

So “training AI to write like you” is not only a “fool’s errand,” as Bob Bly writes. Training AI to write in place of us (implied in the question)—is to usurp our legitimate space in the creative processes of reading, thinking and writing. Such an overthrow allows AI to outsmart us—it is already doing so (and faster than ever before).

Given the state of our world in the 21st century, that’s not only a “fool’s errand” but a fool’s demise.

Since the “cat is already out of the bag” in this reality, it behoves us creatives to lament all that can be (and is being) lost, before the loss overtakes us all.

And now it’s your turn: what are the implications of the ways you use AI? How do you picture the future of humanity in an AI world?