With social media fragmenting, I’m bringing back my old “You Tell Me” Wednesday discussions to try to get good old fashioned blog conversations going. If you’re reading in a feed reader or via email, please click through to the post to leave a public comment and join the discussion!
News broke this week that HarperCollins has struck a deal with an unnamed A.I. company (yesterday revealed to be Microsoft) to let it use text from nonfiction books in order to train its models. Writer Daniel Kibblesmith posted screenshots from an email that offered him $2,500 in exchange for allowing his book to be so utilized.
Kibblesmith’s one word response: “Abominable.”
The publishing and news industries are kind of all over the place when it comes to A.I., suing them for unlicensed uses with one hand and striking deals with them with the other.
And use of A.I. as a writing assistant remains controversial among authors, with some are wholeheartedly embracing it for certain elements, while others reject it wholesale.
Publishers Weekly rounded up a range of responses from industry professionals on the state of A.I. and the proposed Harper deal. The Authors Guild also released a measured statement.
What do you think? Is the A.I. cat out of the bag and is it time writers adapt and take their money while they can? Or does the A.I. bandwagon need to have its tires punctured?
Should writers fight or play ball? Or some combination of the two?
I’m not personally opposed to utilizing technology to advance storytelling. I am, after all, penning a writing blog post on a laptop. But when it comes to this generation of hallucinating A.I. and the companies behind it who raced to see who could rip off authors the fastest, count me for now on team “Abominable.”
Need help with your book? I’m available for manuscript edits, query critiques, and coaching!
For my best advice, check out my online classes, my guide to writing a novel and my guide to publishing a book.
And if you like this post: subscribe to my newsletter!
Art: The Threshing Machine by Albert Garbriel Rigolot
Neil Larkins says
I’m with you 100%, Nathan. To me, a writer using A.I is like entering a paint by number in an art contest. Maybe that’s an extreme comparison but darn it, letting a digital thing compete with what my gray matter, sweat and long, lonely hours comes up with just doesn’t cut it. It’s as you said, abominable.
paul W stephens says
Unless you’re the type whose cheated on tests in school (including writing papers outside of the classroom), one should rely on their own thoughts, opinions, and expressive words.
To me, A.I. is like a someone wearing a wig. With many, if you have an eye, you can spot a fake.
In the Author’s Note of my saga memoir/story, I’ve made it clear the work was written without A.I. What they’re about to read has the authentic human touch.
p.s.
Shayne Huxtable says
Not for me in any form. There’s something intensely satisfying about creating stories that take time to develop from a few rough ideas to a fully fleshed manuscripts,
abc says
Team Fight!
Eva says
Humans are so naive that we allow it – books have been written and movies made about the consequences, and still not enough of us join the dots. The creative oracles are always right.
A huge percentage of the world population seems to be willingly dumbing down….off they go then, over the cliff with you lemmings…haha
Peter Taylor says
I received an email today promoting: “This NEW innovative web app uses artificial intelligence to write and illustrate high-quality children’s books in just minutes in a way that’s enhanced and better than ever. No need for writing or illustrating skills. Just input your idea and let the AI bring your story to life. …60% of all children’s books are bought online…”
That’s dreadful!! Whose work have they referenced?
I’m writing and photographically illustrating a potential Middle Grade ‘Book of Small Things’ – objects kids’ parents, grandparents and ancestors may have owned or used (including world record holders as the small of their kind). One section is on buttons. In Victorian days, when someone enjoyed a book, they often added a story button to their clothes, hats, gloves or handbag. The buttons featured a scene or character to reveal the wearer’s reading taste. And friends enjoyed a puzzle, working out the title of the book. But how where the buttons obtained, and where were they purchased?
Though I searched, I didn’t find an answer until I used AI. I do believe it has potential to be useful for non-fiction, and for inaccuracy. I imagine some of its sources are more reliable than others. Hmm. Should I really perpetuate the result? Will non-fiction books become less accurate if we rely on AI for facts,…and then AI trains itself on our new text…
((It seems the story buttons, also called narrative buttons, were purchased at specialty stores and from jewelers, and publishers and authors also distributed them at readings and events. They sound likely! Besides novels, some buttons depicted well-known fairy tales, Aesop’s fables and historical events.))
Wendy O'Connell says
I’m with you 100%. However, if you are using it for grammar, I might not have a problem with that, but as a human, you have to decide what to accept and leave. AI could destroy the human voice if you let it, and I personally don’t like the hard edges AI generates.
Terin Miller says
As a writer, and a semi-retired journalism professional, I find it abominable. I switched from typewriters (a technological advance akin to the Guttenberg Press) to laptops once the technology advanced beyond the TRS 80 (affectionately known to reporters as the “Trash 80”) and phone couplers.
I used technology to convey my writing faster (a necessity at the time in my profession). That isn’t to say the technology actually CHANGED my writing. It didn’t change my style (editors, whose job among others it was to force us to conform to whatever publication’s ‘style guide’ said, did), it didn’t change my voice, it didn’t change my story in the sense it wasn’t helping me ‘create.’
I find ‘intuitive spelling,’ which 9 times out of 10 gets my wordchoice WRONG, and even ‘Grammarly,’ which does well with most academic English grammer rules, but not with actual creative expression and certainly NOT with modern American journalism, unless every sentence needs to be changed to say things correctly like “about which,” etc., already irritating, an annoyance, akin to an editor questioning your word choice and grammar throughout a story needing to be dictated to get on the wires fast.
Harper Collins, for those unaware, is in fact owned by NewsCorp. Rupert and Lachlan Murdoch’s NewsCorp. Which also owns The Wall Street Journal.
No doubt, if AI could be allowed to make up news or ‘write like’ a professional, it would be welcomed by some major corporations as it would eliminate the need to even outsource news, as, say, Reuters did to India at one time. But that’s where the problem still lies. Or, rather, that’s the situation in which the problem exists.
Because in addition to being able to write ‘nonfiction,’ nonfiction theoretically has to be checked for accuracy.
I don’t want to ‘teach’ anyone, or anything, how to write like me, if they can’t be trusted to be accurate not in their writing, but in the facts ‘about which’ they’d be writing.
Chris Bailey says
I have used AI to produce meeting minutes from meeting notes. It saved a little time, but also completely scrambled some facts. So. Not a fan.
Petrea Burchard says
I see the value of AI (“intelligence” is a misnomer though) for things like research, certain medical procedures, things like that. I do not see its value in art. In fact, AI renders art…not art. So put me on Team Abominable.
Petrea Burchard says
Oh and $2500? For what might have taken months or years to write?
“Abominable” is not quite the word I’m thinking of.
D.K. Dale says
One of my objections is that AI is being imposed rather than offered. Also, authors have had their back-list titles hoovered up and scraped, apparently in some cases with the publisher’s permission but how many of the authors were asked?
A broader objection is that AI is one of various areas of tech that increasingly facilitate mild to deep confusion over what is true, accurate, real—internet bots, deep fakes, the list goes on. And I think feeling uncertain over what is true/real and what isn’t is isolating, part of the ‘loneliness epidemic.’
On the other hand, what keeps us sane is genuine one-on-one connection and interaction, and one place where many of us find that is in books authored by another human soul—not by data centers.