One of the biggest topics in the writing community right now is the growth of artificial intelligence — or, more accurately, language learning models that can simulate writing. These aren’t actually “intelligent.” They’re basically glorified autocomplete. They were “trained” by getting input from written work available on the Internet, and from there they figured out what was most likely to come next based on prompts.
One reason this is an issue is that the people whose works were used to train it weren’t asked if this was okay, so it’s unauthorized use of their work. Another reason is that this is essentially a machine that automates mosaic plagiarism. It’s not writing anything new. It’s just cobbling together bits and pieces of other written work to create what’s essentially a word mosaic. There have been authors who got caught doing this when readers recognized phrases from other books. They take existing books and copy and paste bits and pieces together. It’s not a direct copy, but it’s not original, either. This technology just automates that.
Another reason it’s an issue is that it may make it harder to make a living as a writer because of people who don’t understand what it does and think it offers a shortcut. This is one of the things screenwriters are fighting about in the current writers strike. They’re concerned that studios will use AI to “write” scripts and then hire writers to “edit” or rewrite them into something that can be used. There are different payment scales based on whether someone gets credit for the story, for the script, or just for a rewrite, and studios could try to save money by not crediting an actual writer for the story or the original draft of the script, just for doing a “polish” on an AI-created script, even though it might actually take more effort to turn it into something that could be filmed than it would to write a new script.
For non-fiction writing, like marketing communications (my field when I’m not writing fiction), technical writing, and journalism, there have already been writers fired and replaced with AI. Never mind that it’s extremely dangerous to use it for fact-based writing because it makes stuff up. It doesn’t find information. It just creates something that seems likely based on information that’s already out there. There’s an attorney currently in huge trouble because he turned his legal research over to one of the AI engines to have it write his legal briefs, and it cited entirely fictional cases. It created a legal brief based on other legal briefs, but the cases didn’t exist. I have author friends who’ve played with it, since it’s supposedly a good tool for writing marketing copy, author bios, and the like, but they found that it made up stuff. It didn’t accurately describe the book, made up facts for the bio, and added non-existent books to the list of books in the series.
For fiction writers, there’s already an impact in the short-fiction market as publications have had to close to submissions because they were getting deluged with AI-written drivel. Most publications don’t want to publish anything AI-written because it can’t be copyrighted. It’s an amalgam of other works, so there’s a potential plagiarism issue. Plus, it’s not very good. It can imitate styles of other writers, but it has no real authorial voice, no story logic, no real soul. Apparently, it got out on some “side hustle” advice channel that an easy way to make money is to let AI write short stories for you. Never mind that even at the big publications you’re making a couple of hundred bucks if you manage to sell something. But the swamp of these bogus stories that aren’t good enough for publication, whether or not they’re AI-written, is making more work for editors and making it harder for real writers to get past the noise, especially if they’re newcomers. Editors may start focusing on authors they’ve already worked with or know by reputation because that means it’s more likely that the story is worth publication. A new writer without a reputation may get lost in the shuffle.
Novelists are likely to see the impact in discoverability. The online bookstore algorithms tend to favor new releases, and an author may get an overall boost when they have a new release. If someone can churn out a book with AI in a day, they can flood the marketplace with constant new releases, which crowds out the authors who take weeks or months to write a book the hard way. Even if readers don’t end up buying those books, their listings will stay front-and-center. It will be harder for readers to discover new books and authors.
Publishers already look for new books that are like what’s currently successful, and it’s not hard to imagine some of them seeing this as a shortcut. Get the machine to produce something like the current hot thing, then have an editor clean it up. Then they don’t have to deal with authors and they can get to market faster, jumping on the trend before it passes.
One argument I’ve heard for using AI is that it “democratizes” writing, making it so everyone can do it. Writing is hard, they say, and not everyone wants to put in the time to do it. To which I say, if you don’t enjoy doing it and don’t want to do it, you don’t have to do it. You can do something else. If you do enjoy it but are frustrated because your skills don’t match your vision, this may seem to provide a convenient shortcut. Just plug your idea into the computer, and it writes the story for you. But it doesn’t really get you past that frustration gap because if you aren’t writing, you aren’t learning how to write. Plugging your idea into a computer isn’t going to help you grow to be a better writer. You’ll just get better at wording the way you put your ideas into the computer. If you aren’t willing to put in the work to write until you get good at it, then maybe you don’t enjoy the process of writing and should do something else with your time.
I suspect this is another outcome of that side hustle culture, the idea that everything you do should be monetized. If you enjoy writing, you’ve got to be able to make money at it somehow, and now. You’re not making money from it during the time you’re writing just to get better at writing, so you want that shortcut. I also suspect that there’s a lot of overlap between the “writing is hard and this democratizes it” people and the people who believe that everyone can write, so it’s not really a specialized skill people should get paid to do.
I just don’t understand the idea of automating the things that are fun and that are part of human expression, like art and writing. They talk about how even though jobs and opportunities will be lost for writing, there will be new careers in editing AI-written output. But that’s automating the fun part and keeping just the tedious part. You’re not actually doing the thing when you use these tools. You’re getting output as though you’ve done it. I’ve found that I’m sanest when I’m in the creation phase of writing, when I’m coming up with ideas and writing early drafts. When I come close to burnout, it’s when I’m in the proofreading phase. I’d hate to get to where that’s the only part I get to do.
Where I’d love some kind of automation and artificial intelligence is to get a truly good spellchecker, one that looks at context, so if your typo accidentally creates a real word that’s spelled correctly, the spellchecker can tell it’s the wrong word for the context and flags it. Or it would catch when you use the wrong version of a word (like “their” vs. “there”). And it would be able to tell whether or not you need that comma. It would be trained on fiction, so it would work better than the existing grammar checkers. Automate the tedious, boring stuff, not the fun, creative parts.