Book read: Zombie 69 by Kitty Glitter
Pages: 20
Sometimes you read a book and you wonder… What did I just read?
That is how I felt after just a few pages of Zombie 69. This short twenty page book contains two short stories, the first about zombies and the second about a cat and dog that live with some humans. Neither story makes any sort of sense. They both feel meandering, and broken, almost as if it was written by a primitive AI generator, not a person.
I don’t remember the zombie story very well. It had to do with zombies going to high school (why?) and doing ordinary teen things, they are just zombies that have to pop their eye back into socket now and then.
The second story was…weird. The cat and dog can talk to each other, and they swear a lot. Some random person breaks into their house and kidnaps their child, then the dog goes a little savage and blames the cat. The cat, for it’s part, starts talking about time shifts, and the girl being pushed out of time. Then it just ends. The girl is still gone, there is no explanation of time shifts, there is no explanation of why the cat is able to drive an RC car, or who that guy that kidnapped the girl was. It just ends.
This clipped, piecemeal story telling gave both stories a generated feel. It felt almost like it was using some sort of madlib format. (I am not saying it is, of course, I can’t know that. But the feeling is there.)
This chaotic structure is often how people recognize AI generated art. The AI can combine aspects of different pictures, and even blend them together, but it often gets anatomy or structures wrong. It adds extra fingers, or one eye is much bigger than the other. It becomes incredibly obvious that whatever created the artwork, be it human or program, it has an uncanny valley feel. It seems like it should be art, or human, but it just feels… wrong. And sometimes we aren’t even able to tell what is wrong about it, just that there is something that doesn’t work.
In written work it’s much the same. Algorithms are pulling from sources all over the internet and smashing them together, but it’s a predictive text structure. Just like the predictive text on your phone doesn’t always suggest the right word for that sentence you’re writing, ChatGPT sometimes adds whole paragraphs that just restate what it already said, or breaks structure, or leaves out key details. When dealing with factual information it may even just be flat wrong as it pulls from the wrong information online. Remember, a predictive text formula is only as good as the information being fed into it, and a LOT of the information on-line is just wrong. How could a language model be expected to be right all of the time?
I don’t know if this particular story was written by an algorithm, or just some random stories built from the authors wildest dreams, but the feeling is the same. It doesn’t feel…cohesive, or right. And it makes zero sense.
How would you fix the very structure of your story telling? My suggestion is to have beta readers, or a writers workshop. Being part of a writers workshop and having honest feedback about my writing helped me get the words right faster than just spitting words into a void and hoping they made sense. And with the age of the internet with facebook, forums, meet-ups and more, finding a group of people dedicated to helping each other get better at writing is easier than ever. It can be on-line so that you don’t have to put faces to the criticism, or you can opt for in person where you can get better at people skills, too. Either way, having good feedback about your work is crucial to not just finding your voice, but refining it.
As for AI…There are arguments for and against AI generation. I tend to be of the opinion that it’s a tool just like any other random plot generator and that if you, as the writer, don’t take that generated idea and actually write it yourself than it will never be a great story. AI just simply rewrites what has already been written. Good for ideas, for plot summaries, or settings, but not good for a finished product. Not yet, at least. (That’s a little spooky to think about, really.)
There’s a lot more to go into about AI generated art as a whole, including copyright, stolen assets and more, but that’s a much bigger topic than I may cover in a blog post. So for now we’re just going to take from this short story that if you don’t have a cohesive story that makes sense people might think you’re a computer. And if you want to get better at writing you might try a writers workshop.
Next story: Immortals by Eva Fairwald.









