I was reading a blog post, as I often do, but something felt wrong with this one. It’s not that it was bad, I’ve read and written plenty of bad blog posts, and they did not create this reaction from me. I was halfway through it when I understood. This post was heavily edited with AI. I think it was not a fully generated post, the ideas behind the sentences and the general topic were probably picked by the author, but the individual words, the rhythm, the metaphors, the style… That was almost certainly AI.
Once I realized this, I lost all interest in the post, I was already not really enjoying it because of this sensation in the back of my mind while reading, but sometimes, it’s just a paragraph that you don’t like, or even just this particular post, but you can still learn some things and try to understand what the author wanted to share. But now, I didn’t feel like reading words they did not write. I felt tricked. To get what the author meant, I could just glance at the titles and check the last three sentences. The rest was probably just filled by the AI, producing content so that people could consume. There’s no point in trying to get the deeper meaning behind certain words if putting them here was not the outcome of some thinking.
I don’t know what the process was for this blog post, but I can understand the envy of doing it. I used to run my finished blog posts in ChatGPT to fix some grammar issues, and it would sometimes reply with some suggestions to change some parts. It often looked better than what was initially there. So if it improves the post, why not accept this change on this sentence, or this one too? I just re-read some of the posts where I did that with ChatGPT. Now that they’re cold, not on top of my mind anymore, I can see where I accepted some change suggested by the AI. It’s on sentences that feel plain and soulless. They’re not personal anymore, and it made them worse. I’ve re-edited a couple of my posts where I identified them and replaced the sentences with something more natural. I feel terrible about it, but I think I learned a good lesson.
This reminds me of something I read some time ago that said: “If you ask too many people for feedback on something you’ve written, you’ll end with something ‘average’. Every strong opinion carefully removed, every personality lost, taste diluted so much it’s barely present anymore. The post pleases everyone but loses all interest.” Maybe it’s something similar appearing in this case, LLMs are trained on so much content and spit out the average content. So using them, your post ends up just being globally average, instead of great for certain people and bad for others.
I write because it helps me think. I write because I want to share. I write because I like to read, and I would love to make other people feel the same enjoyment. The title of this piece is not the complete truth. It’s a hope. Every time I see some things that were written using AI, I wonder about what I didn’t catch. The longer the post, the easier it is to see, and there are plenty of blogs where that’s not really a question because I trust the author. But that’s one of the sad new realities with this technology. We can never be sure. So please, don’t use AI to write your blog posts. Use your own words. That might take more time, but that’ll also make the posts better. They’ll be yours and yours only.