Can Chat GPT write my blog?
Chat GPT is going to change the world Anything that is oriented toward content production is at risk.
But could it write my blog? Specifically, could it write the blog posts I’ve posted in 2022?
I’m interested in this question because it will help us determine where there will be space human writing in the future, if there is any space at all.
It also chops away at the old problem of modernism versus post-modernism.
A badly summarized history of the old problem:
All the way back in the 18th century, we began poking away at the problem of truth in a unique way. We entered what’s called the modern era and demarcated that entry by the argument that we can know what’s true and real. We invented the scientific method, dove heavily into mathematics and logic, and designed systems that today power and drive the world. We were a truth-oriented society. It’s been said before, but I think we’re coming to an end of that world and chat GPT is one of the horsemen.
But what comes next? The old world and pattern is over and the new world has come. The new world is a world of aesthetics and qualia and embodiedness and phenomena. It’s the world where we ask “What is it like?” and try to answer it.
By the nature of what it is, Chat GPT cannot know what it is like. Chat GPT is a Chinese Room. It can take input and give us convincing output, but it’s going to struggle to produce content that requires it to depict what living is like, because it just doesn’t know.
What Chat GPT could reproduce:
The essays
I’m not a competent essayist. It requires a level of fastidiousness that I just don’t care about in a blog. I want to draft the blog, then edit it, then publish it. Sometimes this means the essay is trite, sometimes, it’s strained and unclear, and other times its approaches something good. For most of these posts, I think I would have been better off writing a long prompt to Chat GPT and revising it bit by bit. I think “Sanctifying the Profane” and “Engineering for the Mind” would have likely been improved with a co-writer.
The poetry
While current Chat GPT might not be the best tool for this, we can imagine a poetry AI ( e e PT? Aisho? Allwordsworth?) that would be immensely useful for writing poems. I suspect the pattern would be creating drafts, tinkering, then drafting, then tinkering.
But how could it write something like “All She’s Got”? I like this poem. I like the rhythm in the first section, the theme of death and not enough time throughout, but the last section is a rogue section. It connects the theme but makes it a broader, cosmic theme. That turn felt like the truly creative turn of the poem, and now that it’s been explained, maybe GPT could do it, but I don’t know how I would ask it to. Or if I would even need to.
This is where I think we’re going to get destroyed by AI writers. It seems like it shouldn’t be, but the magic of poetry is that it has space for interpretation. Chat GPT can absolutely thrive in ambiguity and let our semantic minds fill in the meaning for its syntactical outputs.
The stories:
I’m torn here. 2022 marked a turn in how I write. I shifted to trying to explore ideas in fiction instead of in nonfiction essays. Chat GPT could do a wonderful job creating the vibe and text of the “Alchemy” series, but I’m skeptical that it could invent the rules themselves and if prompted to invent rules, whether it could invent rules that have any resonance with our lives. They don’t come out of a book somewhere, they come out of my experience of struggling with goal setting and personal transformation over a decade. There’s no internet query to find them, and probably not a clear and concise way of articulating them outside of Buddhist scripture.
I’m not intending to inflate my ego here, but there is something subtle I’m working on and nothing I’ve seen Chat GPT create seems to cut into that. I am poking and prodding at the raw feel of living. I’m digging at the raw phenomena of living and trying to make sense of it. I just don’t know if GPT can do that. It’s plausible that GPT can hide in the same ambiguity that poetry creates, but I’m less convinced here. To tell a story about personal transformation, you need the right components. Perhaps we can train a model to recognize them, but it’s hard enough for human beings to see the pattern. Does that mean GPT is going to be better or worse at doing that? I don’t know! Perhaps the problem of meaning and transformation is more like noticing tumors in an x-ray (something AI is amazing at compared to humans). More plausible to me is that meaning and transformation are subtle, messy, highly qualified and nuanced issues that are hard to write recipes and algorithms for.