Friday, April 15, 2022

GPT-D, the dream-maker

I’ve been impressed by the creativity of my dreams – they show imagination and invention that my waking life doesn’t have. They’ve become tiresome, though, because I can’t get sense from them, try as I might. I’m beginning to think that there is no meaning to be found because there is no dream-self; it's just a mindless algorithm churning out material.

There are certainly recurring themes and images in my dreams, but I can’t figure out what they mean. What is my dream-self trying to say to me? I have no idea. Here’s a story: 

There was once a king, and every morning his aunt would walk into the throne room. If he looked at her, she would tell him stuff: usually stories but sometimes just snippets. They were puzzling and mostly – but not entirely – irrelevant to what was going on in his life. He couldn’t make sense of what she said, nor could he find a use for the information. Sometimes she’d disappear for a while, but she always returned. If he ignored her for a while, she would eventually insist on telling him something whether he wanted to hear it or not.

That’s it. 

My dreams aren’t gibberish, and my recollections don’t seem to be my conscious mind imposing arbitrary structure on chaos. The dreams mostly make sense: the settings, characters and events are intelligible and plausible. Even though no two dreams are the same, there are patterns; recently, for example, several dream plots have featured luggage being left somewhere, leading their owner to panic.

I’ve tried a couple of dream interpretation approaches, to no avail. I can invent interpretations, but they feel forced. Dream interpretation, even when done by professionals, reminds me of literary close reading; it’s largely a projection of the interpreter’s ideas rather than a revelation of some underlying truth. So even if the dreams were messages from the unconscious, the decoding doesn't reveal what the sender intended. To me, it’s an unsatisfying hermeneutic game.

But perhaps dreams aren’t meaningful messages in the first place. Perhaps there is no sentient entity sending them. Perhaps they’re just like text generated by an AI like GPT-3.

GPT-3 is a language prediction model (OpenAI, Wikipedia, TechTarget) It uses neural network machine learning to take seed text and generate follow-on subject matter. It has been applied in many ways, including generating newspaper pieces and horror movie plots. However, there’s nobody behind the curtain; GPT-3 is a real-world philosophical zombie, an entity whose output is indistinguishable (at least sometimes) from a person’s, but that does not have conscious experience.

My dreams feel like something GPT-3 could’ve produced. They’re comprehensible, inventive, and incorporate events from my prior experience (the ‘training set’). However, there’s nothing (such as a Self) that’s trying to communicate something to me. It's just a brain process that spins out plausible patterns based on prior data. To the extent the unconscious is not conscious, perhaps it, too, is a philosophical zombie.

Tangent: Perhaps orgregores are philosophical zombies…


No comments: