> I look forward to the time when I can have as much high-quality (to me) fiction to read as I like, because it's all generated by LLM. Some time after that, I'd love to see the main Star Wars sequence done properly. I won't care that it isn't created by a vast team of humans.
I think the problem here is analogous to the "500 channels and nothing to watch" issue in the heydey of cable.
Ok let's say you have an LLM in your hand that can generate any story you want. High quality. So you say: "tell me a story" and it tells you as story. But what story? Who is in it? What characters? Why are they there?
The only novelty that's going into this the prompt. Everything else is regurgitated weights and probability associations. The question is: does the full infinite closure of recombination over some finite learning set (no matter how large) encompass enough of the essence of creativity to produce something "new"?
This is a hard question to answer because it forces us to try to define creativity, or lacking that - at least try to identify where it comes from.
I don't have a clear answer to this but I'll suggest a line of thinking that seems plausible.
When a person writes a story, it's not derived as an amalgam of everything they have read. It's not some probabilistic weighted average of all those associations. The story they write is also derived from their lived experience. Their personal interactions, their observations, their musings, their passions, their fears.. and how all of those things interact with their circumstance, influencing their reactions, those reactions influencing their environment, and that feeding back into the above process.
There are two components that seem important here: the first is the existence of a rich, dynamic, and active _dialogue_ between the mind and its environment. It's not static, and it involves a feedback loop between the mind and the environment it models.
The second is a motive force. For humans the origin is biological. Fear, hunger, satiation, arousal, etc. - those core primitive emotional drives that originally developed to help us survive, but then were layered over with an intellect that elaborated on them. What originated as a motive force to drive the mating instinct evolves into a sonnet about an unattainable maiden. The fear of the dark that keeps us away from the places where we would be eaten.. evolves into a stories about unfathomable creatures and impossible colours that drive men insane.
And I think there's a third one that's unelaborated and implied but should be made explicit: introspection & reflection. The ability to consider your choices and consequences with respect to your motivations, and adjust any number of things - from the motivations themselves, to expectations/understanding.
This creature would have a lived experience, some underlying motivations, and a feedback loop established between the two using introspection. I have no idea how you'd build any of that.. but it feels like that's what you'd need before you got yourself a good storyteller.
But by that point, you'd also be compelled to question whether or not it's even ethical to force it to tell you a story anymore.
I don't think it's impossible that some broader AI system eventually is capable of genuinely creating creative output. LLMs are not that, though.
I think the problem here is analogous to the "500 channels and nothing to watch" issue in the heydey of cable.
Ok let's say you have an LLM in your hand that can generate any story you want. High quality. So you say: "tell me a story" and it tells you as story. But what story? Who is in it? What characters? Why are they there?
The only novelty that's going into this the prompt. Everything else is regurgitated weights and probability associations. The question is: does the full infinite closure of recombination over some finite learning set (no matter how large) encompass enough of the essence of creativity to produce something "new"?
This is a hard question to answer because it forces us to try to define creativity, or lacking that - at least try to identify where it comes from.
I don't have a clear answer to this but I'll suggest a line of thinking that seems plausible.
When a person writes a story, it's not derived as an amalgam of everything they have read. It's not some probabilistic weighted average of all those associations. The story they write is also derived from their lived experience. Their personal interactions, their observations, their musings, their passions, their fears.. and how all of those things interact with their circumstance, influencing their reactions, those reactions influencing their environment, and that feeding back into the above process.
There are two components that seem important here: the first is the existence of a rich, dynamic, and active _dialogue_ between the mind and its environment. It's not static, and it involves a feedback loop between the mind and the environment it models.
The second is a motive force. For humans the origin is biological. Fear, hunger, satiation, arousal, etc. - those core primitive emotional drives that originally developed to help us survive, but then were layered over with an intellect that elaborated on them. What originated as a motive force to drive the mating instinct evolves into a sonnet about an unattainable maiden. The fear of the dark that keeps us away from the places where we would be eaten.. evolves into a stories about unfathomable creatures and impossible colours that drive men insane.
And I think there's a third one that's unelaborated and implied but should be made explicit: introspection & reflection. The ability to consider your choices and consequences with respect to your motivations, and adjust any number of things - from the motivations themselves, to expectations/understanding.
This creature would have a lived experience, some underlying motivations, and a feedback loop established between the two using introspection. I have no idea how you'd build any of that.. but it feels like that's what you'd need before you got yourself a good storyteller.
But by that point, you'd also be compelled to question whether or not it's even ethical to force it to tell you a story anymore.
I don't think it's impossible that some broader AI system eventually is capable of genuinely creating creative output. LLMs are not that, though.
They seem more like a substrate.