I feel the same. I think the reason is that GenAI has effectively abstracted away the tooling layer. Not perfectly, and not always efficiently, but in terms of going from requirements → workable outcome, it has removed much of the pain of choosing one developer experience over another.
There are some limitations currently, like that functions can't have insert/update/delete (DML) commands in them, but we'll be working on lifting this limitation in 5.0 (along with possibly allowing recursive functions). And because EdgeQL composes well, you'll get a lot of mileage or our functions once we support DML in them.
Building something equivalent to PL/pgSQL is one of the things that will eventually happen, but unlikely to happen very soon.
I actually wonder if humans lack a crucial adaptive advantage if they do not intuitively understand how systems work. But then it occurs to me that some ancient philosophies and religions emphasised the need to be in tune with the surrounding world.
I think the more defective part of humans is our near complete inability for long-term thinking and planning, especially collective long-term thinking and planning. Just look at our daily lives and jobs. When are long-term plans every truly engaged and acted upon? Almost none. There is much too much self-induced noise in society and the economy, and there's a hyper-focus on short-term results and concerns.
I am thinking more of an automatic ability to see how things are related. Chinese language(s) ↔ Taoism, in the context of a holistic approach to worldview [0]. I know I am exaggerating, but maybe some meditative training could help in this regard?
How would understanding systems have been significantly adaptive for humans before a few thousand years ago? I agree with you that humans generally lack this capability. We also lack the ability to understand exponential growth, which I think is partially a cause of our lack of ability to comprehend systems.
My personal theory on this is that it comes from the fact that our sensory systems operate on a logarithmic response curve [0]. Note, for instance, how the decibel scale for measuring sound intensity is a logarithmic scale. Because our sensory systems respond logarithmically, that means an exponential increase in stimulus feels linear, at least until the point where the stimulus is damaging or so intense as to be uncomfortable. The end result is that we think "it's not so bad" until it's really bad.
I find your perspective interesting, but have the feeling that you are not thinking in systems (!). The sensory systems are perceptual systems, but they are subsystems of a larger "cognitive" system, and we cannot be sure that it exhibits the same logarithmic response behavior.