Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can’t help but feel like the hype generators don’t use the tech themselves. I can contrast this with crypto as a recent example. Sure there were some interesting tidbits there, but I just didn’t see the appeal.

With llms the changes are transformative. I’m trying to learn 3d modeling and chatgpt just gave me a passable sketch for what I had in my mind. So much better than googling for 2 hours. Is the cooling off because industry leadership promised agi last year and it’s not here yet?



The old way was not really that you would spend 2 hours googling for the exact procedure you need, but that you would spend time following getting started tutorials, following along with example projects, and generally learn the entire program. Then when you go to make something specific, you'd be able to come up with the steps required yourself because you know the program.

I'm not discounting the value of having ChatGPT just hand you the answer straight up. If you just want to get the task done as fast as possible that's a pretty cool option that didn't used to exist. But the old way wasn't really worse.


The problem is that stacking up these solutions creates technical debt. This technical debt needs to be solved by understanding the code base and fixing the issues. Can an ai coding agent do that? Sure maybe. But what I find is that ai coding agents need your help to do pretty much everything. So you need to have a good understanding of the code base in order to be helpful. So eventually you sit and wonder if vibe coding a stack of tools was actually helpful. Don’t get me wrong I think Claude code is amazing, but it’s not “And therefore there no longer needs to be jobs” amazing.


> following along with example projects

What the LLM gives you is essentially an example project, and you can ask for the specific examples you need. You can compare and contrast alternative ways of doing it. You can give it an example in terms of what you already know, and ask it to show you how that translates into whatever you're trying to learn. You don't have to just blindly take what it produces and use it unread.


This is why EXAMPLES in getting started pages are so important.

LLMs are making up for the lack of this.

It’s the Backus-Naur approach vs the Human approach.

Humans learn by example. IMHO this is why math education (and most software documentations) fails so hard - starting with axioms instead of examples.


But then it wouldn't be vibe coding! /s

It's endlessly mind-boggling to me how there's so many people who can't grasp the idea of just using llms as a tool in your engineering toolkit, and that you should still be responsible, thoughtful, and do code review - as you would if you delegated to a junior dev (or anyone!)

They see complete fools just accepting the output wholesale, and somehow extrapolate that to thinking everyone works that way


I really f*king hate this new brand of tech hype, how it used to be is:

Here's the iPhone 13, it makes better pictures, lasts longer on battery, and plays games faster than the iPhone 12. Buy it for $699.

Now it has become:

Here's the iPhone 13, the greatest breakthrough in the history of civilization. But enough about that, let's talk about the iPhone 14. We've released a whitepaper showing the iPhone 14 will almost certainly take your job, and the iPhone 15 will kill us all, provided no further steps are taken. It's so powerful, that we decided to instill powerful moral safeguards into it, so it will steer you towards goodness, and prevent it being used for evil (such as looking at saucy pictures). We also find it necessary to keep a permanent and comprehensive log of every interaction you have with it.

You also can't have it, but can hold it in your hand, provided you pay us $20/month and we deem you morally worthy of accessing this powerful technology. (Do not doubt any of this, we are intellectually superior to you, and have humanity's best interests at heart, you don't want to look like a fool, do you?)


“A new way to learn Blender” is not a multi-trillion dollar industry, is the thing. “Oh, this is kinda neat, and occasionally useful” just won’t cut it at current levels of investment.


I remember reading about a company which looked to solve the issue of open-source financing - paying volunteers of projects which were used by billion dollar companies.

The company at some point crossed the billion-dollar valuation, yet only handed out a single-digit million as pay for the maintainers.


Education market is certainly multi trillion.


Who is going to create the content to train AI past 2025? If I don't have a job in 3d, I'm not posting to my 3d tools blog. If I can't sell classes on 3d tools, I'm not posting to my 3d tools blog nor creating classes for 3d tools. In fact, if I'm working a home depot job, a second part time hours job, and driving uber, all so I can try to live, I'm not posting ANYTHING useful. I'm too busy trying to survive. The 'posting useful/insightful things' in my spare time ecosystem requires a social class with the hours/energy/desire to do that.

Education has a shelf life. AI needs the pre-AI world in order for AI to train and be useful, but AI also wants to replace the pre-AI world with a new AI world. So the world will need to freeze in place between the two.

AI in an entropy machine.


Only if you can replace it completely, the total value of education material aggregators do not sum up to multi trillion and that is essentially what these replaces.

You can say it does a bit more than education material aggregators, but it doesn't do that much more, it doesn't replace paid education in any way so far.


I think the only way you get there is if you assume _all state education spending worldwide_ goes to our friends the magic robots. That’d be a hell of a dystopia; idiocracy made real.


Well, you said "industry". This is goalpost moving.


You’re the one who made it about _education as a whole_, rather than, realistically, shallow tutorials on how to do 3d modeling. Like, this is the sort of thing that most people learn on their own with the help of tutorials/written material.

(I am, FWIW, _super_ unconvinced that our magic robot friends will be even as helpful there as any decent tutorial on the subject, but even if they are, that’s not really touching on education writ large.)


> Is the cooling off because industry leadership promised agi last year and it’s not here yet?

Effectively, yes: the promises are so huge, that even the impressive usefulness and value it brings today is dwarfed in comparison.


Will chatgpt be able to do this after people stop creating the content it trained on? AI is an entropy machine. The people making the content that AI needed to ingest in order to be useful...will be fired because AI will take their jobs. Those people will stop posting and AI will be stuck in 2025. And so will every industry AI touches and absorbs. 3d tools will have to stay at 2025 levels to stay AI compatible. No more artistic development (sure 1 or 2 home hobby people will break though, but nothing like the hundreds of thousands of works in every niche we have today), no more job development, no more tools development. AI is an entropy machine.


> Is the cooling off because industry leadership promised agi last year and it’s not here yet?

Exactly. The business world isn't remotely close to being rational. The technology is incredible, but that doesn't mean it's going to translate to massive business value in the next quarter.

The market reaction to this is driven by hype and by people who don't understand the technology well enough to see through the hype.


Then there’s me, spending my August learning Blender the manual way like we did before 2022 to escape the dreadscape of AI-infected software engineering, and discovering that, wow, I really enjoy 3D modeling to the point I might at least make it my hobby if not pivot my career to it.

I wonder if I should have listened to the hype generators (which you sound like one) and just have created ‘passable’ models with help of an LLM, instead of exercising my brain, learning something new and getting out of my comfort zone.

At the risk of sounding controversial, I’ll add that I also have a diametrically opposed view of crypto’s utility vs LLM than you, especially in the long-term: one will allow us to free ourselves from the shackles of government policy as censorship expands, the other is a very fancy and very expensive nonsense regurgitator that will pretty much go on to destroy the Internet and any sort of credibility and sense of truth, making people dumber at large scale, while lining the pockets of a lucky few.


It's "the rest". More complex projects, existing context (like infrastructure, code, business context, ...).

Building a small script is easy for chatgpt, but actually leveraging the workforce consistently turns out to be a lot harder than the hype promised.


AI is pulled out every couple decades and over-hyped. It never matches the hype, then the term AI gets a bad rap, goes away, and the useful stuff gets a new name, like machine learning to shake off the bad connotations of AI. We saw this in the 60's 80', early oughts, and now today.

I remember in college during the late 90's the hype was that CASE tools (Computer Aided Software Engineering) was going to make software engineers irrelevant, you just tell the system your requirements and it spits out working code. Never turned out.

Today, the only way the amount of investment returns a profit if it replaces a whole bunch of workers. If it replaces a whole bunch of workers, well there will be a whole lot less people to buy stuff. So either the bubble bursts or a lot of people lose their job. Either way we are in for a rough ride.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: