> The problem here is taking "word for word" as "by dictionary meaning", which is never how laws are read.
Back in the days of "smart contracts" and "DAOS" this was something many well-meaning technical people struggeled with. Humans and their societies are flexible and therefore laws must be flexible as well (to a certain degree before it becomes damaging).
It's also why a lawyer/expert is usually recommended when engaged with legal matters: We as layman lack all the context around seemingly "simple" concepts, procedures and definitions. You can learn all of that or hire a professional.
I believe Enkidu also became much more wild over time. In Against the Grain the author suggests earlier tellings of Gilgamesh presented Enkidu as unusual, but human. When the same story was being recorded a thousand years later, he was a monster. If the story was preserving some ancient memory about Neanderthals, such significant change seems unlikely.
I just looked it up for Germany[0] and there were a whopping 3 (0.0%) new hydrogen fuel cell cars registered in Februrary 2026. Even LPG cars were more with 397 registered.
For comparison 21.9% were BEVs, 11.5% Plugin hybrids, ~51% pure petrol or non plug-in hybrid, and 14.8% Diesel.
That would be good experimnent and could actually work.
I would love to try it however they would have to solve "global song availability" and "Sponsored songs only Stations".
But if they did try there is the chance of some niche communities forming.
It wouldn't even need to be live to begin with. A narrated playlist with a DJ and basic control functionality such as fading into songs or a voice over.
Not trivial but doable and I wonder why they never tried that.
It has been tried. I don't remember its name, but I remember that they have changed names at least once. It's a pretty obvious "app" for Spotify's API which they opened up a few years ago.
If it is 2030, compute and software will vastly reduce requirements.
Nvidia has so much cash for R&D. You can bet they now have immense optimization and improvements in the pipeline, but why would they release anything groundbreaking right now?
There are no real competitors on their heels. As Intel did for decades, they will likely dole out improvements, only when necessary to remain ahead.
By 2030, I expect 10x improvement. We're also seeing stunning optimizations in trained models.
I imagine desktops running many of these local by 2030, even phones.
Will we need even 1/10 the datacenters for LLM in 2030? Certainly, privacy concerns are a thing.
When I read this comment all I see is: LLM at the edge - or close to it - will become available. And whoever provides the best eco-system across digital lifestyle and business wins.
Oh... Apple? lol.
Well that'd be funny wouldn't it.
Oh and dont forget Apple got rid of its reliance on Intel too. No reason why this can't happen again.
Could not believe how almost no-one saw this [0]. It was quite obvious that companies like Apple that are prioritizing local inference and potentially training have already won the race to $0.
I did lol. I can't speak for enterprise in totality, but I see a world where Apple is the dominant provider of products/services for consumer and SMB's.
Google's lack of investment in marketing, design and sales/distributions capabilities is going to hurt them badly. MSFT is no different in many respects - latching onto the investments in 'relationships' and 'switching cost' initiatives that have kept customers loyal to them.
There will be continued hyperscale AI in the datacenter for some use cases, and AI in the smartphone (or PC) for other use cases. It is guaranteed to split that way. Apple's remarkable capabilities around custom chips will enable it to continue to stay out in front in smartphones.
They were ridiculed for being 'behind on AI'. They haven't spent a dime on investing in AI-related infrastructure and so on...
And yet, they could stand to be the biggest beneficiaries if not the only. Given that they have plenty of resources in reserve and they are buying back stock - enabling insiders to have a greater say on actions in the future.
So tldr they are just standing by until everyone else dies? If this is the theory then they HAVE to be doing some serious AI things internally/R&D/in-secret so they're essentially "ready to go" ?
I guess what's the downside though to getting into the game now and gaining users since they have loads of real cash anyways and a crash wouldn't really hurt them?
I don't use AI for the sake of it, I use it where and when it is useful. For example:
1. advanced autocomplete -- if you have or paste the structure of a JSON or other format, or a class fields, it is good at autocompleting things like serialization, case statements, or other repetitive/boilerplate code;
2. questions -- it can often be difficult to find an answer on Google/etc. (esp. if you don't know exactly what you are looking for, or if Google decides to ignore a key term such as the programming language), but can be better via an AI.
Like all tools, you need to read, check, and verify its output.
Personally I find this workflow is jarring. I get into flow typing code and then the AI autocompletes the next four lines on a tab input. Now my flow is screeching to a halt because I have to switch from flow mode to review mode to make sure it actually autocompleted what I wanted
Text editors/IDEs have simple autocomplete and the ability to do some expansion, e.g. a for loop with placeholders to fill in. Those work and are still useful.
JetBrains also has local line-based LLM models for various languages.
With the LLM-based autocomplete it a) generally autocompletes more code at once, and b) will often pick up on patterns in the existing code. E.g. if you have a similar method, list of print/string buffer write statements, or other repetitive code in the file it will often use that as a model for the generated code.
Sitting here on the sidelines having never configured snippets or macros or any of that in any of my editors, which I could have done like 30 years ago but never bothered in all this time, doing quizzical-dog look at all these people thrilled about LLMs.
I guess they might finally get me to use those things since they take the “configuring” and “remembering shortcuts” part out, but so much of this doesn’t look new at all. Super old, actually.
In my objective opinion, almost all AI uses cases (coding or otherwise), are just because of people's extreme laziness in spending a little time setting up some "automated" workflow, be it canned templates or whatever. The non-AI approach has the added benefit of being precise!
Customizable snipping is a feature editors support (which I mentioned as they are related/similar to what the AI is doing), but is different to the AI autocomplete behaviour.
If I have a JSON structure, I can paste that into the file as a comment, e.g.:
# {"foo": 1, "bar": "test", "baz"}
@dataclass
class FooBar:
foo:
and the AI will/can autocomplete/generate that to:
@dataclass
class FooBar:
foo: int
bar: str
baz: int
The JetBrains local autocomplete is hilarious but occasionally useful. I find it really hit and miss in terms of when it will decide to autocomplete and whether it will exhastively complete all elements, miss some out or get itself into a loop over several.
The out-of-the-box stuff is supposed to be kind of stupid. Are you guys really not editing your own snippets and shortcuts? Have people really been typing out "def do_something(foo, bar, baz)\n\t" manually?
Won’t you get much better results trying to maximize utilization of some sort of LLM? For many people, you’d get faster and better results trying to optimize for LLMs than for any standard word processor or music composition tool.
Speaking for myself (who can program and all that), AI solves some of the tedium in my day job (building UI components). Most of that work nowadays is boilerplate.
But at the moment it's also helping me solve more complex issues with building applications - it's JS, so you can imagine how complex it can be.
I yearn for a simpler workflow to be honest, I don't want to rely on SO or LLMs to solve build issues. I want to work in Go but there's only a handful of companies using it in my country, plus my CV basically says I mainly did front-end in the past ~15 years.
Selling Shovels is quite lucrative whether there is an actual mining business or just a gold rush.
At one point Jensen Huang will be out (retired or forced by staginating sales) and can definitely look back on a very successful career. That much is certain.
reply