Agreed. Windows 10/11 can run just fine on 4GB of RAM. You just can't run anything inside of Windows 10/11 with 4GB of RAM.
The last version of Windows that felt like 4GB of RAM was performant for me with applications was Windows XP. Not that every computer running the 32-bit edition of Windows XP could even see/utilize a full 4GB of RAM properly, but at least it was fast.
I ran a Windows 7 system with 3GiB as a gaming machine and it was just fine. Windows 7... the last Windows release that was acceptable-ish. Memories...
Neo is powered by a fast and battery-friendly chip. It's definitely not a novelty any more than Chromebooks or Windows 11 notebooks with integrated graphics have been.
I’ll go further: there should be laws addressing account consolidation. Getting banned from an Apple or Google account is an incredibly wide blast radius. It would be like being banned from buying Unilever or Nestle food from your grocery store.
Email providers should be utilities and also legally require a warrant before disclosing any information whatsoever to the government.
Unfortunately the government is full of corrupt geriatrics who do not understand technology and are paid to continue not understanding technology as they sign bills prepared for them by ALEC.
No Google account has been banned for this. People just keep spreading this lie because no one agrees that they have the right to steal the OAuth token.
It's their OAuth token, it's not being stolen. It's just being copied from one place on their computer to another. This is no different than a competing browser importing your localStorage and cookies from Chrome on first launch.
No, the OAuth token is supposed to be used solely with the context of a first-party app only. Clearly, if you need to extract the key by reverse engineering or set up a proxy to spoof requests to a service, you're doing something shady.
> No, the OAuth token is supposed to be used solely with the context of a first-party app only.
The web doesn't work like that. The operators of google.com saying you must only use Chrome to load it is a ridiculous concept. It's not spoofing to use your own access credentials on your own computer to access your own account on an HTTP API.
Technically speaking, they haven’t been able to. There’s really no way of stopping someone using an alternate client if it appears to the server the same way.
The only reason video game cheating is more difficult is because it uses custom protocols and message types, and it needs to be reverse engineered. Usually it’s just easier to reuse the existing game client and patch it to report to the server that everything is normal.
Most people would agree both that getting rid of cheating is desirable and that the methods of control exerted over users to accomplish it is questionable. It's one of the few freedom/security tradeoffs where people generally agree we have to come down on the side of authoritarian, because otherwise it destroys online gaming as a whole. That scenario doesn't apply here. The world is a complex place.
How do so many people think this happened? All of the articles I’ve read have been clear that it did not happen. Yet it’s all over the comments here. Why?
Sometimes I forget programming languages aren't a religion, and then I see someone post stuff like this. Programming languages really do inspire some of us to feel differently.
> The building we exited was another one of the terrafoam projects. Terrafoam was a super-low-cost building material, and all of the welfare dorms were made out of it. They took a clay-like mud, aerated it into a thick foam, formed it into large panels and fired it like a brick with a mobile furnace. It was cheap and it allowed them to erect large buildings quickly. The robots had put up the building next to ours in a week.
> The government had finally figured out that giving choices to people on welfare was not such a great idea, and it was also expensive. Instead of giving people a welfare check, they started putting welfare recipients directly into government housing and serving them meals in a cafeteria. If the government could drive the cost of that housing and food down, it minimized the amount of money they had to spend per welfare recipient.
> As the robots took over in the workplace, the number of welfare recipients grew rapidly. Manna replaced tens of millions of minimum wage workers with robots, and terrafoam housing became the warehouse of choice for them. Terrafoam buildings were not pretty, but they were incredibly inexpensive to build and were designed for maximum occupancy
I switched to Ubuntu last week for my desktop. First time in my 25+ year career I’ve felt like Microsoft was wasting my time more than administering a Linux desktop would take. The slop effect is real.
I've used Kubuntu for several years, wife too now which is an official, supported flavor of Ubuntu using KDE desktop instead of Gnome. It gives a more Windows like or CDE (Common Desktop Environment - from UNIX systems) feel than Gnome which gives a more Mac feel.
I am not getting what that linked url is supposed to mean. It is a very decent business page where ubuntu is selling consulting for "your" projects and telling why ubuntu is great for developing AI systems.
I wasn't making an argument. It was a prediction that all major software, (including the major linux distros) will eventually be majority (>50%) AI generated. Software that is 100% human generated will be like getting a hand knitted sweater at a farmers market. Available, but expensive and only produced at very small scale.
On what reasoning do you make this prediction? Just because corporations are mandating their employees to use AI right now does not mean it will continue.
Any new software developers entering the field from this point on will have to know how to use and be expected to use AI code-gen tools to get employment. Moving forward, eventually all developers use these tools routinely. There will be a point in the future where there is no one left working that has ever coded anything complex thing from scratch without AI tools. Therefore, all* code will have AI code-gen as all* developers will be using them.
* all mean 'nearly all' as of course there will be exceptions.
> Any new software developers entering the field from this point on will have to know how to use and be expected to use AI code-gen tools to get employment
So eventually, doesn't the KPI move from "more code" to "better code"? The pendulum will have to swing the other way eventually; seems like microsoft is just accelerating that process
> doesn't the KPI move from "more code" to "better code"?
I would love for this to be true. But another scenario that could play out is that this process accelerates software bloat that was already happening with human coded software. Notepad will be a 300GB executable in 2035.
And this will cause what I'm talking about -- When nobody can afford memory because it's all going into the ocean-boiling datacenters, all of a sudden someone selling a program that fits into RAM will have a very attractive product
Half of the country has been left behind to the extent that they are worse off financially than their parents at the same age and have no path to improvement. It just keeps getting worse for them. The other half of the country thinks they are uneducated buffoons who are morally bankrupt for and need to do more to help themselves.
They blame the other half of the country for calling them “deplorables”, immigrants, globalization, religion, gender, education, science, and anything else that is an easy scapegoat. Blame whoever you want but the symptom of this illness that the institutions failed to resolve is the current political reality.
reply