Hacker Newsnew | past | comments | ask | show | jobs | submit | p1necone's commentslogin

They might have tried, but this would be pretty hard to achieve for real - especially for the older/worse models. For changes that do more than alter a couple of lines llm output can be very obvious. Stripping all comments from the changeset might go a long way to making it more blind, but then you're missing context that you kinda need to review the code properly.

This is the most common way this happens in my experience - people naively assume that by giving just a date and not a time surely it wont do timezone conversion, but it does (and even worse that behaviour is not at all consistent between different languages/systems). Oh and fun fact JS parses 'YYYY/MM/DD' (slashes instead of dashes) differently from the dashed format as well...

The 'safe' way that I try to make everyone use for 'wall clock'/'business' dates is 'YYYY-MM-DDT00:00:00' (without the Z) - this unambiguously parses as 'this date in the current timezone' in basically every languages Date type and it's ISO 8601 compliant. However it's still a pain in the ass to keep straight when serializing as the 'default' output is usually a timezone converted UTC string (Z at the end).


You clearly haven't worked on codebases with other developers

The first thing I do with any new system is immediately wipe the drive and install a fresh copy of Windows/Linux, so bundled shovelware is meaningless to me, and presumably many others.

(Of course it would be even better if they just came with a totally stock install already, but that's not worth hundreds of dollars to me)


I can imagine doing that for Linux ... but why tech people battle Windows at this point is beyond me.

That’s an added windows license though?

Windows license stored inside BIOS. When you install fresh Windows, it'll get activated automatically.

What's more troublesome is that some laptops require drivers and customizations, so you need to tinker with your fresh Windows by installing carefully selected subset of drivers, so your hardware works and at the same time you don't install the same shovelware. The driver situation for Windows is truly dire. There are drivers from laptop manufacturer (e.g. Lenovo). There are drivers from part manufacturer (e.g. Nvidia). There are drivers that Windows was bundled with. There are drivers that Windows will download automatically and install as part of Windows Update. It's a huge mess and I don't think anybody knows how to navigate that. So there's no reliable recipe to create "stable" Windows from the scratch.


> When you install fresh Windows, it'll get activated automatically.

Same happens with some crapware provided by vendor. You can wipe drive all you want, but ASUS motherboard will ask Windows to automatically install "essential drivers", and to be specific - "Armoury Crate".


With Lenovo thinkpad, windows downloads all the needed drivers and even bios updates.

My T14 Gen1 (We are at gen7 now I think?) still gets updates. It's pretty neat.


You can extract the key and write it down. It's a 2-minute job.

My laptop is always either plugged into a dock at work, or plugged into a dock or just a power supply at home. I feel like there's an untapped market for 'same laptop, but slightly cheaper because there's no battery in it at all'.

Like you say most windows laptops have such garbage battery life already that it's not practical to use them unplugged.


> 'same laptop, but slightly cheaper because there's no battery in it at all'

So, a simple computer? You can even choose your keyboard, mouse and screens.


Not the same - I still want to be able to just use and carry round the one thing without needing a monitor, mouse, keyboard etc at every single location, but I basically never need to use it somewhere where there isn't a wall socket available.

When I bought a Thinkpad a few months ago there was an option to order it with a smaller battery (which I selected).

It seems ridiculous on the surface, since you'd think you'd just buy a desktop or something, but with a laptop with no battery, and hypothetically better everything else, it would eliminate the need for a bunch of other peripherals

Ah but then you'd need to hard shut down to carry it home. The battery should keep the ram active to commute while sleeping

Kind of an interesting idea. Only the portability but none of the mobile computing capability.

It does kind of seem like, outside a few select models, the PC market just gets the laptop part of laptops so so wrong. Bad touchpads, bad screens, no battery life, unpleasant industrial design usually, crammed with crapware and other bullshit. I hand it to the few companies that do try harder to remedy these.


Eh, I want some battery, it's nice when you need to move rooms or someone kicks the power cable out. Even 15 minutes would be enough for a chonkster machine like this.

I wonder if a big capacitor would be cheaper than a battery, probably not with how huge in scale battery production is at this point.

The early hybrid car of computers

Or if even that feels too verbose, just a 'nullable' modifier on the variable or field definition, with the default being not nullable.

(Although Optional/Maybe types are definitely my preference based on the languages I've used)


Being limited to 8gb of ram is genuinely the only thing on that list I care about (no backlight and no fast charging are teetering on the edge of me caring, but they aren't worth multiple hundreds of dollars) - Apple silicone is so fast now that (at least for my purposes) the performance segmentation between price points is basically meaningless.

A keyboard backlight is such a cheap and useful addition to a keyboard, it feels insulting not to get it. I cannot believe this is one of the ways they decided to cheap out.

I wouldn’t even care about the 8GB of ram if I could just add some myself.


> A keyboard backlight is such a cheap and useful addition to a keyboard

Useless LEDs that burn battery budget.

The thing everyone seems to be missing is this isn't a laptop for you or me. It is to compete with Chromebooks in the educational market, and to have a SKU to sell in developing countries.


Thank goodness they removed this fantastic thing everyone wants to give you an extra fourteen seconds of use time per battery charge. Come on man.

As for the importance of it, if you want to give these to kids, you should have something more rugged, more replaceable, and more built for all kinds of environments (including kids who don’t have a conveniently well-lit place to focus on schoolwork at home).

A large school could have thousands upon thousands of broken Chromebooks waiting to be shipped off - literally multiple pallets. I’ve seen it more than once. Absolutely nobody is begging for an unrepairable, unexpandable, more-expensive version of what they all already have. It’s garbage for school, dead out of the gate.


>> fantastic thing everyone wants

I wouldn't normally comment on such stuff as it's clearly a personal preference, but just to underline that it is in fact a preference vs everyone, I have used keyboard lighting exactly once in the ~decade it's been available to me. On a laptop with predictable keyboard, it genuinely doesn't matter to me.

(On a laptop with unpredictable keyboard, light is mitigating, not fixing the problem :)


Why do you need to see your keyboard?

Touch typing is a useful skill for everyone to have and doesn't take long to acquire.

Not to mention even the light of the display should be enough for you to be able to read the key caps if you really need to. Keyboard backlight seems like a gimmick with limited use to me. I always thought it was purely aesthetic.


You're sitting back in a chair watching YouTube in the dark. Hit F for fullscreen. (OK, that was the easy level because of the key bump.) Now hit L to skip 10 seconds forward. Now hit < and > to adjust speed.

The backlighting is useful. But no, it's not for typing, for most people.


I don't have a habit of sitting in the dark.

Also I don't understand what would be hard about your challenge. My hands automatically move to the home row, feel the key bumps and I instantly know where every key is. I never need to look at my keyboard. Not to mention having to move my eyes down from the displays would be annoying.

I mean people like backlight keyboards. So if it fits your use case great. Still makes sense to not include in a base model. I actually actively avoid keyboards with any lightning.


Congratulations on your keyboard superiority. I was just explaining why mere mortals like myself like backlight.

The fact that one in ten million people is annoyed by one of the softest lights ever invented by mankind is not a good reason to not include said feature in a product my guy.

Most people don’t have the touch typing skill and do not care to learn it. It literally matters zero per cent if they would benefit from learning that.

"everyone wants"? I am not even sure I understand the utility. Typing in the dark? For, idk, living in a cave?

14 seconds? Lights are expensive when to comes to batteries.

Isn’t the iPad already competing in these segments? Because unless reality has changed dramatically, this is still fairly pricey and a full-fledged laptop that doesn’t make for direct competition with Chromebooks.

Even in my home country of Portugal 700€ is a lot to throw at a ‘laptop’ that will be somewhat obsolete in 3–4 years, assuming Apple continues the trend of graphics-intense, memory hungry OS releases. An iPad seems like a better candidate for students or those on a budget.

I’m actually not sure who the Neo is for. Unless it’s a 3-model trick to price the Air upwards.


> I wouldn’t even care about the 8GB of ram if I could just add some myself.

I think that’s pretty unreasonable when they’re using an iPhone SoC to keep it cheap because they have massive volume. It was only ever available in 8GB and never designed for user upgradable memory because it’s for a phone.


It’s basically a web browser machine, that’s fine.

Damn, everyone is using AI for copyediting now aren't they? Once you notice the patterns you see it everywhere.

* "This isn't X. It's Y"

* "Some sentence emphasizing something. Describing the same thing with different framing. Describing it a third time but punchier.

* The em-dash of course

* A hard to describe sense of "cheesiness"

I only hope the models get good enough to not be so samey in the future.


Once you see it you can't unsee it. Although maybe this how corporate blogslop has always been, and we're just now noticing now that it's infected everything.

> "These are not complaints, merely observations."

> "There are repairable laptops, and then there are ThinkPads."

> "iFixit approached the relationship as collaborators, not critics."

> "[...] they didn’t declare victory and go home. They kept pushing."

> "Designing for repairability doesn’t mean compromising innovation or premium experiences; when done well, it actually drives smarter innovation, better modularity, and more resilient platforms."

> "It would be one thing to make a highly repairable but low-volume niche device or concept. Instead, Lenovo just threw down a gauntlet by notching a 10/10 repairability score on their mainstream-iest business laptop."

> "This is [...] how repair goes from being an enthusiast’s “nice-to-have” to being baked into procurement checklists and fleet-management decisions."


There's a desperate grasping for drama and simplicity about it -- same as most mass-media news stories. I recall reading somewhere that the two watchwords of journalism are "simplify, and exaggerate". Maybe add to that: "Make all your metaphors cliches, so the reader doesn't have to think about what is meant."

Yeah, it's weird. It's like one person writes articles for the whole world. Probably will be fixed in a few AI iterations to present more styles, but right now it's everywhere. Articles, even forum posts.

I found a way to 'de-smell' LLM copy: tell it to take a second pass that processes the text output with the William Burroughs cut-up method. Works well for a small subset of use cases.

Presumably the smelly AI text problem is just ... a problem that will be solved. Or maybe we'll just get used to it.


I believe it's already a solved problem especially with base models (pre RL) but they still push the LLM voice either to make it easy to identify or because they think it's likeable, so it's not that OAI, anthropic, Google can't get rid of the assistant voice it's that they don't want to

We've gone the wrong direction on the verbosity scale.

Unless I'm reading for pleasure, I want everything in concise summaries. I don't need flowery language. Or even complete sentences.

Maybe an LLM verbosity slider that dynamically truncates text we don't need. I'll dial mine down.



I recently destroyed the screen on a Google Pixel during a repair following a shoddily-written set of iFixIt instructions. I wish I had checked the comments, where many people complained that the instruction was wrong.

It was about a very fragile part of the process, and so it seemed like an error of omission that seemed atypical for iFixIt. It made me suspect the instructions might not have been wholly human written. I feel a bit vindicated for that suspicion.

The most generous interpretation I can have for this type of article is that it's a second-order phenomenon. If it was written by a human, it was written by one who consumes a lot of AI generated content and whose standards for what they produce have slipped.


I’ve only tried doing a phone repair per iFixit’s instructions once, and the instructions sucked. They explained in excruciating detail how to take the phone apart and then the instructions just ended. No details on reassembly.

>A hard to describe sense of "cheesiness"

This is the "Reddit" factor. I picked up on it being LLM written with this sentence:

"This is the treacherous, final-boss stage where repairability usually dies,"


Ah, yes, everything needs to be phrased as an existential crossroads now. Same thing the other day when I was debating between olives or pickles on my pizza.

Now that I know pickles are a pizza topping, maybe.

Only in the final boss stage.

LLMs bring up the “final boss analogy a lot too. I’ve gotten that in my own prompts

> I only hope the models get good enough to not be so samey in the future.

Why would you hope to be more easily fooled?


Not GP but I'm personally hoping that if I'm inevitably doomed to be exposed to this horseshit every day that it becomes tolerable to read. For world-shaking language-based superintelligences, they can't write to save their very expensive lives.

> I'm personally hoping that if I'm inevitably doomed to be exposed to this horseshit every day that it becomes tolerable to read.

Thank you for replying, but that doesn’t answer the question. Why would you want to make made up bullshit output more tolerable to read? Being intolerable to read is a feature, it’s a useful signal to know a piece of text may not have had human review, and that you should spend your time reading something else.

I use that same strategy with website consent banners. If a website is so invasive that they go out of their way to make rejection hard (which, by the way, is against the law), I know it’s a company not worth supporting.


It indicates a baseline competency of the AI user or whomever they are trusting to use it and it will hurt brand trust and trusting humans even more.

I'm glad I haven't let AI write much for me, its better for it to help me develop my ideas and writing and do the work to learn, explore and end up with something where my brain is in the gym. . Passive generation might not always map well to passive consumption


What annoys me the most is that the information has become much less dense. There's a lot of unnecessary repetition. I feel like I need to feed every article through an LLM just to get a summary of it.

If only a human could edit the output before posting.

Ironically, the editors probably haven't opened a text editor for months.

Em dashes aren’t an actual tell IMO. Many people use them.

Surely you mean: Em dashes aren’t an actual tell IMO — many people use them.

Maybe he isn't one but has a close friend who is? That would describe me.

Em dashes aren’t an actual tell IMO: many people use them.

There are dozens of us!

— dozens!

It is though if the rest of the prose is trash.

Jokes on you—humans write trash all the time.

> everyone is using AI for copyediting now aren't they?

If the studies that say that humans prefer AI writers are to be believed then you'd be a fool not to


Depends on the type of human you want to attract.

* "This isn't X. It's Y"

I find that Gemini uses that phrase way too much.


Ugh I have actually started hating Gemini for this specifically.

I don’t mind the AI generated aspect. I mind the lack of carrying that it looks like AI slop.

Generate with carefully steered AI, sanity check carefully. For a big enough project writing actually comprehensive test coverage completely by hand could be months of work.

Even state of the art AI models seem to have no taste, or sense of 'hang on, what's even the point of this test' so I've seen them diligently write hundreds of completely pointless tests and sometimes the reason they're pointless is some subtle thing that's hard to notice amongst all the legit looking expect code.


My plan for this in my current toy language project is to allow things like 'import * from Foo', but save a package.lock esque file somewhere on first build - after that you need to run some kind of '--update' command to bring in totally new names.

The problem I'm trying to solve is more around ensuring that purely additive changes in libraries aren't technically breaking due to the risk of name clashes than general discoverability though.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: