Hacker Newsnew | past | comments | ask | show | jobs | submit | esperent's commentslogin

> Comment sections on AI threads tend to split into "we're all cooked" and "AI is useless."

This comment section is exactly the same, of course.

> I'd like to cut through the noise

Me too, but it's not happening here.


I have tried l-theanine, didn't do anything that I could tell.

But fortunately there's a much better solution for those of us who get anxiety from weed, it's called "don't smoke weed".


I use playwright CLI. Wrote a skill for it, and after a bit of tuning it's about 1-2k context per interaction which is fine. The key was that Claude only needs screenshots initially and then can query the dev tools for logs as needed.

> Most browser automation tools launch a fresh, isolated browser. This one connects to the Chrome you're already running

Is this the same as what Claude in Chrome does?

I tried that for a while and since I use Firefox and Chromium, the security problem of it seeing your tabs wasn't a big deal. Fresh Chrome install, only ever used for this exact purpose. Plus you can watch it working in real (actually very slow) time so if you did point it at something risky you can take over at any point.

For actual testing of web apps though, a skill with playwright cli in headless mode is much more effective. About 1-2k context per interaction after a bit of tuning.


I thought this was going to be an interesting take on how to prevent all the wealth from being passed down within families. But no, it's just about China becoming more like the west here. Generational wealth; the rich, and their children, get richer. Everyone else stays poor. There must be a better way.

Heritad privileges are a thing in China before. Look at the president who is the son of a (fallen) high politician.

Yeah I'm not using this.

Information We Collect Account Information

Conversation Data (Claude Code Plugin)

When you use the Peek Claude Code plugin, it reads portions of your Claude Code conversation transcript on your local machine and sends the following to our servers:

Your most recent message (prompt text) Recent conversation context (up to 10 prior messages, truncated to 500 characters each)


Understood. If it was full local or where only you hold the decryption key?

> we preemptively trigger a summarization step and load that when the context-window fills up.

How does this differ from auto compact? Also, how do you prove that yours is better than using auto compact?


For auto-compact, we do essentially the same Anthropic does, but at 85% filled context window. Then, when the window is 100% filled, we pull this precompaction + append accumulated 15%. This allows to run compaction instantly

Or maybe they find the idea of computers that can think just as exciting as you found programming at the start of your career?

I never found the idea of a thinking computer exciting, just as I don’t find the idea of a thinking screwdriver exciting.

These days I see the ultimate goal to create a super-intelligence to be blasphemous, if not existentially dangerous and I am afraid by how nonchalant everybody is about it.

I quite enjoy a reality where humans and biological life are in control of their destiny, but it’s apparently become a taboo opinion around these parts.


Good for you. But other people are allowed to find things exciting that you don't.

Personally I'd find the idea of thinking screwdriver... Well, weird. But definitely amazing and exciting.


I find the idea of a thinking screwdriver annoying. Thinking things are difficult to reason about, and tools that are difficult to reason about are frustrating to use.

A thinking screwdriver:

"You know what ... screw this."


> Can you imagine Europe's reaction?

And they'd be right to do so. The correct approach to creating a new standard is plan interoperability from the start. If a vendor plans lock in by introducing a new standard, they should get shut down immediately and told to do better.


That sounds like a way to not get any progress. The way I'm used to this sort of thing happening is some company brings in a new proprietary standard, makes bank, then all the competition bands together to form an open standard to try and stop them. There is a bit of a tick-tock feeling as consortiums use more open and accessible standards to slowly lever power away from incumbents.

It is interesting to just glance at the history of USB [0] through that lens was originally developed, and it is interesting to see that as I would have predicted the group of companies that developed USB (MS, IBM, Compaq, etc) seem to be disjoint from the companies listed as precursor technologies (looks like that was especially an Apple-led consortium of hardware manufacturers organised around firewire [1]).

[0] https://en.wikipedia.org/wiki/USB#History

[1] https://en.wikipedia.org/wiki/IEEE_1394#Patent_consideration...


As your link shows, even if the IEEE 1394 promoted by Apple was technically superior to USB (mainly because IEEE 1394 had been derived from SCSI), it was killed by patents.

Many superior technologies have been killed by patents and the greediness of the patent owners has been futile and they gained very little from their patents, because people have always preferred something cheaper, even if less good, so the inferior USB has easily won against IEEE 1394.

The patent owners that hope to gain too much from their patents always forget that instead of paying a too big royalty it is always possible to circumvent the patent by using an alternative solution, even if that is inferior.


> The way I'm used to this sort of thing happening is some company brings in a new proprietary standard, makes bank, then all the competition bands together to form an open standard to try and stop them. There is a bit of a tick-tock feeling as consortiums use more open and accessible standards to slowly lever power away from incumbents.

And that leaves you with two standards (at least), non interoperable between them. In the case of hardware this can be really annoying, constraining and inefficient both for consumers and at large.


How likely is it that that can be avoided if, as in this context, the starting point is the current standard not being that great? It pretty much has to end in 2 different competing standards. Or there can be 2 different flavours of the existing standard which are quite likely to break interoperability and make reusing the name an annoyance rather than a help.

A downside of existing standards is it means it is quite hard to innovate on them.


It really is a damn shame that my Lightning connectors are all dead and useless despite being the empirically better connector because of Vestager's whinging and stupidity across the entire EU mobile ecosystem.

Lightning is not a better connector. It maxed out at USB 2 speeds and I needed separate bespoke adapters and chargers. I can now use standard USB C cords with everything, standard USB C headphones, connect my iPhone to my portable external monitor with the same USB C cable I use for my computer…

https://imgur.com/a/fIwsjIQ

And the iPhone supports all of the USB C standards that computers support - audio, video, mass storage, network, keyboard, mice etc


Side note: USB 3 Lightning did exist on iPad Pros.

No. It existed with one special adapter.

> As you got closer to 100k performance degraded substantially

In practice, I haven't found this to be the case at all with Claude Code using Opus 4.6. So maybe it's another one of those things that used to be true, and now we all expect it to be true.

And of course when we expect something, we'll find it, so any mistakes at 150k context use get attributed to the context, while the same mistake at 50k gets attributed to the model.


My personal experience is that Opus 4.6 degrades after a while but the degradation is more subtle and less catastrophic than in the past. I still aggressively clear sessions to keep it sharp though.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: