Hacker Newsnew | past | comments | ask | show | jobs | submit | muzani's commentslogin

We considered it before in a past job when cost cutting. But nobody wants the boss to know when they're playing DOTA.

More seriously though, Slack was one of those tools that just held context of everything. All the engineering decisions and past error messages. Discord potentially could, but it doesn't seem to built for this use case.

Can it handle a thousand channels? It sounds ridiculous, but companies scale, and so do observation tools, customer service, and such.

I joined an elite gaming clan that operated on Slack once because it was easy to spin multiple channels per player and handle notifications. Discord potentially could, but channel management is a lot harder. Though that clan eventually moved over to Discord because nobody wants to pay $ per head.


They all are trained by each other. Claude says it's DeepSeek if you ask it in Mandarin.

Most people seem to think that phenomenon is not the same thing. People have shown by experimenting with different prompts that even in Mandarin, Claude correctly says it’s Claude when it is doing something for you. But if you ask it about its identity, it sometimes says DeepSeek. The current theory is it just has run into Chinese content that has chat logs that often have a DeepSeek model answering that it is DeepSeek. But the inconsistency in different prompts suggests this is something different from distillation.

LLMs work perfectly well without a pseudocode skill. It natively understands pseudocode just as well as it understands Indonesian.

That's not the point of the skill.

I've always called LLMs like the Watt steam engine. LLMs are an engine designed to convert electricity into attention.

You use an engine to pump water and drill holes. Then they got more complex and moved heavy objects. Then they started powering warships. Warships were always powered by wind but the engine gave just enough power to build ships out of heavier materials and carry heavy armaments, which changed the balance of power, and led to a third of the world declaring independence from European colonial powers.

LLMs are like that. They can process a lot of text. They can think better than humans because they're doing large searches. They can be far more creative than any human. We're not there yet though, but there's a century gap between the Watt engine and the steam powered warships.

Calling it AI has had... very negative effects on how people use the technology. Instead of using it to process things, they're used on a layer that they don't belong to. People are trying to accelerate to the warship era with lots and lots of money, but money doesn't work that way.


It could be a kind of watermark. It's possible they aimed for it to be just 5% more noticeable but overshot it. Also humans tend to spot these things better than computers.

It used verdant excessively in the past, but that's a less noticeable word than goblin.


Grok was the best every now and then but they committed the capital crime of trying to charge 50% more than the competitors who were only 20% or so worse.

There's kind of this weird thing in Silicon Valley where CEOs and investors keep telling people to raise prices! The definition of a moat is you can raise prices!!!

But these folks usually don't have the experience of being broke or saving for retirement and such. They don't understand that an extra $10/month is a heck lot of money. And so nobody really wants to try it and it has a dampening effect on the virality.


Recruiters are the opposite; they're often incentivized to get you the job. But in the case of recruiter bonuses, the hirer is biased to hiring a slightly worse applicant that doesn't cost the recruiter fee.


> biased to hiring a slightly worse applicant

I understand your reasoning, but in practicality, I don't think this is true. This would be true if companies though with a coherent set of incentives. Instead, individual incentives are at-play here.

If a company is paying for a recruiter, it usually means:

- It isn't highly cash constrained - Values the time of its IC's, managers and HR more than the fee - Valuation for the role is not cost-based, but value-based - Only at the penny pinching startup stage is the recruiter fee a real factor in a multi-year investment that should be yielding a high return. Beyond that, the bias evaporates and the real incentives lie with individual incentives, and available budgets.


It's back to the original GPT-5 state that everyone hated. Everyone's not just unsubscribing to ChatGPT for political reasons; I believe the tone also puts everyone off.

mate I get tired just responding to coderabbit

Our of curiosity, how many interviews would you get applying to 1000 jobs?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: