> Local model enthusiasts often assume that running locally is more energy efficient than running in a data center,
It is a well known 101 truism in /r/Localllama that local is rarely cheaper, unless run batched - then it is massively, 10x cheaper indeed.
> I think they mean that the DeepSeek API charges are less than it would cost for the electricity to run a local model.
Because it is hosted in China, where energy is cheap. In ex-USSR where I live it is inexpensive too, and keeping in mind that whole winter I had to use small space heater, due to inadequacy of my central heating, using local came out as 100% free.
1. Word choice, phrasing, and sentence structure make it seem likely. Ironically, one has to go on vibes. One gets a feel for the voice and tone used by LLMs after a while. It's also a new account with one comment.
Not the latest SSM and hybrid attention ones.
reply