Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A $xxxx 2.5 year old laptop, one that's probably much more powerful than an average laptop bought today and probably next year as well. I don't think it's a fair reference point.


His point isn't that you can run a model on an average laptop, but that the same laptop can still run frontier models.

It speaks to the advancements in models that aren't just throwing more compute/ram at it.

Also, his laptop isn't that fancy.

> It claims to be small enough to run on consumer hardware. I just ran the 7B and 13B models on my 64GB M2 MacBook Pro!

From: https://simonwillison.net/2023/Mar/11/llama/


The article is pretty good overall, but the title did irk me a little. I assumed when reading "2.5 year old" that it was fairly low-spec only to find out it was an M2 Macbook Pro with 64 GB of unified memory, so it can run models bigger than what an Nvidia 5090 can handle.

I suppose that it could be intended to be read as "my laptop is only 2.5 years old, and therefore fairly modern/powerful" but I doubt that was the intention.


The reason I emphasize the laptop's age is that it is the same laptop I have been using ever since the first LLaMA release.

This makes it a great way to illustrate how much better the models have got without requiring new hardware to unlock those improved abilities.


About $3700 laptop...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: