Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"All you need is £10k of Apple laptops..."


yes but still, a local model, a lightning in a bottle that is between GPT3.5 and GPT4 (closer to 4), yours forever, for about that price is pretty good deal today. probably won't be a good deal in a couple years but for the value, it is not that unsettling. When ChatGPT first launched 2 years ago we all wondered what it would take to have something close to that locally with no strings attached, and turns out it is "a couple years and about $10k" (all due to open weights provided by some companies, training such a model still costs millions) which is neat. It will never be more expensive.


That is... probable, if you bought a newish m2 to replace your 5-6 year old macbook pro which is now just lying around. Or maybe you and your spouse can share cpu hours.


No, you need two of the newest M3 Macbook Pros with maxed RAM, which in practice some people might have, but it is not gettable by using old hardware.

And not having tried it, I’m guessing it will probably run at 1-2 tokens per second or less since the 70b model on one of these runs at 3-4, and now we are distributing the process over the network, which is best case maybe 40-80Gb/s

It is possible, and that’s about the most you can say about it.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact