

6·
2 days agoThe speed of many machine learning models is bound by the speed of the memory they’re loaded on so that’s probably the biggest one.
The speed of many machine learning models is bound by the speed of the memory they’re loaded on so that’s probably the biggest one.
Rather than CPUs I think these are a much bigger deal for GPUs where memory is much more expensive. I can get 128GB of ram for 300CAD, the same amount in vram would be several grand.
Seems pretty underwhelming. They’re comparing a 109B to a 27B and it’s kind of close. I know it’s only 17B active but that’s irrelevant for local users who are more likely going to be filtered by memory rather than speed.
“Free market” fans when free market