Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The industry will shift, yes. At some point, remote LLM compute will be like AWS.

Everyone can do baremetal at home and run on it, or VMs, containers. Many don't.

However, you'll still want the best model and toolset. So there is some place for them to pivot to. Something for them to sell or licence.

It will be interesting to see where the all lands, a decade from now. Who will be left?

 help



If you are using LLMs for tool use locally, then in a decade it will not make sense anymore to pay for hosted solutions. Your device will have compute power to run powerful LLMs trivially.

If you need LLMs at scale to serve many customers, then hosted solutions make sense for the availability aspect. But by this point models can be offered by any generic services provider, like AWS or Cloudflare. Pure AI companies that just offer hosted models and nothing else will go extinct if they don’t expand to offer more services.


> If you are using LLMs for tool use locally, then in a decade it will not make sense anymore to pay for hosted solutions. Your device will have compute power to run powerful LLMs trivially.

LLMs a couple of years ago that'd be impossible to run on consumer hardware are now running on consumer hardware. I'm less concerned about compute power; it's more about memory.

It could be several years before new RAM capacity comes online. Even then, it won't be cheap.

I expect in the future, hosted frontier models will be a utility like electricity or cable tv. Part of a package most people will subscribe to.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: