In reply to
lyralycan
@lyralycan@sh.itjust.works
sh.itjust.works
If you run your own AI and watch how long it takes, how much it runs up the resources for a few seconds, then you might get an idea of what it's like hosting at least three copies of a multi-terabyte LLM, in memory, with much shorter response and a much bigger knowledge base (Gemini by Google), taking millions of prompts per minute. Then think of every company that's hosting major public AI services.
Then remember that the *only* things good that come out of AI are natural language inference for voice commands and *slightly* improved developer processes.
Hosting it is just a reminder of the rapid environmental, ecological and cultural destruction that is the AI bubble.
In summary: *Perhaps*, if the hoster wants it for streamlining their dev process. Otherwise it can be replaced with a far more efficient standard algorithmic program, which is what we had before.
View full thread on sh.itjust.works
15
1
0
Conversation (1)
Showing 0 of 1 cached locally.
Syncing comments from the remote thread. 1 more reply is still loading.
Loading comments...