In reply to
SuspciousCarrot78
@SuspciousCarrot78@lemmy.world
lemmy.world
SuspciousCarrot78
@SuspciousCarrot78@lemmy.world
lemmy.world
@SuspciousCarrot78@lemmy.world
·
1d ago
Yeah, me too :)
https://bobbyllm.github.io/llama-conductor/
https://codeberg.org/BobbyLLM/llama-conductor
I'm thinking about coding a >>cloud side car at the moment, with the exact feature you mentioned...but...that's scope creep for what I have in mind.
Irrespective of all that, I agree: an open cloud co-op could be a good way to have SOTA (or near SOTA - GLM 5.1 is about as close as we have right now) access for when needed.
(Not teaching you to suck eggs, so this comment is for the lay-reader):
For coding, you can do some interesting stuff where the cloud model is the "general" and the locally hosted LLM is the "soldier" that does the grunt work. We have some pretty decent, consumer-level-hardware runnable "soldiers" now (I still like Qwen 3 coder)...they just don't quite have the brains to see the full/big picture for coding.
View full thread on lemmy.world
2
0
0
Loading comments...