In reply to
steel_for_humans
@steel_for_humans@piefed.social
piefed.social
steel_for_humans
@steel_for_humans@piefed.social
piefed.social
@steel_for_humans@piefed.social
·
1d ago
Say I have a GPU with 32GB VRAM and I am on Linux, what local LLM would be good for coding?
Currently I just have an iGPU ;) but that's always an option, albeit a very expensive one.
View full thread on piefed.social
2
4
0
Conversation (4)
Showing 0 of 4 cached locally.
Syncing comments from the remote thread. 4 more replies are still loading.
Loading comments...