In reply to
@Traister101@lemmy.today
·
6d ago
That's their problem. If they are using an LLM and cannot verify the output they shouldn't be using an LLM
View full thread on lemmy.today
47
10
0
Conversation (10)
Showing 0 of 10 cached locally.
Syncing comments from the remote thread. 10 more replies are still loading.
Loading comments...