• Sign in
  • Sign up
Elektrine
EN
Log in Register
Modes
Overview Chat Timeline Communities Gallery Lists Friends Email Vault DNS VPN
Back to Timeline !technology @wizardbeard
In reply to 5 earlier posts
@muelltonne@feddit.org on feddit.org Open parent
Open parent Original URL
766
0
126
@ceenote@lemmy.world on lemmy.world Open parent
So, like with Godwin’s law, the probability of a LLM being poisoned as it harvests enough data to become useful approaches 1.
Open parent Original URL
0
0
0
@Gullible@sh.itjust.works on sh.itjust.works Open parent
I mean, if they didn’t piss in the pool, they’d have a lower chance of encountering piss. Godwin’s law is more benign and incidental. This is someone maliciously handing out extra Hitlers in a game of secret Hitler and then feeling shocked at the breakdown in the game
Open parent Original URL
0
0
0
@saltesc@lemmy.world on lemmy.world Open parent
Yeah but they don’t have the money to introduce quality governance into this. So the brain trust of Reddit it is. Which explains why LLMs have gotten all weirdly socially combative too; like two neckbeards having at it with Google skill vs Google skill is a rich source of A+++ knowledge and social behaviour.
Open parent Original URL
0
0
0
@yes_this_time@lemmy.world on lemmy.world Open parent
If I’m creating a corpus for an LLM to consume, I feel like I would probably create some data source quality score and drop anything that makes my model worse.
Open parent Original URL
0
0
0
0
wizardbeard
wizardbeard in !technology
@wizardbeard@lemmy.dbzer0.com · Dec 16
Then you have to create a framework for evaluating the effect of the addition of each source into “positive” or “negative”. Good luck with that. They can’t even map input objects in the training data to their actual source correctly or consistently. It’s absolutely possible, but pretty much anything that adds more overhead per each individual input in the training data is going to be too costly for any of them to try and pursue. O(n) isn’t bad, but when your n is as absurdly big as the training corpuses these things use, that has big effects. And there’s no telling if it would actually only be an O(n) cost.
View on lemmy.dbzer0.com
0
0
0
Sign in to interact

Loading comments...

About Community

technology
Technology
!technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules
  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots
  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
83902
Members
18816
Posts
Created: June 11, 2023
View All Posts
313k7r1n3

Company

  • About
  • Contact
  • FAQ

Legal

  • Terms of Service
  • Privacy Policy
  • VPN Policy

Email Settings

IMAP: mail.elektrine.com:993

POP3: pop3.elektrine.com:995

SMTP: mail.elektrine.com:465

SSL/TLS required

Support

  • support@elektrine.com
  • Report Security Issue

Connect

Tor Hidden Service

khav7sdajxu6om3arvglevskg2vwuy7luyjcwfwg6xnkd7qtskr2vhad.onion
© 2026 Elektrine. All rights reserved. • Server: 05:45:52 UTC