• Sign in
  • Sign up
Elektrine
EN
Log in Register
Modes
Overview Chat Timeline Communities Gallery Lists Friends Email Vault DNS VPN
Back to Timeline !technology @finitebanjo
In reply to 4 earlier posts
@alphacyberranger@sh.itjust.works on sh.itjust.works Open parent
Open parent Original URL
917
1
207
@brucethemoose@lemmy.world on lemmy.world Open parent
Aside: WTF are they using SSDs for? LLM inference in the cloud is basically only done in VRAM. Rarely stale K/V cache is cached in RAM, but new attention architectures should minimize that. Large scale training, contrary to popular belief, is a pretty rare event most data centers and businesses are incapable of. …So what do they do with so much flash storage!? Is it literally just FOMO server buying?
Open parent Original URL
12
0
4
@T156@lemmy.world on lemmy.world Open parent
Storage. There aren’t enough hard drives, so datacentres are also buying up SSDs.
Open parent Original URL
12
0
2
@brucethemoose@lemmy.world on lemmy.world Open parent
since it’s needed to store training data. Again, I don’t buy this. The training data isn’t actually that big, nor is training done on such a huge scale so frequently.
Open parent Original URL
4
0
1
0
finitebanjo in !technology
@finitebanjo@lemmy.world · Dec 15
As we approach the theoretical error rate limit for LLMs, as proven in the 2020 research paper by OpenAI and corrected by the 2022 paper by Deepmind, the required training and power costs rise to infinity. In addition to that, the companies might have many different nearly identical datasets to try to achieve different outcomes. Things like books and wikipedia pages aren’t that bad, maybe a few hundred petabytes could store most of them, but images and videos are also valid training data and that’s much large, and then there is readable code. On top of that, all user inputs have to be stored to reference them again later if the chatbot offers that service.
View on lemmy.world
0
0
0
Sign in to interact

Loading comments...

About Community

technology
Technology
!technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules
  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots
  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
83904
Members
18817
Posts
Created: June 11, 2023
View All Posts
313k7r1n3

Company

  • About
  • Contact
  • FAQ

Legal

  • Terms of Service
  • Privacy Policy
  • VPN Policy

Email Settings

IMAP: mail.elektrine.com:993

POP3: pop3.elektrine.com:995

SMTP: mail.elektrine.com:465

SSL/TLS required

Support

  • support@elektrine.com
  • Report Security Issue

Connect

Tor Hidden Service

khav7sdajxu6om3arvglevskg2vwuy7luyjcwfwg6xnkd7qtskr2vhad.onion
© 2026 Elektrine. All rights reserved. • Server: 06:28:03 UTC