• Sign in
  • Sign up
Elektrine
EN
Log in Register
Modes
Overview Chat Timeline Communities Gallery Lists Friends Email Vault DNS VPN
Back to Timeline !technology @davidagain
In reply to 8 earlier posts
@MalReynolds@slrpnk.net on slrpnk.net Open parent
Cheap fuckers cheaping out, shocker (context is (V)RAM). AI speedrunning enshittification, who’d of thunk.
Open parent Original URL
0
0
1
@pixxelkick@lemmy.world on lemmy.world Open parent
Uh… no its just the free models being free, theyre lower cost intentionally to provide free options for people who dont wanna pay subscription fees. (context is (V)RAM) Eh sort of, its more operating costs, the larger the context size the more expensive the model is to run, literally in terms of power consumption. Keep in mind we are on the scale of fractions of cents here, but multiply that by millions of users and it adds up fast. But the end result is that the agent will fuck stuff up, and will even quickly /forget/ it fucked that up if you dont catch it asap A lot of them have a context window that can be wiped out within like, 2 minutes of steady busywork…
Open parent Original URL
0
0
1
@davidagain@lemmy.world on lemmy.world Open parent
I love how your response to the catastrophic results of stupidly trusting ai is “pay more money to ai companies”. Sane person’s response: don’t trust llms.
Open parent Original URL
0
0
1
@pixxelkick@lemmy.world on lemmy.world Open parent
What are you talking about. No? I never said that. I just explained /why/ it happened, I literally nowhere in my post said, or implied, someone should pay for more expensive models. What are you smoking? You just have to be aware they have very short memory when using a cheap model and assume anything you wrote 1 minute ago has already left its memory, which is why they produce pretty dumb output if you try and depend on that… so… dont depend on that.
Open parent Original URL
0
0
1
@davidagain@lemmy.world on lemmy.world Open parent
Everyone else who has any sense: llms are shit and you shouldn’t trust them with executive power. You: just the cheap ones. Me: no, all of them. What kind of lunatic trusts control of anything important to a fundamentally stochastic process?
Open parent Original URL
0
0
1
@pixxelkick@lemmy.world on lemmy.world Open parent
You: just the cheap ones I never said that. I just said that the cheap ones are especially shitty. People on this site really lack reading comprehension it seems.
Open parent Original URL
0
0
1
@davidagain@lemmy.world on lemmy.world Open parent
no its just the free models… You just have to be aware… when using a cheap model You: just the cheap ones I never said that. Ohhhhhhhhh ok yes of course you never said or implied that. Not your repeated message at all. And yet you can’t keep away from adressing your criticism towards free or cheap LLMs! It’s like your subtext or your underlying belief is that of you just pay big tech enough money and they can just build a big enough set of server farms, it’ll be ok. No, it will not be ok and the enshittification has begun from an already shitty base point. All LLMs are shit, the cheap and free ones are indeed just easier to spot as generating shit, if you ask them about things you know about. But you have to accept that they’re ALL shit and STOP making get out clauses for the expensive ones by firing your criticisms exclusively at the cheap or free ones. Giving ANY LLM executive power over your data is A BIG MISTAKE because you’re putting your data in the control of something which operates, at its heart, as a random number generator. They’re trained to sound right. People trust them because they sound right. This is a fundamental error.
Open parent Original URL
0
0
1
@pixxelkick@lemmy.world on lemmy.world Open parent
The only people who have these issues, are people who are using the tools wrong or poorly. Using these models in a modern tooling context is perfectly reasonable, going beyond just guard rails and instead outright only giving them explicit access to approved operations in a proper sandbox. Unfortunately that takes effort and know-how, skill, and understanding how these tools work. And unfortunately a lot of people are lazy and stupid, and take the “easy” way out and then (deservedly) get burned for it. But I would say, yes, there are safe ways yo grant an llm “access” to data in a way where it does not even have the ability to muck it up. My typical approach is keeping it sandbox’d inside a docker environment, where even if it goes off the rails and deletes something important, the worst it can do is cause its docker instance to crash. And then setting up via MCP tooling that commands and actions it can prefer are explicit opt in whitelist. It can only run commands I give it access to. Example: I grant my LLMs access to git commit and status, but not rebase or checkout. Thus it can only commit stuff forward, but it cant even change branches, rebase, nor push either. This isnt hard imo, but too many people just yolo it and raw dawg an LLM on their machine like a fuckin idiot. These people are playing with fire imo.
Open parent Original URL
0
0
1
0
Log in | Sign up in !technology
@davidagain@lemmy.world · 12d
You’ll be the 4753rd guy with the oops my llm trashed my setup and disobeyed my explicit rules for keeping it in check. You know programmers who use llms believe they’re much more productive because they keep getting that dopamine hit, but when you actually measure it, they’re slower by about 20%. You appointed yourself boss over a fast and plausible intern who pastes and edits a LOT of stack overflow code, but never really understands it and absolutely is incapable of learning. You either spend almost all of your time in code review now for your stupid sycophantic llm interns who always tell you you’re right but never learn from you, or you’re checking in vast quantities of shit to your projects. You know really subtle, hard to find bugs on rare cases that pass your CI every single time? Or ones that no one in their right mind would have made, but yet they compile and look right at first glance. They’re now your main type of bug. You are rotting your projects with your random number generator. And you think that all the money you’re playing for your blagging llms protects you from them fucking up everything for you. But it doesn’t. And you’ll also find that your contract with your llm supplier expressly excludes them from any liability whatsoever arising from you using it instead pre-blaming you for trusting it.
View on lemmy.world
0
0
0
Sign in to interact

Loading comments...

About Community

technology
Technology
!technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules
  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots
  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
83898
Members
18815
Posts
Created: June 11, 2023
View All Posts
313k7r1n3

Company

  • About
  • Contact
  • FAQ

Legal

  • Terms of Service
  • Privacy Policy
  • VPN Policy

Email Settings

IMAP: mail.elektrine.com:993

POP3: pop3.elektrine.com:995

SMTP: mail.elektrine.com:465

SSL/TLS required

Support

  • support@elektrine.com
  • Report Security Issue

Connect

Tor Hidden Service

khav7sdajxu6om3arvglevskg2vwuy7luyjcwfwg6xnkd7qtskr2vhad.onion
© 2026 Elektrine. All rights reserved. • Server: 04:58:53 UTC