• Sign in
  • Sign up
Elektrine
EN
Log in Register
Modes
Overview Chat Timeline Communities Gallery Lists Friends Email Vault DNS VPN
Back to Timeline
  • Open on discuss.online

Devial

@Devial@discuss.online
lemmy 0.19.16
0 Followers
0 Following
Joined November 21, 2025

Posts

Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 13, 2025
It happened once, to one aircraft, and it’s solvable with a software update. You’re more likely to be struck by lightning the next time you leave your house than to run into this problem on a flight, and that was before the software update.
View full thread on discuss.online
12
1
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 13, 2025
A bit IS represented by one or zero. A bit can take the state of charged or not charged. That’s what a bit physically is. In low level code, those states are represented by binary numbers. Or do you think there’s a actual physical numbers 0 and 1 floating around in your RAM ?
View full thread on discuss.online
11
1
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 13, 2025
Read that sentence again. They didn’t say bits represent 0s and 1s, they said bits are represented BY 0s and 1s, which is entirely correct. Physicsly speaking, in a modern silicon based PC, bits are the presence or absence of electrons in an electron well. That presence or absence is often represented by binary numbers, because it makes the math easy, though it can also be represented in other ways, such as “HI” and “LO”. The statement from the article is entirely correct.
View full thread on discuss.online
10
3
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 13, 2025
This isn’t a scientific journal or news paper. It’s a main stream article by the BBC, intended to be consumed, and understood, by people who have zero knowledge of how computers, bits or binary numbers work, so I really don’t see the issue here.
View full thread on discuss.online
16
6
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 13, 2025
It’s not going to be become a major problem. We have radiation hardened computing hardware, and ways to deal with single event effects, we’ve in fact got a lot of practice doing these things, because guess what: Satellites also need working computing hardware, and they’re exposed to orders of magnitude more radiation than aircraft. Manufacturers will just have to start taking it into consideration more in the future, and ensure that the flight computers have redundant ECC memory.
View full thread on discuss.online
21
6
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 12, 2025
To address your two points, where did people get the idea that the word porn implies artistic merit or consent? I didn’t say merit (or consent, though I assume that one’s a typo), I said artistic intent. Which every creative work by definition has. There is nothing ethically wrong with porn in a vacuum, so categorising CSAM as a category of something that isn’t inherently ethically wrong in my opinion makes it a bad term. CSAM should clearly and strictly be delineated from consensual porn. CP can stand for a lot of things but it’s common parlance now. CSAM just causes confusion. Ah yes. The Acronym with MORE common definitions somehow causes less confusion. That makes perfect sense. Of course. That explains why so many people in this thread were confused by it. Oh no wait. They weren’t. Also really? Now you’re stooping to the old “why so mad bro?”. You’re the one having a meltdown, I’m wasting time at work by sharing an opinion. You’re the one who got upset enough about me using a common abbreviation, that no one in the thread was remotely confused by, to kick off this entire shit. You decided you needed to pedantically comment on this. I’m simply defending myself from your pedantic grammar nazi shit.
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 12, 2025
I’m not comparing you to Ben Shapiro, I’m comparing your grammar nazi pedantism to a single specific instance of someone else’s grammar nazi pedantism. I also gave several explicit reasons why using CP over CSAM is idiotic, not just “my friends say so” But hey, if you want to be like that sure. You’re right, everyone else is wrong, you do you and keep using CP instead of CSAM, and keep getting irrationally upset and angry at people who think CSAM is a better term. Happy now ?
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 12, 2025
Big “Ben Shapiro ranting about renewable energies because of the first law of thermodynamics” energy right here. The majority of people can be wrong. No they can’t, not with regards to linguistics. Words and language, by definition, and convention of every serious linguist in the world, mean what the majority of people think them to mean. That’s how language works.
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 12, 2025
Also, the data set wasn’t hosted, created, or explicitly used by Google in any way. It was a common data set used in academic papers on training nudity detectors. Did you seriously just read the headline, guess what happened, and are now arguing based on that guess that I, who actually read the article, am wrong about it’s content ? Because that’s sure what it feels like reading your comments…
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 12, 2025
So you didn’t read my comment did you ? He got banned because Google’s automated monitoring system, entirely correctly, detected that the content he unzipped contained CSAM. It wasn’t even a manual decision to ban him.
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 12, 2025
Material can be anything. It can be images, videos theoretically even audio recordings. Images is a relevant and sensible distinction. And judging by the downvotes you’re collecting, the majority of people disagree with you.
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 12, 2025
Which of the letters in CSAM stand for images then ?
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 12, 2025
They didn’t get mad. Did you even read my comment ?
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 11, 2025
The article headline is wildly misleading, bordering on just a straight up lie. Google didn’t ban the developer for reporting the material, they didn’t even know he reported it, because he did so anonymously, and to a child protection org, not Google. Google’s automatic tools, correctly, flagged the CSAM when he unzipped the data and subsequently nuked his account. Google’s only failure here was to not unban on his first or second appeal. And whilst that is absolutely a big failure on Google’s part, I find it very understandable that the appeals team generally speaking won’t accept “I didn’t know the folder I uploaded contained CSAM” as a valid ban appeal reason. It’s also kind of insane how this article somehow makes a bigger deal out of this devolper being temporarily banned by Google, than it does of the fact that hundreds of CSAM images were freely available online and openly sharable by anyone, and to anyone, for god knows how long.
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 11, 2025
They reacted to the presence of CSAM. It had nothing whatsoever to do with it being contained in an AI training dataset, as the comment I originally replied to seeks to think.
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 11, 2025
They didn’t react to anything. The automated system (correctly) flagged and banned the account for CSAM, and as usual, the manual ban appeal sucked ass and didn’t do what it’s supposed to do. This is barely news worthy. The real headline should be about how hundreds of CSAM images were freely available and sharable from this data set.
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 11, 2025
Did you even read the article ? The dude reported it anonymously, to a child protection org, not google, and his account was nuked as soon as he unzipped the data, because the content was automatically flagged. Google didn’t even know he reported this.
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 09, 2025
Has this dude never heard of the tobacco, alcohol or gun Industry ? He’s talking about commercial heroin like it’s some outlandish and unthinkable idea that a harmful thing would become a billion dollar industry
View full thread on discuss.online
0
0
0
0
Open post
Boosted by Technology @technology@lemmy.world
In reply to
Devial
@Devial@discuss.online
discuss.online
Devial
Devial
@Devial@discuss.online
discuss.online
@Devial@discuss.online in technology · Dec 03, 2025
If you gave your AI permission to run console commands without check or verification, then you did in fact give it permission to delete everything.
View full thread on discuss.online
124
3
0
0
313k7r1n3

Company

  • About
  • Contact
  • FAQ

Legal

  • Terms of Service
  • Privacy Policy
  • VPN Policy

Email Settings

IMAP: mail.elektrine.com:993

POP3: pop3.elektrine.com:995

SMTP: mail.elektrine.com:465

SSL/TLS required

Support

  • support@elektrine.com
  • Report Security Issue

Connect

Tor Hidden Service

khav7sdajxu6om3arvglevskg2vwuy7luyjcwfwg6xnkd7qtskr2vhad.onion
© 2026 Elektrine. All rights reserved. • Server: 02:15:54 UTC