microcontroller exposing its JTAG at you like “single-step on me mommy 🥺”
Posts
I will forever hate ARM for including user-defined extension signals such as ARUSER in their AXI specifications.
it’s not that I particularly hate the idea of vendors adding random-ass extension as a sideband to AXI. see, I know perfectly well that, if ARM chose to not define these signals, vendors would add them anyway, and I’d be looking at an AXI bus with signals such as ARINTEL™FOO.
see, the problem is that I am now instead looking at 20 different AXI interfaces buses with 40 different USER signal widths and all the shit has non-descriptive names such as ARUSER[13].
oh, the argument was that at least it’s all now packed into a single field that generic interconnect can uniformly handle, and that it’s going to obey some ground rules in the spec? whoops bad news, turns out vendors gonna vendor and the USER signals on the bus I’m looking at decidedly do not obey AXI spec rules for extension signals.
and that vendors gonna vendor and the bus I’m looking at includes some custom vendor-specific signals in addition to *USER because why not.
and that vendors gonna vendor and the AXI-stream bus I’m looking at somehow has a 4-bit TREADY signal. what the fuck does that even mean. send help.
notice: catgirls are valued members of the society.
therefore, when writing technical documentation please avoid using discriminatory and nekophobic language such us “information provided under NDA”, “license key required”, or “registrer programming interface is proprietiary and will not be documented”.
two electronic devices: exchange data in any way whatsoever
EEs: this is literally equivalent to slavery and needs to be discussed as such
:duckduckgo: how many kilometers from the German border do the spezi withdrawal symptoms start
needs a paper
opens sci-hub
gets hit in the face with “sci-hub AI-powered research assistant”
instantly takes 9001 psychic damage
screams, closes sci-hub, decides to no longer need paper, nor any other papers
current status: playing quake brutalist jam 3 and holy shit this is good
I never cared much for the first quake, but this mod manages to make it way more fun somehow
I didn’t know I need a gun that shoots rebar in my life, but here we are and it is excellent
(also, holy shit it feels good when enemies shout “KILL HER” etc. when they see you instead of the usual assumed gender; no, the PC gender isn’t configurable, you just have to play as Rover the lesbian)
in retrospect maybe I should’ve asked my roommate what drugs she’s taking before instinctively snorting the powder off her leg or something
currently rewatching a series of unfortunate events, and oof, this exchange hits me every time
- I know he can be prickly, but you have to understand he had a very terrible childhood.
- I understand. I’m having a very terrible childhood right now.
you’ve heard of problematic age gap, now get ready for problematic enum variant size gap
tbf clippy is definitely the kind of software that would happily cancel someone
PoV: you have ordered 1 (one) item from aliexpress
jesus fuck that’s a lot of emails
there’s many things which piss me off about modern web design practices but the worst one would be interfaces which show you a list or table of some small item and force you to look at this shit with pagination of like 50 items at a time
and then you try clicking previous and next to actually find whatever it is that you’re looking for and pray to all gods that this thing isn’t too dumb to remember where it was and is going to just return some random items from the middle
motherfuckers will pour megabytes worth of javascript down your pipe, but sending more than a few kilobytes worth of actual payload is too much bandwidth for them
anyway this rant brought to you by my attempt of looking at a PR review on github and being stuck trying to figure out how to load more than 40 comments at once
you know what is good at handling large amounts of text at a time? computers. you should try it sometime. it’s kinda amazing how much text you can fit in a few hundred megs of RAM.
every chat protocol made after IRC persistently keeps you in your chat rooms, even when you’re not connected to the server at the moment. this is generally considered to be a good thing
however, consider: this removes the peak comedy of someone saying “let me try this out real quick” and getting loudly kicked out of the room by their own OOM killer taking exception to an unchecked memory leak.
so, it;s impossible to say if its bad or not
a decade or so ago, I was writing a H.264 decoder (needed a custom one for stupid reasons which of course had to do with hardware reverse engineering).
the first order of business was to implement CABAC: the final entropy coding stage of H.264 (ie. the first layer I had to peel starting from the bitstream), a funny variant of arithmetic coding. the whole thing is quite carefully optimize to squeeze out bits from video frames by exploiting statistics. in addition to carefully implementing the delicate core logic, I also had to copy-paste a few huge probability tables from the PDF, which of course resisted copy-paste as PDFs like to do and I had to apply some violence until it became proper static initializers in C source code.
furthermore, testing such code is non-trivial: the input is, of course, completely random-looking bits. and the way bitstreams work, I’d have to implement pretty much the whole thing before I got to the interesting part.
so, a few hours later, I figured I’m done with CABAC and reconstructing H.264 data structures, and pointed my new tool at some random test videos. and it worked first try! the structures my program spit out looked pretty much as expected, the transform coefficient matrices had pretty shapes and looked just as you’d expect them to, and I was quite happy with that.
and then I moved on to actually decoding the picture from the coefficients, and this time absolutely nothing worked. random garbage on screen. I spent a long time looking at my 2D transform code searching for bugs, but couldn’t find anything.
and then it hit me exactly what “entropy coding” means. I implemented something that intimately knows and exploits the statistical properties of what video transform coefficients and other structures look like, their probabilities and internal correlations, and uses that to squeeze out entropy and reconstruct it on the other end. my “looks good” testing meant absolute jack shit: I could’ve thrown /dev/urandom into the CABAC decoder instead of actual H.264 video, and it would still look like good video data at this stage until you actually tried to reconstruct the picture.
and sure enough, it turned out I fucked up transcribing some rows from the PDF around a page break or something.
10 years later, I think of this experience every time I see a vibecoded pull request, or other manifestation of AI bullshit. all the right shape, and no substance behind it.
and people really should learn to tell the fucking difference.