• 0 Posts
  • 46 Comments
Joined 1 year ago
cake
Cake day: June 19th, 2023

help-circle
  • Dismissive dickwad behavior is good, actually - if you’re dismissing Nazis. Or anyone else who deserves a blunt rejection. It is fine and valid to deny people civility, when their rhetoric is inherently abusive. Respect and patience have limits.

    Swearing at people absofuckinglutely has its place in online discourse. If not for the assholes themselves - then for the people they’re trying to fool.

    Anyway.

    Discworld has a few parallel threads. Release order starts with The Colour Of Magic, which is fun and short, but not exactly top-notch material. See explanatory flowchart. Those first few novels have a real Season One vibe.

    The traditional introduction seems to be whichever book catches your eye. Or whichever you happened to find first, if you’d heard good things about the series. That’s how I wound up reading Ringworld by Larry Niven, because cultivating your interests in the 90s was a much fuzzier experience.




  • Honestly I miss reddit circa 2015. Obviously before The Idiot and half the world lurching toward fascism - but also back when “fuck off, Nazi” was treated better than being a goddamn Nazi.

    The proliferation of “civility” is poisonous to online discourse. It is always the wrong metric. Trolls love being polite monsters. r/Politics even went a step further and demanded all opinions be taken in good faith. Do those idiots know what trolling is? Do they not understand bad faith… as a concept? It only works because people mistake it for good faith. Demanding everyone do that is a gift to trolls.

    Moderation requires common-sense identification of who’s being an asshole. It’s never about no-no words. If a script could handle the job, we would let it.

    Lemmy has far too many communities with rules going ‘never be rude to anyone ever!!!’ and then zero enforcement when someone calls you a cunt for gently correcting their grammar. That is the worst of both worlds. Anyone sincerely trying is going to hold back from just dealing with assholes appropriately, like an adult, but those people are then left with no recourse against pointlessly toxic shitheads. I don’t want a screaming match. I want words to matter.

    Also if you enjoy Douglas Adams, Terry Pratchett has a similar deep snark. Discworld’s a whole mess of books but you can kinda jump in anywhere. I recommend Guards! Guards! or Going Postal. He did Good Omens with baby Neal Gaiman, and they’d write chapters separately, then throw out every joke they’d bought thought of.


  • It is a right bitch that the reason to leave is 100% the bastards in charge. The community was fine. (Okay, giant asterisks all over that, but you know what I mean. The community was not the cause for masses walking away with a sea of middle fingers lit by burning bridges.)

    I’m not here because it’s better. I’m here because fuck Spez. And fuck enshittification. Fifteen years and these greedy incompetents made it impossible to come back without feeling like betrayal. The only reason I’m not deleting anything is that I don’t do that shit. Nothing any human being put effort into deserves to be lost forever.

    Elmo did us the favor of turning his stolen harassment engine into an all-stick-no-carrot experience in a fucking hurry.











  • “Here’s all the ways we tell people not to use parallelism.”

    I’m sorry, that’s not fair. It’s only a fraction of the ways we tell people not to use parallelism.

    Multi-threading is difficult, which is why I said it’s a fucking obstacle. It’s the wrong model. The fact you’d try to “slap it on” is WHAT I AM TALKING ABOUT. You CANNOT just apply more cores to existing linear code. You MUST actively train people to write parallel-friendly code, even if it won’t necessarily run in parallel.

    Javascript is a terrible language I work with regularly, and most of the things that should be parallel aren’t - and yet - it has abundant features that should be parallel. It has absorbed elements of functional programming that are excellent practice, even if for some goddamn reason they’re actually executed in-order.

    Fetches are single-threaded, in Javascript. I don’t even know how they did that. Grabbing a webpage and then responding to an event using an inline function is somehow more rigidly linear than pre-emptive multitasking in Windows 95. But you should still write the damn things as though they’re going to happen in parallel. You have no control over the order they happen in. That and some caching get you halfway around most locks.

    Javascript, loathesome relic, also has vector processing. The kind insisted upon by that pedant in the other subthread, who thinks the 512-bit vector units in a modern Intel chip don’t qualify, but the DSP on a Super Nintendo does. Array.forEach and Array.map really fucking ought to be parallelisable. Google could use its digital imperialism to force millions of devs to adopt better standards, just by following the spec and not processing keys in a rigid order. Bad code treating it like a simplified for-loop would break. Good code… wouldn’t.

    We want people to write that kind of code.

    Not necessarily code that will run in parallel. Just code that could.

    Workload-centric thinking is the only thing that’s going to stop “let’s add a little parallelism, as a treat” from producing months of needless agony. Anything else has to be dissected, warped beyond recognition, and stitched back together, with each step taking more effort than starting over from scratch, and the end result still being slow and unreadable and fragile.


  • “The way we teach this relationship causes harm.”

    “Well you don’t understand this relationship.”

    “I do, and I’m saying: people plainly aren’t getting it, because of how we teach it.”

    “Well lemme explain the relationship again–”

    Nobody has to tell people not to use parallelism. They just… won’t. In part because of how people tend to think, by default, and in part because of how we teach them to think.

    We would have to tell students to use parallelism, if we expect graduates to choose it freely. It’s hard and it’s weird and you can’t just slap it on at the end. It should become what they do first.

    I am telling you in some detail how focusing on linear performance, using the language of the nineteen goddamn seventies, doesn’t need to say multi-threading isn’t worth it, to leave people thinking multi-threading isn’t worth it.

    Jesus, even calling it “multi-threading” is an obstacle. It makes parallelism sound like some fancy added feature. It’s the version of parallelism that shows up in late-version changelogs, when for some reason performance has become an obstacle.


  • I am a computer engineer. I get the math.

    This is not about the math.

    Speeding up a linear program means you’ve already failed. That’s not what parallelism is for. That’s the opposite of how it works.

    Parallel design has to be there from the start. But if you tell people adding more cores doesn’t help, unless!, they’re not hearing “unless.” They’re hearing “doesn’t.” So they build shitty programs and bemoan poor performance and turn to parallelism to hurry things up - and wow look at that, it doesn’t help.

    I am describing a bias.

    I am describing how a bias is reinforced.

    That’s not even a corruption of Amdahl’s law, because again, the actual dude named Amdahl was talking to people who wanted to build parallel machines to speed up their shitty linear code. He wasn’t telling them to code better. He was telling them to build different machines.

    Building different machines is what we did for thirty or forty years after that. Did we also teach people to make parallelism-friendly programs? Did we fuck. We’re still telling students about “linear portions” as if programs still get entered on a teletype and eventually halt. What should be a 300-level class about optimization is instead thrown at people barely past Hello World.

    We tell them a billion processors might get them a 10% speedup. I know what it means. You know what it means. They fucking don’t.

    Every student’s introduction to parallelism should be a case where parallelism works. Something graphical, why not. An edge-detect filter that crawls on a monster CPU and flies on a toy GPU. Not some archaic exercise in frustration. Not some how-to for turning two whole cores into a processor and a half. People should be thinking in workloads before they learn what a goddamn pointer is. We betray them, by using a framing of technology that’s older than disco. Amdahl’s law as she is taught is a relic of the mainframe era.

    Telling kids about the limits of parallelism before they’ve started relying on it has been an excellent way to ensure they won’t.




  • This is rapidly going to stop being a polite interaction if you can’t remember your own claims.

    SIMD predates the term vector processing, and was in print by 1966.

    Vector processing is at least as old as the Cray-1, in 1975. It was already automatically parallelizing what would’ve been loops on prior hardware.

    Hair-splitting about whether a processor can use vector processing or exclusively uses vector processing is a distinction that did not exist at the time and does not matter today. What the overwhelming majority of uses refer to is basically just SIMD extensions. Good luck separating the two when SIMT is a thing.