Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youā€™ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cutā€™nā€™paste it into its own post ā€” thereā€™s no quota for posting and the bar really isnā€™t that high.

The post Xitter web has spawned soo many ā€œesotericā€ right wing freaks, but thereā€™s no appropriate sneer-space for them. Iā€™m talking redscare-ish, reality challenged ā€œculture criticsā€ who write about everything but understand nothing. Iā€™m talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyā€™re inescapable at this point, yet I donā€™t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldnā€™t be surgeons because they didnā€™t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canā€™t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

  • BlueMonday1984@awful.systemsOP
    link
    fedilink
    English
    arrow-up
    8
    Ā·
    1 month ago

    Baldurā€™s given his thoughts on Bluesky - he suspects Zitronā€™s downplayed some of AIā€™s risks, chiefly in coding:

    Thereā€™s even reason to believe that Edā€™s downplaying some of the risks because theyā€™re hard to quantify:

    • The only plausible growth story today for the stock market as a whole is magical ā€œAIā€ productivity growth. What happens to the market when that story fails?
    • Coding isnā€™t the biggest ā€œwinā€ for LLMs but its biggest risk

    Software dev has a bad habit of skipping research and design and just shipping poorly thought-out prototypes as products. These systems get increasingly harder to update over time and bugs proliferate. LLMs for coding magnify that risk.

    Weā€™re seeing companies ship software nobody in the company understands, with edge cases nobody is aware of, and a host of bugs. LLMs lead to code bases that are harder to understand, buggier, and much less secure.

    LLMs for coding isnā€™t a productivity boon but the birth of a major Y2K-style crisis. Fixing Y2K cost the worldā€™s economy over $500 billion USD (corrected for inflation), most of it borne by US institutions and companies.

    And Y2K wasnā€™t promising magical growth on the order of trillions so the perceived loss of a failed AI Bubble in the eyes of the stock market would be much higher

    On a related note, I suspect programming/software engineeringā€™s public image is going to spectacularly tank in the coming years - between the impending Y2K-style crisis Baldur points out, Silicon Valley going all-in on sucking up to Trump, and the myriad ways the slop-nami has hurt artists and non-artists alike, the pieces are in place to paint an image of programmers as incompetent fools at best and unrepentant fascists at worst.