Mfw my doomsday ai cult attracts ai cultists of a flavor I don’t like

Not a fan of yud but getting daily emails from delulus would drive me to wish for the basilisk

  • @visaVisaOP
    link
    English
    13
    edit-2
    27 days ago

    Is the whole x risk thing as common outside of North America? Realizing I’ve never seen anyone from outside the anglosphere or even just America/Canada be as God killingly Rational as the usual suspects

    • @jaschop
      link
      English
      14
      edit-2
      27 days ago

      Might be semi-related: the german aerospace/automotive/industrial research agency has an “AI Safety” institute (institute = top level department).

      I got a rough impression from their website. They don’t seem to be doing anything that successful. Mostly fighting the unwinnable battles of putting AI in everything without sucking and twiddling machine learning models to make them resilient against malicous data. Besides trying to keep the torch of self-driving cars alive for the german car industry. Oh, and they’re doing the quantum AI bit.

      They’re a fairly new institute, and I heard rumors they’re not doing great. Maybe the organization resists the necessary insanity to generate new AI FOMO at this point. One can dream.

      • @visaVisaOP
        link
        English
        527 days ago

        Kinda interesting that it’s focused on smaller scale risks like malicious data instead of ahhh extinction ahhh

        • @jaschop
          link
          English
          727 days ago

          Yeah, the same thing struck me. I’d guess they were jumping on the buzzword, but x-risk just was deemed gaudy and unserious.

      • @visaVisaOP
        link
        English
        227 days ago

        True but specifically was referring to researchers since most of the researchers repping extinction risk are LW or yud influenced (Musk, Hinton, etc)