• rocket_dragon@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      60
      ·
      17 hours ago

      Next step is an AI that detects AI labyrinth.

      It gets trained on labyrinths generated by another AI.

      So you have an AI generating labyrinths to train an AI to detect labyrinths which are generated by another AI so that your original AI crawler doesn’t get lost.

      It’s gonna be AI all the way down.

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        4
        ·
        17 hours ago

        All the while each AI costs more power than a million human beings to run, and the world burns down around us.

        • LainTrain@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          15 hours ago

          The same way they justify cutting benefits for the disabled to balance budgets instead of putting taxes on the rich or just not giving them bailouts, they will justify cutting power to you before a data centre that’s 10 corporate AIs all fighting each other, unless we as a people stand up and actually demand change.

            • LainTrain@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              2
              ·
              edit-2
              9 hours ago

              In my country blue is conservatives… But I agree with the sentiment! It worked for California, it can work for your whole country, let the Dems stop fearing they’ll lose elections, give them comfortable margins and then massively support progressives who can bring in the good stuff, they won’t have a chance if the party core thinks the very future of elections is on the line, but if they think they’ll likely win anyway, you might just be able to push through a progressive candidate and end the Neoliberal decay.

      • brucethemoose@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        16 hours ago

        LLMs tend to be really bad at detecting AI generated content. I can’t imagine specialized models are much better. For the crawler, it’s also exponentially more expensive and more human work, and must be replicated for every crawler since they’re so freaking secretive.

        I think the hosts win here.