• elshandra@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    2
    ·
    9 months ago

    The best solution to any problem is to go back in time to before the problem was created, sure. That cat’s so far out of the bag, and it’s only going to multiply and evolve.

    • Admiral Patrick@dubvee.org
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      4
      ·
      edit-2
      9 months ago

      I mean, yeah that’s true, but harm reduction is also a thing that exists. Usually it’s mentioned in the context of drugs, but it could easily apply here.

      • elshandra@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        9
        ·
        9 months ago

        Interesting take, addiction to the convenience provided by AI driving the need to get more. I suppose at the end of the day it’s probably the same brain chemistry involved. I think that’s what you’re getting at?

        I’m any case, this tech is only going to get better, and more commonplace. Take it, or run for the hills.

          • elshandra@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 months ago

            Ah, so more like self-harm prevention, gotcha.

            I guess like any tool, whether it is help or harm depends on the user and usage.

              • elshandra@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                9 months ago

                Oh, right. Microsoft is a corp. They don’t care about the harm they do until it costs them money.

                e: also, I love to bash on ms, but they’re not the problem here. These things are being built all over the place… In companies, in governments, in enthusiasts back yard. You can tell Microsoft, Google, Apple to stop developing the code, you can tell nvidia to stop developing cuda. It’s not going to matter.

                  • elshandra@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    9 months ago

                    I suppose, could be harm reduction. Like peeling a bandaid off slowly instead of ripping it off.

                    They’re here, they might not be everywhere yet, but they’re here to stay as much as photoshopped images or trick photography are. Just more lies to hide the truth.

                    All we can do now is get better at dealing with them.

        • Admiral Patrick@dubvee.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          I’m heading for the hills then. I’m perfectly capable of thinking for myself without delegating that to some chatbot.

          • elshandra@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 months ago

            Everyone is. As time and tech progresses, you’re going to find that it becomes increasingly difficult to avoid without going off-grid entirely.

            Do you really think corps aren’t going to replace humans with AI, any later than they can profit by doing so? That states aren’t going eventually to do the same?