• Peanut@sopuli.xyz
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    As a peasant I know that professional help is not always available or viable. AI could very well have saved some of my friends who felt they had no available help and took their own lives. That being said, public facing language models should come with a warning for exacerbating psychosis. Notably the sycophantic models like chatgpt.

    • Umbrias@beehaw.org
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      2 days ago

      problem: actual mental help has low availability

      solution: ai can stand in where needed

      outcome: ai mental health systemically expands while actual therapists remain inaccessible, as insurance refuses to cover them. mental health outcomes systemically worsen across the board.

    • sabreW4K3@lazysoci.alOP
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      This says everything really. We live in a profit driven society, so where we should invest in public health to ensure that mental healthcare is available for everyone, instead we count pennies, driving public health workers to become private or quit completely. As a result, there’s not enough healthcare professionals to go around, if we can alleviate that a little, we absolutely should invest heavily in it. Have people using AI and supervise the AI, make changes and make it the best we can. Because a free AI, which is the dream, can help to save thousands.