Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

  • yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    2 months ago

    Think of LLMs like a stupid office worker. You wouldn’t rely on them to make critical decisions, but they’re valuable for tedious stuff.

    For example, my calendar changed the way to enter new events breaking my workflow. Now I just type out a skeletal schedule and have LLM convert that into a .csv that I import.

    I’m thinking of Ripping my CD collection again. I’m researching a way to use a LLM to tidy up the metadata.

    I had a folder full of random stuff I’ve saved for years. Had a LLM organize and categorize it for me. I had to tweak the prompt enough that this was a medium difficulty task, but still way easier than doing it manually.

  • RandomLegend [He/Him]@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    It’s a tool like any other. If you don’t have any usecase for it, just don’t use it.

    I use it to summarize release notes and generate some minor descriptions for generic stuff in my TTRPG campaigns.

    • DrinkMonkey@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      generate some minor descriptions for generic stuff in my TTRPG campaigns.

      Need a quick 200 word description of the interior of an apothecary? Or a band of marauding orcs? It’s been a huge time saver for me.

  • minnix@lemux.minnix.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Ollama without a GPU is pretty useless unless you’re using with Apple silicon. I’d just get rid of it until you get a GPU.