• circuscritic@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 month ago

      Yes and no, I have self-hosted models on one of my Linux boxes, but even with a relatively modern 70 series Nvidia GPU, it’s still faster to use free non-local services like ChatGPT or DDG.

      My rule of thumb for SaaS LLMs is to never enter in any data that I wouldn’t also be willing to upload cleartext to Google Drive or OneDrive.

      Sometimes that means modifying text before submitting it, and other times having to rely entirely on self-hosted tools.