Yes and no, I have self-hosted models on one of my Linux boxes, but even with a relatively modern 70 series Nvidia GPU, it’s still faster to use free non-local services like ChatGPT or DDG.
My rule of thumb for SaaS LLMs is to never enter in any data that I wouldn’t also be willing to upload cleartext to Google Drive or OneDrive.
Sometimes that means modifying text before submitting it, and other times having to rely entirely on self-hosted tools.
Have you tried Llama? If so, is it useful according to your criteria?
Llama is the model I use most often, followed by ChatGPT and Claude.
Others as well, but yes, it is incredible helpful for the tasks I use it for.
Self-hosted?
Yes and no, I have self-hosted models on one of my Linux boxes, but even with a relatively modern 70 series Nvidia GPU, it’s still faster to use free non-local services like ChatGPT or DDG.
My rule of thumb for SaaS LLMs is to never enter in any data that I wouldn’t also be willing to upload cleartext to Google Drive or OneDrive.
Sometimes that means modifying text before submitting it, and other times having to rely entirely on self-hosted tools.