• Ziggurat@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Is it still only nvidia or does AMD work ? I am on a market for a new PC, and I heard AMD works better on Linux. Moreover google tells me that automatic111 runs on AMD but don’t know if someone tried

    • Murdoc@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      All I saw when I checked your link was this:
      System Requirements
      Windows 10/11, Linux or Mac.
      An NVIDIA graphics card, preferably with 4GB or more of VRAM or an M1 or M2 Mac. But if you don’t have a compatible graphics card, you can still use it with a “Use CPU” setting. It’ll be very slow, but it should still work.
      8GB of RAM and 20GB of disk space.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      I do Stable Diffusion on an AMD GPU, a XT 7900 XTX on Linux. So, yeah, that works.

      The problem is that well-performing support is relatively new (the RDNA 3 cards are okay), and older cards may or may not be practical; that particular card is their latest generation. I agree that generally-speaking – not talking specifically about generative AI – on Linux, it’s preferable to use AMD these days. For generative AI – not Linux-specific, but in general – Nvidia started earlier than AMD; in general, people have written generative AI stuff to run on Nvidia on Windows and now are adding AMD support. Problem is that Nvidia is also charging considerably more for AMD for their hardware, and you want a high-VRAM card to do Stable Diffusion…I’d probably recommend at least 16GB if possible.

      Also, a popular library for doing some generative AI stuff, “transformers”, doesn’t currently run on AMD cards. Stable Diffusion can run with it – it speeds operations up – but doesn’t require it. But a few other things, like Tortoise TTS, a piece of software that can generate speech in someone else’s voice given some samples of that voice, do require transformers.

      AMD has been putting out Linux support for generative AI before Windows support, actually – they just got out Windows support for my card, and it’s been usable on Linux for a while.

      If you’re going Linux and you’re willing to get an RDNA3-based card (new and high-end and a lot of VRAM) and you don’t need to run transformers (which you don’t for Stable Diffusion), yeah, I’d say go AMD. If you’re going Linux and you don’t care about generative AI, then I’d get AMD whatever. If you’re trying to use older cards, I’d consider Nvidia; I had been using an older Nvidia card prior to this, and while it could do the generative AI stuff on Linux, for everything else I’d prefer the AMD card.

      Be warned that generative AI stuff is really VRAM-hungry; I would prioritize getting something with a ton of VRAM over performance (so if you go Nvidia, don’t get their -Ti models, which run more quickly but have less memory). With Stable Diffusion, VRAM places hard caps on the size of the image that you can process at one time (though you can generate lower-resolution and then upscale an image in chunks).

      Right now, AMD cards of the sort that I’d recommend run something like $500 to $1000. If you can wait another year or so, I’d assume that hardware prices for higher-VRAM cards will come down (though Stable Diffusion’s VRAM requirements have risen as newer, models aimed at higher-resolution images come out, so…)

      Your main system’s specs don’t matter much for Stable Diffusion – throwing a lot of CPU or main memory won’t make much difference. Faster storage might speed up initially loading the model into the card, dunno. But basically, all the heavy lifting happens on the GPU, so the real requirement is a beefy, high-VRAM GPU; unlike with games, you can do fine with an older computer and a high-end GPU if your only concern is running Stable Diffusion.

      Another option, if one wants to just dabble with Stable Diffusion doing generative AI a bit, is to rent a computer in a datacenter somewhere that has a high-end GPU in it. I have not done this, but vast.ai does this sort of thing, rents machines out on something like an hourly basis. That can be more cost-effective if you’re only going to be sporadically doing this sort of thing. An Nvidia 4090 might run something like $2000, but a remote computer with one could rent for (checks) 50 cents an hour. So if you only have time to play around with this on, say, every other weekend, you could rent the thing for a weekend for $24. Buying the card would require 83 weekends to break even.

      • Abel@lemmy.nerdcore.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’m running a 3060 with 12 GB VRAM on a 2014 rig. Can confirm it works well, the only sad part is the random all-computer crashes sometimes.

      • Ziggurat@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Thanks for your insight.

        So indeed, looks like AMD compatiblity is still “work in progress” or require “expensive hardware” (Knowing that in one year, the model may have become even heavier)

        Renting GPU time might be the proper way