And how much do they cost? And how do you like them?

  • DALLEmmyBot@lemmy.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    10 months ago

    Does anyone know if SDXL can run split tasks with SLI cards? I’ve been thinking of building a dual A80 tesla rig since they are so cheap but I want to be able to render on all 48gb as one.

    For OP – I run totally on OpenAI using API calls.

    • tal@lemmy.today
      cake
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      10 months ago

      You can’t just increase your VRAM limit like that for single tasks, like working on a single massive high-resolution image.

      There might be some way to get a series of queued tasks split.

      googles

      According to this, not in Automatic1111 currently, but there’s some other frontend that can:

      https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1621

      StableSwarmUI support this out of box. The ex employee of Stability.ai made it.

      https://github.com/Stability-AI/StableSwarmUI

      Motivations

      The “Swarm” name is in reference to the original key function of the UI: enabling a ‘swarm’ of GPUs to all generate images for the same user at once (especially for large grid generations).