• Jeena@piefed.jeena.net
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    13 hours ago

    That is a good question actually. I was looking into buying a graphics card to run som llms but it seems you need a huge amount of VRAM which makes the cards cost a huge amount of money.

    I read you can buy several cheaper ones and connect them so the VRAM is enough, but that was on one computer.

    It would be cool if several people could run their GPUs together similar to SETI@home to do GPU sharing. Even better if it’s done federated.