• psychadlligoat@piefed.social
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    3 days ago

    except Blockchain mining has no practical use that can’t be solved using simpler tech, AI does have those applications

    • thesmokingman@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      3 days ago

      Not yet, not for some time, and certainly not at single local GPU running at minimal use. Both you and the commenter I was responding to seem to forget how massive the @home projects were.

      • nagaram@startrek.website
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        For simply productivity like Copilot or Text Gen like ChatGPT.

        It absolutely is doable on a local GPU.

        Source: I do it.

        Sure I can’t do auto running simulations to find new drugs and protein sequencing or whatever. But it helps me code. It helps me digest software manuals. That’s honestly all I want

        Also, massive compute projects for the @home project are good?

        Local LLMs runs fine on a 5 year old GPU, a 3060 12 gig. I am getting performance on par with cloud ran models. I’m upgrading to a 5060ti just because I wanted to play with image Gen.