• ExtremeDullard@piefed.social
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    6 days ago

    The entire world, since AI was trained on massive amounts of existing code stolen from everyone everywhere. What AI is spitting out is somebody’s work that they weren’t paid for.

    • Zexks@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      6 days ago

      So the whole world is FOSS. As every developer learned to code by looking at other peoples examples. Sin e we all know this isnt the case you might be missing some key points of understanding

    • StoneyPicton@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      6 days ago

      Although I agree with some of that sentiment, in many cases it’s no different then someone trying to learn programming by reading textbooks and looking at code shared on the internet. The difference is scale, which may be a hard legal argument to apply.

      • ExtremeDullard@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        It’s not the same,

        I agree that the whole world has always worked by people learning or imitating what others before them have done. In fact, it’s a key factor in human progress: people learn what others have done - more often than not, for free, with the originators of the ideas being copied receiving no money - and then improve on the things they learn. Standing on the shoulders of giants and all that.

        But here’s why the human experience is not the same as AI:

        Humans need a lot of time and effort to learn from others. Those others don’t get paid for the experience new generations “steal” from them, but they themselves “stole” their forebears’ experience. It’s an unending chain of “stealing”: nobody gets paid because everybody essentially pays it forward so-to-speak, by letting their own experience get pilfered to offset the pilfering they themselves did.

        AI just takes the sum of human knowledge and spews it out without any effort, without injecting any improvement into the chain of experience and without giving away anything novel to the next generation. All it does is the pilfering but none of the pay-it-forward., and the only people who profit from AI are the AI companies and their Epstein-class billionaire CEOs. Everybody else - the users - get lazy, complacent and stop progressing.

        • StoneyPicton@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          I agree and fully support you’re human representation. It’s one reason that the patent system needs some adjusting to better reflect that. I don’t agree with your interpretation of AI though. It is early stages and the AI you are being exposed to is not the same ones being developed in the background. With accelerated advancement of results will come a litany of improvements and discoveries that will make previous contributions seem slow. This is just my own take and I hope we have time to find out.

  • StoneyPicton@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    6 days ago

    Another question for consideration is at what point will AI be considered sentient and therefore entitled to ownership of its own IP and retention of all profits generated. Because of this implication there will forever be battles to prevent this designation.

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 days ago

      At a point long long ago Ina gar away galaxy

      AI as we currently have it is a lot of A and no I. LLM’s are basically statistical distance measurement databases going from word to word or pixel to pixel.

      There is no awareness, it is literally a fancy ass anthropomorphized database you’re talking to

      • StoneyPicton@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        I get that and agree. I’m really just referring to future capabilities that existing A"I" (if you like, lol) will contribute to in ways I don’t think we can safely predict.