• xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    ·
    9 months ago

    Please explain to the lawyers in 1980 who wrote the contract for a published a book what precisely generative AI, Twitter and the internet are so they can be sure to account for their fair use in their contract… until five years ago none of us knew what this stuff would do. And, I’d mention, that Google Books has been pummeled by lawsuits for pretty much the same reason and ended up needing to pull almost all books from their free reading section.

    • Rocket@lemmy.ca
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      9 months ago

      The training of an attention network is not meaningfully different from training a perceptron, which was invented in 1957. Interconnected networks of computers date back to 1969. finger captured the spirt of Twitter in 1971. The businesses may have changed, but the concepts you speak of were very well understood in 1980.

      A contract precisely specifying “generative AI”, “Twitter”, and “the Internet” would be silly even today. That would be like the DVD specifying “You may only watch this while sitting on a Chesterfield flower-print couch in your living room on a RCA CRT television”. My understanding from you is that the intent is simply to prohibit non-humans reading the work. That is something that could have been easily written in 1980.

      Hell, robots doing human-like things was front in centre in popular culture in the 1980s. It would have been impossible to not realize that robots reading books was an almost certain future. To act surprised now that it is happening is an act not selling itself.

      • xmunk@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        9 months ago

        There is a massive difference between AI tech in the 70s and today. The scale we’re able to achieve is orders of magnitude beyond what was dreamed of. These modern issues were conceived as taking much longer to arrive and giving the legal system more time to catch up. Our legal system can force a common baseline of behavior on our new technology and that will be necessary to have a healthy balance of power.

        • Rocket@lemmy.ca
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          9 months ago

          There is a massive difference between AI tech in the 70s and today.

          Not really. We’ve learned a few tricks along the way, but the fundamentals of neural networks have not changed at all. The most significant progress AI has made is in seeing compute become orders of magnitude faster. Which we knew, with reasonable confidence, was going to happen. Moore’s Law and all that.

          The scale we’re able to achieve is orders of magnitude beyond what was dreamed of.

          While I disagree, the scale is irrelevant. The slow systems in the 1970s were maybe only ingesting one book rather than millions of books, but legally there is no difference between one book and millions of books. If we are to believe that there is no legal right for a machine to read a book, then reading just one book is in violation of that.

          our new technology

          What new technology? The Attention is All You Need paper, which gave rise to LLMs, showed a way to train neural networks faster, but it is not the speed at which machines can read books that is in question. Nobody has suggested that the legal contention is in traffic law, with computers breaking speed limits.

          We’ve been doing this for decades upon decades upon decades. Incremental improvements in doing it faster changes the legal state not. To pretend that suddenly the world was flipped upside down is ridiculously disingenuous.