• Eggyhead@fedia.io
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    1 day ago

    This bring up an interesting question I like to ask my students about AI. A year or so ago, Meta talked about people making personas of themselves for business. Like if a customer needs help, they can do a video chat with an AI that looks like you and is trained to give the responses you need it to. But what if we could do that just for ourselves, but instead let an AI shadow us for a number of years so it essentially can mimic the language we use and thoughts we have enough to effectively stand in for us in casual conversations?

    If the murdered victim in this situation had trained his own AI in such a manner, after years of shadowing and training, would that AI be able to mimic its master’s behavior well enough to give its master’s most likely response to this situation? Would the AI in the video have still forgiven the murderer, and would it hold more significant meaning?

    If you could snapshot you as you are up to right now, and keep it as a “living photo” A.I. that would behave and talk like you when interacted with, what would you do with it? If you could have a snapshot AI of anyone in the world in a picture frame on your desk, who you could talk to and interact with, who would you choose?

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 day ago

      it would hold the same meaning as now, which is nothing.

      this is automatic writing with a computer. no matter what you train on, you’re using a machine built to produce things that match other things. the machine can’t hold opinions, can’t remember, can’t answer from the training data. all it can do is generate a plausible transcript of a conversation and steer it with input.

      one person does not generate enough data during a lifetime so you’re necessarily using aggregated data from millions of people as a base. there’s also no meaning ascribed to anything in the training data. if you give it all a person’s memories, the output conforms to that data like water conforms to a shower nozzle. it’s just a filter on top.

      in regards to the final paragraph, i want computers to exhibit as little personhood as possible because i’ve read the transcript of the ELISA experiments. it literally could only figure out subject-verb-object and respond with the same noun as it was fed, and people were saying it should replace psychologists.

    • Lag@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      I wouldn’t want to talk to AI either. Just have it send me a voicemail recording of the video, but transcribed into a text, into my spam folder.