If AI and deep fakes can listen to a video or audio of a person and then are able to successfully reproduce such person, what does this entail for trials?

It used to be that recording audio or video would give strong information which often would weigh more than witnesses, but soon enough perfect forgery could enter the courtroom just as it’s doing in social media (where you’re not sworn to tell the truth, though the consequences are real)

I know fake information is a problem everywhere, but I started wondering what will happen when it creeps in testimonies.

How will we defend ourselves, while still using real videos or audios as proof? Or are we just doomed?

  • RangerJosie@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    7 hours ago

    We’re not. Its going to upend our already laughably busted “justice” system to new unknown heights of cartoonish malfeasance.

  • Randomgal@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    15 hours ago

    A bit dramatic imo. For most of legal history we didn’t actually have perfectly recorded video or audio, and while they are great tools at the present, they are still not the silver-bullet people would expect them to be at trial. (Think Trump and his cucks) Furthermore, most poor people try to avoid being recorded when doing crimes.

    It will probably mean that focus will shift to other kinds of evidence and evidence-gathering methods. But definitely not the end of law as we know it, far from it.

  • SirEDCaLot@lemmy.today
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    18 hours ago

    Eventually, we will just have to accept that photographic proof is no longer proof.

    There are ways that you could guarantee an image is valid. You would need a hardware security module inside the camera, which signs a hash of the picture with its own built-in security key that can’t be extracted and a serial number that it generates. That can prove that an image came from a particular camera, and if you change even one pixel of that image the signature won’t match anymore. I don’t see this happening anytime soon. Not mainstream at least. There are one or two camera manufacturers that offer this as a feature, but it’s not on things like surveillance cameras or cell phones nor will it be anytime soon.

  • Call me Lenny/Leni@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    18 hours ago

    A camera can only show us what it sees. It doesn’t objectively necessitate a viewer’s interpretation of it. I remember some of us being called down to the principal’s office (before the age of footage-based scandals, which if anything imply shortcoming in the people progressing the rulings to be in so much awe at, sadly a common occurrence, adding to the “normal people distaste” I have, and something authorities have made sure I’m no stranger to) who may say “we saw you on the camera doing something against the rules” only to be responded to with “that’s not me, I have an alibi” or “that’s not me, I wouldn’t wear that jacket” or “that’s not me, I can’t do that person’s accent” (aforementioned serial slander of me serving as a prime example where this would be the case). In connection to the process, you might say it’s witness testimony from a machine and that they’ve “just started” to get into the habit of not being very honest to the humans in thw court. I remember my first lie.

  • LesserAbe@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 day ago

    I think other answers here are more essential - chain of custody, corroborating evidence, etc.

    That said, Leica has released a camera that digitally signs its images, and other manufacturers are working on similar things. That will allow people to verify whether the image is original or has been edited. From what I understand Leica has some scheme where you can sign images when you update them too, so there’s a whole chain of documentation. Here’s a brief article

  • logos@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    Fake evidence, e.g. forged documents, are not not new things. They take things like origin, chain of custody etc into account.

    • ColeSloth@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      22 hours ago

      Sure, but if you meet up with someone and they later have an audio recording that is completely fabricated from the real audio, there’s nothing for chain of anything. Audio used to be damning evidence and was fairly easily discoverable if it was hacked together to try to sound different. If that goes away, then it just becomes useless as evidence.

  • AbouBenAdhem@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    Maybe each camera could have a unique private key that it could use to watermark keyframes with a hash of the frames themselves.

    • OsrsNeedsF2P@lemmy.ml
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 day ago

      Usually I see non-technical people throw ideas like this and they’re stupid, but I’ve been thinking about this for a few minutes and it’s actually kinda smart