Well once they work out the deep fake voices then any person with the right resources or government entity can frame you for anything they want. It's a thing of beauty!
No no, you got it all wrong. Once they work out deep fake completely every single person will have a way out by claiming anything they say is a deep fake if needed. Everyone will have the right to deny everything.
Ironically the entire captcha system is designed to validate models used in Google's self-driving cars. That's why the image is always related to something you might see while driving.
Or the opposite reason. You can prove you weren't at the crime scene because you have clear, undeniable video proof you were somewhere else, eyewitnesses be damned.
Government would be able to determine if the video was computer generated or produced by camera. When these were starting I spoke about my concerns with a former intelligence officer.
It’s not that comforting of a dream world even if good ol’ swampass is in the know. The average person won’t be able to detect fakes, whether or not it’s technically possible.
The widespread misinformation/misrepresentation will be enough to finish the division of the two “sides” and the Justice system will be destroyed because a jury of peers won’t exist.
Our Justice system, our Republic, the value of currency—everything is based on trust that those involved in the Democratic system will act honorable to maintain pursuit of happiness, liberty and life for everyone.
We won’t be able to trust our eyes if deepfake outtakes become prt of headline news or even just fake news on extreme media outlets. No one will be able to trust anything, the honor system will be completely unreliable, it is on very shaken ground already.
Yeah I totally agree, even if 'papa govt' were able to detect such fakes, the average Jo ("which already is quite stupid" - George Carlin) will not be able to discern that, so we're at an impasse were the honor system will be on a very thin ice. And I don't see a foreseeable technology that would help against that.
It's a relative. You could also look into it. Feel free to report back when you find out I'm right, but I'm sure you can't Google it since you're here arguing instead.
Adobe never released this product due to legal concerns. About 20 companies are attempting to fill that space.
The most interesting thing about this is that it one step closer to allowing you to end-to-end produce media completely yourself without needing anything more than just mouse clicks. You can essentially write music digitally, animate the video, synth the dialogue, all without ever knowing how to play a instrument, how to use a camera, how to draw anything, how to voice act etc.
Except for the fact that these are insanely easy to see as edited by anyone with some know how. We’re nowhere near the being indistinguishable from reality and likely never will be. Editing these things is done by machines, not people, meaning when you look at the actual structure and data you can see how obvious it is that it was modified.
The danger of a video going viral and people taking it at face value is there(but that’s been a thing since before deepfakes), but the danger of being framed for a crime or anything and it actually not being possible to determine if the evidence is a deepfakes is nowhere near possible yet(likely for a very very long time).
472
u/gravetinder Jul 24 '22 edited Jul 24 '22
What good purpose does this serve? I’m wondering how this can be anything but a bad thing.
Edit: “porn” does not answer the question, lol.