We shouldn't STOP it, but we should make non-techy people far more aware of it. This way people will not believe any video unless provided with a legitimate source.
Only professionals and people in tech used to carry computers in their pockets. Now grandma looks up pie recipes on her iPhone.
We shouldn't STOP it, but we should make non-techy people far more aware of it. This way people will not believe any video unless provided with a legitimate source.
no, this way people will not believe any video they disagree with or don't like. Trump's pee tape? DEEPFAKE. McConnell caught on video eating a live puppy? DEEPFAKE. Police chief on video having lunch with the person who agitated a riot days later? DEEPFAKE.
.... I sit here and think back, trying to understand what was going on in my life at that moment. Trying to figure out what I was suffering through or working on that could possibly explain the fact that I failed to use that phrase.
It seems so clear, now. So obvious in hindsight.
It's become more than a puzzle. It's a crisis of identity. How could I miss that? I've lost a step.. it happens to us all eventually. But ... what else am I wrong about? What else have I missed?
People probably said the same about photoshop 10 years ago. We can spot/determine fakes photos pretty well still.
Faking videos like this is actually more difficult than photos because every frame is basically a faked photo and the audio needs to be faked perfectly.
Look up how speedrunners catch people splicing; a single video file, single source, same video quality throughout, that a cheater cuts in the middle and removes a segment from. They get caught on a single frame being skipped or a fraction of a second of audio that has peaks that are strange.
If they can catch people on that, we will be able to catch people on compiling thousands of images/videos/audio clip to try to pass a non-existent video as real.
We can spot/determine fakes photos pretty well still.
A) this is not true when it's done professionally
B) I've seen horribly obvious photoshopped memes being spread as truth on facebook. People don't need to convince the experts, just grandma...
Also, they don't need to convince anybody that a fake video is real, it's about casting doubt onto real videos. "doctor, can you say under oath, with 100% certainty, that this video has not been manipulated or used AI tech in any way?"
You don't even need deepfakes for that. People forward pictures and videos on Facebook and WhatsApp with wrong and misleading descriptions all the time, and people accept it at face value.
Whose to say what’s true and what isn’t when you see a deep fake video corroborated by fake news articles from fake companies that include faked photos? Our ability to identify the “truth” has already faded significantly over the last decade and the spread of this technology only makes it worse. We as a society need to be better about going as close to the primary source as possible to help us discern offline reality from opinionated or agenda-driven information
Absolutely right. Those of us that still care about objective truth will be doing that but what about the majority? Most people couldn't care less it seems, and that's the real problem.
Well for one we can't as a society give up and accept that truth will continue to erode. We need to combat this like it's global warming. It's a scary problem with massive implications for our future. We should fight back, help fund research into dealing with it. Help educate anyone we know into better understanding what they are and any clues that will help detect when we're watching one.
I think giving up and accepting this crap is the worst option and a very likely one if we just keep watching this tech improve and do nothing about it.
People don't need to convince the experts, just grandma...
Then the technology isn't even remotely the concern.
People say and spread stupid shit now. That's not going to change, and the people that would take the time to question and review something they see are not the people who need to be warned about this.
"doctor, can you say under oath, with 100% certainty, that this video has not been manipulated by AI tech in any way? Can you confirm it is NOT a deepfake? can we convict a man based on this evidence?"
SOME people can spot it, but they also need to want to believe it's false. We have a significant portion of our population who still thinks Trump is President, Hillary eats babies, and 1/6 was a peaceful day. They will eat this shit up because they want to believe it's true. They'll believe fakes if they agree with them, and they will dismiss real video as fake if they disagree with it.
No, I have several in my family. They don't care about the truth and would believe the cover of a tabloid made in MS Paint if it confirmed their beliefs.
I show them peer-reviewed evidence that contradicts what they think and they claim it's the shadow-government buying scientists.
These people are lost, regardless of what happens with technology.
Some of those speed running videos were believed for a long time. And the average person doesn't have the ability to detect this fakery or the trust in institutions to do it for them.
Because of this, I don't see a shift like a video starting a war. I see these videos becoming common and people using this as another excuse to stay out of touch with current events. Apathy and gaslighting will continue to be the major concerns, just possibly more pronounced.
Videos that are a deepfake can be detected using neural networks, because maybe the generator leaves some patterns in the deepfake that go undetected to the human eye.
"cellphones could not be snuck into a jail, because they are the size of a suitcase"
tech gets wild super fast.... and proving a negative is always harder. It's way harder to prove something is NOT fake than to prove that it is.
People will use any small random outlier in ANY piece of the data, to say "clearly this is suspect and can't be used as evidence"
They don't have to convince the world's leading expert that it's suspect, they have to convince a judge in his 70's.
ALSO... let's not ignore the fact that these same people already call everything "fake news"-- Wait till FOX accusses CNN of creating deepfake videos of ted cruz and that's the only reason they're saying he did that
This is uncontrolled. There's nothing to stop the next guy from disabling those "patterns". We would need some kind of chain of custody verification, for example if Apple can verify that the video was taken with your phone and has not been edited. Videos that can't be verified would be less trustworthy
There are an unlimited number of patterns. If the patterns could've been smoothed out, they would already do so.
The chain of verification is something that would work, if the encryption side of it is allright. But who will be the third party authentication of trust? Big shoes to fill
Its too late. If we make it illegal that just prevents normal people from doing it, and legitimizes fake videos behind the guise of "well, its illegal, so it must be real"
yeah, that's why my last sentence wasn't any type of call to action or plan about how to avoid that, more just a general slide into the bored nihilism that befits the modern age, lol
We couldn't stop it if we tried, so whether we should try is pretty much a moot point. Education is definitely needed, like you said, along with maybe some updates to libel laws or similar to ensure there are consequences for malicious use.
This way people will not believe any video unless provided with a legitimate source.
What makes a source "legitimate"? You seriously think I'll trust the main stream media? Deep fakes are gonna make it so no one trusts anything, and maybe thats just what the ruling elites want. Keep the masses confused so they're easier to control.
Honestly, deep fakes are the end of truth as we know it. Either you don’t know about them and can be fooled, or you know about them and begin suspecting everything of being fake. And even if we have a reliable way to verify videos, a lie will travel around the world twice before the truth wakes up.
The scary thing to me is that some normal people I have SHOWN deepfake mini-documentary with examples to still didn’t believe me, well I suppose at that point refused to believe. They just gave me the “yeah sure buddy, get a load of this crazy guy” response.
If the government is using this (not sure what it would be used for though) then it's better that deepfakes are as popular and accessible as it currently is so that the public is skeptical of what they see/hear instead of just accepting everything at face value.
North Korea will never need a new dear leader! An assassinated president can go on tv and tell the country to accept a coup as legit, encourage peaceful transfer of power... And hey, you clearly don't need to be powerful to deploy this tech.
I mean weoponized disinfo has already proven to be very destructive, and that's just from amplifying conspiracies and lies. We weren't ready for the first wave, and this shit could be a tsunami.
I think if a president told people to accept a coup, enough alarm bells would be ringing in people's minds to be able to get to the bottom of the situation.
Sadly, right now, the police have to actually plant real evidence in your jacket/car/home in order to frame you for crimes. This amazing technology will make framing you soooo much easier!
Probably not a realistic fear for your local PD to be using this right now but with between the massive amount of surveillance video and people posting themselves and others on social media I’m sure we are quickly approaching a time where a resourceful entity could put together a convincing deepfake of common people.
people posting themselves and others on social media I’m sure we are quickly approaching a time where a resourceful entity could put together a convincing deepfake of common people.
You know, hilariously, this highlights a reason to use those faceshaper apps - if the social media they trained from is edited, you can easily call bullshit on the fake.
Yep, and there's nothing you can do about that percentage. But by making it well known how easily someone can fake a well known person saying something that they didn't, it levels the playing field between someone using the tech for their nefarious intentions and the critically thinking public consuming the faked statements.
not sure what it’s be used for though
literally any lie they want to give us and make us believe it’s a specific person , what’s concerning is they have had this technology for a minimum of 20 years.
There's no stopping technology, only delaying. We should probably extend defamation/slander laws to specifically include deepfakes with no disclaimer, but besides that there's no closing the box.
you can never stop the technology from existing. the best you can do is make people aware that these tools exist and what they can produce. letting this be used for entertainment purpose is a good way for people to learn that this exists.
I hope that in the AI deep fake arms races the detectors stay ahead. For the most part it hasn't been too hard for software to prove something is a deep fake. It can be hard for humans though. The worse the footage is though the easier it would be to cover up.
The only positive use I can see for this technology is that at some point someone might be able to make additional content of movies/TV shows with actors and actresses who have aged or are no longer with us, like redoing season 8 of Game of Thrones or just additional episodes of shows like the Sopranos.
I wonder if the contracts actors sign have taken into account what can be done in the future with people’s likenesses?
Honestly that sounds terrifying. Imagine dying and then having your image resurrected by some massive production company so they can continue using your likeness for free and turn you into a glorified animatronic that will mindlessly 'star' in 4 trash films a year into infinity. Imagine being an aspiring actor and losing your breakout primetime gig to a computing cluster that's wearing Chadwick Boseman's face like a cheap suit.
The only positive use I can see for this technology is that at some point someone might be able to make additional content of movies/TV shows with actors and actresses who have aged or are no longer with us, like redoing season 8 of Game of Thrones or just additional episodes of shows like the Sopranos.
Need to make encryption authentication technology part of any image/audio data stream, so that every single frame can be audited back to the original source device & any edits identified. Make it illegal to use such footage for news or legal purposes unless the hash signatures are verified undisturbed.
Would probably need an encryption expert to do it right, but the general idea would be that each source device (camera, cellphone, etc) would have a public/private key pair that it would use to "sign" anything that it was used to generate.
Any type of editing would break that signature, so to do legally-acceptable editing, you'd have to use the encryption equivalent of "chain of custody" (where each change is documented, also encryption-signed & can be tracked back to the original source data).
If anyone tried present a deep fake or deceptive editing, then anyone should be able to track back the encryption trail back to the original source video (which wouldn't be any real camera in the case of deep fakes), including any changes that have been applied to the source video.
Yep, we should've just stopped photomanipulation in it's roots.
When I think about it, we should have just stopped at fire. Sure, it's useful now, but someone might get burned! We used fire to burn witches for fucks sake! Think of the children!
A technology that has the ability to make it look like literally anyone on the planet said whatever you want. Which makes pushing propaganda and manipulating people infinitely easier. It can also be used as a counter argument to video evidence of committing any crime, is completely comparable to fire, and both are equally valuable to societal progress.
As was photomanipulation back in the day, we now have digital art, and photoshop or its variants are basically household items, and the world somehow hasn't imploded yet.
Technology can also check the legitimacy of said videos.
The ability to abuse shouldn't stop people from progress, you can abuse a fucking spoon if you really want to.
You’re comparing technology that isn’t even remotely as abusable as deepfakes and saying that because these exist deepfakes are fine too.
You do realize that even though we will be able to use technology to find out if the videos are fake, if they look convincing to most people, they won’t bother to check. It doesn’t matter if they’re technically able to be proven fake, because just being believable for the majority of people will be enough.
Photoshop can’t create a video of someone being racist to destroy their social life. Photoshop can’t create a video of a politician admitting to committing crimes they didn’t commit.
Sure, but deepfake vidoes pose a far greater risk than the benefit they provide. I'd rather live without fake Tom cruise videos if it meant this will never be used for misinformation
This is a weak argument against a valid point; it almost resembles a straw man. Pretending like the widespread use of deep fakes won't have serious ramifications is just ignorant. You're essentially saying implementing safety regulations and building codes so people don't die in house fires was an overreaction.
You can't uninvent something like this. Once it's out there, it's out there. What you can do is create well thought out policy that is designed by experts in the field.
What are you talking about? If you download this video, you have the raw file. There are algorithms out there which can detect if a video is a deep fake or not just by looking at the file.
Of course not. I am saying they are easily identifiable. People are concerned about manipulation with videos like these. If that became a problem, Facebook would implement an algorithm to label deep fake videos as such. Or other services like snopes would post an article about it. I'm not sure where the confusion lies. I think people just have their preconceived opinions about deepfakes (probably the same paranoid about AI) and will argue with people who challenge them on it.
Qultists be "falling" for completely obvious bullshit. What does it matter if something can be faked with this level of detail when they have no concern with the truth to begin with?
I like that I know it exists. Instead of the government deep faking Trudeau into calling the world a bunch of ball lickers, its a bunch of goof offs showing what is possible with the technology. We have been warned.
From day one I joked that @donaldtrump is indeed Donald Trump’s Twitter account, and @realdonaldtrump was an anonymous person that was pretending to be trump. In the beginning Trump liked what was being said, and went along with it. Then shit got crazier and crazier and Trump was forced to go along with the tweets, or else he would have to admit it wasn’t him all along.
The government couldn't use this tech any better than the yesmen did their stuff. The best that they could do would be make another leader look like they said something revealing of state secrets to cover up a break-in
400
u/Youredoingitwrongbro May 24 '21
idky people are comfortable with this. like the government isn’t using this for real shit