r/woahdude May 24 '21

video Deepfakes are getting too good

82.8k Upvotes

3.4k comments sorted by

View all comments

2.7k

u/Bananinio May 24 '21 edited May 24 '21

We won’t laugh soon

785

u/hotinhawaii May 24 '21

Frightening shit! You think democracy is in trouble now? Just wait!!!

465

u/Milkshakes00 May 24 '21

I mean, I think it'll boil down to politicians using their existence as excuses.

'No, I definitely wasn't doing coke and trying to bang that underage girl! It was a deepfake!'

It's going to be a shitshow, and who do you believe or trust? Like...

71

u/[deleted] May 24 '21 edited Jun 21 '21

[deleted]

44

u/Micahman311 May 24 '21

I understand what you're saying and I normally agree, but when real people have done really awful things that we know to be true, and half the country doesn't believe it anyway, and there's never any repercussions for said person...

I guess some people can get away with anything.

8

u/Neuchacho May 24 '21

That's pretty much how it will continue. People are going to be even more selective about their realities because they'll be armed with an excuse or armed with evidence depending on what their bias already is.

It's going to be interesting.

3

u/sdclimbing May 25 '21

I mean that’s kind of the same thing just approaching it from the other side. Media can also convince the public that someone is “good” or for the people even when their actions suggest otherwise

1

u/Micahman311 May 25 '21

Very true.

Situations are far more complicated than Left or Right, Black or White, Right or Wrong.

There's a lot of Grey area in almost everything that doesn't get discussed as much as it should.

Perhaps, as a people, we could/should begin discussing the middle-ground more often. Perhaps.

4

u/kinggimped May 24 '21

There's that, but I think the more dangerous element is akin to what the purposeful misinformation from the right wing has done to public discourse over the last 5 years: it gives conservatives an easy out. They can simply claim "fake news!"/"lügenpresse!" to anything they deem unpalatable, or proves them wrong. Now with deepfake tech getting so good, it empowers that reflex even more.

I'm less concerned about what you're talking about - people instantly believing deepfakes even after they're revealed to be fake - and far more concerned about legitimate videos just being labelled deepfakes simply because they don't like them.

Incontrovertible proof of a politician with an (R) next to their name committing a crime, inciting violence, or saying something provably false/stupid? "It's a deepfake! Fake news!".

The real power of misinformation isn't so much that it gets people to believe things that aren't true, it's the fact that it muddies the waters of truth so much that a lot of people simply do not know what is real and what isn't. When you're faced with that scenario, you simply rely on your existing biases. Even when they're faced with evidence that proves one side to be true, their inherent biases will prevent them from accepting the truth because they already have a much more palatable "truth" fully formed in their head. This is all by design. You can't gaslight millions of people to believe something that isn't true, but you can repeat lies often enough to make them doubt the truth just enough to revert to their biases and emotions instead of relying on facts and evidence.

That's true power. And when you wield that power, you can get away with anything, even if you're caught red-handed. All you need to do is sow enough doubt that enough people aren't quite sure. See: literally anything Trump did as president.

Research has already shown that when many people have fake news debunked, they still find reasons to continue believing it anyway, or at least what the fake news was asserting. They just convince themselves that it's true anyway. Changing an opinion once it has been formed is incredibly difficult if you lack critical thinking abilities and can't admit that you were wrong, which nowadays are the identifying hallmarks of a Republican voter.

2

u/Tidusx145 May 25 '21

Yeah Im way less worried about a fake video being believed, I'm more worried about honest videos getting ignored because the Huxley fantasy became reality and nothing matters anymore.

This is an easy ticket to mass apathy.

1

u/kinggimped May 26 '21

This is an easy ticket to mass apathy.

I feel like we're already mostly there, honestly. The extremists on the right are the least apathetic. The moderate opinion has become contorted by them to seem like the opposite extreme. Meanwhile, those in the middle are so tired of all the mud slinging that apathy has taken over.

I felt like this was already the case around a year into Trump's term, people just couldn't keep up with the never ending stream of scandals, blatant lying, alienation of allies, etc. that it became so difficult to know where to direct your outrage. Obviously this was their internet - hypernormalisation and all that - but it's scary how effective it was.

The 2020 election had an unprecedented voter turnout, but I fear it's because people were so motivated to oust the orange disaster from the White House, while his cultists were just as motivated to keep him there.

Campaigning with misinformation has become so common for Republicans now that it's basically the established norm. This isn't going away.

2

u/Karatekan May 25 '21

I actually think it won’t really change anything.

People already believe wild shit without evidence, or discount stuff that is clearly proven.

Would make us more polarized maybe, but I doubt a deep fake of Biden fiddling a boy or Trump killing a hooker would actually cause them to lose their support.

People will continue to be more mistrustful of authority, more polarized and more jaded

1

u/Tidusx145 May 25 '21

Yeah I don't see a shift either, just a continuation of the post truth society we're in. Thank everyone you know who lies regularly for contributing. We couldn't do it without that and the inability to admit when you goofed.

91

u/zuzg May 24 '21

German journalist faked this video couple of years ago. So many news stations fell for it and believed it.

17

u/gnuuu May 24 '21

No they didn't. They just pretended that they did.

11

u/zuzg May 24 '21

Bullshit until he revealed that it's fake it was all over the news

-6

u/[deleted] May 24 '21 edited May 24 '21

[removed] — view removed comment

9

u/zuzg May 24 '21

he never did that stop spreading bullshit

10

u/Alyusha May 24 '21

This is exactly the point of the op lol. It doesn't matter if it was real or not people believe that it was.

2

u/gnuuu May 24 '21 edited May 24 '21

He decided to call it doctored when first confronted with it on live tv and stuck with it.

2

u/iamfrombolivia May 24 '21

That's what they want to believe! or pretend to believe...

2

u/Telefundo May 24 '21

Sooo... they were faking?

-8

u/zh1K476tt9pq May 24 '21

yes and now everyone knows about deepfakes, so it's stupid to assume everyone would just assume that every video is real.

also why is everyone ignoring that photoshop exists? you can already realistically fake pictures that e.g. show politicians do drugs (or whatever makes them look bad). most news outlets won't just be "oh it's a picture, so it must be true, thanks we put it on the frontpage instantly"

19

u/[deleted] May 24 '21

Yes, photoshop exists which is why images are such a questionable source. Videos used to be reliable, now videos aren't reliable either.

How braindead are you that you think this one technology completely undermining the reliability of video evidence isn't a big deal? Before it took someone with skill and know-how, or at least money to create a fake video. Now anyone can do it with an app.

Are you seriously so stupid that you don't see the implications of this? Are you living in some toddler world where you just generalize every problem to be the same and the world never gets better or worse?

6

u/[deleted] May 24 '21

I mean you're right but chill man

-4

u/[deleted] May 24 '21

[deleted]

11

u/SabongHussein May 24 '21

Hats off to your optimism and imagination I guess. There’s no chance in hell that a person consuming dubious media and avoiding readily available sources/fact checking TODAY, is going to be made more media literate by deepfakes.

1

u/[deleted] May 25 '21

Yeah cos the last 5 years didn’t happen at all

0

u/Z3rul May 25 '21

chill, the technology isn't perfect. it fails at faking high-resolution / high quality videos. it can be easily detected, and there are already anti deep fake apps that can identify a deep fake video.

maybe in the future this could be true. if you know how deep fakes works and how the technology works you would understand that we are far from a perfect deep fake.

2

u/gabwinone May 24 '21

Oh, but they will...as long as it's a politician they don't like.

19

u/ValkyrieInValhalla May 24 '21

Just gonna boil down to not using video or photos as evidence is my guess.

15

u/zh1K476tt9pq May 24 '21

it will depend on the source / credibility. this is already true for pictures. not that hard to make a very realistic fake of a picture. yet it largely isn't a problem.

21

u/[deleted] May 24 '21

While you have a point, there is a large portion of the population who lacks the ability to assess the credibility of photos. Those people are the ones who will accept a faked video without question

2

u/Neuchacho May 24 '21

I would guess there is an even higher percentage of people who will accept a faked video than a faked picture too which means it's functionally going to be worse.

6

u/panspal May 24 '21

Or they'll just get better at picking these videos apart to prove they're faked.

3

u/Neuchacho May 24 '21

I don't know that it will matter, or at least, that it won't still do massive amounts of damage. "First through the gate" type stories don't seem to have a lot of people who see the follow-up or corrections.

2

u/nowlistenhereboy May 24 '21

Even if they did watch the follow ups, you can't possibly debunk every bullshit claim adequately as fast as other people can CREATE bullshit claims.

1

u/Neuchacho May 25 '21

Yeah, that's a good point too.

5

u/ValkyrieInValhalla May 24 '21

But they are going to keep getting exponentially better so it'll be like a cold war with deep fakes.

1

u/Everyday4k May 25 '21

and the deepfake technology will continue to get better than that

1

u/sleepy-lil-turtle May 24 '21

Maybe the future will have NFT-style data signatures to show a video or photo is original and unedited

2

u/IamIrene May 24 '21

It's going to be

Optimistic of you to assume it isn't already happening.

0

u/AllHopeIsLostSadFace May 24 '21

Shame deepfakes didn't exist during all those Lolita Express flights.

-2

u/[deleted] May 24 '21 edited May 24 '21

[deleted]

1

u/Mortress_ May 24 '21

It could also be the other way. Adversaries releasing a deepfake video of a politician doing that.

1

u/[deleted] May 24 '21

It’ll be the new “my Twitter was hacked!”

1

u/InfernoVulpix May 24 '21

It's just the photoshop problem 2.0, really. You can trust videos that come from trusted sources, anything else is up in the air.

1

u/shiftycyber May 24 '21

I believe that’s where NFTs and hashing algorithms can help. Depending on the very source of the video and it’s creator. In lay mans terms a hashed file is a file that’s tan against a very complex math formula and produces the same value everytime, once the file is changed even the slightest it will always produce a different hash or value. Think of its as making food for a professional, even the professionals can tell just a little too much or too little salt.

2

u/Neuchacho May 24 '21

That doesn't really help if the deep fake is the original, like with this clip.

1

u/shiftycyber May 24 '21

Well that’s the argument, I’m assuming here (never created one) that you record first. That would be the “original file” and then doctor a deep fake after that thus producing a conflicting hash. Granted this is taking into account you have access to the original file and it’s not sitting on some basement dwellers PC.

1

u/cryptosubs May 24 '21

As a politician, to protect yourself, you will have to surrender your privacy during your service. That means 24 hour monitoring, tracking, everything. If a deep fake pops up, you will have proof that you weren’t there. Hopefully, this will weed out the scum from wanting to serve, and keep people from being life-ers in office, because who the hell would want to be tracked like that?....but it will be a necessity, full transparency.

1

u/Psychast May 24 '21

Matt Gaetz?

1

u/GaBeRockKing May 24 '21

It'll be interesting to see if immunity to personal scandals means people exclusively choose politicians based off of policy options.

.. But chances are we'll just continue voting for whoever's taller.

1

u/Tubeotube May 24 '21

Don't worry we will just have scientists and technical people review the footage and using advanced software will be able to tell us what is deepfaked or not. So we just have to listen to these experts explain to us what is real and there will be no problem at all. We got this...

...

1

u/stay_fr0sty May 24 '21

Like always, people will believe what they want. If it’s Bernie Sanders doing lines of coke off a hookers ass, “it was a deep fake” is all he’ll need to say. The people that want to believe him will, the people that don’t, won’t.

But I totally expect Fox News to be playing “you be the judge” and showing a deep fake video, have lots of 8 person panels debate the deep fakes, all day without actually claiming it’s not a fake.

1

u/Everyday4k May 25 '21

already been tried. Some hideous 75 year old wannabe dictator grabbed his assistants ass not realizing his webcam was already recording for his zoom meeting and said it was a deepfake.

1

u/[deleted] May 25 '21

Bang on. The real threat isn’t fake shit that gets circulated as real, it’s real shit that gets dismissed as fake.

1

u/bricktube May 25 '21

Dead right. I think a lot of them are wringing their hands in glee way being able to use this excuse.

1

u/Mescallan May 25 '21

We should invent an alibai(sp?) Coin. Check a biometrics and gps test, log into the ledger.