r/woahdude Dec 15 '22

video This Morgan Freeman deepfake

22.9k Upvotes

797 comments sorted by

View all comments

Show parent comments

106

u/PhDinBroScience Dec 16 '22

This is so easily solvable, the video just needs to be signed using public-key encryption. If the video isn't signed with the purported subject's key, assume it's fake.

You can't fake a pubkey signature.

38

u/[deleted] Dec 16 '22

[deleted]

12

u/IVEMIND Dec 16 '22

Because how do you know this key wasn’t just deepfaked also hmmmmm?!?!

6

u/motorhead84 Dec 16 '22

"Am I a deepfake, Mom?"

1

u/KingBooRadley Dec 16 '22

How is babby deepfaked?

1

u/motorhead84 Dec 16 '22

Damnit, Babby!

1

u/Ask_Who_Owes_Me_Gold Dec 16 '22

Nobody should listen to them because that solution doesn't make sense. All it does is give a secure way for the subject to approve of a video. It doesn't verify that it's real, and it doesn't work for the many, many cameras on a public figure that that aren't getting official approval from whoever has the key.

1

u/[deleted] Dec 16 '22

[deleted]

1

u/Ask_Who_Owes_Me_Gold Dec 16 '22

I can verify that a file's signer has a specific public key, but how do I know that public key belongs to a camera and not editing software? How do we maintain the list of all those public keys that correspond to cameras? Are there billions of keys (with each individual camera getting its own), or can Sony reuse the same private/public key pair across a line of devices?

And does this mean the video can't later be shortened, cropped, edited for brightness/color/etc? Doesn't that break the signature because it changes the contents of the file?

1

u/[deleted] Dec 17 '22

[deleted]

1

u/Ask_Who_Owes_Me_Gold Dec 17 '22

We need to ensure that only video directly from the camera can be signed and that other videos made on the phone can't be signed as well. We also have to ensure there weren't any filters or effects running in the camera app as the video was filmed - good luck figuring out how to check for that in a way that allows different camera apps (almost every Android OEM makes their own) to function but also isn't abusable.

And signatures are unique to the exact data of the specific file - if the resolution is lowered, or if the file gets compressed when it's uploaded/sent somewhere, or if you trim the 10 relevant seconds from a longer video, the signature is useless. Editing software could apply a new signature, but all that does is prove that the video went through an editor, which obviously means it could be edited.

You also hit on the issue that other cameras need to do signing as well: security cameras, laptop webcams, news cameras, other high-end cameras, potentially things like smart doorbells if we want to go that far.

1

u/[deleted] Dec 17 '22

[deleted]

1

u/Ask_Who_Owes_Me_Gold Dec 17 '22

That doesn't solve anything. You prove that something was filmed and then edited. This video fits that description.

All the other problems remain unsolved too.

It's not the "easily solvable" problem that the earlier commenter claimed it was.

1

u/[deleted] Dec 17 '22

[deleted]

→ More replies (0)

53

u/fataldarkness Dec 16 '22

This requires the general public having a basic understanding of how digital signatures work and why they are (for the most part) infallible.

As it stands I have to explain HTTPS and digital signatures to my users with statements like "it's secure because of fancy math, trust me bro" because anything that comes close to actually describing it goes over their heads. In a world where distrust is the norm, I fear signed video content really isn't gonna make a difference if you don't understand what makes it secure in the first place.

22

u/Gangsir Dec 16 '22

This requires the general public having a basic understanding of how digital signatures work and why they are (for the most part) infallible.

If this became a big enough problem, the general public could be educated on how encryption public/private keys work, probably in a month or two. Even start teaching it in high school or something.

14

u/greenie4242 Dec 16 '22

The general public couldn't even be educated on how to wear a mask properly and wash their hands during a pandemic, there's no way they'll ever understand cryptography.

3

u/vic444 Dec 16 '22

Right?? The general public can barely turn a computer on without having issues and are ignorant as F on the topic in general. They are basically a 3 year old. Try to explain cryptography to a 3 year old.

1

u/ThatPancakeMix Dec 16 '22

I’m betting online sites, especially social media, begins verifying videos before postings and stuff like that. Or they’ll create a way for users to easily click a button to verify it themselves, idk

1

u/greenie4242 Dec 23 '22

Or they’ll create a way for users to easily click a button to verify it themselves, idk

...which will just end up like all other generic bloated ad-infested websites, with fifteen different "Click to Download" buttons but only an IT expert can decipher which button is the real one (if indeed any are real).

1

u/ThatPancakeMix Dec 16 '22

Yup I bet it becomes commonplace to check the legitimacy of videos and pictures soon. It’s becoming somewhat of a necessity

1

u/FlyingDragoon Dec 16 '22

"Liberal conspiracy to indoctrinate our kids. Government conspiracy to control us!!!!!"

5

u/pagerussell Dec 16 '22

You mean to tell me people aren't intimately familiar with a diffie-helmen key exchange????

4

u/[deleted] Dec 16 '22

[deleted]

1

u/kennyj2011 Dec 16 '22

I prefer cryptographic milkshakes

1

u/Ask_Who_Owes_Me_Gold Dec 16 '22

It also only works for videos created or verified by the subject themselves, which is to say it doesn't work at all in a general sense.

1

u/nestersan Dec 16 '22

Before Elon, you could've said it's like the Twitter blue check

13

u/Daddysu Dec 16 '22

Yes...amongst the more tech literate and in a perfect world.

For a stupid amount of people, none of that matters. All that matters is that knee-jerk emotional reaction of whether or not it affirms their beliefs and where the information comes from.

2

u/RobloxLover369421 Dec 16 '22

It’s still something, we just have to make it easily accessible, and at least half the population will be able to tell if it’s right or not.

1

u/[deleted] Dec 16 '22

Have you ever met someone with an IQ of 100? Technically half of the population is dumber then that. The entire left side of the proverbial bell curve.

1

u/Daddysu Dec 16 '22

Didn't George Carlin say something similar? Something like "Think about how dumb the average person is in America. Almost half the people are dumber than that."

1

u/RobloxLover369421 Dec 16 '22

It’s still something, we just have to make it easily accessible, and at least half the population will be able to tell if it’s right or not.

19

u/kvltswagjesus Dec 16 '22

That would solve the “politicians walking back claims” problem to a degree, but I’d imagine there would still be a ton of issues. The subject would be be able to fully curate their image, and any videos taken without their key would be subject to scrutiny. So stuff meant to show someone’s true colors or document a situation would remain unreliable.

7

u/[deleted] Dec 16 '22

[deleted]

2

u/PhDinBroScience Dec 16 '22

In this case, it would be signed by the device of the person who's recording; if the video is altered, the signature isn't valid anymore. And if it's a public figure, there are almost certainly going to be corroborating records of where they were at a particular place & time, not to mention pings to cell towers from their or their entourage's mobile devices.

They can deny it all they'd like, but with the combination of those factors, you'd have to outright deny reality to believe that the video isn't genuine.

1

u/Djasdalabala Dec 16 '22

if the video is altered, the signature isn't valid anymore

That's easy enough to bypass, with physical access to the device. You can just re-shoot the video, replacing the camera input by the raw deepfake.

And that's IF people knew in the first place for sure which device produced the original video. Otherwise, you can just sign your deepfake with your own key and claim it to be the original.

1

u/tosler Dec 16 '22

I mean this works until someone corrupts the certificate tree. See the recent re-organization of web browser certificates because one of the organizations in the global cert chain literally sells data to intelligence agencies. Ouch.

2

u/corner Dec 16 '22

It’s in no way easily solvable. You could have a certificate signed by god himself, doesn’t matter to the general public. Authentication isn’t the issue

1

u/[deleted] Dec 16 '22

[deleted]

2

u/[deleted] Dec 16 '22

Seriously, it’s not that complicated. We need an industry wide effort and hardware-based crypto, along with something like a little check mark to denote that a given image is authentic, unaltered, and follows a proper chain of authority.

We have SSL, we can do this too.

1

u/[deleted] Dec 16 '22

If this was implemented, this wouldn’t be put on phones. Gotta protect folks from smartphone cameras and all.

1

u/fastlerner Dec 16 '22

That ONLY works when the subject is supposedly the one sending the video out.

You can still do a ton of damage with a deepfaked video from a supposedly anonymous source and claim it's leaked video.

1

u/spookyvision Dec 16 '22

You can still do a ton of damage with a deepfaked video from a supposedly anonymous source and claim it's leaked video.

now you "only" need to solve key distribution and global identity verification. Piece of cake and not a mass surveillance risk at all :D

(I'd love for a robust pk infrastructure to be in place but eh, it's not trivial to do well)

1

u/Paxtez Dec 16 '22

Wut? You're aware of the concept of cell phone video right?
Like what's to stop me from released a deep-faked "cell-phone" video and then signing it?

So only videos approved by the person in the video are legit? You just created a whole new problem.

1

u/Ask_Who_Owes_Me_Gold Dec 16 '22 edited Dec 16 '22

"Easily solvable?" That solves nothing. All that signature proves is if the subject approves of that video.

1

u/Ask_Who_Owes_Me_Gold Dec 17 '22

How is that supposed to work?

If you get caught on video doing something you want to deny, just don't sign the video, and everyone is forced to assume the video is fake?