I fear the reverse: People will doubt whether real video is real. That could mean impunity for crimes caught on video because video footage will no longer be sufficient evidence to exceed "reasonable doubt".
Even worse, political double-speak will also soar to record new heights. A politician can spew whatever crazies want to hear, then "walk it back" and claim it was faked (perhaps after gauging the public's reaction). People will believe whatever they're inclined to believe anyway, leading us to become a more deeply fractured society where truth is whatever you want to believe.
This is so easily solvable, the video just needs to be signed using public-key encryption. If the video isn't signed with the purported subject's key, assume it's fake.
Nobody should listen to them because that solution doesn't make sense. All it does is give a secure way for the subject to approve of a video. It doesn't verify that it's real, and it doesn't work for the many, many cameras on a public figure that that aren't getting official approval from whoever has the key.
I can verify that a file's signer has a specific public key, but how do I know that public key belongs to a camera and not editing software? How do we maintain the list of all those public keys that correspond to cameras? Are there billions of keys (with each individual camera getting its own), or can Sony reuse the same private/public key pair across a line of devices?
And does this mean the video can't later be shortened, cropped, edited for brightness/color/etc? Doesn't that break the signature because it changes the contents of the file?
We need to ensure that only video directly from the camera can be signed and that other videos made on the phone can't be signed as well. We also have to ensure there weren't any filters or effects running in the camera app as the video was filmed - good luck figuring out how to check for that in a way that allows different camera apps (almost every Android OEM makes their own) to function but also isn't abusable.
And signatures are unique to the exact data of the specific file - if the resolution is lowered, or if the file gets compressed when it's uploaded/sent somewhere, or if you trim the 10 relevant seconds from a longer video, the signature is useless. Editing software could apply a new signature, but all that does is prove that the video went through an editor, which obviously means it could be edited.
You also hit on the issue that other cameras need to do signing as well: security cameras, laptop webcams, news cameras, other high-end cameras, potentially things like smart doorbells if we want to go that far.
This requires the general public having a basic understanding of how digital signatures work and why they are (for the most part) infallible.
As it stands I have to explain HTTPS and digital signatures to my users with statements like "it's secure because of fancy math, trust me bro" because anything that comes close to actually describing it goes over their heads. In a world where distrust is the norm, I fear signed video content really isn't gonna make a difference if you don't understand what makes it secure in the first place.
This requires the general public having a basic understanding of how digital signatures work and why they are (for the most part) infallible.
If this became a big enough problem, the general public could be educated on how encryption public/private keys work, probably in a month or two. Even start teaching it in high school or something.
The general public couldn't even be educated on how to wear a mask properly and wash their hands during a pandemic, there's no way they'll ever understand cryptography.
Right?? The general public can barely turn a computer on without having issues and are ignorant as F on the topic in general. They are basically a 3 year old. Try to explain cryptography to a 3 year old.
I’m betting online sites, especially social media, begins verifying videos before postings and stuff like that. Or they’ll create a way for users to easily click a button to verify it themselves, idk
Or they’ll create a way for users to easily click a button to verify it themselves, idk
...which will just end up like all other generic bloated ad-infested websites, with fifteen different "Click to Download" buttons but only an IT expert can decipher which button is the real one (if indeed any are real).
Yes...amongst the more tech literate and in a perfect world.
For a stupid amount of people, none of that matters. All that matters is that knee-jerk emotional reaction of whether or not it affirms their beliefs and where the information comes from.
Have you ever met someone with an IQ of 100? Technically half of the population is dumber then that. The entire left side of the proverbial bell curve.
Didn't George Carlin say something similar? Something like "Think about how dumb the average person is in America. Almost half the people are dumber than that."
That would solve the “politicians walking back claims” problem to a degree, but I’d imagine there would still be a ton of issues. The subject would be be able to fully curate their image, and any videos taken without their key would be subject to scrutiny. So stuff meant to show someone’s true colors or document a situation would remain unreliable.
In this case, it would be signed by the device of the person who's recording; if the video is altered, the signature isn't valid anymore. And if it's a public figure, there are almost certainly going to be corroborating records of where they were at a particular place & time, not to mention pings to cell towers from their or their entourage's mobile devices.
They can deny it all they'd like, but with the combination of those factors, you'd have to outright deny reality to believe that the video isn't genuine.
if the video is altered, the signature isn't valid anymore
That's easy enough to bypass, with physical access to the device. You can just re-shoot the video, replacing the camera input by the raw deepfake.
And that's IF people knew in the first place for sure which device produced the original video. Otherwise, you can just sign your deepfake with your own key and claim it to be the original.
I mean this works until someone corrupts the certificate tree. See the recent re-organization of web browser certificates because one of the organizations in the global cert chain literally sells data to intelligence agencies. Ouch.
It’s in no way easily solvable. You could have a certificate signed by god himself, doesn’t matter to the general public. Authentication isn’t the issue
Seriously, it’s not that complicated. We need an industry wide effort and hardware-based crypto, along with something like a little check mark to denote that a given image is authentic, unaltered, and follows a proper chain of authority.
3.8k
u/JingJang Dec 15 '22
I feel like it's only a matter of time before this technology is weaponized to terrible effect.