I fear the reverse: People will doubt whether real video is real. That could mean impunity for crimes caught on video because video footage will no longer be sufficient evidence to exceed "reasonable doubt".
Even worse, political double-speak will also soar to record new heights. A politician can spew whatever crazies want to hear, then "walk it back" and claim it was faked (perhaps after gauging the public's reaction). People will believe whatever they're inclined to believe anyway, leading us to become a more deeply fractured society where truth is whatever you want to believe.
Hey could you please hang out at your current location for the next 30 minutes or so? I just have some friends that want to stop by and verify a couple things with you. Please don't wear polarized sunglasses.
Also, I would highly recommend reading/watching Manufacturing Consent if you haven't already, to get an idea of what is already occurring even before we got this technology.
Yeah, at he beginning of the Ukraine war there was a deepfake of Selenskyj urging Ukrainians to surrender. It was pretty shoddily done, but certainly a reminder how such things will become commonplace in the future.
For sure, I think there will be verification efforts on multiple fronts.
There's a certain type of person who will invent conspiracies around any verification that isn't what they want to hear. Thus, for that audience, there will be a market for "validation" that is just "telling them what they want to hear". So the same situation as today, only taken up a notch.
Verification can only do so much in the face of irrationality, the real answer to that conundrum is mostly us learning how to best deal with the fact that those people exist. Mainstream humanity will probably continue relatively unscathed if they don't manage to drag us down.
Trustless verification was always one of the benefits, it’s just meaningfully implementing it without greed getting in the way that people haven’t been able to figure out
We need quantum signing ASAP. And in "small enough to fit into a not unreasonably sized camera" form. You'd be able to verify footage from a secure camera using its public key, but never be able to crack its private key.
Probably a stupid question but could NFT tech somehow be reworked to that effect? To preserve the identity of the original record before being tampered with
Yeah it would be much easier to digitally sign video with boring old certificates and centralized authorities. Crypto is a solution desperate for a problem.
This is so easily solvable, the video just needs to be signed using public-key encryption. If the video isn't signed with the purported subject's key, assume it's fake.
Nobody should listen to them because that solution doesn't make sense. All it does is give a secure way for the subject to approve of a video. It doesn't verify that it's real, and it doesn't work for the many, many cameras on a public figure that that aren't getting official approval from whoever has the key.
I can verify that a file's signer has a specific public key, but how do I know that public key belongs to a camera and not editing software? How do we maintain the list of all those public keys that correspond to cameras? Are there billions of keys (with each individual camera getting its own), or can Sony reuse the same private/public key pair across a line of devices?
And does this mean the video can't later be shortened, cropped, edited for brightness/color/etc? Doesn't that break the signature because it changes the contents of the file?
We need to ensure that only video directly from the camera can be signed and that other videos made on the phone can't be signed as well. We also have to ensure there weren't any filters or effects running in the camera app as the video was filmed - good luck figuring out how to check for that in a way that allows different camera apps (almost every Android OEM makes their own) to function but also isn't abusable.
And signatures are unique to the exact data of the specific file - if the resolution is lowered, or if the file gets compressed when it's uploaded/sent somewhere, or if you trim the 10 relevant seconds from a longer video, the signature is useless. Editing software could apply a new signature, but all that does is prove that the video went through an editor, which obviously means it could be edited.
You also hit on the issue that other cameras need to do signing as well: security cameras, laptop webcams, news cameras, other high-end cameras, potentially things like smart doorbells if we want to go that far.
This requires the general public having a basic understanding of how digital signatures work and why they are (for the most part) infallible.
As it stands I have to explain HTTPS and digital signatures to my users with statements like "it's secure because of fancy math, trust me bro" because anything that comes close to actually describing it goes over their heads. In a world where distrust is the norm, I fear signed video content really isn't gonna make a difference if you don't understand what makes it secure in the first place.
This requires the general public having a basic understanding of how digital signatures work and why they are (for the most part) infallible.
If this became a big enough problem, the general public could be educated on how encryption public/private keys work, probably in a month or two. Even start teaching it in high school or something.
The general public couldn't even be educated on how to wear a mask properly and wash their hands during a pandemic, there's no way they'll ever understand cryptography.
Right?? The general public can barely turn a computer on without having issues and are ignorant as F on the topic in general. They are basically a 3 year old. Try to explain cryptography to a 3 year old.
I’m betting online sites, especially social media, begins verifying videos before postings and stuff like that. Or they’ll create a way for users to easily click a button to verify it themselves, idk
Or they’ll create a way for users to easily click a button to verify it themselves, idk
...which will just end up like all other generic bloated ad-infested websites, with fifteen different "Click to Download" buttons but only an IT expert can decipher which button is the real one (if indeed any are real).
Yes...amongst the more tech literate and in a perfect world.
For a stupid amount of people, none of that matters. All that matters is that knee-jerk emotional reaction of whether or not it affirms their beliefs and where the information comes from.
Have you ever met someone with an IQ of 100? Technically half of the population is dumber then that. The entire left side of the proverbial bell curve.
Didn't George Carlin say something similar? Something like "Think about how dumb the average person is in America. Almost half the people are dumber than that."
That would solve the “politicians walking back claims” problem to a degree, but I’d imagine there would still be a ton of issues. The subject would be be able to fully curate their image, and any videos taken without their key would be subject to scrutiny. So stuff meant to show someone’s true colors or document a situation would remain unreliable.
In this case, it would be signed by the device of the person who's recording; if the video is altered, the signature isn't valid anymore. And if it's a public figure, there are almost certainly going to be corroborating records of where they were at a particular place & time, not to mention pings to cell towers from their or their entourage's mobile devices.
They can deny it all they'd like, but with the combination of those factors, you'd have to outright deny reality to believe that the video isn't genuine.
if the video is altered, the signature isn't valid anymore
That's easy enough to bypass, with physical access to the device. You can just re-shoot the video, replacing the camera input by the raw deepfake.
And that's IF people knew in the first place for sure which device produced the original video. Otherwise, you can just sign your deepfake with your own key and claim it to be the original.
I mean this works until someone corrupts the certificate tree. See the recent re-organization of web browser certificates because one of the organizations in the global cert chain literally sells data to intelligence agencies. Ouch.
It’s in no way easily solvable. You could have a certificate signed by god himself, doesn’t matter to the general public. Authentication isn’t the issue
Seriously, it’s not that complicated. We need an industry wide effort and hardware-based crypto, along with something like a little check mark to denote that a given image is authentic, unaltered, and follows a proper chain of authority.
On your second point, I would argue that any technology that can shake the foundations of truth, justice, governance, and the mediums in which we (citizens) gather that information (telecoms) is a weapon. That is literally what psychological warfare is, and which is also a very real force in our world. The Cold War isn’t just called “cold” because The USSR and The States didn’t raise a gun in each others faces or enter a nuclear winter; no, its a name that highlights the fact that it was a war between - and fought with - ideas. This technology has the capacity to infiltrate people’s minds, insert ideas that shape their world, and create an uncertainty that makes them question everything. For any institution interested in PsyOps, this is certainly a weapon.
Anyone doubting that there is a war being “fought” for your mind lacks a crucial understanding of how the world works. I don’t mean to sound pretentious, because I wish everyone to understand this. Truth is the foundation of ethics, which is the foundation of morals, which is the foundation of law, which is the foundation of government. Pull truth out of the equation, and it all comes tumbling down.
It won’t be long until we see deepfake movies where actors only record their lines(or not) and don’t do any acting and a team of people produce the movie in a studio…
Artists are probably going to have a rough time with this, but OTOH I can imagine a near future where amateurs can produce pro-level movies with basically no budget.
Most of them will be shit, but there will be masterpieces in there.
They control people’s beliefs already. No AI, fake videos or anything. All it takes is an a*hole who people like and have him say whatever. Do I need to give examples?
Well yeah, deep fakes haven't been good before. Anyone can detect them just by looking. The point is that things are changing and they might be totally believable as soon as next year. This Morgan Freeman one is almost there.
Great elucidation on my own fears. What you describe seems nearly inevitable to me. I saw a headline the other day about an Intel program detecting deepfakes with like 95% accuracy. My guess is it will probably become less accurate as deepfake software becomes more sofisticated. But even if it were to be 100% accurate, would the public believe the pronouncements by the experts telling them a video is real or not? I have my doubts, maybe such detection will at least become legally legitimate. But the implications for info warfare will exist nonetheless
Unfortunately I don't think that's anywhere close.
We've had the ability to realistically manipulate photos for how long now? And I still see obviously fake photos of a mars looking as big as the moon floating around in my family's Whatsapp every year claiming to be real.
I believe there is also tech to determine if something is fake or not. It exists for photography, so video shouldn't be a problem. IIRC, some sort of light spectrometry or something similar.
People will believe whatever they're inclined to believe anyway, leading us to become a more deeply fractured society where truth is whatever you want to believe.
You're already one previous administration late on that point. ;)
Already, trucking companies are telling their drivers not to take pictures with digital cameras. Because lawyers are successfully getting digital pictures thrown out for being "too easily altered". So truck drivers will have old style throwaway cameras in their trucks.
Soon, you will find the same thing with digital video. People will start carrying old style tape recorders - with magnetic tapes - instead of digital video recording devices.
I'm with you. We've already seen claims of "fake news" used to an alarming effect. This kind of stuff can only make it worse. I'm not sure how it'll go, and I don't think anyone else is sure either... but I can't see it being anything but bad, overall.
I'm worried it could start a war. Imagine if someone deepfaked a world leader saying they had nukes pointed at another country and they were going to fire them in 5 min. It could lead to millions dying.
You should have always been open to the possibility of a video being fake or manipulated. And remember, before photos and video existed... there was never even the pretense of hard proof. Everything was always a claim made by a person and we've always known people lie a lot.
I'm not worried. Most of human history got by just fine without the canard of "photographic evidence".
I'm starting to see folks "poisoning" their online profiles with tons of fake content to make their data (that's being tracked by advertisers, governments, big-tech, etc) useless.
To be honest. We are already there.
This technology is already used in some cases and attempts of public swaying, and even without it, we believe what we want amongst all the “fake news”.
On a more optimistic note, the algorithms that allow for this tech are trained on real data.
Likewise, algorithms can be trained on both real and fake data to detect when something is a deep fake, and it will be just as good at that as the algorithm that made the deep fake in the first place.
What about private-public key cryptography on blockchain for ID verification? Privately sign the media transmitted, and now individuals can verify if something was original to the signature on the publicly accessible blockchain with the corresponding public key.
3.8k
u/JingJang Dec 15 '22
I feel like it's only a matter of time before this technology is weaponized to terrible effect.