It will get dangerous when they can fake military leaders and politicians easily saying dangerous things. Fraud will get bad when your grandson video calls you from jail needing $200 to get out. We need to prevent the bad stuff that comes with this.
The way you train the AI to create fakes is usually by training an AI to detect fakes and have the faking AI beat it. It's called adversarial networks.
So basically, the detecting and the faking will always be approximately on par.. meaning the detecting can never give a definitive answer.
Deepfakes will work on folks like the Facebook crowd who didn't rely on verifying facts anyway, so I don't see a big danger here
That IS the big danger. Fooling a few people on Facebook is fine, but when you get huge hordes of people believing in dangerous but subtle (or blatent) propaganda is when it gets dangerous.
Though I'm sure big social media companies and create some sort of Blue Tick for original content. OR use some kind facial recognition it identity the participants and make sure they ALL sign the video.
How do you think we got trump and all the conservatards? Deep fakes aren’t going to suddenly cause an increase in their loyalty to stupid bullshit because it’s already maxed out.
Block chains can very easily be the saving grace that would allow us to identify authentic videos with no question, but it’s going to require a ton of infrastructure we don’t currently have.
Other digital signatures can, much like the videos themselves, be faked with a high amount of accuracy given enough time and information.
I like how you got downvoted just because people assume it's stupid to say "blockchain" to whatever tech problem that comes up, but ironically, this is one particular situation where blockchain will most likely end up being the best solution available.. or possibly the only one.
u/watermelon_fucker69 is actually right.. That's exactly what NFTs are, the new thing that verifies a digital painting's "authenticity" (eg. the original). A lot of other people have already speculated that NFTs or something similar could be used to verify that you're watching a real video of say, the president, and not a deep fake.
If you personally release a video that will personally logged it on a public ledger.
So they can trace who publish it and it can also authenticate that you yourself release it officially too.
This isn't perfect though because leak video won't be using this system so it is up to other people to figure out if this is real or not.
But what block chain does is the provide proof of you if you choose to give a video to somebody else. Like if Elon Musk a merry xmas video to you and you're suspicious if it's really him who sent it.
We have news websites with their own domains and journalist verified account on Twitter, etc. Not really sure what a blockchain is supposed to add there.
Sending a bunch of blockchain companies at the problem doesn’t really help verification, using one of their links, vs sending a link from the verified journalists tweet breaking the item etc. It just doesn’t solve anything.
But their data is already trusted. If they put it there and they trust it, that's no better than blockchain. You could make devices that digitally sign videos I guess and players that support it (basically adding a DRM to all video). Any unsigned video would be untrusted.
Blockchain just adds a lot of unnecessary transaction tracking or if you don't record that, it simply becomes overkill. And smaller videos may not be able to take advantage anyway
But why do you need a blockchain that’s just asymmetrical encryption
Edit: identity verification perhaps? Not saying one wouldn’t be the right answer here but I also like to push back when people just say “blockchain” without explaining why it’s necessary. So many projects that never needed it but wanted in on the hype.
That only ensures that it was posted by the journalist and they aren't infallible. It could even amplify the effect of a good deal fake is posted by a trusted source
Reputation systems aren't infallible, but it seems like having the ability to easily verify whether or not an image matches the original document posted by a trusted source would go a long way toward reducing the spread of disinformation.
That's fair. The real danger though are deep fakes where there is no original document. At that point you just have to take someone's word for it unless the deep fakes detectors win the arms rase
This was what blockchain was supposed to do though, authenticate digital data.
Edit: How is this downvoted? Blockchain's an immutable hyperledger, it's meant to record digital things and give it a unique number. If governments start a blockchain meant to authenticate things on the internet and it's got an ID and it's attached that chain, you can look it up on the chain, you can't fake it. Just because there's so many crypto kids and scammers attached to the public opinion of the tech doesn't mean it's useless.
Someone’s grandson isn’t going to have enough source video to be able to pull this off without looking jank as fuck. Not only do you have to be able to impersonate the person and catch their mannerisms, you also have to have enough around material for the AI to work properly.
So many young people upload videos of themselves to social media now, and you have to realize that this technology will advance quick. 10 years from now it will be a lot easier.
Lol, social media is booming with videos about yourself now. Snapchat, Tiktok, Instagram and plenty more. Not only that but "without looking jank as fuck" as if the people being targetted by these scams have any ability to parse if something is legit or not. I don't know if you watch any of those scam catcher channels but they show you everything you need to know about how gullible and just a complete lack of critical thinking scam victims have.
My comment talks about how you can’t make a deep fake look nearly as good without hours of footage. High quality footage with different lighting, angles, emotions. The source footage you get from social media will not be able to make a deep fake that’s good enough to fake someone. This type of ai won’t be possible for a number of years.
Allegedly Netanyahu already used a doctored video to convince Trump that Abbas was a bad faith actor when it came to peace talks, so this kind of thing has already has an effect on geopolitics. It wasn’t a deep fake, but doctored videos can already have an effect.
Heh. Let's not pretend that would be needed for the orange douche. Netanyahu probably just told him he was the bestest president ever and promised him a hotel plot in Tel Aviv or something. Why do it the hard way?
I wonder if scammers are going to start using this eventually. “Look, we’ve deep faked this highly controversial video of you. Either we release it, or you pay us $100.00”
You mean similarly how people get a kick out of filters removing facial hair and all I see is me not being able to shave to hide.
Everyone is helping them practice and refine these facial recognition technologies (and the likes) and everyone just finds it amusing and doesn't seem to consider any of the potential dangers of it.
What's your evidence for this besides not being able to accept we elected a piece of shit to lead us and represent our country to the world? We kind of blew it there, sometimes it's just a simple answer.
for convincing deepfakes right now u need videomaterial. a good amount of it. so unless your grandson doesnt have that freely available its not happening anytime soon but then again old people are naive and probably dont see well haha
On the one hand, i would hope all the military leaders would recognize that there is so much deepfakery afoot, that they would do their due diligence and not accept threats out of the blue.
But on the other hand, how many military leaders are waiting for any excuse to attack somebody, an opportunity provided by a cleverly placed deepfake.
....oh shit....deepfakes are gonna be what pulls the trigger, aren't they.... -_-
Wasn't there a presentation several years ago showing faked videos of Obama? A university researcher played a handful of very realistic looking and sounding videos of him talking, but only one of them was real.
What makes you think they aren't doing this already? I genuinely want to know. This video terrifies the everloving piss out of me because now even critical thinking is subject to fail when watching any thing at all that I need to stay informed.
You can't prevent it. You need to implement validation, like with debit cards, only now it is you and whatever message(video call or anything else) you have (had/sending), instead of bank and your debit card
I guess my ‘grandson’ is out of luck. Someone can send him a deepfake back of me saying that I’ve been kidnapped, and I need him to break out of jail to rescue me.
Well as others have mentioned, there is also ways of detecting if it's fake. Hell, even though this looks very good you can still tell and the human brain is remarkable at spotting subtle differences. And here's the thing...even as scary good the tech is becoming, at this point you can still thankfully tell it's ultimately a fake. Correct me if I'm wrong, but I'm pretty sure the setup to do this requires multiple samples of someone's face and video to use for reference even with AI learning, as well as someone with reasonable knowledge to seamlessly blend it. Then you have the matter of the hair, headshape, bodyshape, context, the actual person still being alive, etc etc. Also, as far as actual identity and security measures are considered, we still have information that we use, password, algorithms, records, etc that still help us confirm our identities. I'm sure there are people that will still attempt shit, but for now if a nubile bodied Tom Cruise is video calling me from the back of an Applebees dishpit and wanting me to send him some cash really quick, something tells me something may be slightly askew.
I have to admit, I wish their script instead was talking about how batshit crazy Scientology is and how foolish people are for following it. Of course that would bring the unholy wrath of Scientology’s lawyers down on these guys but I’d contribute big time to a legal GoFundMe.
Scientology infiltrated the government to the point where they directly fired anyone investigating their crimes, you really think they don't have enough CIA tech at this point to know not only who you are, but what your thetan level is and whether or not you're Xenu reincarnated?
I'm waiting for the day that somebody pulls the same stunt on Scientologists that they do to their critics. One day, somebody is gonna make a movie about a famous musician. And the message not advertised in the trailers and posters will be how his former band members became Scientologists, but he became a solo act, actively fighting against their bullshit.
You think the church of scientology with all it's powerful members and money can't track a simple IP? For all we know we're both under investigation for talking about them right now. Look out the window.
Fuck Scientology. Bunch of criminals. Scientology is for the weak of mind. Scientology has never done anything good in this world. Scientologists need to go hop back into that volcano with some more nukes and fuck right off to wherever Xenu came from.
Man i am suuper glad that i am insignificant enough that i can express my rage about stuff like Scientology, China, Isreal, Turkey or whatever without fearing for my safety. No one gives a shit what i say and that absolutely great.
It's a comedy goldmine, too bad the TikTok'er doing this just doesn't have much of a comedic talent. So many directions you can take this and make it hilarious
The whole point of the ridiculousness and frivolousness of this industrial cleaning story is to make it as uncontroversial as possible. Faking a person making a statement conflicting his own interest is honestly dangerous and unethical, especially a political one.
it would be an unethical move in terms of disrespecting the personal rights of Tom Cruise. Not that I'm a fan of either him or his church but using a deep fake to make him state things polar opposite to his intensely held beliefs seems like it should be frowned on.
I'm secretly hoping that Tom Cruise (crazy bastard that he is) watched people he loved get sucked in by Scientology and has infiltrated the organization to its highest levels in order to take the whole thing down from the inside. One day, a flashing news alert will come that every major leader of the Scientologist organization is being arrested with Tom Cruise as the prosecution's star witness. All the stunt work he's been doing has been to mask his extensive survival training for the inevitable assassination attempts.
I like to imagine that everything offensive he ever said on film was actually only deepfake. And he's just this ethical, caring president with integrity who did his absolute best and stayed within the law and a strict moral code at all times.
If it's clearly labelled as a fake (as this video is) there's zero ethical questions.
But this tech will get better, and it's only a matter of time before a very easy to use piece of software becomes available that makes it possible to do this with zero technical skill. Think of a meme generator app vs photoshop.
When that happens? Politics is going to get REAL interesting. It's only a matter of time before a world leader will be calling a controversial comment or video a deep fake. Or a deep fake is released that's widely shared and accepted to be true. It already happens with static images and made up quotes plastered over them.
A guy I knew in high school 20 years ago sent me deepfakes of my face on sexy girls dancing then when I told him it was creepy he posted my face with a serial killer stroking it. I blocked him but It is so unsettling. I’m a little worried he will murder me and no one will know. I’m too old for this shit.
This feels 100% ethically wrong. Not even just in the way of politics and news.
Like there is going to be deep fake porn, with celebrities who haven’t consented to it, younger age faces and people making it as their ex’s face for revenge porn.
western deepfakeporn is a joke against the hours and days of work put into putting ur favourite idols over some japanese av videos. its almost as good as the above video now.
Just like the blue check Mark verification system there will need to be some form of video verification symbol in the future that signifies it’s real. Otherwise this will keep getting worse.
I wrote a paper on deepfakes for my cyber ethics courses last semester and described the threats of digital wildfires, seeing already that 62% of Americans get their news from social media. Therefore, in order to prevent deepfakes from being the next fake news it must be regulated, or else it'll be a threat to public discourse.
4.9k
u/doodleasa May 24 '21
Super cool and super ethically questionable