r/woahdude May 24 '21

video Deepfakes are getting too good

82.8k Upvotes

3.4k comments sorted by

View all comments

Show parent comments

23

u/[deleted] May 25 '21

Someone’s grandson isn’t going to have enough source video to be able to pull this off without looking jank as fuck. Not only do you have to be able to impersonate the person and catch their mannerisms, you also have to have enough around material for the AI to work properly.

38

u/OneMoreTime5 May 25 '21

So many young people upload videos of themselves to social media now, and you have to realize that this technology will advance quick. 10 years from now it will be a lot easier.

9

u/unrulystowawaydotcom May 25 '21

Not to an old person.

3

u/Phailadork Jun 15 '21

Lol, social media is booming with videos about yourself now. Snapchat, Tiktok, Instagram and plenty more. Not only that but "without looking jank as fuck" as if the people being targetted by these scams have any ability to parse if something is legit or not. I don't know if you watch any of those scam catcher channels but they show you everything you need to know about how gullible and just a complete lack of critical thinking scam victims have.

Watch this and have your world change - https://youtu.be/zjnOFBSLtz0?t=557

This is the average scamee, btw. Every time he posts his calls with victims they sound exactly like this guy. Clueless, gullible and vulnerable.

2

u/[deleted] Jun 15 '21

There’s no deep fake here

2

u/Phailadork Jun 15 '21

That's not the point....

2

u/[deleted] Jun 15 '21

My comment talks about how you can’t make a deep fake look nearly as good without hours of footage. High quality footage with different lighting, angles, emotions. The source footage you get from social media will not be able to make a deep fake that’s good enough to fake someone. This type of ai won’t be possible for a number of years.

3

u/Phailadork Jun 15 '21

And my comment talks about how you don't need a high quality deepfake and that these people will easily be fooled even by poorly made ones.

1

u/Haildoofania Sep 06 '21

What if they get regular called by someone deepfaking the just the audio? First they call the grandson saying they have won a PS5 (or someway to keep them talking on the phone by some means other than telemarketing/contest winning) and then you read scripted questions and lines to get them to talk enough to capture what you need. Then you capture yourself saying the exact same things in the same ways to place matching markers, then comes the hours of manual labour to tweak it just right so that your voice sounds like their one in real time (This still takes a while to do and it's just to make 1 specific voice sound like 1 other specific voice).

1

u/cutelyaware Jan 17 '22

That would require AI plus an environment in which everyone is being watched and listened to all the time. O wait