It’s Getting Harder to Spot a Deep Fake Video

  Рет қаралды 8,360,380

Bloomberg Originals

Bloomberg Originals

Күн бұрын

Fake videos and audio keep getting better, faster and easier to make, increasing the mind-blowing technology's potential for harm if put in the wrong hands. Bloomberg QuickTake explains how good deep fakes have gotten in the last few months, and what's being done to counter them.
Video by Henry Baker, Christian Capestany
Like this video? Subscribe: kzfaq.info?sub_...
Become a Quicktake Member for exclusive perks: kzfaq.infojoin
QuickTake Originals is Bloomberg's official premium video channel. We bring you insights and analysis from business, science, and technology experts who are shaping our future. We’re home to Hello World, Giant Leap, Storylines, and the series powering CityLab, Bloomberg Businessweek, Bloomberg Green, and much more.
Subscribe for business news, but not as you've known it: exclusive interviews, fascinating profiles, data-driven analysis, and the latest in tech innovation from around the world.
Visit our partner channel QuickTake News for breaking global news and insight in an instant.

Пікірлер: 5 100
@business
@business 3 жыл бұрын
We have some exciting news! We’re launching channel Memberships for just $0.99 a month. You’ll get access to members-only posts and videos, live Q&As with Bloomberg reporters, business trivia, badges, emojis and more. Join us: kzfaq.infojoin
@52sees
@52sees 3 жыл бұрын
Epic
@Amu_LEGEND
@Amu_LEGEND 3 жыл бұрын
Okay 👁️ 👁️ 👄
@lightningfun6486
@lightningfun6486 2 жыл бұрын
What
@TheProfessor66
@TheProfessor66 2 жыл бұрын
"Warfare campaign" that aged poorly with the Ukraine president deepfake.
@deleted9388
@deleted9388 Жыл бұрын
Anyone with 3 million subZ is a cointel pro shill
@520lun
@520lun 3 жыл бұрын
2018: Deep fake is dangerous 2020: DAME DA NE
@mrpizza_5139
@mrpizza_5139 3 жыл бұрын
DAME YO
@PopAwesomeJesusislife
@PopAwesomeJesusislife 3 жыл бұрын
DAME DA MO YOOO
@lassipls
@lassipls 3 жыл бұрын
ANTA GA
@aldorahman4755
@aldorahman4755 3 жыл бұрын
SUKIDE, SUKISUGITE
@520lun
@520lun 3 жыл бұрын
Dore dake
@nufizfunslower3438
@nufizfunslower3438 3 жыл бұрын
Imagine developing advanced technology and people using it to make memes
@joonatanlepind3124
@joonatanlepind3124 3 жыл бұрын
lmao I love that it's happening.
@diwakardayal954
@diwakardayal954 3 жыл бұрын
that's the internet
@JA-yz8eq
@JA-yz8eq 3 жыл бұрын
this is just movie technology released for im sure the pure purpose of high-level plausible deniability tampering
@joonatanlepind3124
@joonatanlepind3124 3 жыл бұрын
@@JA-yz8eq no it's for making callmecarson sing big time rush
@leleled6467
@leleled6467 3 жыл бұрын
I guess people are doing this in an attempt to corrupt it before it's used for the worst
@aguyonasiteontheinternet578
@aguyonasiteontheinternet578 Жыл бұрын
The real terrifying thing about this video is that it was uploaded 4 years ago.
@audiobyamp4459
@audiobyamp4459 Жыл бұрын
Im starting to see all of the videos with startling information are usually old
@whizzerbrown1349
@whizzerbrown1349 Жыл бұрын
So far the only Deep Fake I’ve seen popping up have been videos of the members of parliament playing sweaty League of Legends matches so personally the doom and gloom of this video has started eroding lol
@963freeme
@963freeme 10 ай бұрын
The newer videos of Trump look like deep fakes. In the Kanye West & Piers Morgan interview from last year, Kanye looked like a deep fake.
@myusernameis_pasword6860
@myusernameis_pasword6860 3 жыл бұрын
I think rules and laws about deep fakes should be put in place before this gets any worse because real harmful things can happen to people's reputations and even the fact that people can claim innocence for crap they did say!
@river_acheron
@river_acheron Жыл бұрын
How? Once something CAN be technologically done, there cannot be rules and laws to unlearn it. lol. Those that want to use deepfakes to scam are of course not going to listen to rules and laws against creating them! The ONLY solution here is to find a way to detect a deepfake from the real thing.
@tj_trout9855
@tj_trout9855 Жыл бұрын
Surly no one will break the law!
@myusernameis_pasword6860
@myusernameis_pasword6860 Жыл бұрын
@@tj_trout9855 of course people will but at least with rules in place people have the ability to take action in a court
@yoursleepparalysisdemon1828
@yoursleepparalysisdemon1828 Жыл бұрын
it’s just technology at this point. you don’t seem to understand why banning it would be hindering tech. don’t hate what you don’t understand.
@myusernameis_pasword6860
@myusernameis_pasword6860 Жыл бұрын
@@yoursleepparalysisdemon1828 I'm no hating it, I think it has cool applications. You misunderstand my point of view. What I'm saying is that there should be laws in place to defend those who's images are being used to say things they never said. I don't want to ban it, I just want to make sure that there is protection in place in case people misuse this technology.
@brianmchaney7473
@brianmchaney7473 5 жыл бұрын
2008: Pics or it didn't happen. 2018: Pics are a lie.
@AFCA-vn9bl
@AFCA-vn9bl 5 жыл бұрын
Brian McHaney pics have been a lie since photoshop, but now evenvideo’s are a lie
@wolfenstien13
@wolfenstien13 5 жыл бұрын
Remember it or it didn't happen, Write it or it didn't happen, Paint it or it didn't happen, Pint it or it didn't happen, Photograph it or it didn't happen, Record it or it didn't happen, Take a of picture it or it didn't happen, Video tape it or it didn't happen, What now?
@efloof9314
@efloof9314 5 жыл бұрын
White Coyote pull out your brain hook it up to a system and show the memory of it happenin
@Tofu3435
@Tofu3435 5 жыл бұрын
@@wolfenstien13 now? nothing happening.
@Dig_Duke_SFM
@Dig_Duke_SFM 5 жыл бұрын
@@DrumToTheBassWoop A literal human sacrifice to Satan to reveal the truth. Or it didn't happen.
@MrPatpuc
@MrPatpuc 5 жыл бұрын
This is terrifying. Imagine when deepfake videos can frame innocent people as guilty.
@22z83
@22z83 5 жыл бұрын
Well soon videos can't be used as evidence because of this
@billybadbean9077
@billybadbean9077 5 жыл бұрын
@@22z83 but it can still ruin lives
@treerexaudi
@treerexaudi 5 жыл бұрын
unless it is 16k res it isn't trustable xD even a simple mask in a robbery I saw can make it look like someone else just because low quality camera. It is silly and scary.
@MrSirFluffy
@MrSirFluffy 5 жыл бұрын
You can fake at 4k and lower resolution to make it impossible to know if its fake.
@deitrickorullian505
@deitrickorullian505 5 жыл бұрын
False allegations can be made against people without any real evidence to support them and people believe it, I can't image what this will do
@willdwyer6782
@willdwyer6782 Жыл бұрын
Putting Tom Hanks as Forrest Gump into archival TV footage could be considered early deepfake video. They digitally manipulated the lips of the other people in the scenes to move in sync with an impersonator's voice.
@yoursleepparalysisdemon1828
@yoursleepparalysisdemon1828 Жыл бұрын
isn’t a deepfake using ai or something? iirc it was done differently.
@sarah69420
@sarah69420 Жыл бұрын
@@yoursleepparalysisdemon1828 deep fake is a general idea of creating a fake video/audio/picture of or including someone not originally there or altering those who are, AI is just a tool to get that done
@yoursleepparalysisdemon1828
@yoursleepparalysisdemon1828 Жыл бұрын
@@sarah69420 the definition is that it was digitially altered.
@jadkleb2788
@jadkleb2788 3 жыл бұрын
Other than the funny comments and memes this is actually extremely terrifying...
@LAkadian
@LAkadian 2 жыл бұрын
Actually, those are terrifying too, for their unabashed idiocy.
@ChrisSche
@ChrisSche 5 жыл бұрын
It won’t be long until video, photographs, or audio recordings are no longer considered evidence in a court of law.
@JP-sm4cs
@JP-sm4cs 5 жыл бұрын
Make public broadcasts have a near invisible encryption watermark that distorts modifications? But yeah phone based evidence is screwed
@eddyavailable
@eddyavailable 4 жыл бұрын
audio is very easily edited and manipulated nowadays.
@dennydarkko
@dennydarkko 4 жыл бұрын
Far Altright how do you know they haven’t already? 😂
@therogue9000
@therogue9000 4 жыл бұрын
Yes they will... Deepfakes are pointless on security cams they are not at the right angle and the video is ussualy to low res.
@ev.c6
@ev.c6 4 жыл бұрын
@@JP-sm4cs SURE. Like you can't fake the water mark either. You seem not to understand how complex the AI behind this deep fake is. If they can fake someone's facial expressions in a video like that, just imagine how easy it is to put some stupid watermark in a few frames.
@3p1cand3rs0n
@3p1cand3rs0n 5 жыл бұрын
I seriously thought they were going to reveal that the guy at 0:56 was, himself, a deep fake.
@thoyo
@thoyo 5 жыл бұрын
same here! his lips didn't seem to match his speech and his eyes looked a bit dead
@andrelee7081
@andrelee7081 5 жыл бұрын
I think that's just the power of a potato cam.
@TitorEPK
@TitorEPK 5 жыл бұрын
You're ready for the future.
@tacubaeulalio
@tacubaeulalio 5 жыл бұрын
Does anyone know what they are talking about when they mention the weather patterns or flowers? That part honestly confused me. I take it as they can make fake videos of weather changing or flowers blooming but not sure why that would be useful ?
@mishaj2647
@mishaj2647 5 жыл бұрын
ElyssaAnderson b
@arcosprey4811
@arcosprey4811 Жыл бұрын
Imagine how much worse it is now.
@spetzy1921
@spetzy1921 Жыл бұрын
This was 4 years ago. Let that sink in.
@sanmartinella4933
@sanmartinella4933 Жыл бұрын
And this tool is offered to the public, imagine what our governments have.
@cjezinne
@cjezinne 5 жыл бұрын
At first, I thought this was going to be bad... but then I saw the Nicolas Cage renders and then life made sense again
@hirokatsuvictor8755
@hirokatsuvictor8755 5 жыл бұрын
This has too much meme potential
@Matthew_Fog
@Matthew_Fog 5 жыл бұрын
Lol
@urface640
@urface640 5 жыл бұрын
2k likes yet only two(technically three now)replies lol
@llllIIIlllIIllllII
@llllIIIlllIIllllII 3 жыл бұрын
@@hirokatsuvictor8755 yeah...
@CodepageNet
@CodepageNet Жыл бұрын
this is all Nicolas Cages fault!
@OsaZain
@OsaZain 5 жыл бұрын
Imagine the potential for blackmail :/
@Dylan-hy2zj
@Dylan-hy2zj 5 жыл бұрын
OsaZain if anything the reverse is true, any video posted of you doing bad things you can just say it was deep fake blackmail
@madman2u
@madman2u 5 жыл бұрын
+Dylan Adams *Except* for the fact that it won't matter. Anything remotely real looking is going to work against your interests. It wouldn't matter if it's a fake because people will still believe you did or say X. People's reputation and life has been ruined for less and without evidence. Say someone accuses X or being a liar and a cheat. X says the video is a deep fake. The accusation while not necessarily true conflicts with X's statement. You can either take the video as evidence or the persons word who has a vested interest and therefore lies to protect themselves. It's a lose-lose scenario for the accused. It's just a matter of how much you'll lose. Even if the video is then proven to be fake the damage would've already been done. Unfortunately, bad news are so much easier to believe. It's not ideal at all... What we can do to combat this is to be more wary of the so called evidence people come up with. Being objective is important and if there is any doubt then one should always err on the side of innocence rather than guilt.
@OsaZain
@OsaZain 5 жыл бұрын
madman2u People tend to believe maligning things much much more easily than the positive ones as well :(
@holyn8
@holyn8 5 жыл бұрын
yea the potential for blackmailing is going down to 0% because of this technologie. you cant use videos for evidence anymore. everything you see on a screen could be faked
@elias_xp95
@elias_xp95 5 жыл бұрын
What blackmail? It's now easier to claim it as fake. It's the opposite effect of blackmail.
@ghdhfgh6125
@ghdhfgh6125 Жыл бұрын
This video was posted 4 years ago. Imagine how hard it is now.
@bignickreacts
@bignickreacts Жыл бұрын
I love how there are millions of issues in the world that need solutions, and instead, we figured out how to be more manipulative. 🤦‍♂️
@straightbusta2609
@straightbusta2609 5 жыл бұрын
This is probably the secret behind the $1000 emoji machine from apple.
@Dominicn123
@Dominicn123 5 жыл бұрын
Face tracking has been around for years brah
@r32fandom89
@r32fandom89 5 жыл бұрын
ay u got 69 subscribers
@amfm4087
@amfm4087 5 жыл бұрын
No because this requires hours of time and a decent graphics card for just a short clip. The iPhone uses a different technology as the emojis are 3D models. This technology uses 2D pictures like jpeg and png.
@jerrell1169
@jerrell1169 5 жыл бұрын
Yeah they straight busta!
@Lou-C
@Lou-C 5 жыл бұрын
Me me big boy
@candifemale5118
@candifemale5118 3 жыл бұрын
Everyone before: Deepfake is so dangerous... Everyone now: *DAME DA NE*
@stinkygorilla2058
@stinkygorilla2058 3 жыл бұрын
Is the app still up?
@beruta1733
@beruta1733 3 жыл бұрын
relateble
@ohword9541
@ohword9541 3 жыл бұрын
dame yo
@grigoriyefimovichrasputin7897
@grigoriyefimovichrasputin7897 3 жыл бұрын
DAME DA NE
@generically.watched
@generically.watched 3 жыл бұрын
DAME YO DAME NA NO YO
@sifutophmasterofeyerolling2513
@sifutophmasterofeyerolling2513 Жыл бұрын
It's insane how much the technology improved it just 4 years, we now have an almost perfect TTS voices as well.
@Truthorfib
@Truthorfib Жыл бұрын
It's insane though that the focus was this instead of other things that truly benefit us as a whole. Goes to show where innovation is heading and its not toward our collective success. But things like misinformation and espionage.
@timber8507
@timber8507 2 жыл бұрын
I wonder if this technology has or will be used in the war right now? It's absolutely something to consider when watching media today.
@nocaptainmatt3771
@nocaptainmatt3771 2 жыл бұрын
Of course it is
@anetkasbzk98
@anetkasbzk98 2 жыл бұрын
Bingo. Deep fake raW
@shelby1246
@shelby1246 2 жыл бұрын
This comment aged well… I went watching deepfake videos after hearing about the recent one of Putin.
@AbelMaganaAvalos
@AbelMaganaAvalos 2 жыл бұрын
Zelensky got deepfaked
@timber8507
@timber8507 2 жыл бұрын
@@AbelMaganaAvalos Yeah, I saw that on the news.
@Jaylio
@Jaylio 5 жыл бұрын
0:56 dude looks more cgi that the fakes
@fortheloveofnoise9298
@fortheloveofnoise9298 5 жыл бұрын
Those oddly fluttering lips.....wtf
@jcesplanada528
@jcesplanada528 5 жыл бұрын
I know, right. I really thought it was fake too
@hectorhector3819
@hectorhector3819 5 жыл бұрын
.
@SlatDogg
@SlatDogg 4 жыл бұрын
I seriously thought that someone deep faked that video just to prove a point.
@Atombender
@Atombender 4 жыл бұрын
Until the end of the video I thought that it was fake. Damnit...
@thehenryrobot
@thehenryrobot 5 жыл бұрын
*This would never have happened if Nicholas Cage didn't exist* 😜
@justahuman2121
@justahuman2121 5 жыл бұрын
Just 2 likes? 0 comment? Pretty sure this comment will blow one day. Edit: ok it's now 1,2 k Edit: 4k now Edit: i bet it will hit 7k
@testname2635
@testname2635 5 жыл бұрын
@@justahuman2121 Agreed
@leonthethird7494
@leonthethird7494 5 жыл бұрын
HENRY THE RC CAR its spelled nicolas cage
@tharv_2609
@tharv_2609 5 жыл бұрын
You everywhere
@mohammedraqib6418
@mohammedraqib6418 5 жыл бұрын
This is that day
@DjHardstyler
@DjHardstyler 7 ай бұрын
This aged well...already
@ResoundGuy5
@ResoundGuy5 5 жыл бұрын
This is going to end badly...
@glynemartin
@glynemartin 5 жыл бұрын
it's not gonna end...that's the problem...
@wutsit2yuhhuh246
@wutsit2yuhhuh246 5 жыл бұрын
@benzo I think you trust your government a little bit too much.
@wutsit2yuhhuh246
@wutsit2yuhhuh246 5 жыл бұрын
@benzo "We'll know our disinformation program is complete when everything the American public believes is false." -Former CIA Director William Casey
@joeljarnefelt1269
@joeljarnefelt1269 5 жыл бұрын
@benzo He said, "This is going to end badly." Point out where he stated that the entire development of these programs should be terminated.
@joeljarnefelt1269
@joeljarnefelt1269 5 жыл бұрын
@benzo Maybe you wouldn't want to develope it, or maybe you are just expressing your conserns of the possible misuses of the emerging technology.
@Nismoronic
@Nismoronic 5 жыл бұрын
Can I use it for memes tho
@lLl-fl7rv
@lLl-fl7rv 5 жыл бұрын
You're THE man.
@edd868
@edd868 5 жыл бұрын
Yes. Prepare for the oncoming deep fake meme war between 4chan and Reddit
@9yearoldepicgamersoldier129
@9yearoldepicgamersoldier129 5 жыл бұрын
Asking the real questions here.
@mariopokemon955
@mariopokemon955 5 жыл бұрын
Tesco Stig people have already, also can be used to start war but it's no biggie
@VOLAIRE
@VOLAIRE 5 жыл бұрын
Yeah memes aren’t a big deal ha
@brianorozco1074
@brianorozco1074 Жыл бұрын
Honestly, this is terrifying
@geometrikselfelsefesi
@geometrikselfelsefesi Жыл бұрын
Thats even 4 years ago
@xtechn9cianx
@xtechn9cianx 2 жыл бұрын
Imagine if the world leaders are using this on Putin right now
@anetkasbzk98
@anetkasbzk98 2 жыл бұрын
Bingo
@funnypeoplefail19
@funnypeoplefail19 Жыл бұрын
Putin is maybe allready dead?
@Ceshua
@Ceshua 5 жыл бұрын
Back in the days where everyone says: "Video evidence can't lie." 2018: (Edit) 2020: Baka Mitai
@akkafietje137
@akkafietje137 4 жыл бұрын
I saw it with my own eyes
@KnightmareNight
@KnightmareNight 4 жыл бұрын
Well, back then it couldn't. So they were still right.
@agentsmith9858
@agentsmith9858 4 жыл бұрын
@@KnightmareNight you missed the point
@TheAnonyy
@TheAnonyy 4 жыл бұрын
It could not then. Now it can lie. This is the problem with people embracing new technologies you can't trust what you, see, hear, feel, smell too many artificial things or there.
@rukna3775
@rukna3775 4 жыл бұрын
Ok boomer
@EnzoDraws
@EnzoDraws 5 жыл бұрын
1:47 why tf does the source look faker than the deep fake?
@aaronmicalowe
@aaronmicalowe 5 жыл бұрын
There is no such thing as a deep fake. This video is fake news.
@Unknown-dq2cj
@Unknown-dq2cj 5 жыл бұрын
🤣
@justanotheryoutubechannel
@justanotheryoutubechannel 5 жыл бұрын
Kalazakan Or more likely just a troll.
@kyoshinronin
@kyoshinronin 5 жыл бұрын
Overfitting
@TheVibes101
@TheVibes101 5 жыл бұрын
@@aaronmicalowe umm... I hope you are not serious.
@alinoprea54
@alinoprea54 Жыл бұрын
This was 4 years ago.
@venmis137
@venmis137 2 жыл бұрын
2018: Deepfake is a terrifying, dangerous technology. 2022: GENGHIS KHAN SINGS SUPER IDOL
@Cloudeusz
@Cloudeusz 5 жыл бұрын
Technology is a double edged sword
@fellowcitizen
@fellowcitizen 5 жыл бұрын
...that looks like a cup of tea.
@rvke5639
@rvke5639 5 жыл бұрын
with no handles
@user-ho1vt8vz2l
@user-ho1vt8vz2l 5 жыл бұрын
What is double edged sword then
@subzero5055
@subzero5055 5 жыл бұрын
@@user-ho1vt8vz2l you kill with it or get killed by it
@yungwhiticus8757
@yungwhiticus8757 5 жыл бұрын
Ad Victorium, Brother!
@TheAstronomyDude
@TheAstronomyDude 5 жыл бұрын
Nick Cage SHOULD be every actor in every movie.
@milanistaminetti
@milanistaminetti 5 жыл бұрын
TheAstronomyDude plot twist, he is
@Nilvolentibusje
@Nilvolentibusje 5 жыл бұрын
I support this
@YoungBlaze
@YoungBlaze 5 жыл бұрын
Hes actually everyone in real life
@antoniob4458
@antoniob4458 5 жыл бұрын
Where do I sign?
@muzgnasicianie
@muzgnasicianie 5 жыл бұрын
He is one of my favourite actors!
@double_lightsaber
@double_lightsaber Жыл бұрын
To think this was 4 years ago....
@skeetermcswagger0U812
@skeetermcswagger0U812 Жыл бұрын
Ever since I became aware of this technology,I got a really uncomfortable feeling about it. I knew it could be one of those technological 'superpowers' that wouldn't be safe if there was not a clear and adequate way to penalize the ways and how it could be used. It is in a way an ability to pirate & clone some forms of reality. Although there still seems to be some perceivable characteristics during the beginning stages of it's development that may be obvious to many and not just those with a 'trained eye',who knows at what point that it's going to be capable to make it indistinguishable to most if not all viewers of it? Do they 'have to' disclose this information? Who knows if some of the examples of those flaws that are more readily obvious to be fake aren't just being used to to distract the viewers from the more capable versions of this technology already? Great,..now I sound like a crazy person even to myself!🤦‍♂️
@aliceslab
@aliceslab Жыл бұрын
its not crazy, the future will get more complex, and it is harder to control complexity than the simplicity of our origins.
@ES11777
@ES11777 5 ай бұрын
No, you are just smart and looking at it from all angles.
@ShufflingManu
@ShufflingManu 5 жыл бұрын
I am more concerned about influential people labelling real videos of them as deep fakes in order to avoid consequences than I am about someone trying to harm said people with deep fakes.
@OneEyeShadow
@OneEyeShadow 5 жыл бұрын
+Captain Caterpillar Like what? The entire point of the programme is to make it as seemless as possible - so when the technology is actually "there" that's not the case anymore.
@Fiufsciak
@Fiufsciak 5 жыл бұрын
@@OneEyeShadow Lol, nope. They may look seamless to a human eye but not to a software designed to expose fakes.
@swandive46
@swandive46 5 жыл бұрын
Like Trump?
@PureVikingPowers
@PureVikingPowers 5 жыл бұрын
@@swandive46 Is Trump even a real person? 🙄
@allenkennedy99
@allenkennedy99 5 жыл бұрын
That's actually very poor logic.
@GIRru11
@GIRru11 3 жыл бұрын
Everyone: This stuff is dangerous and scary! Me: DAME DA NE DAME YO!
@haxxruz6284
@haxxruz6284 3 жыл бұрын
Baka mitai best meme
@denniscuesta7009
@denniscuesta7009 3 жыл бұрын
Putin singing baka mitai is the funniest
@aPandesalboi
@aPandesalboi 3 жыл бұрын
You mean every memer
@bianca.611
@bianca.611 3 жыл бұрын
i can hear and see these videos damn it.
@madjaster9620
@madjaster9620 3 жыл бұрын
I came here from the yanderedev and others singing deepfake lmao
@DD-yq1tj
@DD-yq1tj 2 жыл бұрын
Interesting this is getting suggested to me right now with the thumb nail of Putin ,🤔
@intreoo
@intreoo Жыл бұрын
This is making me more and more paranoid about showing my face online. Not that I ever did though.
@yongamer
@yongamer 5 жыл бұрын
This can become so scary.
@YoungBlaze
@YoungBlaze 5 жыл бұрын
Like my exs mother!
@PasscodeAdvance
@PasscodeAdvance 5 жыл бұрын
I agree with the Internet person
@AthosRespecter
@AthosRespecter 5 жыл бұрын
@Throngdorr Mighty lol
@someonesomewhere6289
@someonesomewhere6289 5 жыл бұрын
@Throngdorr Mighty once this technology is developed further (and it will be) doesn't matter how gullible you are; or if you're wise to all the tricks. We be fucked.
@yongamer
@yongamer 5 жыл бұрын
@Throngdorr Mighty The thing is that a significant proportion of people are dumb. And this technology is going to improve. I dont see why a fake video using this technology could not go viral.
@spicychipgaming2080
@spicychipgaming2080 3 жыл бұрын
2018: deepfakes are dangerous and could harm other people 2020: hamster sings Japanese game OST
@Starry_Wave
@Starry_Wave 3 жыл бұрын
And Yandere Dev singing to the Big Time Rush opening theme.
@mangovibes2525
@mangovibes2525 3 жыл бұрын
Doom Changer lol I just saw that vid
@user-on8vk5gb6x
@user-on8vk5gb6x 3 жыл бұрын
DAME DAME
@Accidentalreef
@Accidentalreef 3 жыл бұрын
Charles! Man i thought u died! Im happy your back!
@shaunluckham1418
@shaunluckham1418 Жыл бұрын
Simple rule don’t believe anything on television or online video. If you don’t see it in person it may or may not be compromised.
@cubycube9924
@cubycube9924 Жыл бұрын
It’s been 4 years... I wonder what’s going on now...
@crispsandchats
@crispsandchats 5 жыл бұрын
remember everybody: just because you can, doesn’t mean you should
@HullsColby
@HullsColby 5 жыл бұрын
I can deep throat a banana. But since you said so maybe I shouldn't.
@erazure.
@erazure. 5 жыл бұрын
Hulls Colby just a single banana? Step your game up, 3 bananas at once or a large cucumber minimum
@missionpupa
@missionpupa 5 жыл бұрын
Cool comment, but do you know how tragic it is of humans actually followed that, and deny everything about what makes us human, our curiosity and desire to progress. Scientists and engineers history havent done things because they could, but because they can. We do thing because they are possible, and you cant stop that.
@fruitygarlic3601
@fruitygarlic3601 5 жыл бұрын
@@missionpupa Stop being so pedantic. If something should be done, do it. If not, then don't. Imagine reaching so far you find something to argue about in something that doesn't necessarily disagree with you.
@lol-fh3oq
@lol-fh3oq 5 жыл бұрын
@@missionpupa I mean obviously there's still gonna be people who do it but that doesn't mean it's right.. Lmao, why're you reaching?
@ceece3817
@ceece3817 5 жыл бұрын
Black mirror do ya thing
@InnovAce
@InnovAce 5 жыл бұрын
ceec e black mirror and Altered Carbon
@SomeTrippyCanadian
@SomeTrippyCanadian Жыл бұрын
Sheesh this was 4 years ago! Just popped up on my feed. 2023 and it’s just getting crazier
@CarterLundy10
@CarterLundy10 Жыл бұрын
It’s crazy that this was 5 years ago. It’s just an every day thing to see these deep fake videos now.
@nicoh.1082
@nicoh.1082 3 жыл бұрын
This is terrifying. Imagine Nicolas Cage playing in every movie..
@Chattman4311
@Chattman4311 3 жыл бұрын
No, its perfection
@Raccon_Detective.
@Raccon_Detective. 3 жыл бұрын
No, it's perfection
@justaneditygangstar
@justaneditygangstar 3 жыл бұрын
No, it's perfection
@Traventity
@Traventity 3 жыл бұрын
No, It's perfection
@migocasty5515
@migocasty5515 3 жыл бұрын
No, its perfection.
@aguywithsubs8956
@aguywithsubs8956 5 жыл бұрын
The porn industry is evolving get ready for VR porn
@andatop
@andatop 5 жыл бұрын
Vr porn has been a thing for a decade
@brandonontama2415
@brandonontama2415 5 жыл бұрын
It will get worse, soon it will anime and video game characters. And then it will be a virtual reality were you can actually... things are really getting weird.
@florianp4627
@florianp4627 5 жыл бұрын
It has existed for a few years now, ever since the Oculus Rift dev kit initially came out
@brandonontama2415
@brandonontama2415 5 жыл бұрын
@@moogreal Crap...
@dirtiestharry6551
@dirtiestharry6551 5 жыл бұрын
I want subs ready player porn
@say12033
@say12033 Жыл бұрын
It's 2023 and now everyone can make deep fakes on their phone
@coralevy-yo8dh
@coralevy-yo8dh 6 ай бұрын
So many warnings in the forms of movies, books, tv series, video games etc showing us why this is dangerous. We never listen.
@WholesomeLad
@WholesomeLad 5 жыл бұрын
It's also getting harder to spot a fake deep comment
@everyone9500
@everyone9500 5 жыл бұрын
oof you're here
@npc304
@npc304 5 жыл бұрын
It's also harder to not be a nazi in my book. And just remember, the NPC meme is dehumanizing. We are all unique and special
@JJ-te2pi
@JJ-te2pi 5 жыл бұрын
@@npc304 Youre boring. Dead meme.
@rixille
@rixille 5 жыл бұрын
How do we know who is real and who isn't? Mass confusion is a powerful way to separate society.
@misterrogerroger5537
@misterrogerroger5537 5 жыл бұрын
How can mirrors be real if our eyes aren't real
@syrus1233
@syrus1233 3 жыл бұрын
"Deep fakes gained popularity through adding famous celeberties to porn scenes" Ahh porn, always innovating.
@reallyuglyaim991
@reallyuglyaim991 3 жыл бұрын
True *cri* ✊😌
@TheMaster4534
@TheMaster4534 3 жыл бұрын
The Russians have a word for that. Компромети́рующий материа́л, or компромат for short.
@denierdev9723
@denierdev9723 3 жыл бұрын
Always defiling and immoral, too.
@jordanmendoza812
@jordanmendoza812 3 жыл бұрын
@@denierdev9723 your name checks out
@denierdev9723
@denierdev9723 3 жыл бұрын
@@jordanmendoza812 ?
@fakintru9398
@fakintru9398 Жыл бұрын
2023 AI: LOL
@sharilyn8262
@sharilyn8262 Жыл бұрын
Getting that alibi out before the evidence is gold. Divide the people with more confusion.
@keelo-byte
@keelo-byte 5 жыл бұрын
Forget the fake celebrity porn and political tapes, this technology should be used for only one thing... *remixing old school kung-fu movies.*
@caralho5237
@caralho5237 5 жыл бұрын
I imagine Bruce Lee dabbing and doing fortnite dances. Scary.
@xouslic742
@xouslic742 5 жыл бұрын
you mean remaster
@keelo-byte
@keelo-byte 5 жыл бұрын
@@xouslic742 no I meant remix. Sort of like "kungpow: enter the fist"
@Motorata661
@Motorata661 4 жыл бұрын
Bruce Lee. Jacky Chan Jet Lee Donnie Jen Kung-Fu battle royale The movie
@pieterdejager7805
@pieterdejager7805 4 жыл бұрын
Bwahahaha...now ure talking!....
@cardorichard4148
@cardorichard4148 5 жыл бұрын
Trumps new favorite phrase, “Deep fake news.” 😂
@ggsay1687
@ggsay1687 5 жыл бұрын
It would be hard to deny if someone put his fase on insane person shouting insults on squer.
@leonscottkennedy3143
@leonscottkennedy3143 5 жыл бұрын
GG SAY *face
@ggsay1687
@ggsay1687 5 жыл бұрын
you missed the "squer", I think I was drunk
@L7vanmatre
@L7vanmatre 5 жыл бұрын
TRUMP LOL HAHA
@DioThermidor
@DioThermidor 5 жыл бұрын
You're pathetic.
@raulgalets
@raulgalets Жыл бұрын
this is 4 years ago
@e8tballz
@e8tballz 3 жыл бұрын
I’m sure this will be huge w/ scammers in a few yrs. they’ll use this to FaceTime someone’s grandparents and say they’re in trouble and need cash or something ridiculous. Inevitably but sad.
@mayeighteen2812
@mayeighteen2812 3 жыл бұрын
Yes or our children being subject to deepfake porn or other degradatory/defamatory videos made by their own classmates as a form of bullying... ruining their futures and sense of self or reality. 😔
@shawnli4746
@shawnli4746 5 жыл бұрын
If this technology evolves, get ready for the dystopia that Orwell predicted, and be ruled by faceless individuals...
@CriticalRoleHighlights
@CriticalRoleHighlights 5 жыл бұрын
This could be something a dystopian government uses when the masses wouldn't know any better _after_ a dystopia has occurred by other means, but dystopia will never occur because of it.
@lilahdog568
@lilahdog568 5 жыл бұрын
CRH our government could begin going after individuals simply by creating evidence in the form of deep fake videos
@nefelibata4190
@nefelibata4190 5 жыл бұрын
what is the point in the videos if you can't tell what is fake and not? you would need an expert on the case that is somehow being monitored by another expert ans several other people, who has the best or worst intensions for human kind.
@michaelwatts5139
@michaelwatts5139 5 жыл бұрын
@@nefelibata4190 we already have people faking their gender
@mutanazublond4391
@mutanazublond4391 4 жыл бұрын
It has evolved, are you stupid, 90 percent of all actors used are non existant with fake backgrounds ... all of the documentaries are fake people ... etc etc
@jose-gr7jg
@jose-gr7jg 5 жыл бұрын
So Stan Lee will be able to do all the cameos??
@siyacer
@siyacer 4 жыл бұрын
In endgame
@ssssSTopmotion
@ssssSTopmotion 3 жыл бұрын
What's the point if its not even him that's what made it special
@sauceamericano4279
@sauceamericano4279 3 жыл бұрын
Yup
@sauceamericano4279
@sauceamericano4279 3 жыл бұрын
Yep
@thefirebeanie5481
@thefirebeanie5481 Жыл бұрын
Well this was always inevitable This aged like fine wine
@biomuseum6645
@biomuseum6645 Жыл бұрын
Why is it that these creepy technologies always get romanticized by telling people it will help small creators? Wasn’t that what they told us when they removed the dislike button? To help small creators? Don’t small creators adapt creatively to limitations and become more creative in the process instead of having all their whims on a silver plate?
@samswich1493
@samswich1493 3 жыл бұрын
2018: deep fakes are very realistic and dangerous 2020: truck sings dame da ne
@zuko1569
@zuko1569 5 жыл бұрын
Shapeshifting reptilians want to know your location
@joshuakoh1291
@joshuakoh1291 5 жыл бұрын
Zuzu "That's fucking rough buddy for me"
@kat_867
@kat_867 5 жыл бұрын
Ones in the whitehouse
@pastelxenon
@pastelxenon 5 жыл бұрын
@@kat_867 if youre an idiot
@kat_867
@kat_867 5 жыл бұрын
Pastel Xenon if? Lmao what 😂 just go away.
@sofialaya596
@sofialaya596 5 жыл бұрын
lmao
@wengdummy7687
@wengdummy7687 9 ай бұрын
OMG, this thing is not helping. It can be used to destroy someone's reputations.
@zyurxi7307
@zyurxi7307 2 жыл бұрын
The fact you can make national threats and make it seem as though it was someone else is truly scary.
@AngryGoose
@AngryGoose 3 жыл бұрын
DeepFakes:possibly dangerous Everyone: HAHA yanderedev go damedane
@altaica3522
@altaica3522 3 жыл бұрын
People are making fun of this video, but it's only a matter of time till someone uses this for malicious purposes.
@ljoxleyofficial8119
@ljoxleyofficial8119 3 жыл бұрын
They already are
@ljoxleyofficial8119
@ljoxleyofficial8119 3 жыл бұрын
Vincent DiPaolo what does this all mean?
@yimmy7160
@yimmy7160 3 жыл бұрын
You act like this is new and hasn't been done. This "tech" has been around for a while actually
@mirjanapucarevic2105
@mirjanapucarevic2105 3 жыл бұрын
It is very scary how many lives will be destroyed?!
@HTMLpopper
@HTMLpopper Жыл бұрын
They warned us about this 4 YEARS AGO
@_.1447
@_.1447 Жыл бұрын
This was four years ago. Imagine the current potential...
@luciferexperiment8553
@luciferexperiment8553 4 жыл бұрын
if this stuff became public ,they been using it for years ...
@honkhonk8009
@honkhonk8009 4 жыл бұрын
The porn industry made it and google helped expand it with tensorflow. its nothing new lmao
@afterthought4627
@afterthought4627 4 жыл бұрын
What you think they been doing?
@Phantogram2
@Phantogram2 4 жыл бұрын
What? You forgot your tinfoil hat.
@Besomar_Surahat
@Besomar_Surahat 3 жыл бұрын
DAME DANE
@carpetchair5778
@carpetchair5778 3 жыл бұрын
Bruh
@LuxAeterna22878
@LuxAeterna22878 4 жыл бұрын
This is terrifying. One can only hope that equally ingenious methods of security will protect humanity against such powerful tools of deception.
@lil_weasel219
@lil_weasel219 2 жыл бұрын
yes that security is certainly uhm "impartial" eh and would never itself propagate similar things?
@braindavidgilbert3147
@braindavidgilbert3147 Жыл бұрын
I mean we talked the same way about editing at first. Look how it is now😊
@emptyhad2571
@emptyhad2571 Жыл бұрын
The age of AI has begun and it won’t stop
@nigachad4031
@nigachad4031 Жыл бұрын
voice actors are gonna go wild with this kind of technology
@raoulfr
@raoulfr 5 жыл бұрын
This technology comes from porn...what is humanity evolving into 😂!?
@jaliborc
@jaliborc 5 жыл бұрын
It doesnt come from porn. It comes from academia. The industry is using it in it, after being developed in the academia for general purpose.
@beamboy14526
@beamboy14526 5 жыл бұрын
evolving to create a direct brain-to-computer porn indistinguishable from reality
@PhanniHam
@PhanniHam 5 жыл бұрын
porn is one of the biggest industries on the planet
@goforit7774
@goforit7774 5 жыл бұрын
porn will be banned like prostitution
@Lucky8s
@Lucky8s 5 жыл бұрын
@@goforit7774 Banned by who exactly?
@confusedwhale
@confusedwhale 5 жыл бұрын
It's true that it's getting harder to tell, but there is still something wrong with the robot face images. Long live the uncanny valley.
@themelonn6313
@themelonn6313 5 жыл бұрын
confusedwhale wow this will actually help us combatant against this lol. imagine an expert.
@David-gp3fd
@David-gp3fd Жыл бұрын
nah this a foolish short sighted perspective..the human body has its limits and would have to evolve to keep up to tell. Unfortunately tech is evolving way faster than humans
@beepduck
@beepduck Жыл бұрын
bro you can tell so easily theyre fake, no one moves their face like that
@yeke7720
@yeke7720 Жыл бұрын
"oh hi mark" at the end of the videos break me up😂😂
@HeavymetalHylian
@HeavymetalHylian 5 жыл бұрын
Spread the word. This needs to be on trending.
@yourneighbour5738
@yourneighbour5738 5 жыл бұрын
Yes we need more Nicholas Cage movies
@plsdontreplytomewitharmy5926
@plsdontreplytomewitharmy5926 5 жыл бұрын
that would inspire more people to do it then :/
@reecherdbrown8156
@reecherdbrown8156 5 жыл бұрын
The word's been spread and we've all stayed asleep
@justsomeguys1121
@justsomeguys1121 5 жыл бұрын
HoneyedHylian trending videos are hand picked by KZfaq staff
@littleme3597
@littleme3597 Жыл бұрын
They could keep dead people alive. LIKE biden and that old hag. S.C. person. Make it appear, she is still alive and speaking.
@vao5399
@vao5399 5 жыл бұрын
Honestly I feel like this is saying that this is some problem that's going to be hard to control, but it won't be, the scariest part about this is that the way bigger news sources are getting desperate snd lazy so they won't fact check this when it pops up.
@miamarie5426
@miamarie5426 5 жыл бұрын
Simon WoodburyForget how do we authenticate videos when people lie, audio can be faked/cut/manipulated, and pictures can obviously be photoshopped
@SuperPhunThyme9
@SuperPhunThyme9 4 жыл бұрын
@@miamarie5426 a Deepfake leaves specific, pixel level traces that Bloomberg here "forgot" to mention... ....but the audio is absolutely full of clues that no deepfake can come close to fixing.
@tony_5156
@tony_5156 4 жыл бұрын
We have a big UN meeting and outright ban it, with martial punishment high and tough Jail, yup no bail for you buddy your going straight to jail.
@honkhonk8009
@honkhonk8009 4 жыл бұрын
True. In courts, this wont even be an issue, but with our already lazy and retarded media, their gonna not even fact check and their gonna treat it like proof. Wont be the first time
@shannonjaensch3705
@shannonjaensch3705 Жыл бұрын
Even more sadder is that most never fact check the news they hear or anything they are told by anyone. Lazy brains that are just consumed with trying to stay alive and in a state of low consciousness physical body comfort.
@KeertikaAndFallenTree
@KeertikaAndFallenTree Жыл бұрын
I guess the way for everyone is now to put all their pictures in private on social media and only let the people you personally know and trust to have access to it. This is one of the scariest Pandora’s boxes yet.
@getpriyanka
@getpriyanka 3 жыл бұрын
Governments fear the advanced technology of the world may fall into wrong hands, but the only reason we want deepfakes is to get memes
@user-uj4ip2pt6h
@user-uj4ip2pt6h 5 жыл бұрын
we humans put technology development first and common sense second.
@muhdelyas-abgyas562
@muhdelyas-abgyas562 5 жыл бұрын
Realistic porn first and common sense second
@PasscodeAdvance
@PasscodeAdvance 5 жыл бұрын
Aliens are butter than us (or salter)
@realdeal5712
@realdeal5712 5 жыл бұрын
myownname myownlastname it is common sense to have porn video with your crush face on it
@SuperDanielHUN
@SuperDanielHUN 5 жыл бұрын
Even if 99.9% of the planet prefers sense, there is always that one guy that opposes it and creates a breakthrough as a result (sometimes). Galileo was completely insane for stating the earth is round at the time, and any person with "common sense" would say not to do it, because he'd be killed by the church and because the Bible already says its flat. Technology both helps and punishes humans, often in unexpected ways, Alfred nobel wanted to help miners with dynamite, he created a weapon of mass murder by accident. Common sense is neither universal nor definite, its rather technology and social changes that twists whats considered "common sense"
@reyxus9454
@reyxus9454 5 жыл бұрын
@@SuperDanielHUN "the bible already says it's flat" wtf are you on about
@JeremyBX
@JeremyBX 3 жыл бұрын
“Fake news on steroids” I really like that summarization
@joshuagulbrandson9397
@joshuagulbrandson9397 3 жыл бұрын
"This advance is not yet available to the public." I don't think that's up for them to decide.
@kenm1976
@kenm1976 2 жыл бұрын
You either have to have an eye for it, or another program has to recognize if its a modified photo or not. Although eventually, the machine learning will be researched so well that it will be impossible to spot the difference.
@samonterolanjayp.8229
@samonterolanjayp.8229 3 жыл бұрын
Plot twist: The expert they interviewed is a deepfake edit
@alessandrocarbotti9241
@alessandrocarbotti9241 3 жыл бұрын
I was thinking the same hahahahahah
@bobshmyder9749
@bobshmyder9749 3 жыл бұрын
Deepfake crisis
@jameschiwang
@jameschiwang 3 жыл бұрын
WoW 100Th LikE NOBoDy CareS
@Jordyesscoto
@Jordyesscoto 5 жыл бұрын
This popped out in my recommendations right after seeing Shane’s new video lol
@harperwelch5147
@harperwelch5147 3 жыл бұрын
This is really scary. It opens doors that are a Pandora’s box of trouble. I am sad that the creators don’t care what their work will very obviously lead to. This is no video game. It will be weaponized.
@dissolved5920
@dissolved5920 Жыл бұрын
No it wont
@micheala7304
@micheala7304 Жыл бұрын
I was actually talking about this the other day. There will be a point where video and audio cannot be used in court because there’s no way of telling the authenticity.... think how powerful that is
@tablespoon1277
@tablespoon1277 4 жыл бұрын
Classic representation of human out smarting themselves ultimately leading to their own demise :)
@hasanbassari7364
@hasanbassari7364 4 жыл бұрын
ok boomer
@keyboardcorrector2340
@keyboardcorrector2340 4 жыл бұрын
Yet we're still here...
@erik6473
@erik6473 4 жыл бұрын
@@hasanbassari7364 😂😂👌
@indianapaullyj
@indianapaullyj 4 жыл бұрын
Yes indeed
@jaconova
@jaconova 4 жыл бұрын
@Johnny Moris What can you expect of people that go to concerts to deny the real experience by recording everything with their phone.
@kylecollins5463
@kylecollins5463 4 жыл бұрын
Just imagine a world where a deep fake can be shared millions of times showing a leader saying he's pushed the nucleur button
@hwlz9028
@hwlz9028 4 жыл бұрын
Yep
@sorrymyenglishbad2535
@sorrymyenglishbad2535 4 жыл бұрын
Gotta be more subtle for more chaos.
@youtubespy9473
@youtubespy9473 3 жыл бұрын
Lol, why would anybody blow up their own world unless they were suicidal.
@pit2992
@pit2992 3 жыл бұрын
@@youtubespy9473 There are people who have nothing to lose.
@youtubespy9473
@youtubespy9473 3 жыл бұрын
@@pit2992 I said that "suicidal"
@toothpasteboy2019
@toothpasteboy2019 3 жыл бұрын
"Number one victory royale, yeah fortnite we are about to get down (get down) 10 kills on the board right now, just slaughter tomato town"
@_alltheseprettylights_
@_alltheseprettylights_ 3 жыл бұрын
This is where I draw the line with technology.
@derekg5006
@derekg5006 3 жыл бұрын
2018: Deepfakes are dangerous! 2020: JFK talks about Rick and Morty
@awesome117unsc
@awesome117unsc 5 жыл бұрын
Making Memes more danker than ever.
@kirkclarke7396
@kirkclarke7396 6 ай бұрын
Just imagine how many people would believe the world is flat if someone made loads of deep fake videos of famous people claiming earth is flat
@veredcohen9079
@veredcohen9079 Жыл бұрын
It can be a very bad trick to court when someone wants to dirt someone, especially when the judge hide it. This is crazy
Bill Hader impersonates Arnold Schwarzenegger [DeepFake]
3:11
Ctrl Shift Face
Рет қаралды 22 МЛН
Stunning AI shows how it would kill 90%. w Elon Musk.
15:59
Digital Engine
Рет қаралды 9 МЛН
GADGETS VS HACKS || Random Useful Tools For your child #hacks #gadgets
00:35
白天使和小丑帮助黑天使。#天使 #超人不会飞 #超人夫妇
00:42
小路飞第二集:小路飞很听话#海贼王  #路飞
00:48
路飞与唐舞桐
Рет қаралды 15 МЛН
AI Generated Videos Just Changed Forever
12:02
Marques Brownlee
Рет қаралды 8 МЛН
Simon Cowell Sings on Stage?! Metaphysic Will Leave You Speechless | AGT 2022
5:57
I Challenged My AI Clone to Replace Me for 24 Hours | WSJ
7:34
The Wall Street Journal
Рет қаралды 1,3 МЛН
How Gucci Fell From High Fashion to Discount Rack
7:28
Bloomberg Originals
Рет қаралды 629 М.
How do we prevent AI from creating deepfakes?
7:41
Channel 4 News
Рет қаралды 54 М.
Why we say “OK”
5:22
Vox
Рет қаралды 9 МЛН
This intense AI anger is exactly what experts warned of, w Elon Musk.
15:51
Bill Hader channels Tom Cruise [DeepFake]
3:14
Ctrl Shift Face
Рет қаралды 14 МЛН
Google CEO Sundar Pichai and the Future of AI | The Circuit
24:02
Bloomberg Originals
Рет қаралды 337 М.
GADGETS VS HACKS || Random Useful Tools For your child #hacks #gadgets
00:35