What are deepfakes and are they dangerous? | Start Here

2024 ж. 8 Мам.
297 066 Рет қаралды

Deepfake videos are becoming more common. These AI-generated fake videos are getting easier to create and more realistic. So what are deepfakes? How are they made? And should they worry us? #AJStartHere explains with Sandra Gathmann and the help of a special guest...
#DeepFake #DarkWeb #DeepFakes
Subscribe to our channel bit.ly/AJSubscribe
Follow us on Twitter / ajenglish
Find us on Facebook / aljazeera
Check our website: www.aljazeera.com/

Пікірлер
  • This robust technology is a great way for destroying people’s reputations.

    @excelsior31107@excelsior311072 жыл бұрын
    • among other things, such as fabricating evidence, disapproving evidence and so much more extremely dangerous uses. you dont like a new government policy? oh look! we have a video of you having intercourse.

      @amerlad@amerlad2 жыл бұрын
    • It's not new.

      @RiversBliss@RiversBliss2 жыл бұрын
    • @@amerlad Well said! Not only government policy but particularly disagreements pertaining to matters of religion and race. Or if people want to discredit you in some way to suit their agenda. It's very easy. Still, people who are close to you usually know you better and which is why these deepfake videos fail to reflect reality over time.

      @dansierrasam79@dansierrasam792 жыл бұрын
    • /s

      @aminbinsalim1995@aminbinsalim19952 жыл бұрын
    • @@amerlad :(

      @aminbinsalim1995@aminbinsalim19952 жыл бұрын
  • Appreciate the fact that "Sandra looks and sounds better than J-Lo"

    @fahdjamy@fahdjamy2 жыл бұрын
    • Unimportant and unnecessary take

      @danielwilson9342@danielwilson93422 жыл бұрын
    • Ayyyyeeee

      @SubliminalMessagesTV@SubliminalMessagesTV2 жыл бұрын
    • @@danielwilson9342 u right but shut up

      @SubliminalMessagesTV@SubliminalMessagesTV2 жыл бұрын
    • Totally agree.

      @klarag7059@klarag70592 жыл бұрын
    • @jaep struiksma not from my perspective. The presenter looks more beautiful as she looks more natural than the overly made up “fictitious”image of beauty. The reporter is more real and relatable because of her more natural look.

      @klarag7059@klarag70592 жыл бұрын
  • Welcome to the Age of Deceptions.

    @salmanramzan2032@salmanramzan20322 жыл бұрын
    • I read that first as DECEPTICONS

      @samdacosta4676@samdacosta46762 жыл бұрын
    • Facts

      @rogeramezquita5685@rogeramezquita56852 жыл бұрын
    • @@samdacosta4676 You need to unload movies from your mind.

      @filhanislamictv8712@filhanislamictv87122 жыл бұрын
    • @@filhanislamictv8712 ha ....ikr

      @samdacosta4676@samdacosta46762 жыл бұрын
    • @@filhanislamictv8712 it is related

      @furrycheetah@furrycheetah2 жыл бұрын
  • LMAO! the beginning actually had me, I was like "WTF" And then I remembered what the topic was xD

    @hoboryan3455@hoboryan34552 жыл бұрын
    • Same here 🤪

      @TheSenzerx@TheSenzerx2 жыл бұрын
    • @@TheSenzerx i thought it was youtube add of some beauty product of jennifer lopez

      @amenjamal8454@amenjamal84542 жыл бұрын
    • @@amenjamal8454 lol

      @TheSenzerx@TheSenzerx2 жыл бұрын
    • Same... 😂😂😂

      @frfarahrahman@frfarahrahman2 жыл бұрын
  • Informative, yet chilling enough to make people think twice before posting pictures of themselves.

    @florence8532@florence85322 жыл бұрын
    • celebs are gonna be in danger lol, so easy to make a scandall

      @PainfulGrowth@PainfulGrowth Жыл бұрын
    • @@PainfulGrowth Considering how often a lot of people put images of themselves online (or, even if they don't, there's lots of people that will intentionally look for pictures of them to display online), I don't think it's just celebrities that are going to be in danger... 0_0

      @Scarshadow666@Scarshadow666 Жыл бұрын
    • Considering how well social media like TikTok, KZhead, and Instagram take off due to people posting images of themselves, I doubt it'll hinder people unless they educate themselves of the dangers of deep-fakes. 0_0

      @Scarshadow666@Scarshadow666 Жыл бұрын
  • Wonder why she put her earrings off? It turns out that earrings or eyeglasses make it harder for the algorithm to isolate your face.

    @xja85mac@xja85mac2 жыл бұрын
  • I thought the J Lo intro was an ad.

    @tauriqabdullah6130@tauriqabdullah61302 жыл бұрын
  • Before anything horrible happens, I hope a global policy is created to protect those who were used for deep fake stuff. Which could involve cyber police maybe. Edit: Honestly it's already seeming to get out of hand but the sooner the better.

    @sindhujasai1345@sindhujasai13452 жыл бұрын
    • Too late, it has and is

      @KatyYoder-cq1kc@KatyYoder-cq1kcАй бұрын
  • That was the best session according to me, post pandemic. Especially that face changing Sandra.

    @MuhammadShahAlamSaqibi@MuhammadShahAlamSaqibi2 жыл бұрын
  • this series is actually pretty good

    @rajinrashid2455@rajinrashid24552 жыл бұрын
  • this lady is amazing, your scripts are just on spot.

    @ousman997@ousman9972 жыл бұрын
  • Technology has advantages and disadvantages and this is one of them. May Allah save us from all evil people amen.

    @itistrueitisafact5432@itistrueitisafact54322 жыл бұрын
    • Amin. But deep fake can be recognized there are apps people use that verify if the video was created

      @y.r5155@y.r51552 жыл бұрын
    • @@y.r5155 So what, are you gonna scan and check every video for possible deep fake?

      @KatariaGujjar@KatariaGujjar2 жыл бұрын
    • @@KatariaGujjar no I'm talking about like a celebrity or government officials or someone known. I'm a software engineer I know how to create it and how to know it's a deep fake.

      @y.r5155@y.r51552 жыл бұрын
    • @@y.r5155 Celebrities and officials make thousands of videos daily. Who is going to check every single clip?

      @KatariaGujjar@KatariaGujjar2 жыл бұрын
    • @@KatariaGujjar ARe You A Jew or Zoroastrian

      @ADeeSHUPA@ADeeSHUPA2 жыл бұрын
  • Starting was just🤣🤣🤣🤣

    @mahmudulhaidersiyam3186@mahmudulhaidersiyam31862 жыл бұрын
  • Sandra! Thank you so much 🧡

    @sultanrayder@sultanrayder2 жыл бұрын
  • Fun fact, Sandra looks better than Jlo 😂

    @zx7siovia213@zx7siovia2132 жыл бұрын
    • So what? JLO can sing, dance, act, choreograph, and she's got a better body. A woman's physical appearance shouldn't matter that much to you or anyone else. Stop comparing us like objects. It creates a competitive sense between females and we should no longer allow males to do this to us.

      @blaze4158@blaze41582 жыл бұрын
    • Not a fact, just an opinion

      @loveshell007@loveshell0072 жыл бұрын
    • @@loveshell007 Who is your comment directed to? You should know enough to be specific about whom you are addressing.

      @blaze4158@blaze41582 жыл бұрын
  • Wow thanks for raising concerns!

    @cinto1394@cinto13942 жыл бұрын
  • I actually don’t think she looked like Jennifer Lopez. In fact when I watched it at first the sound was muted and I just thought she had the same name as Jennifer Lopez but I will agree this is very dangerous and quite sick actually.

    @Mazzie2022@Mazzie20222 жыл бұрын
    • She only did it for an examples...There's someone out there can make DeepFake looking 100% like the real celebrities!

      @andikoazri@andikoazri2 жыл бұрын
    • They probably did it very quickly just as an example

      @nusaibahibraheem8183@nusaibahibraheem81832 жыл бұрын
    • @@nusaibahibraheem8183 Exactly...

      @andikoazri@andikoazri2 жыл бұрын
  • The intro really got me! 😂

    @rimshakhan9751@rimshakhan97512 жыл бұрын
  • It's people's moral characters. There are people always to misuse something 🙄

    @badripaudel77@badripaudel772 жыл бұрын
  • Thank you! This is a very good overview on deep fake technology, the only of the thing you didn't mention was the fact that realistic deep fakes are trained on huge data sets of images (10,000's+) like the Tom Cruise deep fakes, where they have hours of footage with a huge range of facial expression and you need to map the face onto someone with a similar facial structure to achieve realistic results.

    @nickdupreez1843@nickdupreez18432 жыл бұрын
    • With so many activities being online now, school, work, etc. it's not so hard to get hours of footage of anyone. And that's minus all the content people put up of themselves on their social media accounts.

      @BA-mf4gi@BA-mf4gi Жыл бұрын
    • @@BA-mf4gi I agree. The commenter seemed to be trying to make it seem like it's not that big of a threat to anyone other than celebrities with grand amounts of footage, while ignoring the fact that modern phones and social media have driven large segments of the population to create a comparable amount of footage of themselves and post it all online. The commenter doesn't use personal social media like FB? Doesn't have friends on SM so he doesn't know?

      @DarkPesco@DarkPesco Жыл бұрын
  • I was trying to find the skip ad button in the beginning, thinking it was an omaze ad or something

    @saajidalikhan@saajidalikhan2 жыл бұрын
  • Thanks a lot for informative videos and please upload your video on time.

    @ishaqueahmed6362@ishaqueahmed6362 Жыл бұрын
  • Genius editing 👌

    @webdecodedwithfahad4414@webdecodedwithfahad44142 жыл бұрын
  • The starting.... Nailed it 🤣

    @shabeebkaringappara2917@shabeebkaringappara2917 Жыл бұрын
  • That is really interesting to know. Technology has changed everything. I hope law will be made on it to spot the culprits behind it.

    @MuhammadAbdullah-dy5dn@MuhammadAbdullah-dy5dn2 жыл бұрын
  • When also look into the fact that we have a small camera to record things, the world is getting creeper by the minute

    @gokulpayyanur1839@gokulpayyanur18392 жыл бұрын
    • Yes. And this video only highlights visuals. Sound and acoustics are developing very fast too.

      @EmpressTouch@EmpressTouch2 жыл бұрын
  • All we need is knowledge, and to stay informed of what technology can do.

    @thisismyloooveeeyy8014@thisismyloooveeeyy80142 жыл бұрын
  • You are rocking Sandra ❤️

    @wonderfacts7782@wonderfacts77822 жыл бұрын
  • Great informative vedio, thanks. Keep going.

    @haideralisuterwala9403@haideralisuterwala94032 жыл бұрын
  • Very informative videos, good work 👍

    @sunny-pe6yt@sunny-pe6yt10 ай бұрын
  • Very informative video! Btw you look absolute sober and beautiful!

    @Salman-qd2wl@Salman-qd2wl2 жыл бұрын
  • The same algorithms used to detect deep fakes, can be used to train better deep fake networks

    @azzyfreeman@azzyfreeman2 жыл бұрын
    • A sick cycle...

      @DarkPesco@DarkPesco Жыл бұрын
  • yo her last part words about rocks sounded like the hood i love it lol

    @NINJANOOB777@NINJANOOB777 Жыл бұрын
  • Al Jazeera is definitely a great example of it

    @Head_of_the_Table2.0@Head_of_the_Table2.02 жыл бұрын
  • Education, education, education. People who can think critically and exercise a healthy level of skepticism are difficult to deceive. New technology, old solutions.

    @mirygalas6508@mirygalas65082 жыл бұрын
  • Too notch investigative journalism

    @andym6603@andym66032 жыл бұрын
  • It's very dangerous because it's destroying people's reputations dignity , career's, creating depression

    @dorcasnjeri2858@dorcasnjeri28582 жыл бұрын
  • i love the show..... v informative

    @ikkmic451@ikkmic4512 жыл бұрын
  • Let's just appreciate all the hard work done to make such a video, It seems simple because it took a lot of effort into it for sure

    @moatazgamal34@moatazgamal342 жыл бұрын
  • Very interesting but terrifying at the same time I wouldn't want that to ever to happen to me or my friends and family this isn't good there will be so many problems with this

    @michelefortner1190@michelefortner11902 жыл бұрын
    • 💯😩💯

      @maverickbourne2.0rph.@maverickbourne2.0rph.11 ай бұрын
  • Sandra has got extraordinary grace and gravitas

    @arvailankara@arvailankara2 жыл бұрын
  • I suggest these things should be used for video game entertainment purpose and not for harm or conflict. And we can make deep fakes in such a way that they look real and even know that is deep fake.

    @VAUIENLET@VAUIENLET2 жыл бұрын
  • The only video of Al Jazeera I’ve respected

    @slipknotj2581@slipknotj25812 жыл бұрын
  • Yet the funny thing about this segment is that this information has been widely known for the last several years and has only improved and there's a slight possibility that It has been used with actual results in modern media settings

    @SubliminalMessagesTV@SubliminalMessagesTV2 жыл бұрын
  • This is extremely alarming.

    @birdsarecool6448@birdsarecool64482 жыл бұрын
  • Engaging in open discussions about deepfakes is essential for raising awareness and building resilience. By fostering a culture of transparency and accountability, we can collectively navigate the challenges posed by deepfake technology, mitigating its negative impact on individuals and society.

    @user-mu3iy8fq3d@user-mu3iy8fq3d5 ай бұрын
  • Sowhat APPSdidyou use?

    @kristakaufman3593@kristakaufman3593 Жыл бұрын
  • The last bit was a killer 😀

    @01arthi@01arthi2 жыл бұрын
  • There is a woman in Canada who has KZhead channel her name is jasmine, she is doppelganger of Sandra. The name of the channel is jasmine and dawoud .

    @ShahbazAli-ji3jq@ShahbazAli-ji3jq2 жыл бұрын
  • The audio on this is brutal.

    @Recuper8@Recuper82 жыл бұрын
  • Sandra is my favorite journalist

    @nawazsharif4634@nawazsharif46342 жыл бұрын
  • I subscribe your KZhead channel by watching this video. Very informative and timely!

    @juankitchen1008@juankitchen1008 Жыл бұрын
  • Informative

    @davidalao5336@davidalao53362 жыл бұрын
  • Marvel is creating deepfakes of Tom Hiddleston using Loki.

    @izzatfauzimustafa6535@izzatfauzimustafa65352 жыл бұрын
  • Could you please do a episode of the current situation of Somalia!!!

    @Kingofboys15@Kingofboys152 жыл бұрын
    • What's happening in Somalia???

      @ilhaans8086@ilhaans80862 жыл бұрын
  • At 2:16, the genius of the software user trumps that of its developer.

    @SunnySJamil@SunnySJamil2 жыл бұрын
    • facts

      @Hypocrisy.Allergic@Hypocrisy.Allergic4 ай бұрын
  • Very hard to find fake ones so be vigilant & prudent.

    @Aslaan1@Aslaan12 жыл бұрын
  • Conductora ❤

    @user-mw9cl6pj3i@user-mw9cl6pj3i2 ай бұрын
  • Correction

    @Gustoking37@Gustoking37 Жыл бұрын
  • Soooo any good free apps for deepfaking? For educational purposes of course :)

    @tzogreekwarrior6@tzogreekwarrior62 жыл бұрын
  • I was so annoyed that Sandra was substituted. Lol.

    @IbrahimAli-vv3df@IbrahimAli-vv3df2 жыл бұрын
  • Kindly make the video on Digital currency.

    @CSSWITHRIDAEZAINAB@CSSWITHRIDAEZAINAB2 жыл бұрын
  • nice work

    @mubashirali7829@mubashirali78292 жыл бұрын
  • This tech will help protect the elites

    @gandhi1945@gandhi19452 жыл бұрын
  • *Very*

    @LivesInReality@LivesInReality2 жыл бұрын
  • What if we found a way to like “watermark” a video 💦 the future will eventually depend upon markings to prove ❤legitimacy❤of media

    @MagicMattHawkins@MagicMattHawkins Жыл бұрын
  • They should have used this technology in MS Dhoni's Biopic

    @istriver.@istriver.2 жыл бұрын
  • lol @ "if it makes you feel a strong emotion, either really, really good or very mad take an extra second to check if it's real". Yeah, because people experiencing strong emotions are definitely using logic in that moment and will think to check frame by frame for artefacts and ghosting before they join a bandwagon. lol. The rule in the 90's was, don't believe anything you see on the internet you don't know what is and isn't fake. It continued to be the rule in the early 2000s too. Then suddenly somewhere around 2010 people lost their minds and forgot that the internet is full of misinformation and fakery. If we just went back to the original rule, deep fakes online wouldn't pose a problem to anyone. Hopefully deep fakes might also encourage people to care about their data, about who has their voice print and who has access to their photos. Maybe they'll think twice about using that Russian novelty face swap app, or letting a major company/the government just have their voice print for "security purposes".

    @tjmarx@tjmarx2 жыл бұрын
    • Best comment yet.

      @fredkerfwappie8380@fredkerfwappie83802 жыл бұрын
  • Funny enough the only people I am afraid of regarding this technology is the government. Who knows what kind of devious plans they're going to be able to pull off because of this tech.

    @dumanimjo609@dumanimjo6092 жыл бұрын
  • Caleb Carr’s book Killing Time predicted this.

    @wh3resmycar@wh3resmycar2 жыл бұрын
  • Thanks

    @emadabuhagag222@emadabuhagag222 Жыл бұрын
  • Few days!?

    @CAMIDRCS@CAMIDRCS2 жыл бұрын
  • I’m just curious why the stupid name deep fake stuck? Counter facing was the better descriptive.

    @danny-li6io@danny-li6io2 жыл бұрын
  • Till 3:40 it was fun but the whole scenario changed after that.... It is actually terrifying!!

    @Farah_Gojali07@Farah_Gojali072 жыл бұрын
    • Rule 34 of the internet: If something exists on the internet, its NSFW version already exists.

      @krateproductions4872@krateproductions48722 жыл бұрын
    • Jin😀

      @fahmidafaiza8207@fahmidafaiza82072 жыл бұрын
    • @@fahmidafaiza8207 yess!😍😍💜💜

      @Farah_Gojali07@Farah_Gojali072 жыл бұрын
  • Nicolas cage as Louis lane is the best.

    @habbyhouse@habbyhouse2 жыл бұрын
  • That freak me out

    @rogeramezquita5685@rogeramezquita56852 жыл бұрын
  • At 4:14 that music made me think my stomach was rumbling

    @ahiyanali7231@ahiyanali72312 жыл бұрын
  • Make laws to watermark deep fake videos or face prosecution!

    @Amaaaaan1@Amaaaaan12 жыл бұрын
  • pretty interesting and they should redo enter the dragon, bruce lee movie.

    @lancesay@lancesay2 жыл бұрын
  • pro-deepfakes and deepfakes apologist are a problem. mark my words

    @jamdindali@jamdindali2 жыл бұрын
  • wow this show is awesome

    @user-uc4iv3jx3v@user-uc4iv3jx3v2 жыл бұрын
  • with advancement ... these blur spots around ear or this resolution thing will disappear and we won't be able to recognize.

    @abdullahbinraghib5983@abdullahbinraghib59832 жыл бұрын
  • I've heard that now a days movie stars use them to save time..

    @gcfoodandculture@gcfoodandculture2 жыл бұрын
  • Welcome 💝💖🌹💞

    @shahidchoudhary9795@shahidchoudhary97952 жыл бұрын
  • Audubillah at first I was confused 🤷‍♀️😂

    @halalpolice7544@halalpolice75442 жыл бұрын
  • The only good application i can see for deepfaking is replacing a stunt double's face with the actor/actress in movies and tv shows instead of like doing all sorts of scand and rigging for that scan or hiding the stunt double's face

    @hermie0600@hermie0600 Жыл бұрын
  • Is water wet?

    @SLKFJAD@SLKFJAD2 жыл бұрын
  • Yes, very dangerous stuff..!!

    @osiasnocum7239@osiasnocum72392 жыл бұрын
  • not going to lie. She had me in the first half

    @shriragreddy7193@shriragreddy71932 жыл бұрын
  • I want to say... there is a reason why the DeepFaceLab don't keep updating the image every second, that slow down the training process!

    @BillyHau@BillyHau2 жыл бұрын
  • Fitnah coming soon

    @paklah245@paklah2452 жыл бұрын
    • Dajjal coming soon

      @hareemshk9904@hareemshk99042 жыл бұрын
    • It's already here...we have to be more careful.

      @baekhyunsbambi6978@baekhyunsbambi69782 жыл бұрын
    • too late it is coming soon ..

      @filhanislamictv8712@filhanislamictv87122 жыл бұрын
    • Allah is coming

      @leylayetmez@leylayetmez2 жыл бұрын
    • He's here

      @jumambugah3946@jumambugah39462 жыл бұрын
  • Wtf I triped out when she said I'm Jennifer Lopez 😆 she is a pretty reporter though...

    @dezzelmoney@dezzelmoney2 жыл бұрын
  • JL..Palo Mayombe. FV magic.

    @stephencorsaro954@stephencorsaro9542 жыл бұрын
  • Sandra looks good and she knows it

    @fivetimesyo@fivetimesyo2 жыл бұрын
  • Hahaha... I was amazed and dejected that the host got changed.

    @Counselor23_Nov@Counselor23_Nov2 жыл бұрын
  • Lol last of my worries! Hahah

    @gnomuka@gnomuka2 жыл бұрын
  • The implications of deep fake technology is chilling af

    @darkvoid.0938@darkvoid.093810 ай бұрын
  • She's totally got lopez beat on beauty (especially class)

    @empmachine@empmachine Жыл бұрын
  • Absolutely frightening

    @FarzTurk@FarzTurk2 жыл бұрын
  • Yes is the answer to the video title

    @aperson2730@aperson27302 жыл бұрын
KZhead