Apple’s Software Chief Explains ‘Misunderstood’ iPhone Child-Protection Features (Exclusive) | WSJ
Neural Hashes, Safety Vouchers and More Fun Terms Explained
Apple’s tools for flagging child pornography and identifying explicit photos in kids’ messages caused backlash and confusion. In an exclusive interview, Apple software chief Craig Federighi sat down with WSJ’s Joanna Stern to defend the technology and explain how it will work. Illustration: Laura Kammermann/The Wall Street Journal
Personal Technology With Joanna Stern
Technology is overwhelming and making decisions about what gadget to buy is harder than ever. WSJ personal tech columnist Joanna Stern makes it all a bit easier in her lively and informative videos.
More from the Wall Street Journal:
Visit WSJ.com: www.wsj.com
Visit the WSJ Video Center: wsj.com/video
On Facebook: / videos
On Twitter: / wsj
On Snapchat: on.wsj.com/2ratjSM
#WSJ #Apple #Privacy
Real title of video: Tim Cook throws Craig to the wolves.
🤣🤣🤣🤣
bro what 😂
@@cardboardpackage Tim Cook has been hiding while he throws Craig under the bus. I think the CEO of the company should be the one explaining this to its customers and media, not his VP of Software Engineering. Cook just threw him under the bus and started driving it.
@@MidNiteR32 nah. I think it was the right move. First of all Craig is more agreeable and second the risk is lower. And to be honest Craig managed it formidably imo
@@MidNiteR32 nonetheless your comment is quite on point ;)
Let’s make sure all the members of the Vatican have an iPhone
The pope just bought a Huawei phone
Or they picked the blackberry devices.
@Highground Trump and biden be sweating
Use iCloud*
@Highground We don't have to be selective. But if you want to be, the Republican party is where we should start
Loved how he mentions Telegram as the message app. No free mentions for you, Zuck.
Dam facebook. Still dont get how their adquisition of instagram and whatsap went true, o wait...
How is Signal? I saw Elon Musk recommending it.
Yet Telegram founder hates Apple
@@albinjt1 so whats the benefits of loving apple?
@@albinjt1 probably because Apple restricts certain channels on Telegram. It’s insane that I have to go the browser version of Telegram to view those restricted channels.
Apple: We're not scanning your images, we're just scanning your images
We're not scanning your images, we're just scanning OUR images.
If they are stored on their servers in this day and age I feel as if it’s your fault for trusting big tech. Either way, we’ll all forget about this in a couple of weeks. We basically already have
If they can install a program that tell me my battery is at 10% after 10minutes of use, when a quick hard restart bring it back on at 100% there is no telling what they can install on your phone. If Ur f-ingdeau gives Apple a couple hundred million of our tax dollars because we proved that the vax was ineffective and self immunity has a 80% success at beating the virus there’s no telling what those greedy blasters will do.
Correct me if I'm wrong, but in order to upload an image to the cloud, you need to scan the images first right?
There is plenty information on how they “scan” the photos, it’s even explained in laymen terms in this video.
3:21 “pornography of any other sort” I’m glad Craig essentially said that Apple knows and understands that people simply just have nudes on their phones
Yeah, some people are simple degenerate pigs, but not actually pedophiles.
@@nicolelea615 yes, and some people are photographers and part of that is nude photography, not porn.
@@nicolelea615 They might be photos of a spouse or partner. Or photos people took to track weight lose/gain progress.
@@billjamal4764 Like you, your dad, your uncle, etc.
@@nicolelea615 My friend there are people out there that are just as bad as mentioned but they are people who have private photos of their partners/spouses don’t put everyone and everything under one group its not fair hope you understand (:
Tim: “Hey Craig….” Craig: “NO NO NO NO NO NO!” Craig: “Hey everyone…😅”
Don’t get it
@@Jushwa it means tim cook said craig to go for interview
Tim Cook is the ceo Craig is the software person he knows what everything does he made it Tim Cook does not do software and anyone that stores things to the cloud they dont own the servers all they own is the main device storage they dont scan on the device they scan on the cloud only
Now that this whole news has gotten out, actual petafiles aren’t going to be storing their Photos on iPhone anymore. So basically this feature is useless now.
The people stupid enough to store highly illegal material in cloud storage won't be stopped by these news. It was always one of the easiest ways to get caught
@@hundvd_7 he says, sounding a little too informed
@@diedforurwins Sure, go ahead and call other people pedophiles. That will make you look smart.
“Petafiles” bro?
@@Dr.HouseMD down with those Petafiles!
“I think the customer owns the phone” It’s a yes or no answer
That's a big fat no.
That sounds like a yes to me?
It's definitely a little more complicated, I can "own" a car but there's a lot of restrictions on what I can to with it or to it, especially if you want to use it on a road. Ownership doesn't really imply full control most of the time, even with land you have tons of laws limiting what you can do with it
@@joshgribbon8510 Exactly we as consumers don't really own anything anymore and that's the world over, we don't have any rights just privileges until someone decides to take them away.
@@mantasvilcinskas definitely more complicated then a yes.
“It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.” -Warren Buffett Apple is feeling this hard, hence the panicked response to media.
Okay then purchase a Chinese phone.
It’s going to be misused like any other tool big tech and the government gets its hands on, period.
@Carrot Cruncher I'm an Android guy through and through Brett has a network engineer I can tell you programs are much more flawed than people realize they're doing this for margin of error
Apple fanboys will just bend over and accept everything
@@The-Heart-Will-Testify they are the ones who are mad at apple. Think before u comment.
First of all, thank you for covering the issue. I wish you pressed him on what type of audit he mentions, because to me anyone can force Apple to add a database via the FISA court. I want to know what is done to prevent that from happening instead of taking Apple as its word.
THIS!! I was pleased to hear about "auditability" -- but what exactly does he mean? Anyone got source / more info on that?
Before WSJ is allowed the privilege to interview craig they have to agree to terms and conditions
He seems to be very shady in his explanation as to what the company IS going to do.
So basically avoid apple's icloud services
@@zonka6598 if that it what’s works for you and gives you a sense of privacy then by all means but just note that if you use google they’re already doing it and worst so yeah…
"It's not a backdoor. But it can be manually verified by humans in case our algorithm finds a match." Hmmmmmmm 🤔 That sounds suspiciously like a backdoor to me.
Spying withe extra steps
The files are on their servers
A backdoor to what? iCloud? Which Apple already controls?
Yeah... You don't need to upload your photos or use that service.... Or simple don't have cp
@@bluebird1954 no it's the fact that it might be a faulty system. How can it differentiate an image of a child posing in a sexual manner with lingerie, to a baby taking a bath. Will it flag both, none, or one of those images? Simple things like that can really impact a person's future
“I think the customer owns the phone” Right to repair: no
now you can
Same tech can be used to identify political dissidents, protesters, and just about anybody. Imagine matching memes commonly shared by people of the groups to identify people for political persecution.
Yes. Even If we take them at their word and accept that they can't see other photos because they can only see the ones that neural network has very tightly matched for. They still haven't said anything about the possibility of them searching for other stuff.
All they need to do is change the hash and AI to look for other photos.
I'm sure your isp, phone provider, Google, facebook (including Instagram), and any other social media or messaging platform do that. If you truly care about privacy, you have to get an opensource operating system, and only use opensource apps. There's no way around it
THANK YOU was looking for this. This is smoke in Mirrors.
It’s already on gmail , facebook, instagram, twitter and KZhead.
I don't want my images to be scanned even if I don't engage in any illegal activities. It doesn't matter if it's AI or a human looking through my photos it just makes me feel uncomfortable.
They already are. Don’t you see how your photos app can recognize faces etc? I think people pressed about this have things to hide
@@johansm97 lol I don't understand how people think everything in their iPhones aren't already being touched by AI, especially photos. How do you think your photos look so good? Computational photography using AI. How do you think they group faces and show you memories? AI. This is just Apple using AI, in a much more careful way than other companies, to do something. That's all it is, and people are losing their minds
@@7billza they actually aren't, facial recognition on iphone is done on device, apple doesn't scan anything, it's the only company that believes in privacy
@@bouzianenadhir8503 they wouldn't have destroyed end to end encryption to Apple servers aka iCloud. If they can snoop around while a photo is uploading to cloud, it's not end to end encrypted. It's not private. As simple as that.
Don’t use iCloud then.
I’m glad that she pushed the “who owns your phone” and the conclusion. I applaud WSJ on pushing the exec on something that felt not scripted apple BS interview. Now, how do we know those pictures being provided by those associations won’t be manipulated into searching for other stuff. At the end of the day apple has no idea what those hashes are. Who knows what the hash provided was.
@@Karantkr Multiple photo apps do this..
This felt not scripted? The forced laughs, fake "searching for the right words", multiple camera angles and after all that this felt unscripted?
@@MrSidneycarton you expect a trillion dollar company to shoot an interview with a single camera? 😒🙄 multiple camera angles are an industry standard
Paid interview
@@nixednamode3607 Not sure whether that was intended as sarcasm or not buddy.
Tim literally threw the guy at wolves. Hilarious
It's almost as if it's Craig's job to talk about software, 5 days a week. He even makes a few mil a year for doing it.
@@_sparrowhawk talk about software is his job, but that matter was super important and a word from Tim would have been welcomed
Nice copy and paste
He doesn't seem to understand the fundamental reason people are upset. The hash database is on your phone. The scanning is on your phone. This means that we have no guarantee that our phones will be private in the future.
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now. Why are you in an uproar now? When they first said your phone was Private, why didn't you roll your eyes and say "ya, but what about the future?"
@@bhavinbijlani They already built the tech to do it. That was their argument against creating a backdoor in 2015. Now it exists and Apple has no excuse that they “can’t comply.” They’ve already stated that they developed the technology to comply.
It feels like in China.
People are upset because they don’t understand the underlying technology, the same way that a lack of education about natural forces and science leads people, still, to call someone a witch and persecute them.
@@carlosgomez-ct6ki World is gone same. We want to have privacy. But Every company/gov want to get it.
Tim checks the laptops of his engineers........ apple engineer: I swear its just for the image classification algorithm.
Trying to confuse the rocket detection algorithm with similar images 😏
Bruh 🤣
Craig practicing his “Good Morning” for Tim Cook’s Replacement 👀😂 Reference | 1:20
I would be happy if Craig took over for Tim.
@@triple7marc Same, he’s so perfect for the Role . Full of Life and so Enthusiastic .
"A thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor"
But isn’t it better than having the door wide open as it is on many cloud services? I think this is the best balance they could find between not hosting CSAM on their servers and also protecting customer privacy.
@@Dlawderek no
@@Dlawderek What about not building the door at all. Law enforcement is NOT the duty of private companies, and there are very good reasons for that.
@@Dlawderek absolutely not, however good their intentions are, i will never agree to having my private data monitored. Something many forget is that one’s privacy is protected by law. Even if police were to illegally obtain even legitimate evidence against one (be it through illegal wiretapping or else) that evidence will be rejected as unlawfully obtained. What apple is doing here is basically rephrasing “we will hack into your storage and check if you have anything illegal” into “we will scan all your photos and if you don’t agree then we will stop providing service to you even if you paid for it”. Barbarism.
@@Dlawderek NO. If you upload to a cloud service its not your hardware or a private space. Apple is now saying my hardware is actually theirs too to do as they please...
They also announced it on a Friday afternoon because they knew there would be blowback and they just wanted people to forget about it during the weekend. Well that’s not happening.
And they was about to loose sales , I thought the whole phone was the cloud I'm not understanding
Remember icloud hackin 2014? All celebrities pictures leaked. Yeah Apple has some nice security there. Thank God I dont have an Apple account.
Apple always releases negative news on a Friday afternoon
@@fynkozari9271 Dude that was 2014 lol Apple has only gotten better with security since then.
@Apple Genius that’s actually sad
It seems like Apple still doesn't understand just how strange this has made their most loyal and fervent customers feel. This has the potential to really spiral out of control on PR terms, much like the 'apple purposely slows down phones' headlines came out of the throttling due to battery age thing. This loyal base kinda sets the tone for what the sentiment around Apple is, and right now they are spewing and the issue isn't going away. I think the way Craig handled this won't do anything to dampen the concerns either, condescendingly dismissing the backdoor concerns and also giving no details on how it will be expanded or just how we can guarantee Apple is limiting it to child porn. I understand Apple has a new head of PR, it's making people question just what Apple has been up to before that their slick PR glossed over. Some kinda line has been crossed here that I've never felt/seen in my 25 years of using and following Apple.
I feel exactly the same way. I’ve been apple only since I was 8… huge fan of the company… they basically bought my house… some line is being crossed here. Like maybe I’m not in love anymore…
windows is in background nice
The apple loyalists will always fall in line. As a person who used to think "this is surely the last straw for Apple fans" I don't doubt anymore. I buy the stock and get "rich" with the winning team. Public backlash needs to be HUGE to stop this. Apple hard-core fans aren't revolting against Apple. I've put my money on that.
We're not because we don't run our lives with pitchforks and torches.
@@ThinkyParts how old are you now?
As a longtime Apple customer (1986)I was thrilled with Tim Cook's statement about privacy and your history of resisting law enforcement and government when it comes to privacy. Now you have appointed yourself the law. And now you are going to scan my phone without my permission. At least the government has to get a warrant. Just a month ago I got rid of my Fitbit watch because Google bought the company and bought an Apple watch because of Apple's supposed commitment to privacy. You are not the government so I have no recourse if you abuse my privacy. So you can do whatever you think is right and I have no recourse. There are only two operating systems in the world and we just have to accept that Big Brother Apple is like Big Brother Google who knows what’s best for the unwashed. We have just about as much recourse as people in China.
Or Apple wanted to avoid government parties such as FBI and CIA so long that by doing so (according to past features such as adding a feature to destroy all users data should the phone's password be typed 10times wrong) it could jeopardize the company. Donald Trump single handedly managed to give an executive order to Google to stop providing the official version of Android and it's services to Huawei and Huawei was almost ready to exit the market. Now imagine Apple being forced to show all users iCloud data to governments due to child pornography claims even though you do not have any. That would suck for them and the user's privacy. Apple (for now) found an in-between solution that still protects legit users data on iCloud and protects Apple from governments by giving an actual "backdoor" to them after many years (as seems by Kreg's tone). The only time this feature will get out of hand is only if it expands for political parties or political correctness such as someone posting an LGBTQ funny image that seems insulting in apple's eyes. Then things will not look good for Apple.
An american saying they have to endure tyranny anything like the one in china is just ignorant.
@@milantoth6246 It was extreme. My concern is the fact that the internet and media companies are becoming a necessity. Most businesses or utility companies assume you have internet access. The problem is the tools you need to access the internet are companies that can make arbitrary decisions that change your access to the internet, and you have no recourse. There are only two operating systems in reality Apple and Android, private companies.
You can de-google Android phones though, since it is open source. Check out Rob Braxman's channel on how to do it, if privacy is so important to you.
@@justinberman7386 along with Graphene and Calyx which pretty much only work on Pixels, there is also /e/OS which supports a wider range of phones.
It doesn’t matter what the steps are between if A is uploading a photo and Z is them reviewing/alerting authorities. They “Review your private photos” despite the letters in between. Don’t get lost in the steps.
This explanation from Apple is even more worrying. They describe a technical solution where no one will be able to independently evaluate what content triggers the alert. Hashes results will be ciphered so that no one will know what content matches what "hash of interest" on the device nor on the backend. Any political sensitive content could be part of the database without any one never knowing. Never trust anyone's word to keep you safe from technology abuse.
The explanation is literally a lie. We don't process the images on your phone, here's the misunderstanding: oeighoihzgoiehrg hieogheorighe oriheoirhgoierhg "scanning on your phone, yes, but," eoitgoiehgeihgo That could be the TL;DW of the video tbh
@@user-hm7zn6bz4y It's literally not that hard to understand it
The alternative is, as Google and MS do, to scan the whole cloud content of all users. Apple wants to be in a position to not being able to see our data. And that's the way to protect our privacy, while trying to follow the laws of the US and EU that want to have more and more supervision.
Oh it gets worse. At some point audits (real humans) get involved. At this level who knows what can happen…and if anything did go afoul at Apple, how would you know? What happens when hackers find a way to inject foul hashes or FISA requests force apple to apply this tech for political reasons (under the guise of domestic terrorists)…..in fact, the timing is extraordinarily on point with recent up dates to terrorism.
@@tomboss9940 The alternative is better. You, as a user, can decide whether or not your content undergo the screening. While your data rests on your computer or phone, they remain yours and on your sole control. What Apple is doing is potentially removing that control from your hands: any data on your phone may be monitored without you even granting that right. the only things preventing them from doing that is their good will. Technology history taught us that you should never trust anyone's word from preventing technology abuse (be it knowingly or not).
The reason to worry about this photo scanning is that there's no way it doesn't evolve. Currently, it only checks 1) photos being uploaded to iCloud 2) that match a database of known CSAM. Importantly, this doesn't do anything about new CSAM created in the abuse of children. Catching new material is the obvious next step. And there's no way to achieve that with the current hashing architecture. It has to be done by constantly monitoring all media on the phone, probably with some "AI moderator". And there's no way that some government doesn't demand that this monitoring be used to detect something other than CSAM, like political dissent (remember: China is Apple's biggest market). That's the worry. This new tech is only *kinda ok* as long as it doesn't evolve a single step beyond what it is now. And there's virtually no chance of that happening.
en.wikipedia.org/wiki/Slippery_slope
@@RHStevens1986 Sure, but also worth considering: en.wikipedia.org/wiki/Foot-in-the-door_technique
The typical American ignorance that radiates from this single comment is amazing.
Ios 16 they will start scanning your on device photo library. Mark my worlds guys 😎
Yeah, this tech should evolve because this step alone doesn’t solve the problem. Regardless, It was either going to get created to do the right thing or the wrong thing. That’s just how it works. For now it’s use-case is positive.
This is painful even for him to sell this…my god. This is a problem. iCloud photos are now turned off for me.
dont jinx this to me dude, i just switched to icloud
Apple cannot call itself the privacy company anymore.
How about Google drive and Dropbox, they also scan for CP
@@starbutterflygaming8881 true, but they never really were known for their privacy stance, unlike Apple.
@@romakrelian thats apple stan right there
...do you people not know how to interpret English language? Why is there still confusion
@@drinkwoter or because nobody is ever safe when buying a phone
"customers own their phones for sure" They cant even repair them without going to Apple!
You own it, until you want to repair it ;)
I just did it today tho
if you repair, you’ll get a warning message in settings 🥲
You don't own an IPhone, you just use it.
Ive been repairing Apple products for 2 years, and aside from battery replacements I wouldn’t recommend non-techie/ qualified/ confident people to do other things like replacing screen screens, lightning ports, FaceID sensors etc
This was a weak interview. Craig threw some big fancy words when asked to simply describe the system. No hard questions asked and this seemed more like a PR move than an interview
Basically a paid interview for pr purposes.
"Craig, tell us why it's okay to treat your customers as if they are guilty until proven innocent, and why you want to foist the system resources onto the users instead of your data centers..." That's what should have been asked.
@@KarstenJohansson because if it was checked at iCloud servers people would go oh no they are spying on us
@@IndexError They wouldn't say that when the check is done on their personal device?
@@ssud11 Yep, they also pay through future access to things, not just through money. So if they cover this in a way that Apple likes they get future access to news first because they're seen as trusted.
I appreciated that she did this fiercely straightforward interviewing sessions (mostly kind of interrogation) for the good of every Apple device users. Thank You ✌🏼
How do they perform the hashing? That's the interesting question here. It can definitely be abused to find other features inside the image. They store those hashes in their servers anyway.
A hash is a one way algorithm where data, in this case being the image, is passed through. This creates a unique hash value for the image which is basically impossible to replicate even a small change in the contents will affect the hash value drastically. This hash value is basically compared to a list of the hash values in the csam database. Meaning that to anyone who looks at your stuff its just going to be nothing more than a string of random numbers
While I applaud the CSAM implementation, the issue becomes how far reaching will this become? It's a slippery slope.
This question can and should not be asked to Apple directly but to the government and entities responsible for controlling data security. All companies have a similar or identical technology and unlike Apple they’ve been using it for decades now.
Same thought. I think this is what happens when legislation cannot keep up with how fast tech develops.
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.
This was built for china to spy on dissidents
@@tiagomaqz other companies scan things on their cloud. Apple is scanning on your device AND the cloud
If this is allowed, what's stopping them from reporting your drug pics to the police? Wake up people
If drug is illegal where you live, then why not? People doing illegal activities should be reported.
DRUGS ARE BAD MQWAYYYY
@@gobi817 You missed the point entirely. Also, simply having a picture of drugs is not illegal.
@@gobi817 Because you have a reasonable expectation of privacy on your personal cell phone and companies don't have the right to search and report your content to the police. They shouldn't be looking at your data beyond what is necessary to provide cell phone service. iCloud was marketed as a way to store your data, not a service to scan for and prevent illegal activity.
Just don’t upload your photos to Apple then? Also, I don’t think people send well known pictures of drugs to other people. Funnily enough, if Apple has a hash for your drug photo, this proves you didn’t take it yourself.
Bro, what did they use to train the models?
I really wonder what the testing phase for the algorithm looked like.
Oof
Still doesn’t hit on the real concerning issue
Wait until China ask them to quietly scan other photo...
Many of us understood exactly what this was from day one, this "talking down to" by Apple is gross. You don't control what's in the database and a government can change it from just CSAM to anything they want. Creating the backdoor is the problem.
Exactly. There was no never any misunderstanding
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.
And you can just....you know not upload anything to the cloud....
Exactly, those who provide the hashes can change it to look for anything.
I disagree completely with calling this a "backdoor". Apple is not *entering* your phone to do anything, Apple is scanning what *you decide to send to them*. This is more of a bouncer than a backdoor.
Really good interview, he explained everything well. The point many people are missing is the fact that cloud services already scan all photographs you upload. Apple is just deploying a method that wouldn't scan all your photographs, instead generate hashes that are checked against the CSAM database.
9:46 how much of the shelf life of a phone is lost this way, how hot will their phones get,
The second part reminds me of that Black Mirror episode. We are getting there.
Which one?
As a parent and security expert, I get that feature. The first one is the one I'm more curious about...
Which episode?
@@LuthandoMaqondo Arkengel
@@ShubhamKumar-xu2od mm nice one It hadn’t occurred to me but agreed. This is the worry with technology little by little but we’re getting to that point
They should done this interview from the start, and the worry about future change still stands.
Right? This makes the suspicion grow even more.
@@Eugenepanels yeah it’s really strange how they try to underplay this change in a way. They should have done a comprehensive press release from the start, considering how important this change is.
Dude the whole thing was leaked before they could properly present this. That’s why it’s causing problems, because it wasn’t officially presented by Apple.
@@TomorowGames as far I know it wasn’t leak but released be apple them self via their newsroom, but I will check if I’m wrong….
@@TomorowGames Yeah exxactly. It was leaked way before the proper launch and as a result there were tons of false information and fearmongering.
So just to wondering if I’m getting this right apple isn’t scanning our whole iCloud but the photos we are starting to upload now that this update is out
In a practical sense, Apple at least has good intentions by doing this. Its unarguably a good thing that they are planning on tracking down phones that happen to have children on there. I do see why people are mad tho. Apple has always had a long history of keeping information secure for its customers and this seems like a slap in the face to those who use iPhone because of its security.
Because apple has always been so vocal about Privacy and Not letting other apps track your data. And also cuz apple has starting to show ads on their platforms and hiring people to create a targeted ad network that they have been opposing for so long, pushing out the whole competition. Google scans your data all the time, and flags illegal stuff on gDrive but it's not a bid deal cuz they never said they won't do it or tracking personal data is a bad thing like Apple has been doing.
7:05 should have digged deeper here. The reference hashes belong to child pornography today, tomorrow some state might want to force Apple to add additional references hashes, e.g., of Winnie Poh pictures. If too many Winnie Poh pics get uploaded to the cloud, we have our manual verification prompt, thus our backdoor. One could try to weaken the hashes, too, so they cover more pictures, prompting the manual verification on all kinds of pictures. In the end, you still need to trust Apple to only check for the hashes they tell you about. Not quite the advertised "you don't have to trust a single entity".
I’m so lost with why people are upset 🤷🏻♂️ So what if a manual verification prompt occurs if we have too many Winnie the Pooh picture? Are you saying that then Disney could then advertise to us more or something? Like apple aren’t gonna report you to the police for having Winnie the Pooh on your phone
@@UnkleRiceYo they will if your in China. That’s the point slick. In China it’s a hidden law not to have the photo referencing their leader as Winnie the Pooh so they arrest people who do. Apples software could easily be rolled out to match the picture and report people in China.
@@Chaser-mw1fb So Apple is to blame because of China’s unfair censorship laws? Also, there’s no indication whatsoever that they will be doing anything of the sort.
@@Dlawderek I believe you don‘t understand the issue. A country like china could say „hey apple additional to csam also scan for the following images when uploading to iCloud (f.e. HongKong freedom acitivism photos)“ If Apple then goes: „na wr promised our customers not to do that“ China could go: „do it or you‘re no longer allowed to sell your products in China“ (a huge market that brings a lot of revenue). It really isn‘t hard to understand. The problem is not what apple is doing but the possibility of the miss use.
@@UnkleRiceYo did you bother trying to understand why this is an actual problem
It sounds like “you are holding it wrong”
I was looking for this comment.
Me too
Someone is old enough to remember ;)
I think that Apple has “misunderstood” that I value my privacy more than the convenience their products and services can offer me.
They aren’t looking at your photos. The only people that should be worried about this are child predators… which may be telling of why you care so much.
@@cmtheone I’ve worked with law enforcement to put predators in jail before. It’s funny that you’re too dim to see how having your privacy tampered with in the name of the greater good isn’t concerning. Then again, you’re the ideal complacent sheeple that big companies and governments want us all to be. Enjoy your ignorance friend.
@@jackoryan292 Switch to Samsung brother 👍 I'd recommend the S21 great phone 👍 I love my iPad but come on man switch to Samsung brother 👍
@J0p4 google has been doing this for ages. As well as Microsoft. So if you’re going to use them for image cloud storage it’s even worse.
@lol what makes you think child predators will store their photos on their phones? Same idiocy as using gun registration to stop violence criminals using guns to rob a bank.
it sounds like a blind raid without probable cause or a warrant, they can't see exactly what you have in your house as they rummage around, but they'll check anyway. it's either private or its not.
I appreciate the tone and balance of this interview. Nice job. My biggest problem with these features is that Apple is assuming a moral position. Let me say I am 100% aligned on these behaviors being immoral/heinous. What concerns me is simply that they are taking a moral position. What happens when next month, it’s not child porn but “hate words” in iMessage? Hate defined however Silicon Valley defines it. Applying tech to moral subjects is a very slippery slope. To suggest they can’t or won’t misuse this kind of tech in the future is just ignorant/naive.
Legislation or court systems in other countries could easily add requirements to Apple’s scanning database. It’s hard to believe Apple executives could be this short-sighted about a technology. In order to save face Apple can simply say there are problems with the technology and shelve this for the time being.
I think CSAM and “hate words” are not even nearly in the same league. CSAM is illegal and demonstrably dangerous. The 1st amendment protects your “hate words” so I find it hard to believe that Apple would scan or flag this content. This is a “slippery slope” logical fallacy.
@@Dlawderek “Hateful content” like Nazi imagery is illegal in some European countries. What’s to stop governments from requiring to Apple to include that in the database of images they scan for?
@@davehugstrees Maybe they will. If they start censoring political speech by looking through people's images and reporting them, I would be mad. This is not that. If that day comes, we can all turn off our iCloud storage and/or get rid of our Apple products. I don't think outrage is justified in a case where they are taking very cautious steps to curb the storage of CSAM on their servers. It takes 30 instances of hashcodes matching known CSAM before there is an audit. Even if some photos are flagged mistakenly (which I understand to be very rare) it would never reach 30 by mere chance. Even if it did, I would not mind someone at Apple verifying that I have no illegal images in my iCloud. There shouldn't be anything here to worry about.
Anyone who says it will never be misused or increase in scope is lying to themselves.
we will still continue the ‘Misunderstood’ after this video explanation
This is the exact moment "misunderstand" becomes "defame."
@@samsonsoturian6013 NO. DON"T TOUCH MY PHONE. DON"T USE MY IPHONE"S COMPUTATIONAL POWER TO DO THE FIRST HALF OF THE WORK. NONE OF MY BUSINESS. I DON"T WANT TO BE INVOLVED.
He gave a vague answer. In future apple is planning to scan our entire phone . People like you who still don't understand and still thinks that apple is god . whatever they do is perfect.. I feel bad for you brother .
@@KaizenAction296 ok, conspiritard
"how do you know this is a nude image or a rocketship?" LOL top-tier questions!
Having a picture of Blue Origin rocket Iphone user: *nervous sweating*
How do I turn on the feature
5:36 "Human moderators". So basically private icloud content can be viewed by apple tech support moderators
It will probably be a highly specialized team who can do that, not anyone at apple, let alone tech support.
@@harsimranbansal5355 still a violation of privacy
@@blackhatson13 they are big tech companies…and they follow many rules and regulations, it isnt very simple for every employee over there to come and view our private icloud photos…
Isn't Facebook also did this with it's team of "human moderator". I'm not saying if that's not gonna invade my privacy, but without those human moderator, we could be seeing terrorism, porn and those nasty things in our message / chat
This has always been the case
"I THINK our customers own their phones" What a great vote of confidence......
for sure.
I was shocked he used that language. I’m guessing Craig, Tim and anyone else giving media interviews are demanding the questions upfront. Then Apple legal, corp comm and marketing can train the two of them with exactly what to say that will answer SOME questions, but not enough to commit to anything that could lead up to being used in a courtroom or in Congress against them.
Lol exactly
If the customers owned their phones, they'd be able to install software from wherever they wanted to obtain it. They'd also be able to replace the battery themselves, even if it meant buying a special tool for the job.
THINK DIFFERENT
This explained this better than most videos/articles I've seen on this. And I'm still not sure where I stand. Crime against kids needs to be fought, but..... I'm still nervous about future uses of this hash matching thing. Anyway, great interview, WSJ.
I like what you did there with the rocket ship reference
Apple is like: a man comes to a lady during her shower, saying he will keep eyes closed and just scan for security. People just don’t believe it and don’t buy it. The point is not “ a safe way to scan phones”. The point is “ DON’T scan my phone”. Dont’t means don’t
But Apple don’t scan your phone. I think you meant photos on iCloud server.
Connection to your phone
I like your woman in the shower analogy, but you should have elaborated more on that story. Left me wonder what happens next. When can the man open his eyes?
Yeah a MAN, ofc it has to be a MAN
From apple: “Could governments force Apple to add non-CSAM images to the hash list? -> Apple will refuse any such demands” So once again… you’re missing the point. Yes, a government could force this… but trust us. Why should we trust them? Who’s the next leadership team? Apple needs to stop this now. I’m honestly considering breaking up with them for the first time in 30 years.
Do it, I know I am
Didn’t Apple refuse to unlock an iPhone to US federal government once for a crime case?
@@hsing-kaichen5062 they also gave into to China and put a data center in China for China icloud. So China just has to walk over to their icloud data center in china, pull the physical data and they have China iPhone data. They already caved to China once. China will ask to add their own csam database, will you disagree then? And what's in that database? We won't know.
And go where? Analog? Pick your evil…
if you don’t trust them don’t use icloud photos then and switch to a different cloud photo service 🤷🏼♂️
I think the hash has been uploaded already before this update to check if the uploaded file is still the same or got manipulated during upload. The hash is only a calculated "fingerprint" of the file. The only new thing is that the database is now scanning for this hash.
Correct. Google, Facebook and Microsoft have been doing all this without encryption for years and no one bats an eye.
Thank you for covering this in a much more educated way than other media does.
7:45 what are the multiple levels of auditabililty? Will you seriously say “no” to China?
They've already said "yes" to China when they gave up their security keys to decrypt Chinese iCloud data. They're just going to fold again.
And will likely more yes to other governments
Even more than that, this whole thing is probably start from China because Huawei got banned. So, CCP lost it's surveillance tools and turn to Apple for answer. What else can forced Apple to sudden launch such opposite program
They don't provide encryption for phones sold in China and Saudi Arabia
@@mukamuka0 lol remember Apple is an American company. If anyone is asking them to do anything it’s the CIA
What a timing to drop this exactly after the *Pegasus* deal 'still unaddressed'
Yeah they are very similar
What is that? Can you explain?
What? Wasnt already patched?
@@tophan5146 watch rene ritchie’s video about it
@@kevinhernandezarango5005 never gonna be patched
I haven't used facetime but those shots of the stream on the computer look amazing, doesn't look like there's any lag at all.
I like your comment about cyber security.
@@gxlorp thanks, don't usually put two comments on a video but here we are I guess.
@@gxlorp I know you're being sarcastic but I actually have two comments so here's my other comment which is about cyber security: "Vatican confirms they have just bought a new installment of untraceable, Linux phones for all members"
Can banks scan/look at what's stored in your safe without warrant? If not, why can tech company scan our private files without warrant?
You can just not upload it
Because you agree to it when you upload to cloud
“We’re not scanning your photos, you see, we’re scanning your photos.”
actually it's we aren't scanning your photos on your phone, we are scanning you entire icloud photo library. It's even worse
Actually they aren’t scanning any files. They are creating an encrypted hash that is checked against their database of CP hashes. How hashes work is they cannot be decoded and the only wait to identify them is to have a hash. Thus, the only data that is “revealed” in this process is CP data which should be banned. However, this is not to say that I agree with what they are doing or that I don’t recognize the potential of what this may be come as it relates to privacy, but the fundamental feature actually doesn’t breach privacy unless the user uploads CP.
@@ohmyghost88 that is literally scanning
@@ohmyghost88 nice try Craig we know that’s you
Hashing isn’t scanning. The whole point of hashing is to efficiently store and retrieve data without scanning. The hash does not know the contents of the file, it just calculates a number (hash) that is used during transport to check if errors occurred (the checksum is calculated at the source and destination) and it needs to be sent again. Did you guys take a computer networking class or not?
I wish Joanna would have asked about the future "enhancement and expansion" of this thing, as Apple announced. Dystopian world we are about to live in.
Meanwhile Google has already been inhabiting that world for *years* now.
@@exiles_dot_tv indeed, the rest of big tech are dragging us all to that dark place.
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.
Have you been asleep the last few decades or are you just a Microsoft/Google fan boy?
@@PedroLopezBeanEater I haven't and I'm not anyone's fan boy.
The sound byte seems to be correct
I don’t want my phone to use AI to scan my photos
Same
They actually cut the Apple campus out from any word Craig said which could be used to make memes! That means there has to be an agreement for this interview. I wonder if that includes other limitations as well since the interviewer didn’t pressure Apple that much. This feels more like Apple marketing than journalism
I thought similar. I noticed how this was cut too....like why did they have alternate camera angles for a meeting that took place on FaceTime?
I mean what did you expected from Apple. They are one of the most strictest companies who absolutely love controlling the narrative.
Given that this is an exclusive, this is most likely a way for Apple to take control of the situation. Most companies will only agree to these types of interviews if only certain questions are asked to control the narrative.
Literally everything you see in news these days is just propaganda. Journalism is dead
Time to make some memes with a huge Apple watermark out of pure spite
Another part of this issue is the idea of who owns the content. Regardless of where it is stored. If the police need a warrant to search a safety deposit box at a bank, shouldn't Apple need a warrant before searching photos? The idea of fiduciary duty and trust. If someone is purposely posting items to a public location by all means search away. But when photos are privately being stored in the cloud it feels very invasive.
I’d love to see what the new TOS are for iCloud once Apple implements this.
Youre not privately storing them though. Youre storing them on Apples iCloud servers where they become responsible for any content you have on there.
@@crusherman2001 And that is the issue. When I store paper files in a safety deposit box (1) the bank can't nosy through my stuff and (2) the bank has the responsibility of keeping my files secure. I still own the documents. For all Apple's talk of privacy this could be manipulated to be very big brother...
@@crusherman2001 If you have a reasonable expectation of privacy, they can't just go through your images to report them to the police. For example, if you pay to store your physical items at a storage place, they can't go through and search your stuff and report it to the police. Now, if they have a reason to think you are doing something illegal (smell of weed coming out, for example), they can report it to the police who will then need reasonable suspicion or a warrant to search your stuff. This proactively searching and reporting people to the authorities is not only a terrible invasion of privacy, but it's one that could create legal issues for innocent users.
With Dropbox, Google, MS, this is happening now. Apple wants to safeguard iCloud. That's why they came up with this (complex) solution to not having to watch all your photos. The plan is to encrypt all parts of iCloud in a way that Apple cannot read it. This solution is a counter-offer to the US and EU's intrusive laws in the works for "child protection" (as a scapegoat for sniffing through all our cloud data and communications).
"I think our customers own their phones, huh, for sure." Too bad his thinking is not reflecting what is really happening... #righttorepair p.s. this is not an interview, merely a communication from Apple...
Pretty sure she asked questions, that makes it an interview. And he answered those questions. No fuss about it
it’s weird and totally against their “privacy stance”. I turned off icloud for photos already but i doubt their snooping stops there.
"People have misunderstood" People are not stupid, we understand what you are doing and we have a problem with it
They are making it up to look at our private images 🤬
"journalism" = regurgitating what big companies tell you
Where are Global Human Rights activists ? Taliban is more violent than ISIS
Nothing to see here , move along pleb
she is just like some apple activist, protecting apple at all costs
This is an important step in protection but what is preventing them from adding hashes to their database that has nothing to do with child protection?
10:50 "I think our customers own their phone for sure" *Fights againt Right to Repair* 😂
I am a developer, I agree with Craig on this tech or specific software implementation, but I am worried about what privacy really should be.
I'm also a developer, but I kinda don't agree with him. Actually, providing any information that can identify your data (which can be again identify you by another process), is quite scary. That is not a backdoor YET. But it'll surely become on when someone will find an access to it (and we aren't even talking about using it). This is very innovative, but also very dangerous (as a developer or a client) due to the potential of this kind of technology. But, for the good side, that is an awesome tech that can help improvind other research fields :)
@@datahearth1738 Apple should really implement the best ZeroTrust. The feature is ok, but it's a matter of who they trust. As a developer, I would never tap on this field of sensitivity on users' data.
@@revtane9 Yup, right.
Nice of Apple to brush us off as "confused" or "misinformed". We're not "confused" we simply don't want this on iPhone. Joanna I wish you would've asked him how easy it would be to simply replace CSAM with other illegal/copyrighted material. Even though Apple said they would "deny" if governments asked to do that, why did Apple feel the need to encrypt iMessage, after all they can just deny right?
A major reason for Apple to encrypt iMessage was Privacy, not just security (aka hackers)
Where are Global Human Rights activists ? Taliban is more violent than ISIS
This is a carefully crafted PR piece wherein Apple explains that they are, indeed, scanning your photos. Nearly all Apple customers use iCloud.
You're right. But then, Apple is not above the law. So the idea here that Apple hardens iCloud that they themselves cannot read it nor share it is fine. So is the idea to flag incriminated content and to review it inhouse before informing police. The alternative with Google, Facebook, MS is that the NSA has full access.
"Who owns this phone?" "Well, customers do, but good luck running any software other than ours on it." Answers to moot questions keep average consumers misinformed.
You’ve obviously never jailbroken an iPhone
Yeah, and that's totally intentional.
This will not work if you upload to iCloud through Cryptomator first... And of course, you encrypt your DNS and add a VPN...
I do like Apple's attempt at blocking all the internet creeps, but at the cost of everyone's privacy. That's a big ask.
Hey Siri tell me a joke. Apple is privacy
Did you watch the video? Or did you just not understand how the system works? Yes, Apple scans iCloud images for child porn, but the scans on device stay on device and Apple doesn't know what photos you have, unless the neural hashes match the csam database when you upload it to iCloud, and after levels of scanning.
@@frappes_ What if the neural network matches photos of social activists and uploads them to CIA “accidentally”?
@@frappes_ And do you remember the PRISM was in the name of anti-terrorism.
They can provide a options for these type of features.
This is the first time I've seen apple being so flustered in an interview. Smells fishy...
So he is saying data you upload to Apple is theirs to scan if they want.
10:06, did anyone else notice that he was being coached as to what not to say in the ear-piece 😏
Here is the issue THE US GOV cannot just go around and go through each device and report what is on there. So neither should apple or any other private company for that matter. This is a lot bigger issue than it is being presented.
How are they doing multi-camera shots from a web call?
😂😂😂 This is edited ny friend
Yeah, but this is meant to give the impression that it's a 1:1 conversation. This feels staged.
Sounds convincing. But this is still a backdoor to expand for the govt.
Literally no, goggle has been doing this for the past 10 years
hashes can be made from photos but a hash cannot be converted back into the photo. Apple does not see your photos to generate the hash.
@@akhileshjayaranjan5628 exactly
@@akhileshjayaranjan5628 If Apple cannot see your photos then what's even the point of this system? Algorithms are faulty and Apple admits that if this system flags something, there will have to be a human to double-check. And that's the big problem right there: they *can* check your photos. Who is to say that they or a local/US government agency wouldn't just check every photo instead of only the ones that have been flagged?
I definitely agree. Even Craig seems confused about what a backdoor is and why they're introducing one. I'd say "a degree of analysis is done on your device" is scanning on your device
So, 7:05 senior vp of software at Apple really don't understand what backdoor is?
This seems pretty clearly about they don't want liability given the current rise in tech companies potentially being responsible for content on their clouds/sites
Yes...its satisfying to claim this is about Child Safety (which it might be)...but its more likely for liability reasons.
False, CSAM scanning is done by others apart from Apple on cloud content, nothing new as it becomes liability. What is new is they will scan on device data and if hashes match, the iPhone will report Apple and Authorities. Now the database that is there to match is not under Apple's jurisdiction and can be shifted to account for any kind of materials that the State deems "illegal" or against it's interest. These 2 parts are not something that a customer has to get into a deal when paying for ownership of a device, not even renting.
Then they should scan using their own resources, just like Google et al do. He's just admitted that they don't want to pay for the resources when their customers can be hoodwinked into it.
What would you rather, a human look at your photos, or a algorithm? I’m confused with the problem
If the code really respects privacy, why not make it open sourced for reviews?
"A multi part algorithm" aka a program "runs on your device". So yes they are running a program on you device to scan your photos. Lol!
Close, but no cigar. They are running a program that will compare your photo HASHES (alphanumerical strings) to the hashes in their database.
someone obviously didn't listen only the hash gets created on your device, then it goes to iCloud to compare the hashes, which is NOT on your device.
@@izayahmartin so basically the iphone in realtime sending hash to apple cloud? crazy
@Gc Vi it’s also sending the photos there. Who would have expected that of a cloud storage? Lol.
@@KP3droflxp Not sending the photos. The hash (32 character string of characters and numbers) is being generated on your device and then being sent of Apple's server. They never hold of or see your photos.
Craig is being trotted out like Colin Powell was about the Iraq war
Yes!
I think that lady is just angry that she can’t keep watching her cp.
6:28 he said it will not scan images in your messages but you can only send images in iMessage and iMessage send it via iCloud 🤷🏻♂️
Why is he always bringing up…”only applies when you upload to iCloud.” Every photo is automatically uploaded to iCloud fool. What do you mean?
Who's the fool? It only uploads to iCloud if you tell it to. It's an option, and not on by default.
@@soylocomoco1162 If you don't have iCloud on automatically it incessantly pesters you throughout the day. They also made it very difficult to do physical backups of your phone the way you used to be able to.
@@DebraJohnson not true
Even if this feature remains harmless the implications for the future use of the abuse of said feature is bad. Apple should stop this before it gets worse for there reputation.
You knew Google had been doing this since 2008, right? And, of course, all the other tech companies.
Yeah, but apple is doing it, differently. 😒
Two questions this interviewer should have asked 1. If I don’t use iCloud will this have any effect on me? 2. So if there are only 29 CSAM images will there be no alert for that account?