Apple is going to look through your photos on your phone looking for child abuse pictures

Some analysis on what Apple has announced.

https://www.eff.org/deeplinks/2021/...t-encryption-opens-backdoor-your-private-life

Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.

To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.
 
A few things here.

Apple is not looking at your pictures here. They aren't looking for pictures of your naked kids or anything that can be mistaken for child pornography. They are taking a hash of your picture and comparing it to the hashes of known illegal content. The whole point of a hash, well one at least, is the ability to compare two things without any knowledge of what those two things are. This is how entering your password on any properly designed website works. They never know your password, they just know the digest that is created when the password is hashed (hopefully after being salted and run through multiple rounds) and that is what is happening here.

This is the price you pay for storing your photos in iCloud. iCloud isn't really private because Apple controls the keys. The same is true for iMessage. If you really want true security and privacy, meaning only you have access, you need to manage the encryption, transmission, and storage yourself or use dedicated software for that. It is possible but not convenient and most people won't do it.

The story is a little unclear about the client-side scanning. It says it only scans what is going up to iCloud but if the phone is doing the heavy lifting anyway I'm not sure if that is really true. It would make much more sense to do the hashing on the server side if you are really only scanning what is being uploaded so I don't really trust that.

Saying you are only using this for child safety is just an easy way to silence the privacy advocates. While hash collisions are rare they aren't non-existent, and who is controlling the hashes? If yo show me the hash of an illegal photo I have no way to verify it, I have to take your word for it. That is the entire point of a hash. You can compare two things without knowledge of either. Eventually you may be cleared in court but by then your life could be ruined. This is also assuming you are in the US. What is to stop less democratic governments from using this to silence dissidents?

This is a bad idea but we have already given up so much of our privacy for convenience that cat is out of the bag.

This is also not really a back door. This is a front door that Apple is sharing the keys for.
 
I am 100% for stopping such activities and feel like it is somewhat a good idea to have a way to find offenders. I have nothing to hide and they would be very bored with my camera roll. That said, I do find it a bit creepy that a company would do that.
Oddly enough, when I worked for the state as a Family Case Manager, we used iPhones to take pictures of abuse to be used as evidence. Apple would be flagging the state of Indiana big time if they are still using iPhones as the phones they issue their caseworkers.
 
Not sure if it's been mentioned but they're talking specifically about child sexual abuse ...not skinned knees
 


Almost everyone can get behind trying to stop child sexual abuse/porn, so the invasion of privacy is approved for that purpose only, but the door is now open and the slope is slippery.

As much as I want child sexual abuse/porn detected and stopped, I think this is a very bad idea.
 
You really think they are interested in our boring lives? I tend to error on looking at things from a privacy standpoint I just wish people would argue about it in a more level way. They don't give two hoots about our everyday photos so what's the point of arguing the whole "it's just an excuse to..." viewpoint? It tends to turn people off to the privacy argument IMO
I'm not so sure. I think they want any information about us that will (in their minds) help to sell us a product. Sort of like mentioning fast food online and then getting ads for McDonalds, Wendy's, etc. As much as people document their lives now through pictures (both posted online and not), I bet you there's more information about you than you think.

Now, ok, they SAY all they're doing is comparing "hashes" (whatever those are). But if they have permission to examine your pictures, how does the way they examine them change privacy concerns?
 


The other side of this, is that Apple has made a huge push in favor of user privacy ...like no other tech company. To the point where Google and Facebook are fuming because of the impact to their advertising revenue.
 
I'm not so sure. I think they want any information about us that will (in their minds) help to sell us a product. Sort of like mentioning fast food online and then getting ads for McDonalds, Wendy's, etc. As much as people document their lives now through pictures (both posted online and not), I bet you there's more information about you than you think.

Now, ok, they SAY all they're doing is comparing "hashes" (whatever those are). But if they have permission to examine your pictures, how does the way they examine them change privacy concerns?
You're free to think they actually care about your photos looking for ad revenue but personally I think that's outlandish, no offense meant (haven't had my coffee yet :flower3: ).

At least from what they've released "images are transformed into unique numbers that correspond to that image."

Furthermore "Before an image is stored in Apple's iCloud, Apple matches the image's hash against a database of hashes provided by National Center for Missing and Exploited Children (NCMEC). Apple will only be able to review images that match content that's already known and reported to these databases"

Even prior to the above being explained no I don't personally think they care enough about your photos for a purpose of ad revenue.
 
I am really surprised that people are so willing to give up their civil liberties, when we all know that people who abuse their children do not take pictures of the abuse and keep it on their phone. This is an excuse for Apple (and whomever else) to look through people’s private photographs, as if they are coming into your home and looking through the old box of photos you have in your closet. This is completely illegal and tramples upon our basic right to privacy. If people do not start waking up and claiming their rights, we will all be in a lot of trouble, if we are not already.
This is to catch and prosecute those who produce and purchase child pornography! These folks do not deserve privacy!
 
when we all know that people who abuse their children do not take pictures of the abuse and keep it on their phone



I agree to the WHAT!!??!! It is known that people are sex trafficking their own children. I don't doubt for one minute they are selling/sharing pictures too. I think last month I read of a mom who gave consent for her 10 year old daughter to be used for sex to receive drugs. Our world is very sick and twisted at times.

Do I like that my phone is being looked at? Not really. Do I think it could be a slippery slope? Very much so. But do I also feel that this is to help innocent children stop being sexually abused and trafficked? YES, yes, I do!

After 9/11 when they revealed they could tap our phone and listen in I said then (and I'll say now) you'll be really bored listening and looking at my life. Now I know for certain they can tap into my pictures in iCloud. So can Shutterfly. I still willingly keep them there. It doesn't bother me because I know they are doing it, and I am giving consent. I always figured if they wanted to get it they could anyway. Nothing on the internet is lock box security proof. If you don't approve, then go about storing your pictures a different way.

I get it, it feels invasive. I see it too. And like I said previously it is a very slippery slope. But right now I am ok with this as sex trafficking and child sexual abuse is abhorrent and sadly more prevalent than we truly grasp.
 
But right now I am ok with this as sex trafficking and child sexual abuse is abhorrent and sadly more prevalent than we truly grasp.
(To the part I underlined):

It's not something we like to think about but my metro (largely the other side of the state line but it happens on my side too) seems to be a sex trafficking area (among dog fighting and pet kidnapping sadly where they steal pets :( )

In June they found 31 victims (14 were missing children with the youngest child being four years old and 17 were adults). There were 82 arrests made out of it just between June 17th-26th. One of the cities is outside the metro about 2 1/2 hours west of me but the other two cities were in my metro on the other side of the state line. The arrests had charges from "soliciting prostitution, commercial sex trafficking, sodomy, narcotics violations, felony assault on a police officer, sex offender registry violations and outstanding warrants."

On my side of the state line in my metro in the past seems to be more "massage parlors" that have been discovered; it's one of the reasons I've said in the past you'd be surprised what nice neighborhoods and areas have.

It's honestly sickening all of it.
 
I'm confused over the resistance, there is no right to privacy when breaking the law.

It's been illegal to take suspicious photos of kids in my whole lifetime and those photo shack developers back in the day were obligated to report you for this crime, this is in no way new. The fact that we now take photos on a personal device is not an absolution of the crime and I'm thrilled that items can be pulled for review instead of allowing these monsters to skulk in the shadows. If machine learning / AI can end this plague by pointing a neon light at a monstrous freak for more scrutiny then I'm all for it. When we opt into these devices we opt in to the categories they use (my Amazon Photos does it clearly) and I suspect it's these categories which are monitored by machines that are the thing that can flag human intervention, sort of like a traffic camera, and those are already everywhere with precedents and the machines are already invited into our lives via the device. A machine is not invading privacy anymore than the old xerox did when it made copies, the machine is doing its job so there is no invasion until the point a human steps in and at that point the category could itself be probable cause (best guess).

These monsters prey on and trample the rights of those who can't fight back, it makes their pathetic selves feel powerful & that is the smallest weakest sort of person imaginable, so I'm not going to support any freak nests seeking to protect themselves, not at all.
 
The other side of this, is that Apple has made a huge push in favor of user privacy ...like no other tech company. To the point where Google and Facebook are fuming because of the impact to their advertising revenue.
Obviously lip service at this point. Companies change all the time.
 
I'm all for stopping child abuse but this seems live a very Big Brother act that could be applied to so many other situations where control is desired.

Isn't the term Big Brother only accurate if you are talking about a government agency that has the ability to exert power and control over individuals? A private company has no such power and people have the freedom to use or not use their products.
 
Isn't the term Big Brother only accurate if you are talking about a government agency that has the ability to exert power and control over individuals? A private company has no such power and people have the freedom to use or not use their products.
The database being matched against was created and is maintained by the United States Department of Interior.

The government maintains control over what hashes are stored in the database. Some of the analysis articles on the subject have pointed out just how easy it would be to insert hashes that matched other criteria found in images and Apple and the public have no insight into what image that hash represents.
 
The database being matched against was created and is maintained by the United States Department of Interior.

The government maintains control over what hashes are stored in the database. Some of the analysis articles on the subject have pointed out just how easy it would be to insert hashes that matched other criteria found in images and Apple and the public have no insight into what image that hash represents.
This is something I read: "system is built so that it only works and only can work with images cataloged by NCMEC or other child safety organizations, and that the way it build the cryptography prevents it from being used for other purposes. Apple can’t add additional hashes to the database, it said. Apple said that it is presenting its system to cryptography experts to certify that it can detect illegal child exploitation images without compromising user privacy."
 

GET A DISNEY VACATION QUOTE

Dreams Unlimited Travel is committed to providing you with the very best vacation planning experience possible. Our Vacation Planners are experts and will share their honest advice to help you have a magical vacation.

Let us help you with your next Disney Vacation!











facebook twitter
Top