Apple Undermines Its Famous Security 'For The Children'

from the this-is-dangerously-dumb dept

Apple is somewhat famous for its approach to security on its iPhones. Most famously, Apple went to court to fight the FBI’s demand that they effectively insert a backdoor into its on-phone encryption (by being able to force an update to the phone). Apple has tons of goodwill in the security community (and the public) because of that, though not in the law enforcement community. Unfortunately, it appears that Apple is throwing away much of that good will and has decided to undermine the security of its phone… “for the children” (of course).

This week, Apple announced what it refers to as “expanded protections for children.” The company has been receiving lots of pressure from certain corners (including law enforcement groups who hate encryption), claiming that its encryption was helping hide child sexual abuse material (CSAM) on phones (and in iCloud accounts). So Apple’s plan is to introduce what’s generally called “client-side scanning” to search for CSAM on phones as well as a system that scans iCloud content for potentially problematic content. Apple claims that it’s doing this in a manner that is protective of privacy. And, to be fair, this clearly isn’t something that Apple rolled out willy-nilly without considering the trade-offs. It’s clear from Apple’s detailed explanations of the new “safety” features, that it is trying to balance the competing interests at play here. And, obviously, stopping the abuse of children is an important goal.

The problem is that, even with all of the balancing Apple has done here, it’s definitely moved down a very dangerous, and very slippery slope towards using this approach for other things.

Apple’s brief description of its new offerings are as follows:

Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

Some of the initial concerns about these descriptions — including fears that, say, LGBTQ+ children might be outed to their parents — have been somewhat (though not entirely) alleviated with the more detailed explanation. But that doesn’t mean there aren’t still very serious concerns about how this plays out in practice and what this means for Apple’s security.

First, there’s the issue of client-side scanning. As an EFF post from 2019 explains, client-side scanning breaks end-to-end encryption. In the EFF’s latest post about Apple’s announcement, it includes a quick description of how this introduces a backdoor:

We?ve said it before, and we?ll say it again now: it?s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger?s encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children?s, but anyone?s accounts. That?s not a slippery slope; that?s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of ?misinformation? in 24 hours may apply to messaging services. And many other countries?often those with authoritarian governments?have passed similar laws. Apple?s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.

We?ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ?terrorist? content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it?s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as ?terrorism,? including documentation of violence and repression, counterspeech, art, and satire.

It’s actually difficult to find any security experts who support Apple’s approach here. Alec Muffett sums it up in a single tweet:

This is the very slippery slope. If we somehow believe that governments won’t demand Apple cave on a wide variety of other types of content, you haven’t been paying attention. Of course, Apple can claim that it will stand strong against such demands, but now we’re back to being entirely dependent on trusting Apple.

As noted above, there were some initial concerns about the parent notifications, but as EFF’s description notes, the rollout here does include some level of consent by users before their parents are notified, but it’s still quite problematic:

In these new processes, if an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. If the under-13 child still chooses to send the content, they have to accept that the ?parent? will be notified, and the image will be irrevocably saved to the parental controls section of their phone for the parent to view later. For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.

Similarly, if the under-13 child receives an image that iMessage deems to be ?sexually explicit?, before being allowed to view the photo, a notification will pop up that tells the under-13 child that their parent will be notified that they are receiving a sexually explicit image. Again, if the under-13 user accepts the image, the parent is notified and the image is saved to the phone. Users between 13 and 17 years old will similarly receive a warning notification, but a notification about this action will not be sent to their parent?s device.

This means that if?for instance?a minor using an iPhone without these features turned on sends a photo to another minor who does have the features enabled, they do not receive a notification that iMessage considers their image to be ?explicit? or that the recipient?s parent will be notified. The recipient?s parents will be informed of the content without the sender consenting to their involvement. Additionally, once sent or received, the ?sexually explicit image? cannot be deleted from the under-13 user?s device.

Whether sending or receiving such content, the under-13 user has the option to decline without the parent being notified. Nevertheless, these notifications give the sense that Apple is watching over the user?s shoulder?and in the case of under-13s, that?s essentially what Apple has given parents the ability to do.

It is also important to note that Apple has chosen to use the notoriously difficult-to-audit technology of machine learning classifiers to determine what constitutes a sexually explicit image. We know from years of documentation and research that machine-learning technologies, used without human oversight, have a habit of wrongfully classifying content, including supposedly ?sexually explicit? content. When blogging platform Tumblr instituted a filter for sexual content in 2018, it famously caught all sorts of other imagery in the net, including pictures of Pomeranian puppies, selfies of fully-clothed individuals, and more. Facebook?s attempts to police nudity have resulted in the removal of pictures of famous statues such as Copenhagen?s Little Mermaid. These filters have a history of chilling expression, and there?s plenty of reason to believe that Apple?s will do the same.

There remains a real risk of false positives in this kind of system. There’s a very worth reading blog post explaining how automated matching technologies fail, often in catastrophic ways. You really need to read that entire post as brief excerpts wouldn’t do it justice — but as it notes, the risk of false positives here are very high, and the cost of such false positives can be catastrophic. Obviously, CSAM is also catastrophic, so you can see how there is a real challenge in balancing those interests, but there are legitimate concerns that with this approach it’s unclear if the balance is properly calibrated.

Obviously, Apple is trying to walk a fine line here. No one wants to be supporting CSAM distribution. But, once again, all the pressure on Apple feels like people blaming the tool for (serious) abuses by its users, and in demanding a “solution,” opening up a very dangerous situation. If there were some way to guarantee that these technologies wouldn’t be abused, or mess up, you could kind of see how this makes sense. But history has shown time and time again that neither of these things is really possible. Opening up this hole in Apple’s famous security means more demands are coming.

Filed Under: , , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Apple Undermines Its Famous Security 'For The Children'”

Subscribe: RSS Leave a comment
76 Comments
This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: There it is…

That’s only our own choosing. It’s completely possible for governments to take a more nuanced approach that targets some aspects of CSAM, such as origination or commercial distribution, as a crime while not going after individual possession with criminal charges or invasive technical measures. Instead, we’ve taken a problem which is insoluble by its very nature (“Stop all creation and sharing of abuse images by anyone, anywhere”) and it’s been to grant a blank check to law enforcement for very invasive measures that harm civil liberties and don’t stop CSAM.

Anonymous Coward says:

Re: Re: There it is…

I don’t know if I agree with your entire post, but I feel it might be good to move away from surveillance or the police as the ultimate solution to every single problem a bit.

Is it possible to get computer graphics to the point where you can create child porn, without the children (artificial child porn)? I like the idea of getting someone to have sex with a robot (is it realistic enough to catch their attention?), rather than an actual child too.

It might be possible to reduce all sorts of child abuse by reducing the number of covid lockdowns, as this has been strongly correlated with online child porn distribution (increase by 25%), and offline child abuse. But, this will require people to actually get vaccinated, which I can’t believe is so hard to do in countries desperate to get out of this covid hell.

We are literally in a position where anti-vaxxers are happy to come up with every excuse as to why people shouldn’t get vaccinated, and these are usually the most "think of the children" sorts too. As it’s an international problem, we’d need to ramp up vaccinations globally, and supplies may be thin in many countries.

Anti-abuse campaigns could be more effective by explaining how molestation and rape is harmful to a child, and how someone can fool themselves into thinking they’re doing "something minor", and to avoid that pitfall.

They overwhelmingly focus on things being "immoral" or "illegal", and that mentality inevitably gets extended to deterring people from using any sort of artificial child porn (which might actually be counter-productive) by spooking them by making it look like the state is watching them, and ready to arrest them. It is the harms to the child which makes it bad, not simply that someone might find it offensive.

For grooming on social media, we’d have to run online safety campaigns, and strengthen social supports, so that teens aren’t so desperate for emotional supports online, where they might run into wily predators. Facebook is taking steps to make it harder for them to be exploited too.

Many exploited children are LGBT+. Reducing the amount of anti-LGBT hostility could make them a lot less vulnerable to predators, although some portions of society are very happy to exploit them to pass totalitarian laws, but very unwilling to actually help them.

Stopping child porn is an impossible problem, but we might be able to decrease it. Getting all crimes to zero should never be an absolute goal in a democratic society, as it is impossible to do so without turning this society into a totalitarian one, and that may still not be enough.

This comment has been deemed insightful by the community.
Manabi (profile) says:

Re: Re: There it is…

Most people don’t realize this, but that’s because the vast majority of the organizations pushing for stronger and stronger laws against "CSAM"¹ have as their root goal banning all pornography of all types. They use the "think of the children" angle to make it difficult for people to push back against the laws. Then once the laws are passed, and people start believing the arguments that merely viewing "CSAM"¹ will make people rape children, they can start arguing that regular pornography must be banned as well. Because clearly, if viewing "CSAM"¹ makes people rape children, viewing adult pornography will make people rape adults.

It’s insidious, because it’s very hard to argue against without being labeled as pro-child-sexual-abuse. And they simply are not happy no matter how restrictive the laws get. They want 2D drawings banned as "CSAM"¹, even in cases where the characters are 18+, but have "child-like" features. They’re very much like copyright extremists in that regard.

They’re anti-porn zealots, who don’t actually care all that much about actual children. If they did, they’d realize some of the things they insist on cause quite a bit of damage to children. Just ask the teens and preteens who’ve been charged with production of "CSAM"¹ for sending nude photos of themselves to another preteen/teen. Or ask any of the children who get sexually abused by a family member, while most of society is focused on "stranger danger," because that’s scarier and easier to use to get bad laws passed.

¹ The above is why I dislike the term CSAM. To the NCMEC simple nude pictures of children are CSAM. Nude photos are simply not automatically child sexual abuse, as any parent who’s taken a photo of their toddler in the bath can tell you. When you read an article about someone being busted for having hundreds of photos of "CSAM" on their computer, most of those are probably nude photos, not actual sexual activity. It lets them make things sound far worse than they really are.

Anonymous Coward says:

Re: Re: Re: There it is…

NCMEC also parrots the idiot idea, darling of legislators everywhere, that end-to-end encryption should be compromised by law enforcement backdoors: https://www.missingkids.org/theissues/end-to-end-encryption

Which I guess makes sense given their project here with Apple.

The whole organization is sketchy as hell, it’s privately operated but was created by congress and receives government funding.

Scary Devil Monastery (profile) says:

Re: Re: Re: There it is…

"…the vast majority of the organizations pushing for stronger and stronger laws against "CSAM"¹ have as their root goal banning all pornography of all types."

…because the most vocal such organizations are all religious in nature. Cynically put they use "For the children" as a wedge to make shit harder for teenagers having premarital sex.

Quite a few such organizations once started as credible anti-abuse NGO’s focused on stopping sex trafficking and sex tourism to countries where underage prostitution was common (Thailand, notably) but these organizations were rapidly infiltrated and taken over by the religious right who, bluntly put, doesn’t give a single solitary fuck about the welfare of endangered children but certainly goes the distance to object against people of any age humping one another for any other purpose than procreation within the bounds of wedlock.

Hence why some countries, surprisingly enough my own, has an age limit of what counts as CSAM which is higher than the actual age of consent. Making it fully legal for a 50-year old and a 16 year old to get it on but god help the two 17-year olds sexting each other.

…which neatly seagues into how in the most religious US states, child marriages are still legal and under no particular threat from the religious community so bent on banning sex, making it possible to marry the criminally young but not for two teens to be a couple.

This comment has been deemed insightful by the community.
anon says:

Re: What about Signal?

No, Signal isn’t compromised. Apple’s IOS is compromised. This is worse because you can’t just download a different-but-equally-secure application to duplicate its functionality. Especially if there’s ‘match everything’ hash in that collection…

Manabi (profile) says:

Re: Re: What about Signal?

The worst parts of this are the machine-learning scanning for sexual images, and that is limited to iOS’ Messages app. Teens/preteens wanting to sext without Apple’s nanny-state scanning can simply use another app. Seems pretty pointless to me, as that’s exactly what teens/preteens will do.

The part about scanning for CSAM is scanning against a database of known CSAM material maintained by the NCMEC. It won’t flag the photo a parent takes of their toddler playing in the bath, but the fact that people are assuming it will should worry Apple. People are assuming the worst, and this is going to be a PR nightmare for them because of it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:3 What about Signal?

For real secure communications you need to go to an offline system for encryption and decryption, along with a safe way of copying encrypted files to and from the online machines. The latter is possible by using an Arduino as a SD card reader and writer. Its the usual problem, you can have security or you can have convenience.

This comment has been deemed insightful by the community.
Anonymous Coward says:

No one wants to be supporting CSAM distribution.

This is obviously false; if true, there would be no need to use technology to stop it.

Obviously, CSAM is also catastrophic

That’s not obvious at all. Child abuse is certainly harmful, with sexual abuse being a subset of that, but we’re talking about pictures of abuse (along with entirely non-abusive pictures, e.g. those taken and sent willingly). Not good, but "catastrophic" is hyperbole; is it so much worse that photos of other crimes, such as murders or beatings, which are perfectly legal to posess?

The overreactions to these pictures are certainly harmful, including to children.

nasch (profile) says:

Re: Re:

is it so much worse that photos of other crimes, such as murders or beatings, which are perfectly legal to posess?

If people are committing murders or beatings to sell the photos of them then no. If not, then that’s the difference. I’m not saying the way we’re tackling it is the right way, but the focus on the photos and videos is because in some cases the abuse is being done in order to produce the photos and videos.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

'Solving' a sliver by removing the arm

The encryption issues are bad enough but one line stuck out to me as an indication that they really did not think this one through:

In these new processes, if an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. If the under-13 child still chooses to send the content, they have to accept that the “parent” will be notified, and the image will be irrevocably saved to the parental controls section of their phone for the parent to view later.

The software flags content of under-13 children as sexually explicit and rather than block it they save it to the device. There is just so many ways that can and will go horribly wrong, from prosecutors bringing charges for knowing possession of CSAM(before anyone chimes in to say that would never happen teenagers have been charged for creating/possessing CSAM of themselves) to someone else getting access to those photos whether from a used phone that wasn’t wiped or simply loaning it to someone.

If the goal was to reduce CSAM then this very much does not seem the proper way to go about it.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Re: 'Solving' a sliver by removing the arm

This might open an avenue to subversion. Get a burner phone, acquire some prohibited material, and send it to a judge or prosecutor. Someone else could then "leak" some rumor about the subject. Lo and behold, the proof is on their device. If it is demonstrated that their system is untrustworthy, then perhaps the manufacturer will decide to discontinue it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: 'Solving' a sliver by removing the arm

Get a burner phone, acquire some prohibited material, and send it to a judge or prosecutor. Someone else could then "leak" some rumor about the subject. Lo and behold, the proof is on their device.

Why do people like you come up with such crazy ideas in order to fuck somebody over.

It’s like you are mad at life and spend all your waking hours trying to find new and creative ways to make somebody else’s life just as miserable as your own.

The sooner you fuck off, the better everybody else will be.

Scary Devil Monastery (profile) says:

Re: Re: Re: 'Solving' a sliver by removing the arm

"Why do people like you come up with such crazy ideas in order to fuck somebody over."

That One Guy is correct in his assessment which means Koby’s just taking it from there.

I don’t usually spring to Koby’s defense given his penchant for lying through his teeth concerning free speech issues, but in this case he’s right. If it is possible to use existing law to drop wholly innocent people – children and parents alike – in some utterly horrifying legal nightmare then there will be ten thousand trolls to whom the idea of hurting other people precisely this way is a sexual fetish in itself.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: 'Solving' a sliver by removing the arm

Yes, engaging in what I’m sure would be multiple felonies in a way that would garner a lot of government attention would definitely be the smart and proper way to point out why this move is a bad one, brilliant idea there.

Scary Devil Monastery (profile) says:

Re: Re: Re: 'Solving' a sliver by removing the arm

Unfortunately he’s probably correct. A number of the friends he keeps advocating for are among the people who would gladly and with gay abandon set out to drop innocent people into horrifying legal messes. Specifically targeting BLM and ACLU advocates, liberals, transgender activists and such.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

Re: 'Solving' a sliver by removing the arm

And here it was pitched in other media as them comparing things uploaded to iCloud to the list of known CP.
Then they would be reviewed by a human.
Does that mean that Apple secret viewers would then be in possession of CP?

Oh look a shitshow to get some good PR that will definately not completely undermine or privacy & security and make uploading a winnie the pooh picture in China an arrestable offense… oh wait.. it is.

Sorry I fscking refused to use the longer & longer name.
Google George Carlin’s bit about shell shock.
We keep giving things longer and longer names & it takes away some of the horror we should be feeling about horrific things.
You wouldn’t call rape non-consensual coitus, that makes it sound like something different & maybe not as bad but we make things longer & more polite for some peoples sensibilities being offended.
Kids were exploited, photographed, raped how the fsck does CSAM convey what an actual horror that is?

"a used phone that wasn’t wiped or simply loaning it to someone."
Or taking it into the Apple store only to discover he sent himself all your pics & contact details then was hitting on you?

This comment has been deemed insightful by the community.
Manabi (profile) says:

Re: Re: 'Solving' a sliver by removing the arm

We know that Google & Facebook hire people to review reported images & videos already, and that those people have a high turnover rate and develop psychological problems because of the images they’re required to view. (Not just child pornography but graphic and brutal torture, gore, etc.) Apple probably does as well for iCloud, but they’ll have to hire a lot more if they go through with this.

Also it should be pointed out that most of the people hired to do this are contractors, not direct employees and don’t get the nice benefits normal employees get. Often that includes them not have health insurance through the job at all, as well as being paid close to minimum wage, so they can’t even afford to get psychological help to deal with the things the job requires them to view daily. This will help no children but harm a lot of adults.

I’m with you on not calling it CSAM, but child pornography’s also often a misnomer as well. The NCMEC database contains a LOT of photos that are simply nudes of children not engaging in sexual activity. That’s not what people think of when they hear child pornography, so renaming it to "child sexual assault material" was really uncalled for. They’re deliberately trying to make it sound worse than it often is. Whenever you hear someone busted for having CP on their computer has "hundreds" or "thousands" of images, probably less than 10% of those are what people actually think of as pornographic, much less "sexual assault material."

And let’s not forget they want 2D drawings banned as CP as well. Those don’t involve any actual children, so no children are possibly harmed by them.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Private corporation w 1A Rights to not associate w child por

If you can’t see the difference between controlling what appears on a public noticeboard, and rummaging through you house for thing someone does not like, you have serious problems in understanding people rights.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed funny by the community.
Anonymous Coward says:

Apple, Google, Facebook and their ilk are clearly not the ones at fault here. It is time we face the fact who our true enemies are: The Children. Techdirt, since time immemorial, have hinted at their ungodly powers to sway the will of the most powerful corporations and governments. We need to stop them.

Personally, I have never seen one of these little fuckers so I have no idea how we can defeat them but we have to try.

Because if we don’t, then… The Children have already won.

This comment has been deemed insightful by the community.
Snodoubt (profile) says:

New iCloud ad coming soon - “now with 100% more rainbow table

Phew! I was worried they were going to undermine my phone’s security. It’s nice to know that they just need to hash every word and sentence combo in the world and then they can decrypt my e2e encrypted iMessage if they hash match. Since they have been working on this for years, I’ll assume that’s already taken care of.

Anonymous Coward says:

This is a bullshit argument. This scanning technology already exists, whether or not it’s being used. And if any government decides they want Apple to implement this for nefarious purposes, they can do that today, whether or not it’s already in use. And in that case, Apple has a decision to make; note that they already threatened to pull out of the UK market due to a possible government edict. And no one (other than the usual army of Apple haters) could argue with a straight face that Apple itself would do this for anything other than a good cause.

PerfectBlend says:

Shoutout & slippery slope

A few years ago, TechDirt was mentioned @GOGPodcast. And I followed TechDirt. I was a bit surprised.Who is this “Mike dude” making sense all the time? So, I’ve started listening to the Podcast. I cannot remember a single episode to disagree on. I’ve been a long time EFF supporter. Not because I think the EU is so much better, but good stuff and bad stuff radiates to the EU (and visa versa).

So, the nonsense has moved from the internet to my backyard. First the ridiculous copyright act and now Apple. We’re all emphatic to fighting child abuse. Often forgotten, it usually comes from known people to the child. The unpleasant truth is, once you start to mess with security (and privacy in private spaces) the ship has sailed.

My cynical tweet “It takes one gag order” (to change this thingy into a nightmare) isn’t so far from the truth. We all know where this starts, nobody knows where it will end. Let’s hope it doesn’t end with the “legal but inappropriate” proposal our UK friends are trying to avoid.

This comment has been deemed insightful by the community.
PerfectBlend says:

Shoutout & slippery slope

A few years ago, TechDirt was mentioned @GOGPodcast. And I followed TechDirt. I was a bit surprised.Who is this “Mike dude” making sense all the time? So, I’ve started listening to the Podcast. I cannot remember a single episode to disagree on. I’ve been a long time EFF supporter. Not because I think the EU is so much better, but good stuff and bad stuff radiates to the EU (and visa versa).

So, the nonsense has moved from the internet to my backyard. First the ridiculous copyright act and now Apple. We’re all emphatic to fighting child abuse. Often forgotten, it usually comes from known people to the child. The unpleasant truth is, once you start to mess with security (and privacy in private spaces) the ship has sailed.

My cynical tweet “It takes one gag order” (to change this thingy into a nightmare) isn’t so far from the truth. We all know where this starts, nobody knows where it will end. Let’s hope it doesn’t end with the “legal but inappropriate” proposal our UK friends are trying to avoid.

This comment has been deemed insightful by the community.
scotts13 (profile) says:

Last straw

I’ve worked for and with Apple since 1983. My house is full of their stuff to this day. They’ve done some things I didn’t agree with, but overall I overwhelmingly supported the company. This surveillance is 100% the opposite of everything I thought they stood for. It rolls out, and I’m done.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Hasn’t Apple, by definition, caved, just by doing this? We should trust Tim Cooke who funnels Apple’s profits offshore to evade taxes? Who changes cable standards every few years to increase revenues then pulls cables to save pennies. Cooke would sell his mother if it reduced costs, which is why we’re in this mess, because they went to China.

And haven’t they also caved to demands by China, because, well, they could pretty much shut down their entire business? All your data is stored on Chinese servers, if you are Chinese?

I buy Apple products because of the "implied" higher standard of security. It may or may not actually exist, or not as well as I’d like, but I trust them more than say Amazon with my information. Thing is, the competitors are almost all cheaper and if there’s just a difference without a distinction then, well, I might as well save some money.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

This is honestly bordering on QAnon level shit.

I also resent any argument set up to make opposition to it appear inherently morally repugnant. I have been called pro-criminal far too many times for my opinions on the government to find the least bit of good faith in that strategy. If that’s all they can come up with they have nothing.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

The best part about techdirt articles is reading all the outraged, pearl-clutching comments.

Anyways, doesn’t apple have like two billion ios devices out there? How are they going to manage it all? Is this just for certain countries? I mean, in some places, like Afghanistan, they marry off girls as young as 11!

I foresee a great big reversal of this in less than two years.

The stupidity of this whole scheme just screams invented at apple, probably by its legal dept. Most of apple’s innovations have been other people’s ideas.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Isn’t what they are doing a first step towards content moderation at OS level? Obvious that sooner or later they would find a way to make it accepted by the public. It’s all about the narrative spin. Maybe an alternative will come out – I am thinking about a raspberry dongle or something similar that uses the phone just as a dongle.

This comment has been deemed funny by the community.
Anonymous Coward says:

What about all the other contributers?

So what are the various automobile companies doing to make sure that they’re not somehow a part of distribution… obviously, some sort of transportation took place for that phone to make it to the end user… and they might drive somewhere to take those CSAM pictures.
And how about the power companies? Are they monitoring what’s happening with all that electricity? Certainly there’d be less CSAM if there were no electricity for it!
And all grocery stores and restaurants allowing people to eat there without making sure there’s no CSAM happening… otherwise, they’re providing nourishment for this to continue.

Oh the humanity!

n00bdragon (profile) says:

And, obviously, stopping the abuse of children is an important goal.

At the risk of sounding like a child-molesting nogoodnik: How important is this, really? Has there ever been any attempt to quantify, in numerical terms, how many children are sexually abused via iCloud? From the way Apple’s press release is worded you would think that iCloud was a digital cornucopia of child abuse photographs that America’s legion predators took for some reason. Saying something like "We just don’t know how many abuse images there could be" is like saying there is just no way of telling how much terrorism is planned out via the English language. How many actual real pedophiles is this change going to reel in? Can anyone take even an order-of-magnitude guess?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...