{"id":18712,"date":"2021-08-30T11:10:13","date_gmt":"2021-08-30T15:10:13","guid":{"rendered":"https:\/\/me-en.kaspersky.com\/blog\/what-is-apple-csam-detection\/18712\/"},"modified":"2021-09-30T19:42:31","modified_gmt":"2021-09-30T15:42:31","slug":"what-is-apple-csam-detection","status":"publish","type":"post","link":"https:\/\/me-en.kaspersky.com\/blog\/what-is-apple-csam-detection\/18712\/","title":{"rendered":"How Apple plans to monitor users"},"content":{"rendered":"<p>In early August 2021, Apple <a href=\"https:\/\/www.theverge.com\/2021\/8\/5\/22611721\/apple-csam-child-abuse-scanning-hash-system-ncmec\" target=\"_blank\" rel=\"nofollow noopener\">unveiled its new system<\/a> for identifying photos containing images of child abuse. Although Apple\u2019s motives \u2014 combating the dissemination of child pornography \u2014 seem indisputably well-intentioned, the announcement immediately came under fire.<\/p>\n<p>Apple has long cultivated an image of itself as a device maker that cares about user privacy. New features anticipated for iOS 15 and iPadOS 15 have already dealt a serious blow to that reputation, but the company is not backing down. Here\u2019s what happened and how it will affect average users of iPhones and iPads.<\/p>\n<h2>What is CSAM Detection?<\/h2>\n<p>Apple\u2019s plans are outlined <a href=\"https:\/\/www.apple.com\/child-safety\/\" target=\"_blank\" rel=\"nofollow noopener\">on the company\u2019s website<\/a>. The company developed a system called CSAM Detection, which searches users\u2019 devices for \u201cchild sexual abuse material,\u201d also known as CSAM.<\/p>\n<p>Although \u201cchild pornography\u201d is synonymous with CSAM, the National Center for Missing and Exploited Children (<a href=\"https:\/\/www.missingkids.org\/HOME\" target=\"_blank\" rel=\"nofollow noopener\">NCMEC<\/a>), which helps find and rescue missing and exploited children in the United States, considers \u201cCSAM\u201d the more appropriate term. NCMEC provides Apple and other technology firms with information on known CSAM images.<\/p>\n<p>Apple introduced CSAM Detection along with several other features that expand parental controls on Apple mobile devices. For example, parents will receive a notification if someone sends their child a sexually explicit photo in Apple Messages.<\/p>\n<p>The simultaneous unveiling of several technologies resulted in some confusion, and a lot of people got the sense that Apple was now going to monitor all users all the time. That\u2019s not the case.<\/p>\n<input type=\"hidden\" class=\"category_for_banner\" value=\"ksc-trial-generic\">\n<h2>CSAM Detection rollout timeline<\/h2>\n<p>CSAM Detection will be part of the iOS 15 and iPadOS 15 mobile operating systems, which will become available to users of all current iPhones and iPads (iPhone 6S, fifth-generation iPad and later) this autumn. Although the function will theoretically be available on Apple mobile devices everywhere in the world, for now the system will work fully only in the United States.<\/p>\n<h2>How CSAM Detection will work<\/h2>\n<p>CSAM Detection works only in conjunction with iCloud Photos, which is the part of the iCloud service that uploads photos from a smartphone or tablet to Apple servers. It also makes them accessible on the user\u2019s other devices.<\/p>\n<p>If a user disables photo syncing in the settings, CSAM Detection stops working. Does that mean photos are compared with those in criminal databases only in the cloud? Not exactly. The system is deliberately complex; Apple is trying to guarantee a necessary level of privacy.<\/p>\n<p>As Apple <a href=\"https:\/\/www.apple.com\/child-safety\/pdf\/CSAM_Detection_Technical_Summary.pdf\" target=\"_blank\" rel=\"nofollow noopener\">explains<\/a>, CSAM Detection works by scanning photos on a device to determine whether they match photos in NCMEC\u2019s or other similar organizations\u2019 databases.<\/p>\n<div id=\"attachment_41506\" style=\"width: 1270px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/37\/2021\/08\/30191230\/what-is-apple-csam-detection-1.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-41506\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/37\/2021\/08\/30191230\/what-is-apple-csam-detection-1.png\" alt=\"Simplified diagram of how CSAM Detection works\" width=\"1260\" height=\"640\" class=\"size-full wp-image-18713\"><\/a><p id=\"caption-attachment-41506\" class=\"wp-caption-text\">Simplified diagram of how CSAM Detection works. <a href=\"https:\/\/www.apple.com\/child-safety\/pdf\/CSAM_Detection_Technical_Summary.pdf\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p><\/div>\n<p>The detection method uses NeuralHash technology, which in essence creates digital identifiers, or hashes, for photos based on their contents. If a hash matches one in the database of known child-exploitation images, then the image and its hash are uploaded to Apple\u2019s servers. Apple performs another check before officially registering the image.<\/p>\n<p>Another component of the system, cryptographic technology called <em>private set intersection<\/em>, encrypts the results of the CSAM Detection scan such that Apple can decrypt them only if a series of criteria are met. In theory, that should prevent the system from being misused \u2014 that is, it should prevent a company employee from abusing the system or handing over images at the request of government agencies.<\/p>\n<p>In an August 13 interview with the <em>Wall Street Journal<\/em>, Craig Federighi, Apple\u2019s senior vice president of software engineering, <a href=\"https:\/\/www.wsj.com\/articles\/apple-executive-defends-tools-to-fight-child-porn-acknowledges-privacy-backlash-11628859600\" target=\"_blank\" rel=\"nofollow noopener\">articulated<\/a> the main safeguard for the private set intersection protocol: To alert Apple, 30 photos need to match images in the NCMEC database. As the diagram below shows, the private set intersection system will not allow the data set \u2014 information about the operation of CSAM Detection and the photos \u2014 to be decrypted until that threshold is reached. According to Apple, because the threshold for flagging an image is so high, a false match is very unlikely \u2014 a \u201cone in a trillion chance.\u201d<\/p>\n<div id=\"attachment_41507\" style=\"width: 1150px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/37\/2021\/08\/30191240\/what-is-apple-csam-detection-2.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-41507\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/37\/2021\/08\/30191240\/what-is-apple-csam-detection-2.png\" alt=\"An important feature of CSAM Detection system: to decrypt data, a large number of photos need to match\" width=\"1140\" height=\"700\" class=\"size-full wp-image-18715\"><\/a><p id=\"caption-attachment-41507\" class=\"wp-caption-text\">An important feature of CSAM Detection system: to decrypt data, a large number of photos need to match. <a href=\"https:\/\/www.apple.com\/child-safety\/pdf\/CSAM_Detection_Technical_Summary.pdf\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p><\/div>\n<p>What happens when the system is alerted? An Apple employee manually checks the data, confirms the presence of child pornography, and notifies authorities. For now the system will work fully only in the United States, so the notification will go to NCMEC, which is sponsored by the US Department of Justice.<\/p>\n<h2>Problems with CSAM Detection<\/h2>\n<p>Potential criticism of Apple\u2019s actions falls into two categories: questioning the company\u2019s approach and scrutinizing the protocol\u2019s vulnerabilities. At the moment, there is little concrete evidence that Apple made a technical error (an issue we will discuss in more detail below), although there has been no shortage of general complaints.<\/p>\n<p>For example, the Electronic Frontier Foundation has <a href=\"https:\/\/www.eff.org\/deeplinks\/2021\/08\/if-you-build-it-they-will-come-apple-has-opened-backdoor-increased-surveillance\" target=\"_blank\" rel=\"nofollow noopener\">described<\/a> these issues in great detail. According to the EFF, by adding image scanning on the user side, Apple is essentially embedding a back door in users\u2019 devices. The EFF has <a href=\"https:\/\/www.eff.org\/deeplinks\/2019\/11\/why-adding-client-side-scanning-breaks-end-end-encryption\" target=\"_blank\" rel=\"nofollow noopener\">criticized<\/a> the concept since as early as 2019.<\/p>\n<p>Why is that a bad thing? Well, consider having a device on which the data is completely encrypted (as Apple asserts) that then begins reporting to outsiders about that content. At the moment the target is child pornography, leading to a common refrain, \u201cIf you\u2019re not doing anything wrong, you have nothing to worry about,\u201d but as long as such a mechanism exists, we cannot know that it won\u2019t be applied to other content.<\/p>\n<p>Ultimately, that criticism is political more than technological. The problem lies in the absence of a social contract that balances security and privacy. All of us, from bureaucrats, device makers, and software developers to human-rights activists and rank-and-file users \u2014 are trying to define that balance now.<\/p>\n<p>Law-enforcement agencies complain that widespread encryption complicates collecting evidence and catching criminals, and that is understandable. Concerns about mass digital surveillance are also obvious. Opinions, including opinions about Apple\u2019s policies and actions, are a dime a dozen.<\/p>\n<h2>Potential problems with implementing CSAM Detection<\/h2>\n<p>Once we move past ethical concerns, we hit some bumpy technological roads. Any program code produces new vulnerabilities. Never mind what governments might do; what if a cybercriminal took advantage of CSAM Detection\u2019s vulnerabilities? When it comes to data encryption, the concern is natural and valid: If you weaken information protection, even if it\u2019s with only good intentions, then anyone can exploit the weakness for other purposes.<\/p>\n<p>An independent audit of the CSAM Detection code has just begun and could take a very long time. However, we have already learned a few things.<\/p>\n<p>First, code that makes it possible to compare photos against a \u201cmodel\u201d has <a href=\"https:\/\/www.macobserver.com\/news\/neural-hash-extracted\/\" target=\"_blank\" rel=\"nofollow noopener\">existed<\/a> in iOS (and macOS) since version 14.3. It is entirely possible that the code will be part of CSAM Detection. <a href=\"https:\/\/github.com\/AsuharietYgvar\/AppleNeuralHash2ONNX\" target=\"_blank\" rel=\"nofollow noopener\">Utilities<\/a> for experimenting with a search algorithm for matching images have already found some <a href=\"https:\/\/github.com\/anishathalye\/neural-hash-collider\" target=\"_blank\" rel=\"nofollow noopener\">collisions<\/a>. For example, according to Apple\u2019s NeuralHash algorithm, the two images below have the same hash:<\/p>\n<div id=\"attachment_41508\" style=\"width: 750px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/37\/2021\/08\/30191250\/what-is-apple-csam-detection-3.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-41508\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/37\/2021\/08\/30191250\/what-is-apple-csam-detection-3.png\" alt=\"According to Apple's NeuralHash algorithm, these two photos match\" width=\"740\" height=\"360\" class=\"size-full wp-image-18717\"><\/a><p id=\"caption-attachment-41508\" class=\"wp-caption-text\">According to Apple\u2019s NeuralHash algorithm, these two photos match. <a href=\"https:\/\/github.com\/anishathalye\/neural-hash-collider\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p><\/div>\n<p>If it is possible to pull out the database of hashes of illegal photos, then it is possible to create \u201cinnocent\u201d images that trigger an alert, meaning Apple could receive enough false alerts to make CSAM Detection unsustainable. That is most likely why Apple separated the detection, with part of the algorithm working only on the server end.<\/p>\n<p>There is also this <a href=\"https:\/\/pseudorandom.resistant.tech\/obfuscated_apples.html\" target=\"_blank\" rel=\"nofollow noopener\">analysis<\/a> of Apple\u2019s <em><a href=\"https:\/\/www.apple.com\/child-safety\/pdf\/Apple_PSI_System_Security_Protocol_and_Analysis.pdf\" target=\"_blank\" rel=\"nofollow noopener\">private set intersection<\/a><\/em> protocol. The complaint is essentially that even before reaching the alert threshold, the PSI system transfers quite a bit of information to Apple\u2019s servers. The article describes a scenario in which law-enforcement agencies request the data from Apple, and it suggests that even false alerts might lead to a visit from the police.<\/p>\n<p>For now, the above are just initial tests of an external review of CSAM Detection. Their success will depend largely on the famously secretive company providing transparency into CSAM Detection\u2019s workings \u2014 and in particular, its source code.<\/p>\n<h2>What CSAM Detection means for the average user<\/h2>\n<p>Modern devices are so complex that it is no easy feat to determine how secure they really are \u2014 that is, to what extent they live up to the maker\u2019s promises. All most of us can do is trust \u2014 or distrust \u2014 the company based on its reputation.<\/p>\n<p>However, it is important to remember this key point: CSAM Detection operates only if users upload photos to iCloud. Apple\u2019s decision was deliberate and anticipated some of the objections to the technology. If you do not upload photos to the cloud, nothing will be sent anywhere.<\/p>\n<p>You may remember the notorious <a href=\"https:\/\/en.wikipedia.org\/wiki\/2015_San_Bernardino_attack\" target=\"_blank\" rel=\"nofollow noopener\">conflict between Apple and the FBI<\/a> in 2016, when the FBI asked Apple for help unlocking an iPhone 5C that belonged to a mass shooter in San Bernardino, California. The FBI wanted Apple to write software that would let the FBI get around the phone\u2019s password protection.<\/p>\n<p>The company, recognizing that complying could result in unlocking not only the shooter\u2019s phone but also anyone\u2019s phone, refused. The FBI backed off and ended up hacking the device with outside help, <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/04\/14\/azimuth-san-bernardino-apple-iphone-fbi\/\" target=\"_blank\" rel=\"nofollow noopener\">exploiting the software\u2019s vulnerabilities<\/a>, and Apple maintained its reputation as a company that fights for its customers\u2019 rights.<\/p>\n<p>However, the story isn\u2019t quite that simple. Apple did hand over a copy of the data from iCloud. In fact, the company <a href=\"https:\/\/www.reuters.com\/article\/us-apple-fbi-icloud-exclusive-idUSKBN1ZK1CT\" target=\"_blank\" rel=\"nofollow noopener\">has access<\/a> to practically any user data uploaded to the cloud. Some, such as <a href=\"https:\/\/support.apple.com\/en-us\/HT202303\" target=\"_blank\" rel=\"nofollow noopener\">Keychain passwords and payment information<\/a>, is stored using end-to-end encryption, but most information is encrypted only for protection from <em>unsanctioned<\/em> access \u2014 that is, from a hack of the company\u2019s servers. That means the company can decrypt the data.<\/p>\n<p>The implications make for perhaps the most interesting plot twist in the story of CSAM Detection. The company could, for example, simply scan all of the images in iCloud Photos (as Facebook, Google, and many other cloud service providers do). Apple created a more elegant mechanism that would help it repel accusations of mass user surveillance, but instead, it drew even more criticism \u2014 for scanning users\u2019 devices.<\/p>\n<p>Ultimately, the hullabaloo hardly changes anything for the average user. If you are worried about protecting your data, you should look at <em>any<\/em> cloud service with a critical eye. Data you store only on your device is still safe. Apple\u2019s recent actions have sown well-founded doubts. Whether the company will continue in this vein remains an open question.<\/p>\n<input type=\"hidden\" class=\"category_for_banner\" value=\"ksc-trial-generic\">\n","protected":false},"excerpt":{"rendered":"<p>Apple plans to use its new CSAM Detection system to monitor users and identify those who store child pornography on their devices.<\/p>\n","protected":false},"author":665,"featured_media":18719,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1225,1226],"tags":[14,1061,100,26,43,738,321],"class_list":{"0":"post-18712","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-privacy","8":"category-technology","9":"tag-apple","10":"tag-ios","11":"tag-ipad","12":"tag-iphone","13":"tag-privacy","14":"tag-surveillance","15":"tag-technology"},"hreflang":[{"hreflang":"en-ae","url":"https:\/\/me-en.kaspersky.com\/blog\/what-is-apple-csam-detection\/18712\/"},{"hreflang":"en-in","url":"https:\/\/www.kaspersky.co.in\/blog\/what-is-apple-csam-detection\/23225\/"},{"hreflang":"ar","url":"https:\/\/me.kaspersky.com\/blog\/what-is-apple-csam-detection\/9359\/"},{"hreflang":"en-us","url":"https:\/\/usa.kaspersky.com\/blog\/what-is-apple-csam-detection\/25274\/"},{"hreflang":"en-gb","url":"https:\/\/www.kaspersky.co.uk\/blog\/what-is-apple-csam-detection\/23344\/"},{"hreflang":"es-mx","url":"https:\/\/latam.kaspersky.com\/blog\/what-is-apple-csam-detection\/22723\/"},{"hreflang":"es","url":"https:\/\/www.kaspersky.es\/blog\/what-is-apple-csam-detection\/25890\/"},{"hreflang":"it","url":"https:\/\/www.kaspersky.it\/blog\/what-is-apple-csam-detection\/25409\/"},{"hreflang":"ru","url":"https:\/\/www.kaspersky.ru\/blog\/what-is-apple-csam-detection\/31362\/"},{"hreflang":"tr","url":"https:\/\/www.kaspersky.com.tr\/blog\/what-is-apple-csam-detection\/9971\/"},{"hreflang":"x-default","url":"https:\/\/www.kaspersky.com\/blog\/what-is-apple-csam-detection\/41502\/"},{"hreflang":"fr","url":"https:\/\/www.kaspersky.fr\/blog\/what-is-apple-csam-detection\/17541\/"},{"hreflang":"pt-br","url":"https:\/\/www.kaspersky.com.br\/blog\/what-is-apple-csam-detection\/18025\/"},{"hreflang":"pl","url":"https:\/\/plblog.kaspersky.com\/what-is-apple-csam-detection\/15203\/"},{"hreflang":"de","url":"https:\/\/www.kaspersky.de\/blog\/what-is-apple-csam-detection\/27253\/"},{"hreflang":"ja","url":"https:\/\/blog.kaspersky.co.jp\/what-is-apple-csam-detection\/31527\/"},{"hreflang":"nl","url":"https:\/\/www.kaspersky.nl\/blog\/what-is-apple-csam-detection\/27462\/"},{"hreflang":"ru-kz","url":"https:\/\/blog.kaspersky.kz\/what-is-apple-csam-detection\/24270\/"},{"hreflang":"en-au","url":"https:\/\/www.kaspersky.com.au\/blog\/what-is-apple-csam-detection\/29599\/"},{"hreflang":"en-za","url":"https:\/\/www.kaspersky.co.za\/blog\/what-is-apple-csam-detection\/29404\/"}],"acf":[],"banners":"","maintag":{"url":"https:\/\/me-en.kaspersky.com\/blog\/tag\/ios\/","name":"iOS"},"_links":{"self":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/18712","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/users\/665"}],"replies":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/comments?post=18712"}],"version-history":[{"count":7,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/18712\/revisions"}],"predecessor-version":[{"id":18906,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/18712\/revisions\/18906"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/media\/18719"}],"wp:attachment":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/media?parent=18712"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/categories?post=18712"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/tags?post=18712"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}