Apple Delays CSAM Scanning Amid Backlash

s plans to implement CSAM (Child Sexual Abuse Material) scanning amid backlash from all sides....
Apple Delays CSAM Scanning Amid Backlash
Written by Matt Milano

Apple has announced it is delaying its plans to implement CSAM (Child Sexual Abuse Material) scanning amid backlash from all sides.

Apple previously announced plans to implement CSAM scanning in the next versions of iOS, iPadOS and macOS. While many companies, including Microsoft, Google, Dropbox and Facebook, all scan their servers for CSAM, Apple’s solution was unique in that part of the process would take place on-device.

The backlash was immediate and severe, leading Apple to try to explain how the system worked. Ultimately, it has not been successful assuaging people’s concerns and is now delaying the feature, according to a statement on its website.

Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Unfortunately, Apple’s statement is small comfort, as many have pointed out that any attempt to continue with the original goal is extremely dangerous. In our upcoming multi-part breakdown of Apple’s plans, we’ll explain why.

Subscribe for Updates

CybersecurityUpdate Newsletter

CybersecurityUpdate

By signing up for our newsletter you agree to receive content related to ientry.com / webpronews.com and our affiliate partners. For additional information refer to our terms of service.
Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit