WebProNews

Tag: NCMEC

  • Apple Will Check Photo Uploads for Child Sex Abuse Images

    Apple Will Check Photo Uploads for Child Sex Abuse Images

    Apple will begin checking photos being uploaded to its iCloud service against a database of Child Sexual Abuse Material (CSAM), in an effort to protect children.

    In the battle over encryption — known as the Crypto Wars — governments have often used protecting children as justification for promoting backdoors in encryption and security. Unfortunately, not matter how well-intentioned, as we have highlighted before, there is no way to securely create a backdoor in encryption that will be safe from exploitation by others.

    Apple appears to be trying to offer a compromise solution, one that would preserve privacy, while still protecting children.

    Apple outlined how its CSAM system will work:

    Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

    Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

    Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

    Needless to say, Apple’s announcement has been met with a variety of responses. The Electronic Frontier Foundation (EFF), in particular, has been highly critical of Apple’s decision, even accusing the company of going back on its former privacy stance and embracing backdoors.

    The EFF is particularly concerned Apple’s new system could be broadened to include speech, or virtually anything, governments may not approve of. While there is certainly a concern the system could be abused that way, it’s also a far cry from using an on-device method for screening something as vile as CSAM vs using it to monitor speech.

    In many ways, Apple’s new approach to combatting CSAM is somewhat similar to its approach to combatting malware. There have been times in the past when Apple took the liberty of proactively removing particularly dangerous malware from devices. Critics could argue that Apple could extend that, at the behest of governments, to removing any programs deemed offense. But that hasn’t happened. Why? Because there’s a big difference between removing malware and censoring applications.

    The National Center for Missing & Exploited Children, admittedly a critic of end-to-end encryption, praised Apple’s decision.

    “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” John Clark, chief executive of the NCMEC, said in a statement, via Reuters. “The reality is that privacy and child protection can co-exist.”

    Ultimately, only time will tell if Apple has struck the right balance between privacy and child protection. It’s worth noting Microsoft, Google and Facebook already have similar systems in place, but Apple believes its system offers significant benefits in the realm of privacy.

    In addition to going a long way toward protecting children, it’s also possible Apple’s willingness to make this concession will disarm one of the biggest arguments against end-to-end encryption, preserving the technology against legislative action.

  • Microsoft Unveils Tool To Help Protect Children From Sexual Predators

    Microsoft Unveils Tool To Help Protect Children From Sexual Predators

    It’s estimated that 89% of sexual solicitations made by a predator to a child were done within chat or instant messages. Microsoft is determined to help change that, with the release of “Project Artemis.”

    Project Artemis is a tool to help identify predators in online chat. It was “developed in collaboration with The Meet Group, Roblox, Kik and Thorn,” a tech nonprofit specializing in technology that helps protect children from sexual abuse.

    The tool is designed to evaluate conversations, looking for communication styles and patterns predators use to target children. According to Microsoft, “the development of this new technique began in November 2018 at a Microsoft ‘360 Cross-Industry Hackathon,’ which was co-sponsored by the WePROTECT Global Alliance in conjunction with the Child Dignity Alliance.”

    Once deployed, the tool “evaluates and ‘rates’ conversation characteristics and assigns an overall probability rating. This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review. Human moderators would then be capable of identifying imminent threats for referral to law enforcement, as well as incidents of suspected child sexual exploitation to the National Center for Missing and Exploited Children (NCMEC).”

    The tool will be freely available through Thorn “to qualified online service companies that offer a chat function.” Interested parties can contact Thorn directly at [email protected].

  • Facebook To Feature AMBER Alerts

    AMBER Alerts for missing children are now available to Facebook users.

    Facebook users can sign up to receive AMBER Alert bulletins for their state which will be sent to them through their news feed.  A total of 53 new AMBER Alert pages have been created, one for each state, Puerto Rico, U.S. Virgin Islands and the District of Columbia.  Facebook users will also be able to share the AMBER Alerts with their friends.

     

    Facebook-AMBER-Alerts

     

    Information about the new initiative was announced today by Facebook, the U.S. Department of Justice and the National Center for Missing & Exploited Children (NCMEC).  The announcement was made the day before the 15th anniversary of the abduction and murder of 9-year-old Amber Hagerman, namesake of the national AMBER Alert Program.

    “As the National AMBER Alert Coordinator, I am pleased to see the growth of the program’s national network, said Laurie O. Robinson, Assistant Attorney General, Office of Justice Programs.

    “I would like to thank NCMEC and Facebook for working together to develop another way the public can join with us to bring home missing and abducted children.  We each can play our part by being aware and responsive to AMBER Alert postings that we will now see on Facebook."