Subscribe
Home Craft Why Apple Abandoned Its Plan to Scan Your iCloud Photos

Why Apple Abandoned Its Plan to Scan Your iCloud Photos

by Staff
0 comment

Privacy wins yet again as Apple users no longer need to worry about the company scanning their iCloud photo libraries.


Apple announced plans to scan your iCloud content for child sexual abuse material (CSAM) in August 2021 to protect children from abuse by predators.


The company planned to launch a new CSAM detection feature in iCloud Photos that would scan for such content and report matching images to Apple while maintaining user privacy.

However, the new feature was met with mixed reactions. Over a year since the initial announcement, Apple is officially dropping its plan to scan iCloud Photos for CSAM.


Apple Drops Its Plan to Scan iCloud for Child Abuse Material

According to a report by WIRED, Apple is walking away from its plan to scan your iCloud for child abuse content. The tool would scan photos stored on iCloud to find those that match known CSAM images as identified by children’s safety organizations. It could then report those images because possessing CSAM images is illegal in most jurisdictions, including the US.

Why Apple Canceled Its Plan to Scan iCloud Photos

After the initial announcement in 2021, Apple faced backlash from customers, groups, and individuals advocating for digital privacy and security worldwide. iPhones typically offer more security than Android devices, and many viewed this as a step back. While it was a win for child safety organizations, over 90 policy groups wrote an open letter to Apple later the same month, telling the company to cancel the plan.

The letter argued that while the tool intends to protect children from abuse, it could be leveraged to censor free speech and threaten user privacy and security. Due to mounting pressure, Apple halted its launch plans to collect feedback and make necessary adjustments to the feature. However, feedback didn’t favor Apple’s plan to scan iCloud Photos, so the company is officially dropping the plans for good.

In a statement to WIRED, the company said:

“We have … decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.”

Apple’s New Plan to Protect Children

Apple is redirecting its efforts to improve Communication Safety features announced in August 2021 and launched in December of the same year.

Communication Safety is optional and can be used by parents and guardians to protect children from both sending and receiving sexually explicit images in iMessage. The feature automatically blurs such photos, and the child will be warned about the dangers involved. It also warns anyone if they try to search for CSAM on Apple devices.

With plans to launch a CSAM-detection tool canceled, the company is looking forward to improving the feature further and expanding it to more communication apps.



Read the full article here

You may also like

Leave a Comment

Iman Hearts is one of the biggest lifestyle news and articles portals, we provide the latest news and articles about family, lifestyle, entertainment, and many more, follow us to get the latest news about what matters to you.

 

© 2022 Iman Hearts. All rights reserved. Sitemap