国产精品美女一区二区三区-国产精品美女自在线观看免费-国产精品秘麻豆果-国产精品秘麻豆免费版-国产精品秘麻豆免费版下载-国产精品秘入口

Set as Homepage - Add to Favorites

【eroticize arrobics】Apple delays plan to check iPhones for child abuse images

Source:Global Hot Topic Analysis Editor:hotspot Time:2025-07-02 18:04:01

The eroticize arrobicspushback against Apple's plan to scan iPhone photos for child exploitation images was swift and apparently effective.

Apple said Friday that it is delaying the previously announced system that would scan iPhone users' photos for digital fingerprints that indicated the presence of known Child Sexual Abuse Material (CSAM). The change is in response to criticism from privacy advocates and public outcry against the idea.

"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," a September 3 update at the top of the original press release announcing the program reads. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."


You May Also Like

Announced in August, the new feature for iOS 15 would have checked photos in an iPhone user's photo library — on the device before sending the photos to iCloud — against a database of known CSAM images. If the automated system found a match, the content would be sent to a human reviewer, and ultimately reported to child protection authorities.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

The fact that the scanning happened on the device alarmed both experts and users. Beyond it being generally creepy that Apple would have the ability to view photos users hadn't even sent to the cloud yet, many criticized the move as hypocritical for a company that has leaned so heavily into privacy. Additionally, the Electronic Frontier Foundation criticized the ability as a "backdoor" that could eventually serve as a way for law enforcement or other government agencies to gain access to an individual's device.

"Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," the EFF said at the time.

Experts who had criticized the move were generally pleased with the decision to do more research.

Others said the company should go further to protect users' privacy. The digital rights organization fight for the future said Apple should focusing on strengthening encryption.

While other companies scan cloud-based photo libraries for CSAM, like Google, and the overall goal of protecting children is obviously a good one, Apple thoroughly bungled the rollout of this product with privacy concerns justifiably overshadowing the intended purpose. Better luck next time, folks.

Topics Apple Cybersecurity iPhone Privacy

0.2394s , 9939.953125 kb

Copyright © 2025 Powered by 【eroticize arrobics】Apple delays plan to check iPhones for child abuse images,Global Hot Topic Analysis  

Sitemap

Top 主站蜘蛛池模板: 国产91麻豆 | 91麻豆国产高清 | 丰满少妇被猛烈进入高清播放 | 国产va无码人在线观看天堂 | 97嫩草国产天天综合 | A片又大又粗又爽免费视频 a片在线播放 | 91久久精品平台到底有哪些优势与风险?如何选择合适的使用 | av免费一级特黄美女图片 | 午夜蜜桃视频在线观看 | v一区二区三区麻豆 | 91精品一区二区三区久久久久 | 91免费电影| 午夜免费观看一级a片 | www日本高清视频 | av粉嫩国产不卡无码一区二区 | 成人生活片网站 | 国产av无码一区二区三区dv | 国产91在线看 | 99久久婷婷国产综合精品免费 | 99久久999久久久综合精品涩 | 午夜网站在 | 99精品视频在线观看免费专区 | 成人美女免费网站视频 | 午夜福利电影免费观看 | 国产不卡视频在线 | 91av观看| 果冻传媒董小宛视频一区 | 91精品福利视频 | 国产AV天堂亚洲AV麻豆 | www.久| 91久久综合精品国产丝袜长腿 | 大片免费视频观看 | 91乱码一区二区三区 | a级片网站在线观看 | av无码小缝喷白浆在线观看 | 海角精产国品一二三 | 东京热av人妻无码 | 国产v一区二区综合 | 丰满人妻少妇久久久久影院 | av站天堂资 | av鲁丝一区鲁 |