(Image credit: Apple) Why is Apple doing this?įrom a moral perspective, Apple is simply empowering parents to protect their children and perform a societal service by curbing CSAM. These updates are only coming to users in the US, and it’s unclear when (or if) they’ll be expanded elsewhere – but given Apple is positioning these as protective measures, we’d be surprised if they didn’t extend it to users in other countries. (The latter will presumably scan iCloud photos for CSAM, but it’s unclear if the Messages intervention for sexually explicit photos will also happen when macOS Monterey users use the app.) Lastly, your device won’t get any of these features if you don’t upgrade to iOS 15, iPadOS 15, or macOS Monterey. If your device isn’t linked to a family network as belonging to a child, nothing will change for you. If your iPhone or iPad’s account is set up with a family in iCloud and your device is designated as a child in that network, you will see warnings and blurred photos should you receive sexually explicit photos. If you do not make a Siri inquiry or online search related to CSAM, nothing will change for you. If you do not have photos with CSAM on your iPhone or iPad, nothing will change for you. (Image credit: Apple) Will this affect me?
0 Comments
Leave a Reply. |