Apple’s encrypts photos in transit and stores them encrypted, but has the decryption key to decrypt them if necessary-to serve data stored in iCloud under subpoena, or to make your iCloud photos available in a web browser. They can do this because, while the images may be stored encrypted, the companies have the encryption key. Most cloud services already scan images for material that violates its terms of service or the law, including CSAM. The rollout will happen later this year as part of a collection of technologies meant to make its products and services safer for children to use. TechCrunch has confirmed that Apple will soon roll out a new technology to scan photos uploaded to iCloud for child sexual abuse material (CSAM). Update: Apple has a Child Safety page that describes this new feature and how it works.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |