GirlChat #738668
[...snap...]
The Siri and Search systems will "intervene when users perform searches for queries related to CSAM" and "explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue." Source: https://arstechnica.com/tech-policy/2021/08/apple-explains-how-iphones-will-scan-photos-for-child-sexual-abuse-images/ CSAM=Child Sexual Abuse Material |