You are here |
blog.xot.nl | ||
| | | |
berthub.eu
|
|
| | | | A quick update - The European Commission and EU member states have been pondering, for years now, if they should force WhatsApp/Apple/Signal/Telegram to scan all our private messages for suspected child sexual abuse material (CSAM). For various reasons it is a horrendous idea to break end to end encryption in this likely highly ineffective way. Variations of the proposal have also included a mandate to perform such scanning of images using AI, and even to also read our text messages to see if we aren't "grooming" children. | |
| | | |
piotr.is
|
|
| | | | Update Dec 7, 2022: Apple scrapped the plan of CSAM-scanning iCloud Photos libraries. My wife is pretty tech-savvy. While not a software engineer and not a computer scientist, she has a good understanding of computing technologies, statistics, formal methods, and an intuitive (but quickly growing) grasp of machine learning. | |
| | | |
www.eff.org
|
|
| | | | Apple has announced impending changes to its operating systems that include new protections for children features in iCloud and iMessage. If youve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system. | |
| | | |
onlysky.media
|
|
| | The latest round of reports on sexual war crimes in Israel on October 7 also reveals how little respect women are given by all sides in war. |