|
You are here |
element.io | ||
| | | | |
pxlnv.com
|
|
| | | | | Since 2022, the European Parliament has been trying to pass legislation requiring digital service providers to scan for and report CSAM as it passes through their services. Giacomo Zandonini, Apostolis Fotiadis, and Lud?k Stavinoha, Balkan Insight, with a good summary in September: Welcomed by some child welfare organisations, the regulation has nevertheless been met with [...] | |
| | | | |
www.thestack.technology
|
|
| | | | | Spying on MPs and breaking encryption: Investigatory Powers (Amendment) Bill damned as unprecedented and deeply troubling by industry | |
| | | | |
berthub.eu
|
|
| | | | | A quick update - The European Commission and EU member states have been pondering, for years now, if they should force WhatsApp/Apple/Signal/Telegram to scan all our private messages for suspected child sexual abuse material (CSAM). For various reasons it is a horrendous idea to break end to end encryption in this likely highly ineffective way. Variations of the proposal have also included a mandate to perform such scanning of images using AI, and even to also read our text messages to see if we aren't "grooming" children. | |
| | | | |
www.tn.gov
|
|
| | | Tennessee Occupational Safety and Health Administration (TOSHA) | ||