26 August 2021

Apple’s child abuse (CSAM) scanner a deliberate failure, other agenda in play


Apple will have you think that its CSAM (Child Sexual Abuse Material) AI scanner on your phone will prevent child abuse, but the reality is very different.

It is easily defeatable.

It only works on Apple devices.

It’s an excuse for invasion of privacy for other purposes.

See the technicalities as explained by Rob Braxman within the video:




No comments: