What are the risks and problems of face search engines like Clearview AI and PimEyes? Since institutional protection against these systems is failing us, how can we protect ourselves against this? Three people involved in the fight against biometric mass surveillance share their experiences and reflections. Come to this talk to exchange experiences, learn what tools there are for your protection, how to use them and how you can help stop the creep of mass surveillance technologies.
Face search engines like [Clearview AI](https://reclaimyourface.eu/how-to-reclaim-your-face-from-clearview-ai/) and [Pimeyes](https://edition.cnn.com/2021/05/04/tech/pimeyes-facial-recognition/index.html) have all our faces and process our biometric data. They didn't ask us if we like their *service* and if they may use our data. Users of these search engines can now identify us anytime, anywhere.
Since biometric data enjoy special protection under GDPR, we filed complaints in multiple European states. We report how data protection authorities did nothing for a long time and tell of the first successes. However, it became clear that GDPR does not protect against biometric surveillance.
That's why we have joined forces to form the **[Reclaim Your Face](https://reclaimyourface.eu/)** campaign. Together, we call on the European Commission to strictly regulate the use of biometric technilogies in order to avoid undue interference with fundamental rights. In particular, we ask the Commission to prohibit, in law and in practice, indiscriminate or arbitrarily-targeted uses of biometrics which can lead to unlawful mass surveillance.
The two face search engines are not the only examples of everyday biometric surveillance. However, it is difficult to track where else we are being monitored: There is a lack of transparency and oversight. Public authorities and private companies rarely report on their own what they have been up to. We share how we've used FOIA requests, among other things, to create a little more publicity.