Free tools and resources for Data Protection Officers!

Tag Archives for " biometrics "

Fido Alliance adds a biometrics certification program to help fight spoofing 

In a move aimed at upping standards across biometric user verification systems, the industry consortium, Fido Alliance, has launched a certification program for biometrics systems.

The goal of the Biometric Certification Component Program is to provide a framework for the certification of biometric subsystems that can in turn be integrated into FIDO Certified authenticators. While biometric verification systems such as fingerprint readers have been pretty widely adopted in the mobile space already there hasn’t been a standardized way to validate the accuracy and reliability of biometric recognition systems in the commercial marketplace

Source: Fido Alliance adds a biometrics certification program to help fight spoofing | TechCrunch

10 ways China watches its citizens

From tracking the activity of mobile app users to setting up a social credit scorecard, the world’s most populated country is taking surveillance technology to new heights. With a population of 1.3 billion, China’s plan to create a facial recognition system that can identify people within three seconds – with a 90 per cent accuracy rate – may seem ambitious, but that does not stop it from trying.

Read full article: Drones, facial recognition and a social credit system: 10 ways China watches its citizens | South China Morning Post

Facial recognition system to be used in 2020 Tokyo Olympics

A facial recognition system will be used across an Olympics for the first time as Tokyo organizers work to keep security tight and efficient at dozens of venues during the 2020 Games. The NeoFace technology developed by NEC Corp. will be customized to monitor every accredited person – including athletes, officials, staff and media – at more than 40 venues, games villages and media centres.

Source: Facial recognition system set to be used in Olympic security | CTV News

Genetics testing companies agree on rules to share data

Ancestry, 23andMe and other popular companies that offer genetic testing pledged on Tuesday to be upfront when they share users’ DNA data with researchers, hand it over to police or transfer it to other companies, a move aimed at addressing consumers’ mounting privacy concerns.

Source: Ancestry, 23andMe and others say they will follow these rules when giving DNA data to businesses or police – The Washington Post

Facebook’s Push for Facial Recognition Prompts Privacy Alarms

Facebook is working to spread its face-matching tools even as it faces heightened scrutiny from regulators and legislators in Europe and North America. Already, more than a dozen privacy and consumer groups, and at least a few officials, argue that the company’s use of facial recognition has violated people’s privacy by not obtaining appropriate user consent.

Source: Facebook’s Push for Facial Recognition Prompts Privacy Alarms – The New York Times

DNA Tests on Separated Migrant Children Raise Privacy Issues

The Trump administration’s decision to use DNA testing to help reunite children separated from their parents at the Mexican border is sparking concerns among privacy advocates about how data will be used. Potential concerns include government surveillance of migrant families, or using the health information gleaned from DNA tests to deny access to services in the future. There are also concerns that DNA samples from children won’t be obtained with proper consent.

Source: DNA Tests on Separated Migrant Children Raise Privacy Issues – Bloomberg

Facial recognition software is not ready for use by law enforcement

Facial recognition technologies, used in the identification of suspects, negatively affects people of color. To deny this fact would be a lie. And clearly, facial recognition-powered government surveillance is an extraordinary invasion of the privacy of all citizens — and a slippery slope to losing control of our identities altogether. There’s really no “nice” way to acknowledge these things.

Read article: Facial recognition software is not ready for use by law enforcement | TechCrunch

London cops’ facial recognition doesn’t work

London cops’ facial recognition kit has only correctly identified two people to date – neither of whom were criminals – and the UK capital’s police force has made no arrests using it. Police’s automated facial recognition (AFR) technology has a 98 per cent false positive rate.

Source: Zero arrests, 2 correct matches, no criminals: London cops’ facial recog tech slammed • The Register