Taking ethical action in identity: 5 steps for better biometrics

February 8th, 2019

By Ned Hayes, General Manager, SureID

Glance at your phone. Tap a screen. Secure access granted!

This is the power of biometric identity at work. The convenience of unlocking your phone with a fingertip or your face is undeniable. But ethical issues abound in the biometrics field.

The film Minority Report demonstrated one possible future, in terms of precise advertising targeting based on a face. But the Spielberg film also demonstrated some of the downsides of biometrics – the stunning lack of privacy and consumer protection.

What’s fascinating is that many of these concerns were anticipated over a century ago. In 1890, Louis Brandeis advocated privacy protection when he co-authored an article with colleague Samuel Warren in the Harvard Law Review advocating “the right to be let alone.” Brandeis, a future Supreme Court Justice, stated then that the development of “instantaneous photographs” and their dissemination by newspapers for commercial gain had created the need for a new “right to privacy.”

Today, technology has potentially swamped that right to privacy. From one public CCTV to the next, a long-term history can be stitched together from multiple video sessions to create one end-to-end picture of an individual’s journey. The owner of a shopping mall or private entertainment facility could easily track behavior from store to store, delivering specific data to store owners and making predictive findings on your behavior over time.

There’s a fix for the Minority Report problem: transparency. Companies who control biometrics should be transparent about what they are collecting, how it is collected and stored and the potential for abuse or mis-identification. If an error occurs, companies should be transparent with that data and provide a publicly available fix for that mistake.

Just as you have a right to know what Facebook is collecting on you, you should also have the right to know how your face can be identified by what company and for what purpose. You shouldn’t be surprised if recognized in a crowded public place, and you should know if law enforcement has access to that data.

The degree to which your shopping behavior is “private” is arguable, but it is inarguable that we should discuss this topic rather than just letting commercial terms dictate what the world knows about us.

Unfortunately, we don’t have a good grounding today in what an informed public discussion looks like. A recent Pew study demonstrated that 74% of the American public doesn’t understand that Facebook is targeting advertising to individuals based on a profile they’ve built of your interests. This is not the fault of the consumer: this is a problem caused by tech companies who have not served the public with full transparency and open information.

All of these ethical issues can be addressed, but we need to start now. Here are some basic steps that can assist you and your team in anticipating and addressing potential ethical issues.

1. Put humans in the loop: First, we should ensure that a human being is always in the loop. Human beings are not immune to errors or biases, but having a qualified person review your facial or fingerprints file to determine if it’s correct should be a standard practice. Today, it is not, and far too many people are mis-identified by faulty machine logic. Machines should not determine where the boundaries of personal freedom, privacy or human rights exist.

2. Limit government surveillance: Laws and regulations should limit the use of surveillance or publicly-gathered biometric data (such as facial recognition or latent prints gathered by optical sensors). The limitations should include protections for human life or within the allowances made by court orders.

3. Build systems that don’t discriminate: It’s easy to say “don’t discriminate”, but the reality is harder. When designing a machine learning system, caution should be taken during system design to acknowledge possible bias and course-correct for that bias by testing with different populations and with different cultures and in different regions. Companies who use biometric systems should be held to account for how their algorithm might encourage inadvertent discrimination against individuals.

4. Be open and transparent: Companies should be crystal clear with consumers about the intended use of a person’s biometric data, and not extend that usage to areas where it was not initially intended. Always ask a consumer, always respect the response, and don’t abuse the user’s trust. Many companies are surprised by how far consumers will go, when they are properly and fully informed.

5. Clarify what consent means: Laws and local regulations should specify that consumers both understand the use case, and agree to allow surveillance or biometric gathering when they enter a store or use an online service.

The path towards creating and supporting best-in-class technology doesn’t just begin by writing some code or designing hardware. Instead, your technical system often emerges from a thicket of ambiguous and ever-changing customer needs. Hidden in those needs are also a set of unstated ethical quandaries. When you deliver a system that uses biometrics for identification and access, you open up one or more ethical question. To make your system responsive to consumer concerns, it is always important to anticipate apprehensions, open yourself to listening to questions, deliver data on your planned usage, and provide full details on exactly what you are doing with biometric data.

These steps should assist you in delivering systems that people not only use every day, but trust implicitly with their most personal and private information.

Sterling is not a law firm. This publication is for informational purposes only and nothing contained in it should be construed as legal advice. We expressly disclaim any warranty or responsibility for damages arising out this information. We encourage you to consult with legal counsel regarding your specific needs. We do not undertake any duty to update previously posted materials.

Let's Connect

Have questions, need more info, or want to chat background screening solutions? We’re here for you. Click the option that best describes you.

Job candidate? Click here