Adelphi University

11/04/2025 | Press release | Distributed by Public on 11/04/2025 09:43

Does Facial Recognition Infringe Upon Civil Liberties

Published: November 4, 2025
  • Faculty
  • Research & Creative Works
  • Publications
  • Academic Distinction
  • Technology
Winston Waters, JD, MBA '15, professor of accounting and law in the Robert B. Willumstad School of Business

Adelphi scholar makes the case for federal legislation.

AI has so much potential, but the question is, how do we teach it so it can learn and be applied accurately?"

Winston Waters, JD, MBA '15

The technology we use every day-to communicate, work, play games or even check out at the grocery store-is gathering more and more of our personal information, often without our knowledge or consent. Winston Waters, JD, MBA '15, professor of accounting and law in the Robert B. Willumstad School of Business, believes the growth of artificial intelligence (AI) will only escalate the trend.

"I had the perception that AI was going to become huge," Waters said, "so I thought, 'Let me take a closer look at this.'" His subsequent article, "The need for comprehensive federal legislation to regulate facial recognition technology," published in the Southern Law Journal, explores the collection of biometric data through a legal lens. Without uniform state laws or a comprehensive federal law to regulate artificial intelligence, Waters argues, our basic civil liberties are currently and will continue to be violated, particularly among underrepresented groups.

Biometrics are the distinguishing traits that allow an individual to be automatically identified. They encompass physical characteristics like retinas, voices and fingerprints, how we walk or what we smell like, even our unique typing patterns. Thanks to automated processes, any traits that can be captured and measured by sensors can also be "matched" within seconds to existing identification databases.

"In the beginning, there were a number of online services that made images available to these identification databases. Facebook had over 250 billion images alone, in addition to LinkedIn, Amazon and various organizational rosters," Waters noted. Police databases correlated these countless points of information with the FBI's fingerprint database, which supposedly contains more than 156 million fingerprints.

The benefits of an instantly accessible cache of biometrics-particularly in fast-moving spaces with large crowds, such as airports, casinos and concert venues-were immediately apparent to public safety and national security officials. Police now use facial recognition to aid in criminal investigations, which, according to Waters, has created significant concerns about constitutional infringement, such as the gathering of personal information without consent or without a warrant.

Due to imperfections in AI facial recognition technology in capturing darker-skinned persons, there are biases in its application. Biometric recognition is based in part on preexisting material: casual photographs, videos, mug shots and driver's licenses. This information is used to create a "faceprint" that allows someone to be classified by race, gender and age. Any flaws in these systems-such as traditional cameras' difficulty capturing darker skin tones- automatically become baked into the AI learning models, generating algorithmic bias and perpetuating larger inequalities in criminal justice. As Waters points out, these systems often deliver false positive rates, particularly for darker-skinned individuals. "AI has so much potential, but the question is, how do we teach it so it can learn and be applied accurately?" he said. "Moreover, how do we remove the inherent biases of the person inputting information into AI databases?"

All AI systems are based on algorithms. Algorithms are regularly used in the criminal justice system to predict future crimes, conduct risk assessments and calculate sentencing. In particular, algorithms that predict the likelihood that a defendant will commit an offense have been heavily criticized for their perceived bias against certain underrepresented groups, as well as the opacity of their operation. "If police use facial recognition technology that is deficient in this way, not only will it be ineffective, but it will also result in the inadvertent targeting of innocents-and particularly innocents within minority communities," Waters said. And because these algorithms are programmed by people, and people have implicit biases, those biases appear in the technology, adversely affecting populations of color.

Similar data-driven analytical techniques are used in predictive policing, which describes any system that analyzes available data to identify where crime is likely to occur and who is likely to commit crime. Law enforcement agencies are increasingly relying on these techniques for crime control and forecasting. Yet few predictive policing vendors are fully transparent about how their systems operate, what specific data is used in each jurisdiction that deploys the technology, or what accountability measures the vendor employs in each jurisdiction to address potential inaccuracy, bias or evidence of misconduct.

Despite these concerns-and with our civil liberties increasingly on the line-regulations surrounding facial recognition technology have appeared piecemeal, on a state-by-state basis. Though Congress has discussed legislation that would govern the collection, use and sharing of personal data across industries and states, national privacy standards have yet to be established. "Most of these systems are developed by private companies, and there's a lot of corporate secrecy around biometrics technology," Waters said. "But there needs to be some level of accountability." As more and more privacy cases are brought against private businesses, public entities and law enforcement, the U.S. Supreme Court will also likely play a role in the future of surveillance and data-gathering regulation. For its part, the American Civil Liberties Union (ACLU) has articulated a clear opposition to the use of facial recognition technology due to inherent biases.

But Waters cautions that under the current federal administration, which has close ties to private tech companies, unregulated biometric data collection will continue to infringe on our privacy and constitutional rights. "It'll be worse before it gets better."

Read more in the 2025 issue of Academic & Creative Research Magazine, where we highlight the innovation and imagination shaping Adelphi's academic community.

1 Waters, W. (2022). The need for comprehensive federal legislation to regulate facial recognition technology. Southern Law Journal, 31(1), 112-152.

About Winston Waters, JD, MBA '15

Winston Waters, JD, MBA '15, is a professor of accounting and law in the Robert B. Willumstad School of Business. He is former dean ad interim of the School and has authored articles on artificial intelligence, the court system, Medicaid trusts and corporations. Waters has successfully argued appellate cases in the area of guardianships and trusts.

Adelphi University published this content on November 04, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on November 04, 2025 at 15:44 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]