Face Surveillance Moratorium Support Letter

The following is a letter of support for S.1385 and H.1538, An Act relative to unregulated face recognition and emerging biometric surveillance technologies. Individuals who work in academia and who's subject matter includes technology, and individuals who work in the tech industry are invited to sign on to this letter of support.

The text of the letter reads:

To the Massachusetts legislature:

We are workers, business owners, and academics whose work touches on technology and its impact on human beings and society. We write to express strong support for S.1385 and H.1538, legislation that would place a moratorium on government use of face surveillance and other remote biometric tracking technologies, until the state legislature passes comprehensive privacy, civil rights, racial justice, and transparency protections.

As technologists and people who study technology, we are keenly aware of the fact that artificial intelligence and machine learning systems like facial recognition too often codify existing biases and discrimination, and exacerbate historical inequities by amplifying information asymmetries.

Research conducted by graduate student Joy Buolamwini at Massachusetts’ own MIT Media Lab, for example, has repeatedly demonstrated that face recognition algorithms developed by prominent technology companies misidentify the faces of darker skinned women up to 33 percent of the time. This year, researchers at the Georgia Institute of Technology found that object detection systems are five percent less likely to recognize darker skinned people. And in 2018, a researcher at Wake Forest University found that facial analysis systems wrongfully categorized Black NBA players’ faces as angrier and more contemptuous than their white colleagues—even though all the men were smiling in the tested images.

Compounding the racial discrimination built-in to the algorithm is the fact that law enforcement agencies, including those in Massachusetts, use facial recognition technology of unknown reliability to scan images against mugshot databases. People who are arrested are not necessarily guilty of anything, let alone a serious offense. Extreme racial disparities in arrest rates—including for things like marijuana possession that are no longer criminal in Massachusetts—mean mugshot databases look much browner and Blacker than the overall population.

When police run face surveillance algorithms that are themselves racially biased against databases containing disproportionate numbers of Black and brown people, they run the risk of falsely accusing large numbers of people of color of crimes, leading to false arrests and worse.

Indeed, when the ACLU in 2018 compared images of members of Congress to a mugshot database, 28 members were falsely “matched”—disproportionately members of color.

Massachusetts is home to more technology workers than any other state in the nation, per capita. We already produce the best technologies and technological research; now we must lead by ensuring those technologies are only used in ways that promote racial and gender justice, personal autonomy, and privacy, and do not threaten our free and open society.

S.1385 and H.1538 will help ensure these powerful systems are only deployed to protect instead of to harm. Communities must have a right to decide how surveillance systems like facial recognition will be used by the government, and this legislation provides that opportunity.

As technology practitioners and researchers, we strongly urge you to set an example for the rest of the nation and pass this critical legislation.

* indicates required field

Your Information
*Affiliations listed for identification purposes only*