Tell Olympia: Pass the Tech Fairness Bill

TAKE ACTION NOW

Technology vendors are now selling computer systems that decide whether you get a job or housing, what you pay for health care, how your community is policed, how much bail is set, and how long your sentence is. Public agencies are widely, quickly, and invisibly adopting these systems without transparency or safeguards. Because these systems use biased and often flawed historical data to make decisions, the risk to civil rights and liberties is great and must be addressed. Tell Olympia to create commonsense safeguards for government use of automated decision systems.

Message Recipients:
Your State Lower Chamber Representatives

[The Form Label field is hidden on ACLU message action forms]
Your Message
Support HB 1655 for tech fairness.
Dear Representative:

I write to urge you to pass HB 1655 in its strongest possible form. This bill establishes guidelines for public agency use of automated decision systems, protecting consumers, improving transparency, and creating market predictability.

Technology vendors are now selling computer systems that decide whether you get a job or housing, what you pay for health care, how your community is policed, and how much bail is set and how long your sentence is. Public agencies are widely, quickly, and invisibly adopting these systems without transparency or safeguards. Our Legislature should ensure there are clear guidelines for these systems that both protect communities and boost innovation by ensuring fairness and transparency. HB 1655 takes an important first step in that direction.

Public agencies buy these systems under the banners of efficiency and cost-savings, but lawmakers and the public do not have the information needed to evaluate those claims, since vendors use non-disclosure agreements and litigation to keep the workings of the systems largely secret.

Furthermore, bad automated decisions are impacting people’s lives and their families. In some cases, there is no human review of decisions by these error-prone systems, and procedures to appeal or understand them do not exist. For example, systems have rejected public assistance benefit applications for people with disabilities and faulty risk assessments have recommended different sentences for the same crime based on race. Critical decisions about life, health, and safety should not be made by such flawed systems.

A growing body of research indicates these systems are often biased against marginalized communities. Many of these tools rely on historical data to make their decisions—but those data are often skewed as a result of past discrimination. Predictive policing systems designed to determine where law enforcement should deploy resources largely reflect the historic over-policing of neighborhoods of color. Pre-trial risk assessments often incorrectly generalize people of color to be riskier than others and recommend higher bail. And algorithms that screen job applicants often replicate the existing lack of diversity within companies.

The Legislature should put common-sense safeguards in place around these quickly expanding, invisible, and powerful automated decision systems. As a technology leader, Washington should aim to enhance innovation while protecting consumers by promoting fairness, accountability, and transparency. I urge your support for HB 1655, which would: establish minimum standards of fairness and accountability for any government agency buying or using automated decision systems; ensure public transparency and approval before these systems are acquired; and address bias and discrimination in automated decision systems that treat vulnerable communities less favorably.

Sincerely,

[First Name] [Last Name]
[Your Address]

Recent participants