Technology targeting cycle.
Communities aren’t at the decision-making table.

The issue we have identified through these historical examples is that communities that are most impacted have been rarely engaged in the creation and deployment of surveillance technologies. Those who build the technologies are data scientists and technologists, but those who are impacted by the technology are communities. So, the key questions we need to ask are: Who defines what the problems are that technology purports to solve? And, importantly, who has the power to decide if those technologies are built and how they’re deployed? What would it take to ensure that most of the power resides within the communities? And what would it take to make sure that these communities can determine the trajectory of technology?

What does community-centric tech policy look like?
I’m going to present examples of how we’re working to build community-centric technology in Washington. The goal behind the technology policy work we’re doing is rooted in redistributing power to make sure that new policies explicitly put communities in the driver’s seat. These policies include:

– Redistributing power;
– Centering equity/voice of impacted communities;
– Creating transparent and accountable processes;
– Questioning assumptions, and
– Creating opportunities to say “no.”

Over the course of the past years, we’ve seen a growth in a multi-sector movement to create community-centric technology and community-centric technology policy, comprising groups including impacted communities, the Tech Equity Coalition, Mijente, Athena (Amazon-specific), technology workers/labor, climate change activists, and antitrust groups.

In Seattle, we have built what we’ve called the Tech Equity Coalition, which is composed of representatives of organizations that represent historically marginalized communities and communities that have been impacted by surveillance, such as immigrants, religious minorities, and people of color. This very diverse group of people work to push technology to be more accountable to them.

A key piece of the work the Tech Equity Coalition has been doing is advocating for and implementing surveillance laws like the Seattle Surveillance Ordinance. This is a first-of-its-kind-law in the U.S. that requires public oversight for government use of surveillance technologies. This law was first passed in 2013, but suffered from an overly broad definition of surveillance technology, overbroad exemptions, and a weak enforcement mechanism that did not hold agencies accountable to comply with the law. The law was amended in 2017 and 2018 to rectify these problems and to create a community-focused advisory working group tasked with providing privacy and civil liberties impact assessments of different technologies.

This law is intended to be a vehicle to allow communities to directly influence surveillance technologies, rather than have these technologies just happen to them. The Seattle Surveillance Ordinance is a vehicle for not only community engagement and capacity-building, but also lawmaker education and engagement with communities.

The diagram on the next page is a brief overview of how the law works in theory: every agency using a surveillance technology in Seattle must write a Surveillance Impact Report detailing how the technology is used and submit it to the public. The public then has a chance to comment on those technologies. Those public comments are compiled and the community advisory working group has a chance to review those comments and make final recommendations to City Council, which are the lawmakers that decide what laws are passed in Seattle.

We’re currently in the implementation process and we’ve reviewed a little over half of the technologies on the master list. Some technologies we’ve reviewed include the Seattle Police Department’s Automated License Plate Reader System (ALPR) and CopLogic system, as well as the Seattle Department of Transportation (SDOT)’s Acyclica sensor system. We’ve had a lot of successes and challenges.

One success of the Ordinance has been our ability to learn a lot of new information about surveillance technologies in use by government agencies. We’ve had a chance to see what technologies exist and assess the privacy and civil liberties implications of their use. This transparency is integral to building accountability. One of the technologies we’ve learned a lot about is Acyclica, a sensor used by the Seattle Department of Transportation (SDOT). For example, we learned that SDOT didn’t address important questions about Acyclica in its Surveillance Impact Report, such as questions around data ownership and retention.

But a challenge has been that so far it hasn’t been an opportunity to actually say “no” to a technology, even if we are able to provide recommendations. In addition, transparency is not accountability, and we’ve had to deal with all the logistics of engaging with the Ordinance, including limited community resources, uncooperative policymakers, complicated reports and technologies, and a very slow process.

Another example of work the Tech Equity Coalition has been engaging on is statewide laws like a face surveillance moratorium bill, which is a Washington state law that we are currently pushing forward. We helped introduce this bill last year in our legislature as an effort to press pause on the use of racially biased, inaccurate, and deeply flawed facial recognition technology. Our aim is to give impacted communities a real chance to say “no”—to decide if, not just how, a powerful technology should be used.

To do this work in building power within communities to advocate for technology policies that are accountable to them, we have been collaborating on community capacity-building projects with the Tech Equity Coalition. These projects are aimed at complementing the existing expertise that impacted communities already bring to the table, because as I’ve highlighted earlier, technology has always impacted these communities.

Community capacity-building projects
The first community capacity-building project is the Algorithmic Equity Toolkit, which seeks to help folks easily identify and ask important questions about AI-based automated decision systems. We collaborated with the Critical Platform Studies Group to build an initial version of the toolkit.

A component of the Algorithmic Equity Toolkit.

Source: Algorithmic Equity Toolkit, available at: https://aekit.pubpub.org/

The second community capacity-building project is a countersurveillance workshop toolkit that we’re collaborating on with the coveillance collective. A countersurveillance workshop toolkit is designed to help communities host their own workshops and disseminate information about the wider surveillance ecosystem. We hosted two pilot workshops last fall, and we’re working with members of the Tech Equity Coalition to build upon those pilots.

A field guide for spotting surveillance technologies from a pilot workshop held in Seattle, Washington.

Source: The coveillance collective’s “Field Guide to Spotting Surveillance Infrastructure”, available at: https://coveillance.org

So, with that brief overview of the work we’re doing in Washington to build community-centric technology policy, I’ll leave you with a few key takeaways.

– Communities are really the ones that are best placed to identify and define problems for technology to solve;
– Communities are keenly aware of technologies’ impacts, even if they’re not using technical language to describe them; 
– Life stories and experiences are very valuable, and they’re not just valuable once you label them as community data; and 
– Community organizations and members are willing and eager to lead.

And, I’ll leave you with a few key questions to ponder. Not is this technology good or bad? But:

– For what purposes are we building technology, and with what rules and values?
– Who will be impacted? Are they in the room? What other collaboration structures can we create?
– Who gets to say “no”?
– Beneficial for whom? Who defines benefit?

Finally, I’ll leave you with one last question: What would it mean for those with the least power in conversations about technology to have far more or even the most power in the future of their design and implementation?