海角大神

海角大神 / Text

'An uprising': Youth activists bring digital rights to forefront

A new youth group known as Encode Justice seeks to stir an 鈥渦prising for technology and algorithmic justice,鈥 according to founder Sneha Revanur. The group鈥檚 goals range from fighting racially biased algorithms to protecting privacy rights.

By Avi Asher-Schapiro , Thomson Reuters Foundation
Los Angeles

Sneha Revanur, a 16-year-old student from California, is on a mission to do for digital rights what Greta Thunberg鈥檚 movement has done for the climate fight 鈥撀爌ut young activists on the front lines.

Ms. Revanur founded youth digital rights group Encode Justice last year after joining a successful campaign against state plans to use an algorithm to set prisoners鈥 bail terms, a system that critics said was racially biased.

鈥淲e鈥檝e seen it in climate, we鈥檝e seen it on the guns issue, but there hasn鈥檛 been an equivalent uprising for technology and algorithmic justice among my peers,鈥 said Ms. Revanur, a high school senior from San Jose, California.

鈥淲e are the next generation of technologists, regulators, activists 鈥撀爄t鈥檚 impacting our lives on a daily basis, and in the future, we have the most to lose,鈥 she said during a lunch break between her classes.

In little over a year, Encode Justice has grown from being a small group of Ms. Revanur鈥檚 peers to encompassing more than a dozen chapters across the United States as well as teams dedicated to researching policy issues and campaign strategy.

Ms. Revanur modeled the group on a professional issue advocacy body, and it is involved in digital rights issues at every level of government 鈥撀爁rom surveillance regulations in Baltimore to a privacy bill in Washington State.

Most recently, members lobbied federal lawmakers in Washington, D.C., for a national ban on facial recognition technology.

On weekends, the advocacy team convenes phone banks to lobby officials on specific policy items. So far, they have met more than 20 elected officials.

鈥淭hese youth are amazing 鈥撀燼nd they recognize just like they did with climate 鈥撀爐hat adults have failed,鈥 said Kade Crockford, director of the technology for liberty program at the American Civil Liberties Union of Massachusetts.

Mx. Crockford is working with Encode Justice to organize a series of protests in schools to push for a state ban on facial recognition technology in Massachusetts.

鈥淭his newer generation does not see digital rights as a remote, far away issue 鈥撀爐hey see it as central to what it means to be free,鈥 Mx. Crockford said, praising the group for tracking legislation at all levels of government.

鈥淚t鈥檚 very sophisticated,鈥 Mx. Crockford said.

鈥楻eal world implications鈥

The summer before her senior years in high school in Virginia, Mrudula Rapaka started seeing Ms. Revanur鈥檚 posts about the Californian bail algorithm on Instagram and decided to get involved.

鈥淚 learned how dangerous these algorithms could be to marginalized groups,鈥 she said.

Ms. Rapaka started volunteering her time to call up voters in California to ask them to vote against an upcoming state ballot measure that would have rolled out the bail systems state-wide.

California voters eventually rejected the measure, which was also opposed by groups including Human Rights Watch and the American Civil Liberties Union, by a 13% margin.

Ms. Rapaka, who lives near Washington D.C., now helps direct Encode Justice鈥檚 advocacy work.

Raising awareness of digital rights issues among youth is another pillar of the group鈥檚 activities, said Vidya Bharadwaj, the organization鈥檚 education director.

鈥淣ot everyone realizes that algorithms surround them in their everyday life,鈥 she said. 鈥淚 try to explain how this tech is profiling us 鈥撀爐here are real world implications, and minorities are impacted more so than others.鈥

Ms. Bharadwaj gets in touch with high school teachers by email, asking if she can give Zoom presentations to their students on key issues such as the discrimination and privacy risks posed by facial recognition technology.

She has also given classroom talks about social media algorithms, which are often engineered to keep users hooked, and can boost misinformation or polarizing posts.

鈥淭he message really lands when it comes from your peers,鈥 she said, estimating that Encode Justice鈥檚 classroom presentations have reached more than 3,000 students in the last four months alone.

An aspiring computer programmer, Ms. Bharadwaj also runs GirlCon, a group dedicated to helping young women find careers in science and technology, and draws on the work of pioneering women AI critics such as Cathy O鈥橬eil and Joy Buolamwini.

At a recent presentation to a computer science class in Mississippi, Ms. Bharadwaj explained how algorithmic bias works, giving an example of how a computer model could estimate the price of a house in a way that undervalues Black homes.

It is not a hypothetical issue: Last year, real estate platform Redfin was sued by housing advocates who said the methods it used to assess home values were racially biased.

The group is also beginning to take on privacy issues that directly impact students, from the use of test proctoring algorithms in schools to systems that profile students who are likely to be violent.

Ms. Revanur said it had been a whirlwind to see the group grow from a group of 10 friends and classmates to a fully fledged advocacy group, which is even starting to see international chapters sprout up.

鈥淲e鈥檙e doing something that hasn鈥檛 been done before,鈥 she said. 鈥淎nd it鈥檚 awesome to see how integral youth can be to this movement.鈥

This story was reported by the Thomson Reuters Foundation.