Google employees made headlines this week, when 11 staffers signed a petition calling on the search giant to cut ties with its ongoing Dragonfly program, which would create a censored version of Google for Chinese internet use by filtering out any content the Chinese government deems inappropriate. This petition follows a previous letter signed by more than 1,400 Google employees calling for increased transparency regarding the project — both efforts, however, fall significantly short of a different Google employee initiative from earlier this year.  More than 4,000 Google employees signed a different petition a few months prior to sever ties with a U.S. Defense Department initiative that aimed to streamline analysis of intelligence data. Google ultimately sided with their employees on that issue, choosing not to renew their contract with the American government.

“We are Google employees, and we join Amnesty International in calling on Google to cancel project Dragonfly, Google’s effort to create a censored search engine for the Chinese market that enables state surveillance,” the petition delivered this week reads.

Google, which operated for years under the now defunct corporate motto of “don’t be evil,” found itself in a frenzy earlier this year when staffers banded together to oppose “Project Maven,” an endeavor aimed at using the same machine learning techniques employed in Google’s image search function to sift through hours of surveillance footage and identify people, items, or events that warrant human analysis. In effect, the Defense Department was looking for a program that could analyze hours of footage and produce a list of timestamps that warrant further investigation, but Google employees and others within Silicon Valley presented the project as some sort of “targeting system.”

If ethical action on the part of tech companies requires consideration of who might benefit from a technology and who might be harmed, then we can say with certainty that no topic deserves more sober reflection — no technology has higher stakes — than algorithms meant to target and kill at a distance and without public accountability,” an open letter to Google from the International Committee for Robot Arms Control stated at the time.