It’s something just about all of us have done at one time or another: go to Google, type in a query, and click over to the Google Images tab. Instantly, you’re met with a bevy of pictures depicting just about exactly what you asked for. The rapid availability of countless images depicting a visual representation of your own written words is just one of the many technological marvels we’ve grown so accustomed to that we hardly register it as miraculous at all — it’s just a thing that happens, like birds chirping or rain falling.
But what’s actually going on behind the scenes is a complex system of machine learning, artificial intelligence and robust coding that all comes together to produce an instant response to your strange hankering to see pictures of the kind of car you drove in high school.
Google’s ability to instantly assess what’s depicted in an image and marry that assessment to English words on command isn’t only valuable to those hoping to screen the internet for content that suits their preferences — and chief among the technology’s many applications could be rapidly assessing the material found in reconnaissance feeds provided by drones flying above battlefields the world over.
You've reached your daily free article limit.
Subscribe and support our veteran writing staff to continue reading.
It’s something just about all of us have done at one time or another: go to Google, type in a query, and click over to the Google Images tab. Instantly, you’re met with a bevy of pictures depicting just about exactly what you asked for. The rapid availability of countless images depicting a visual representation of your own written words is just one of the many technological marvels we’ve grown so accustomed to that we hardly register it as miraculous at all — it’s just a thing that happens, like birds chirping or rain falling.
But what’s actually going on behind the scenes is a complex system of machine learning, artificial intelligence and robust coding that all comes together to produce an instant response to your strange hankering to see pictures of the kind of car you drove in high school.
Google’s ability to instantly assess what’s depicted in an image and marry that assessment to English words on command isn’t only valuable to those hoping to screen the internet for content that suits their preferences — and chief among the technology’s many applications could be rapidly assessing the material found in reconnaissance feeds provided by drones flying above battlefields the world over.
The Department of Defense knows it, and so does Google; it’s the basis of their new joint venture dubbed “Maven” — but not everyone at the search giant’s offices are pleased with the concept. A number of Google staffers has resigned over the company’s relationship with the Pentagon, and other industry insiders have begun putting together petitions demanding Google back out of their deal and leave the war fighting to war fighting enterprises like Defense giants Lockheed, Northrop Grumman, or General Dynamics.
If ethical action on the part of tech companies requires consideration of who might benefit from a technology and who might be harmed, then we can say with certainty that no topic deserves more sober reflection—no technology has higher stakes—than algorithms meant to target and kill at a distance and without public accountability,” an open letter to Google from the International Committee for Robot Arms Control states.
Google has moved into military work without subjecting itself to public debate or deliberation, either domestically or internationally. While Google regularly decides the future of technology without democratic public engagement, its entry into military technologies casts the problems of private control of information infrastructure into high relief.”
It’s clear that many within the tech industry have a problem with the concept of “one of their own” working with the Defense Department. In fact, the way many of the complaints fielded by departing Google staffers read, it seems most take issue specifically with Google’s involvement, rather than with the endeavor itself.
“It’s not like Google is this little machine-learning startup that’s trying to find clients in different industries,” a resigning employee told Gizmodo. “It just seems like it makes sense for Google and Google’s reputation to stay out of that.”
“I tried to remind myself right that Google’s decisions are not my decisions. I’m not personally responsible for everything they do. But I do feel responsibility when I see something that I should escalate it,” another said.
I recognize my own inherent biases at play here. I know that the Pentagon has been clear about the short term goals of Project Maven, which have absolutely nothing to do with “killing at a distance and without public accountability,” but that it rather aims to sift through the torrents of intelligence feeds to flag things that may be of importance for human operators to look at. Maven isn’t making any decisions, it’s helping with the search. When I go looking for a picture of an old car to use in an article with Google Image search, the search doesn’t choose the photo for me — it sifts through the countless images available to it and provides me with a list of things that come closest to the parameters I set forth. What it’s done is narrow down the field of images on the internet into just the ones that may be pertinent to me so that the task becomes manageable — and that’s what Maven hopes to do with drone feeds.
In my mind, this technology does play a role in ending lives, but that’s honestly poor framing. Good quality intelligence saves lives — admittedly, often thanks to killing the right folks at the right times.
It’s pretty easy to sit at your desk in the Google offices and feel as though your technology shouldn’t play a role in killing anyone, but it’s harder for me to side with you when I know this technology could save the lives of people I know. Men and women are out there in the fight, tasked with taking action when intelligence surfaces that indicates an impending threat or an important objective is on the horizon. Quickly and reliably sifting through these feeds could mean the difference between knowing where IEDs are being put together, identifying high value targets before they have a chance to move again, and targeting the bad guys that need killing before they’re able to kill anyone else. Google employees want to free themselves from the responsibility of knowing people are dying, rather than trying to save lives.
Admittedly, there’s truth to the idea that this technology will likely bolster eventual efforts to field entirely autonomous systems that are capable of making their own decisions about engaging a target but that means these employees are resigning over the idea that another technology may eventually surface and progress unchecked — a technology that’s independent from the system they’re working on. That’s like wanting to cancel the Apollo missions because the fin design on the Saturn V could be used on missile platforms. The technology is far from nefarious, even if it could potentially one day play a role in something you’re pretty sure you don’t like.
I’m not sure where these tech industry professionals think the equipment used by the U.S. military comes from but by and large, it comes from companies in the private sector working on contract or in partnership with the Department of Defense. Many of these endeavors not only didn’t result in wanton destruction, they helped to build the world we live in today — ya know, like how the internet itself, which began as a DOD initiative through DARPA.
These Google employees have every right to voice their concerns about Project Maven, and they’re welcome to resign because they find the idea of working with the military so difficult to stomach but I find it hard to paint them as the ethical heroes I’ve seen them hailed as. Maven has the capacity to save American lives, to streamline the nation’s intelligence apparatus over the battlefield, and give America an increased advantage over nations that don’t often squabble in the public square about new defense technologies as they develop them with American targets in mind. And as most of these former Googlers seem to know, these systems are going to happen anyway — they just don’t like the idea of their brand getting caught up in the effort.
If you ask me, resigning from Google because of a partnership with the Defense Department has less to do with an ethical line you’ve drawn in the sand, and more to do with maintaining the imaginary barriers many Americans place between themselves and the brutal realities of the world outside our borders. It’s easier to pretend the fighting’s not going on that it is to look it in the face. It’s easier to ignore the needs of our Soldiers, Sailors, Marines, and Airmen than it is to accept responsibility for our role in putting them in harm’s way.
It’s easier to resign than it is to face the cruel reality that we’re a nation with opponents and enemies, diplomatic and otherwise; that there are combat operations going on right now, as we sit comfortably at our desks making lofty judgements about what is and isn’t socially acceptable in Silicon Valley.
I’m not making any judgments about the character of those choosing to voice their opposition to Project Maven or Google’s participation in it but you’ll be hard pressed to find me congratulating anyone for putting their blinders on and pretending war isn’t an unfortunate, but nonetheless real part of the world we live in.
Image courtesy of Wikimedia Commons
Promotion of Delta Force Trained General Who Led 82nd Airborne Division During Afghanistan Evacuation Held Up By Senate
US Navy to Sideline 17 Support Ships to Address Civilian Mariner Shortage
Inside Delta Force: America’s Most Elite Special Mission Unit
SOFREP Weekly-Former Navy SEAL Exposes Biden’s Risky Move: Missiles Into Russia
A Simple Software Upgrade Might Have Saved This F-18 Pilot’s Life
Join SOFREP for insider access and analysis.
TRY 14 DAYS FREEAlready a subscriber? Log In
COMMENTS
You must become a subscriber or login to view or post comments on this article.