Last month, we covered news that Google and the Pentagon were collaborating on developing artificial intelligence for analyzing drone footage. There have been reports that Google employees were exceedingly unhappy over this, and a recent open letter sent to company CEO Sundar Pichai makes that point from the beginning, opening with:
We believe that Google should not be in the business of war. Therefore we ask that Project Maven be cancelled, and that Google draft, publicize and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology.
The letter notes that while Google has promised its work will not be used to operate drones or launch weapons, the technology in question could be used for other military purposes. It then declares that being seen as an ally of the US military in these matters could irreparably harm Google’s efforts to attract new talent by grouping it with companies like Palantir, Raytheon, and General Dynamics.
It’s difficult to draw clean lines between military and non-military usage of some technologies. When does built-in object recognition become a technology you don’t want to help develop?
The Google employees state, “We cannot outsource the moral responsibility of our technologies to third parties.” The authors reject the idea that the situation is permissible simply because companies like Amazon and Microsoft are participating. The letter, which was signed by more than 3,100 people, declares that Google’s unique position requires that it hold itself to a different, higher standard.
Google’s formal response, in contrast, is a handwaved “No, seriously, everything is fine.” We quote:
Maven is a well-publicized DoD project, and Google is working on one part of it—specifically scoped to be for non-offensive purposes and using open-source object-recognition software available to any Google Cloud customer. The models are based on unclassified data only. The technology is used to flag images for human review and is intended to save lives and save people from having to do highly tedious work.
Any military use of machine learning naturally raises valid concerns. We’re actively engaged across the company in a comprehensive discussion of this important topic and also with outside experts, as we continue to develop our policies around the development and use of our machine-learning technologies.
The response is boilerplate PR-speak and avoids engaging with any of the substantive questions raised by the employees. These types of issues have arisen before, in many different contexts. For all the talk of how the federal government shouldn’t be in the business of picking winners and losers that you sometimes hear, the fact is, the federal government purchases enormous quantities of, well, stuff, from computers and office equipment to military hardware.
One of the largest changes to federal procurement over the past 23 years was the 1994 declaration by Secretary of Defense William Perry that the military should rely on COTS (Commercial Off the Shelf) whenever possible, as opposed to paying for custom hardware and in-house development. The debate over the degree to which this has been good or bad often comes down to the specifics of a given project, but one thing is obvious: If you’re going to work with COTS hardware and software, it means you’re going to be working with the companies that developed your hardware and software much more frequently. If nothing else, it creates more opportunities for blurred lines between what is and isn’t acceptable support for a given policy or usage.
From how things sound, Google has no plan to drop its collaboration on Project Maven. What that’ll mean longer-term is uncertain.