In July of last year, 13 Army commanders and technology executives from the United States met at the Silicon Valley outpost in the Pentagon, two miles (about 3 kilometers) from Google’s headquarters.
It was the second meeting of a council created in 2016 to advise the army on ways to apply technology to the battlefield. Milo Medin, vice president of Google, derived the conversation to the use of artificial intelligence in war games . Eric Schmidt, former head of Google, proposed using that tactic to map out strategies for confrontations with China within the next 20 years.
A few months later, the Department of Defense hired Google’s cloud services division to work on the Maven Project , a sweeping effort to improve its surveillance drones with technology that helps machines think and see.
The pact could generate millions in revenue for the Internet giant of Alphabet. But within a company whose employees largely reflect the liberal sensibilities of the San Francisco Bay Area, the contract is as popular as President Donald Trump . Since 2010, when Google withdrew from China after facing the state censors, there was not a problem that shook its bases so much. Almost 4 thousand employees of Google , out of a total of 85 thousand of Alphabet, signed a letter asking Google’s top executive, Sundar Pichai, to cancel the Maven Project contract and stop all work in “the business of war”.
The petition cites Google’s history of avoiding military work and its famous slogan “do not do evil.” One of Alphabet’s AI research laboratories has even distanced itself from the project. Employees who oppose the agreement see it as an unacceptable link to a US administration that many oppose and a disturbing first step towards autonomous killing machines . About a dozen employees are resigning in protest over the company’s continued involvement in Maven, Gizmodo said on Monday.
The internal repercussions, which coincide with a more general protest about how Silicon Valley uses data and technology, have led Pichai to act. He and his lieutenants are drafting ethical principles to guide the deployment of Google’s powerful artificial intelligence technology, according to people with knowledge of the plans. That will shape your future work. Google is one of several companies competing for a service contract in the cloud with the Pentagon for a value of at least 10 billion. A spokesman for Google declined to say whether that has changed in the light of internal conflicts over military work.
Pichai’s challenge is to find a way to reconcile the most conciliatory roots of Google with its future. Having spent more than a decade developing the most formidable arsenal of research and AI skills in the industry, Google wants to combine those advances with its rapidly growing cloud computing business. The rivals are rushing to make deals with the government, which spends billions of dollars a year on everything related to the cloud. No government entity spends more on that technology than the army. Medin and the director of Alphabet, Schmidt, who are part of the Pentagon’s Defense Innovation Board, pressured Google to work with the government on issues of counterterrorism, cybersecurity, telecommunications and more.
To master the cloud business and fulfill Pichai’s dream of becoming an “AI-first company”, Google will find it difficult to avoid the business of war.
Within the company there is no greater advocate for working with the government than the head of Google Cloud, Diane Greene. In an interview in March, he defended the Pentagon association and said it is wrong to characterize the Maven Project as a turning point. “Google has been working with the government for a long time,” he said.
The Pentagon created Project Maven about a year ago to analyze lots of surveillance data. Greene said his division obtained only a “small part” of the contract, without providing details. He described Google’s role in benign terms: scanning drones for landmines, for example, and then pointing them to military personnel. “The kind of thing that saves lives,” Greene said. The software is not used to identify targets or make attack decisions, says Google.
Many employees feel that their rationalizations are not convincing. Even members of the AI team have voiced their objections, saying they fear that working with the Pentagon could damage consumer relationships and Google’s recruiting ability.
“The lethal autonomous weapons… (will) allow the armed conflict to be fought on a larger scale than ever, and on faster time scales than humans can understand,” the letter says. “We do not have much time to act.” DeepMind, based in London, assured staff that it is not involved in the Maven Project, according to a person familiar with the decision. A spokeswoman for DeepMind declined to comment.