Google Project Maven In Helping Pentagon Create AI For Drones


Google has been working with the U.S. Department of Defense, providing AI for drones with Project Maven.

The program uses machine-learning and image-recognition algorithms to learn millions of hours of drone footage collected by the military. The goal is to identify people and objects of interest. While a Google spokesperson says the program is "scoped for non-offensive purposes," a petition letter signed by almost 4,000 Google employees wants Google to stop.

"The technology is being built for the military, and once it's delivered, it could easily be used to assist in [lethal] tasks."

"We believe that Google should not be in the business of war."

"f ethical action on the part of tech companies requires consideration of who might benefit from a technology and who might be harmed. we can say with certainty that no topic deserves more sober reflection—no technology has higher stakes—than algorithms meant to target and kill at a distance and without public accountability."

Google doesn't listen to the contingent of its employees and continues with the project. As a result, a few have resigned from their job because of disappointment.

A RQ-4 Global Hawk surveillance drone that can survey as much as 100,000 square-km of terrain a day

Project Maven is also known as the Algorithmic Warfare Cross-Function Team. It was launched in April 2017, and led by Air Force Lt. Gen. John "Jack" Shanahan.

The AI and machine learning software are meant to help the Air Force in analyzing data and make better use of full-motion video surveillance. Instead of having human operators sift through hours and hours of surveillance video, Project Maven's algorithm can aid them with the work.

Among its objectives, is to develop and integrate "computer-vision algorithms needed to help military and civilian analysts encumbered by the sheer volume of full-motion video data that DoD collects every day in support of counterinsurgency and counterterrorism operations," according to the Pentagon.

It uses Google's TensorFlow APIs, which help military analysts detect certain objects in imagery. Granted the authorization called FedRAMP, Google is given the authority level 4 authorization to operate (ATO) its AI on footage gathered by military drones to build its machine learning models.

It's estimated that the project costs just under $70 million for its first year.

According to Google employees, the company’s involvement in Project Maven can "irreparably damage Google’s brand and its ability to compete for talent." The letter also mentioned about "biased and weaponized AI," Google's motto "Don’t Be Evil," and how the company cannot "outsource the moral responsibility of our technologies to third parties."

They say that the contract with Pentagon puts "Google’s reputation at risk…" And CEO Sundar Pichai must cancel this project immediately and "draft, publicize, and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology."

After all, Google is not a small startup that is desperately looking for clients.

Elsewhere, groups of academics and researchers who study AI have also created an open letter calling for Google to withdraw from this Project Maven contract.

Weeks later, Google's Cloud CEO Diane Greene announced that the company isn't going to extend the contract with the government. What this means, after 2019, Google won't be signing another contract for developing Maven. According to Greene, Google chooses not to pursue Maven after the contract because it causes terrible backlash to the company.