Contractor Exploited Unsuspecting Black People To Help Google Develop Face Recognition AI

09/10/2019

Artificial Intelligence (AI) needs a lot of data to learn. The more information it consumes, the more it should be able to identify the patterns, to understand what's what, and implement the knowledge into certain use cases in the future.

The same goes with Google.

The company wanted to develop a more robust and capable face recognition AI for its Pixel 4 phones. The objective was to get people with darker skin tones to agree to record videos of themselves, so that their image and likeness could be sequenced for an AI training database.

In other words, Google was looking for people willing to sell their facial data.

To do this, Google hired Randstad.

Google's face scanning device
Google's face scanning device. (Credit: 9to5Google)

However, the Randstad as the contractor that worked on behalf of Google, was said to use aggressive tactics and confusing approaches. The contractor's workers were said to have exploited those unsuspecting black people, which were homeless.

The contracted workers employed aggressive techniques, fast-talking to confuse the targets and, in some cases, lied. The workers did this by enticing the targets with a $5 gift certificate, and rushed the subjects through an array of questions and a consent agreement.

Often, the workers didn't say anything about Google, nor letting the targets know that they were recorded.

“We were told not to tell (people) that it was video, even though it would say on the screen that a video was taken,” a source said, adding that the video of each target was stored under each of the worker's profile and periodically reviewed in performance meetings.

According to people familiar with the subject, those workers were told to walk away if the target gets suspicious.

Google's face scanning device
The Consent Agreement targets were supposed to agree, before the workers can gather their face data. (Credit: 9to5Google)

Workers were told to target homeless people because those people “didn’t know what was going on,” and were less likely to object over privacy concerns or discuss things with the media. The workers were also said to have tricked college students, attendees of the BET Awards festivities in Los Angeles, among other places

To make sure that targets are less aware, the device to record the video was encased in a large, rugged metal frame and sealed with tamper-detection stickers and security screws.

And to make sure that the workers work efficiently, the contractor were said to incentivize the workers, and gave them the chance of becoming full employees if they can meet their quotas.

The move to obtain more data about black people is because most data that is available is mostly biased. Data sets that are already abundance, are more about white people and less about black people. And when it comes to face recognition AI, this bias would make the technology less efficient.

But the company claims that it was ignorant about Randstad.

Google wasn't aware that the contractor was going after homeless people. What a Google manager reportedly instructed, was for Randstad to target people with darker skin, with nothing specific about the methods that should be used.

After the news was surfaced by The New York Daily News, Google suspended the project due to Randstad's controversial approach.