The term "gender" originally referred to grammatical categories in languages, and that it was once referred to as the sex of a person.
But fast forward to the modern-day society, gender has become something to refer to the social, psychological, and cultural attributes and expectations associated with being male, female, or non-binary. While often conflated with biological sex, which is determined by physical attributes such as chromosomes, hormones, and reproductive organs, gender is much more complex.
This is because gender has become a term that encompasses a huge range of identities and expressions.
Microsoft has an AI that can automatically detect gender, and later, it knew that it was playing with fire.
Since some people, like transgenders, often prefer to be referred to in a way that aligns with their gender identity, which means using the pronouns, names, and gendered terms that match how they identify, having the AI classifying users wrong, can cause a catastrophe.
This is why Microsoft said that it would retire its AI-powered gender classifier in 2022, and that existing users at the time would have until mid-2023 to discontinue use of these attributes before they are retired.
However, a year later, some users said that the AI was still around, and that they can still access the AI like normal.
Microsoft said at the time:
"To mitigate these risks, we have opted to not support a general-purpose system in the Face API that purports to infer emotional states, gender, age, smile, facial hair, hair, and makeup. Detection of these attributes will no longer be available to new customers beginning June 21, 2022, and existing customers have until June 30, 2023, to discontinue use of these attributes before they are retired."
It then added that:
Despite this statement, 404 Media reported that Microsoft was still providing the gender detection tool from Ada Ada Ada, an algorithmic artist whose practice involves processing photographs of herself through a variety of major AI image analysis tools and social media sites.
Ada Ada Ada said that she set up version 3.2 of Microsoft’s Image Analysis API to detect her age and gender based on her photographs in early 2022.
This was before Microsoft announced it was retiring the AI.
But two years later, in 2024, she said that she still could access the tool, meaning that Microsoft never turned it off.
In a surprising twist, Microsoft did not know that people were still able to access the tool until 404 Media reached out for comment.
Image-recognition AI is a exceptionally powerful technology because it allows computers to understand their surrounding, and interact with what is given to them visually.
And here, Microsoft first announced the AI back in a blog post in 2015.
However, an AI with the ability to classify gender, has long been criticized for often being wrong and particularly harmful for transgender and gender non-conforming people. And having it around makes it dangerous for Microsoft.
This is why Microsoft wanted it dead, until two years later it realized it didn't actually die.
Microsoft said its most recent Image Analysis API, which is version 4.0, doesn't come with gender or age detection capabilities, and that Microsoft has deprecated function in previous versions of the API, and that no new users should have access to those features.
However, Microsoft said that while version 3.2 of Image Analysis remains generally available, customers should not have had access to the deprecated age and gender function.
In other words, Ada Ada Ada having access to the AI is an "error."