Dangers posed by facial recognition like mass surveillance and mistaken identity have been widely discussed in recent years. But digital rights groups say an equally insidious use case is currently sneaking under the radar: using the same technology to predict someone’s gender. Now, a new campaign has been launched to ban technologies that conduct gender recognition. More than 60 NGOs have sent a letter to the European Commission, asking it to ban this technology.
“Trying to predict someone’s gender from digitized clues is fundamentally flawed”, says Os Keyes, a researcher who’s written extensively on the topic. This technology tends to reduce gender to a simplistic binary and, as a result, is often harmful to trans people who might not fit into these narrow categories.
When the resulting systems are used for things like limiting entry for physical spaces or verifying someone’s identity for an online service, it leads to discrimination.
Commercial facial recognition systems, including those sold by big tech companies like Amazon and Microsoft, frequently offer gender classification as a standard feature.
With facial recognition tech, if someone has short hair, they’re categorized as male; if they’re wearing makeup, they’re female. Similar assumptions are made based on biometric data like bone structure and face shape. The result is that people who don’t fit easily into these two categories — like many trans individuals — face discrimination.
“These systems don’t just fail to recognize that trans people exist. They literally can’t recognize that trans people exist,” says Keyes.
Current applications of this gender recognition tech include digital billboards that analyze passersby to serve them targeted advertisements; digital spaces like “girls-only” social app Giggle, which admits people by guessing their gender from selfies; and marketing stunts, like a campaign to give discounted subway tickets to women in Berlin to celebrate Equal Pay Day that tried to identify women based on facial scans. Researchers have also discussed much more potentially complex use cases, like deploying the technology to limit entry to gendered areas like bathrooms and locker rooms.
Being rejected by a machine in such a scenario has the potential to be not only humiliating and inconvenient but also to trigger an even more severe reaction from others present.
Ultimately, technology that tries to reduce the world to binary classifications based on simple heuristics will always fail when faced with the variety and complexity of human expression, inevitably hurting those who are already marginalized by society.
Source: The Verge
Apple is reportedly preparing for a significant design overhaul with its iPhone 17 series, blending…
Karachi: A private school in Karachi has unveiled Pakistan’s first AI-powered teacher, a groundbreaking move…
Third-party apps have long been a staple of the Android ecosystem, but their appeal has…
ISLAMABAD: The Competition Commission of Pakistan (CCP) has completed its Phase-II review of Pakistan Telecommunication…
Xiaomi has shattered records by producing 100,000 vehicles in just 230 days. This is nearly…
OpenAI, in collaboration with nonprofit organization Common Sense Media, announced on Wednesday the launch of…