Dangers posed by facial recognition like mass surveillance and mistaken identity have been widely discussed in recent years. But digital rights groups say an equally insidious use case is currently sneaking under the radar: using the same technology to predict someone’s gender. Now, a new campaign has been launched to ban technologies that conduct gender recognition. More than 60 NGOs have sent a letter to the European Commission, asking it to ban this technology.
“Trying to predict someone’s gender from digitized clues is fundamentally flawed”, says Os Keyes, a researcher who’s written extensively on the topic. This technology tends to reduce gender to a simplistic binary and, as a result, is often harmful to trans people who might not fit into these narrow categories.
When the resulting systems are used for things like limiting entry for physical spaces or verifying someone’s identity for an online service, it leads to discrimination.
Commercial facial recognition systems, including those sold by big tech companies like Amazon and Microsoft, frequently offer gender classification as a standard feature.
With facial recognition tech, if someone has short hair, they’re categorized as male; if they’re wearing makeup, they’re female. Similar assumptions are made based on biometric data like bone structure and face shape. The result is that people who don’t fit easily into these two categories — like many trans individuals — face discrimination.
“These systems don’t just fail to recognize that trans people exist. They literally can’t recognize that trans people exist,” says Keyes.
Current applications of this gender recognition tech include digital billboards that analyze passersby to serve them targeted advertisements; digital spaces like “girls-only” social app Giggle, which admits people by guessing their gender from selfies; and marketing stunts, like a campaign to give discounted subway tickets to women in Berlin to celebrate Equal Pay Day that tried to identify women based on facial scans. Researchers have also discussed much more potentially complex use cases, like deploying the technology to limit entry to gendered areas like bathrooms and locker rooms.
Being rejected by a machine in such a scenario has the potential to be not only humiliating and inconvenient but also to trigger an even more severe reaction from others present.
Ultimately, technology that tries to reduce the world to binary classifications based on simple heuristics will always fail when faced with the variety and complexity of human expression, inevitably hurting those who are already marginalized by society.
Source: The Verge
According to data from the Pakistan Bureau of Statistics (PBS), Pakistan imported mobile phones worth…
The European Union (EU) has expressed concern over the recent convictions of 25 civilians by…
The State Bank of Pakistan (SBP) has officially declared Wednesday, December 25, 2024, as a…
AKD Securities, the manager of the offer, informed the main stock exchange on Monday that…
ISLAMABAD: On Pakistan Television (PTV), medical experts raised serious concerns over false information on chemotherapy…
OpenAI has introduced Advanced Voice Mode to ChatGPT's desktop applications for macOS apps, enabling users…