News

This EU-funded AI judges your face and tells you how “normal” you are

Getting judged by society is bad enough, so you can imagine what it feels like to have an AI system expertly analyze your face and give it a rating for attractiveness. Don’t worry though; unlike the anxiety-inducing remarks you might get from people, this AI is actually judging you for a greater social cause.

Facial recognition is rampant with errors and biases, be it its problematic preference for the fairer skin or its inability to treat each facial image equally. And of course, there are the privacy concerns. In this regard, a new website called How Normal Am I? is using algorithms to judge users’ age, attractiveness, BMI, life expectancy, and gender.

The website was created by SHERPA, an EU-funded project that explores the impact of AI on ethics and human rights. In an interview with The Next Web, artist-in-residence at SHERPA Tijmen Schep explains and showcases the interesting system in detail.

The first thing the system does is ask you to face the webcam so that its algorithm can analyze your face and rate it for attractiveness. Schep explains that similar algorithms are used in dating apps like Tinder to match equally attractive people and in social media platforms like TikTok to promote content made by “good-looking” people.

The point that Schep makes with the demonstration of his system is that facial analysis algorithms are incredibly dependent on the data they are trained with. Since their training data comprises thousands of photos that are manually labeled by a group of people, and perceptions of beauty vary all over the world, any such algorithm is likely to classify someone as beautiful or ugly based on how its training samples have been labeled.

If you have a low score, it might just be because the judgment of these algorithms is so dependent on how they were trained,” explained Schep. “Of course, if you got a really high score, that’s just because you are incredibly beautiful.

Another note-worthy aspect of facial analysis systems is the ease with which they can be manipulated and deceived. For instance, when Schep’s system began analyzing its subject for age, the subject was able to fool it to identify him as ten years younger than he actually was by simply shaking his head.

Schep believes that facial recognition technology has given us that odd feeling of being “watched” all the time, especially as it continues to become a bigger part of our lives. He hopes to use his system to create more awareness around the long-term risks associated wit such technology, which prominently include the loss of our right to privacy.

You might feel more pressure to behave ‘normally’, which for an algorithm means being more average. That’s why we have to protect our human right to privacy, which is essentially our right to be different. You could say that privacy is a right to be imperfect,” Schep said.

Sponsored
Hamza Zakir

Platonist. Humanist. Unusually edgy sometimes.

Share
Published by
Hamza Zakir

Recent Posts

Garena Free Fire India Launch Rumors: What Fans Need to Know

Reports suggest that Garena Free Fire is set to make a much-anticipated return to India.…

5 hours ago

Albania Bans TikTok for One Year: Here’s the Reason!

The Albanian government has announced a ban on the social media platform TikTok for a…

9 hours ago

Google Pixel 9 Pro vs. 8 Pro: Biggest Upgrades Compared

The launch of Google’s latest Pixel lineup brings an exciting chance to compare the new…

11 hours ago

Azad Kashmir to Host Pakistan’s First Women-Centric Software Technology Park

ISLAMABAD: In February next year, Pakistan is set to launch its first women-focused software technology…

11 hours ago

HEC Reveals Law Admission Test Date for LLB Students

The Law Admission Test (LAT) has been announced by the Higher Education Commission (HEC) of…

12 hours ago

Meta’s WhatsApp to Release New Playback Speed Feature for Videos

Meta's WhatsApp is rolling out a new playback speed feature, allowing users to adjust video…

1 day ago