News

This EU-funded AI judges your face and tells you how “normal” you are

Getting judged by society is bad enough, so you can imagine what it feels like to have an AI system expertly analyze your face and give it a rating for attractiveness. Don’t worry though; unlike the anxiety-inducing remarks you might get from people, this AI is actually judging you for a greater social cause.

Facial recognition is rampant with errors and biases, be it its problematic preference for the fairer skin or its inability to treat each facial image equally. And of course, there are the privacy concerns. In this regard, a new website called How Normal Am I? is using algorithms to judge users’ age, attractiveness, BMI, life expectancy, and gender.

The website was created by SHERPA, an EU-funded project that explores the impact of AI on ethics and human rights. In an interview with The Next Web, artist-in-residence at SHERPA Tijmen Schep explains and showcases the interesting system in detail.

The first thing the system does is ask you to face the webcam so that its algorithm can analyze your face and rate it for attractiveness. Schep explains that similar algorithms are used in dating apps like Tinder to match equally attractive people and in social media platforms like TikTok to promote content made by “good-looking” people.

The point that Schep makes with the demonstration of his system is that facial analysis algorithms are incredibly dependent on the data they are trained with. Since their training data comprises thousands of photos that are manually labeled by a group of people, and perceptions of beauty vary all over the world, any such algorithm is likely to classify someone as beautiful or ugly based on how its training samples have been labeled.

If you have a low score, it might just be because the judgment of these algorithms is so dependent on how they were trained,” explained Schep. “Of course, if you got a really high score, that’s just because you are incredibly beautiful.

Another note-worthy aspect of facial analysis systems is the ease with which they can be manipulated and deceived. For instance, when Schep’s system began analyzing its subject for age, the subject was able to fool it to identify him as ten years younger than he actually was by simply shaking his head.

Schep believes that facial recognition technology has given us that odd feeling of being “watched” all the time, especially as it continues to become a bigger part of our lives. He hopes to use his system to create more awareness around the long-term risks associated wit such technology, which prominently include the loss of our right to privacy.

You might feel more pressure to behave ‘normally’, which for an algorithm means being more average. That’s why we have to protect our human right to privacy, which is essentially our right to be different. You could say that privacy is a right to be imperfect,” Schep said.

Sponsored
Hamza Zakir

Platonist. Humanist. Unusually edgy sometimes.

Share
Published by
Hamza Zakir

Recent Posts

Teachers Can Now Access OpenAI’s Free AI Course

OpenAI, in collaboration with nonprofit organization Common Sense Media, announced on Wednesday the launch of…

7 mins ago

WhatsApp-Inspired Updates Under Testing in Google Messages

Google is exploring a revamped image-sharing interface in its Messages app, taking cues from WhatsApp…

28 mins ago

Create AI Video Backgrounds with YouTube Shorts’ Dream Screen

When it comes to online video streaming, YouTube is among the most well-known options. Every…

35 mins ago

Telecom Operators to Automatically Restore SIMs for Tax Filers

The Federal Board of Revenue (FBR) has introduced a comprehensive mechanism for blocking and unblocking…

2 hours ago

Trump’s Influence Fuels Bitcoin to Cross $100,000

Bitcoin broke the $100,000 mark for the first time on Thursday, driven by Trump's crypto-friendly…

3 hours ago

PSX Hits New Heights, Surges by 1,800 Points to Cross 97,000

On Thursday, the Pakistan Stock Exchange (PSX) 100 Index surged by 1,781.94 points, or 1.86%,…

3 hours ago