top of page

Who Gets to Decide What’s “Better”?

A woman in a black dress poses confidently in front of a yellow train. The background shows trees and a clear sky. White painted lines accent the dress.

This isn’t just about beauty. It’s about who gets to decide what’s “better.”


When I first began experimenting with AI beauty filters for Facetune Portraits, I thought I was exploring aesthetics — how technology distorts faces, how algorithms shape identity. But it quickly became clear that beauty was only the surface.


This isn’t just about beauty. It’s about who gets to decide what’s “better.”

Behind every edit is a decision. Behind every “enhancement” is a hierarchy. These systems are built from millions of data points, and those data points reflect culture, bias, and power. The result is a digital mirror that doesn’t just show us what we look like — it tells us what we should look like.


The question isn’t just what AI thinks is beautiful. It’s who trained it to think that way.


My work uses those same tools to reveal how our collective insecurities have been automated, scaled, and sold back to us as self-improvement. By painting over AI distortions, I reclaim the parts of ourselves that algorithms erase — the asymmetry, the texture, the individuality.


When art intervenes, it reminds us that beauty isn’t a system. It’s a story — one that belongs to everyone.


Who Gets to Decide What’s “Better”?

 
 
 

Comments


bottom of page