This weekend, the photo-editing app Lensa flooded social media with celestial, iridescent, and anime-inspired “magic avatars.” As is typical in our milkshake-duck internet news cycle, arguments as to why using the app was problematic proliferated at a speed second only to that of the proliferation of the avatars themselves.
I’ve already been lectured about the dangers of how using the app implicates us in teaching the AI, stealing from artists, and engaging in predatory data-sharing practices. Each concern is legitimate, but less discussed are the more sinister violations inherent in the app, namely the algorithmic tendency to sexualize subjects to a degree that is not only uncomfortable but also potentially dangerous.
Lensa’s terms of service instruct users to submit only appropriate content containing “no nudes” and “no kids, adults only.” And yet, many users—primarily women—have noticed that even when they upload modest photos, the app not only generates nudes but also ascribes cartoonishly sexualized features, like sultry poses and gigantic breasts, to their images. I, for example, received several fully nude results despite uploading only headshots. The sexualization was also often racialized: Nearly a dozen women of color told me that Lensa whitened their skin and anglicized their features, and one woman of Asian descent told me that in the photos “where I don’t look white they literally gave me ahegao face.” Another woman who shared both the fully clothed images she uploaded and the topless results they produced—which she chose to mod