Lensa’s AI-powered Magic Avatars creates realistic images of people, but for Asian women, the results are more sexualized than others’.

While i agree there is a big issue with the bad biased and sexist training data this entire article is about the lensa app which uses (i assume) the default stable diffusion model laion-5b.

Intentional creating sexualized pictures is banned in their guidelines. And yet no one thought of creating a good negative prompt that negates any kind of nudity or eroticism? It still doesn’t properly fix the training data but at least people aren’t unwillingly presented porn of their own images.

Also everyone can create a dataset and build a stable diffusion model, so why is lensa relying on the default model which is more like a quick and dirty tech demo. They had all the tools to do this right but decided to not even uses the easy lazy ones.

Create a post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

  • 1 user online
  • 63 users / day
  • 202 users / week
  • 638 users / month
  • 2.07K users / 6 months
  • 1 subscriber
  • 3.48K Posts
  • 69K Comments
  • Modlog