Lensa AI image app helps some trans people to embrace themselves

Tuesday, 20 December 2022 11:00 GMT

A composite image of Abbie Zeek, a stage manager in Australia and an edited image of Abbie Zeek created by Lensa, an artificial intelligence-based photo editing tool. Abbie Zeek/Handout via Thomson Reuters Foundation

Image Caption and Rights Information
Some trans people say the app helps affirm their gender identity, though others raise concerns over bias in algorithms
  • AI photo editing apps' portraits split opinion
  • Some trans people say images help affirm their gender
  • Calls for greater diversity in AI to avoid bias

By Adam Smith

LONDON, Dec 20 (Openly) - When Junia Joplin tried out Lensa – a popular app that generates stylised images based on photographs – she saw a version of herself that had never existed but made perfect sense.

Joplin, who started transitioning as a transgender woman five years ago at age 39, said setting the app to create female images from her teenage snapshots had helped her to feel more at ease with her past.

"It was moving. Some of them looked so realistic," Joplin, an associate pastor from Toronto, told Context.

"So many of my memories don't make sense, like I'm a woman who's had a bunch of memories of some man's life imprinted on her consciousness," said Joplin.

"But seeing 'young June', it became easier to envision myself as a young girl."

Lensa, made by California-based Prisma Labs, uses artificial intelligence (AI) for its "Magic Avatars" feature that generates a selection of original portraits and cartoons based on photos.

The app asks users to upload a selection of pictures of themselves and choose whether their avatars should be shown as male, female or other.

The resulting stylised, brightly coloured, and sometimes scantily clad images have been plastered over social media feeds in recent weeks.

Lensa has drawn criticism - some users have complained that their avatars are sexualised with big breasts and little clothing, or that they reflect racial stereotypes.

Prisma Labs said Lensa results sometimes reflect biases in the millions of images that the app is trained on, despite developers' efforts to screen them out. This month, the firm updated the app to better filter out explicit image results.

But for some trans and non-binary people who struggle with gender dysphoria - or a mismatch between their gender identity and their body - the app can be affirming.

"I suffer from dysphoria around parts of my body that could be perceived as 'male'," said Abbie Zeek, 27, a trans stage manager for a theatre company in Australia.

Zeek sent her Lensa generated results to a friend, asking if the images were true to life.

"When they showed me the ones that looked like how they saw me, I burst into tears because the woman that was looking back at me from the photos not only looked like me, she looked like the woman I wanted to be," she said.

A composite image of Philip Li, also known as Le Fil, a pop artist in London and an edited image of Li created by Lensa, an artificial intelligence-based photo editing tool. Philip Li/Handout via Thomson Reuters Foundation

Gender diversity questions

Advancements in AI that uses existing text, audio files, images and videos to create new content are becoming increasingly realistic and more widespread.

However, some users said tools like Lensa may still not provide representation for users who reject the boundaries of binary gender.

Philip Li, an artist and performer who goes by the stage name Le Fil, selected the "other" gender option.

Li identifies as non-binary, meaning they do not see themselves as either male or female, and uses the pronouns they and them.

However, Lensa's AI added breasts to their images.

Li, who is British-Chinese, also said the pictures failed to capture their face accurately but followed stereotypes of small Asian women by softening their jaw and thinning their limbs.

"In most of the images, the face was quite distorted, as if it didn't know what to make of me - one image was a severed torso," they said.

"The images were mainly terrible in their likeness."

Li and others raised concerns that AI apps can create unattainable beauty standards for young users.

These idealised images of femininity can be particularly harmful for trans women by adding to pressure to "pass" as a woman without any question mark over their gender, Li added.

"Every stage of the trans journey is just as valid and in need of representation too", they said.

AI workforce diversity call

Art-making algorithms are trained by scanning pre-existing images which they can then copy and remix. Lensa's Magic Avatars learns from more than 5 billion images scraped from user-generated platforms, stock image sites, and works by famous artists.

"We value art and creativity because it requires authenticity, originality, often great skill and virtuosity, deep human insight," said Jon McCormack, a computer science professor at Australia's Monash University.

"Current artificial intelligence and machine learning systems possess none of these qualities, they are just good statistical mimics."

AI image-generating companies can ensure wider and more realistic representation across age, body type, gender, and other demographics by filtering the images they learn from, said McCormack.

Even then, it could prove difficult to avoid stereotypes - some genres such as fantasy often include gendered stereotypes and sexualised images of women, said McCormack.

For Li, this underscores the need for better LGBTQ+ representation among workers in the AI industry.

"When AI starts developing films, adverts, photoshoots, we'll just be completely forgotten again. All you'll have are hyper masculine men and super feminised women," they said.

"For AI to integrate with humans, it needs to reflect humanity, not an idealised version of it."

Related stories:

Number of transgender children seeking treatment surges in U.S.

'Why don't you just use me?'-Thai transgender advocate wants her Miss Universe to inspire

Transgender Nigerians fear proposals to criminalise cross-dressing

(Reporting by Adam Smith, @adamndsmith, Editing by Sonia Elks. The Thomson Reuters Foundation is the charitable arm of Thomson Reuters. Visit https://www.context.news)

Openly is an initiative of the Thomson Reuters Foundation dedicated to impartial coverage of LGBT+ issues from around the world.

Our Standards: The Thomson Reuters Trust Principles.

Update cookies preferences