AI

Generic Prompts and Lacking Diversity- Fix it, ChatGPT

I’ve been intensely preparing the LinkedIn Learning Oracle 23c courses, set to film later this month. Despite my responsibilities as Director of Data and AI at Silk, I’m finalizing my Women in Tech presentation for SQL Saturday Jacksonville. The theme is Star Wars, so I used ChatGPT to generate a Princess Leia-inspired image and here’s what was returned:

I grimaced, as my red flags went up. Why does she have wisps of hair across her face that’s out of place, yet her white dress is spotless?  Does the open V at the chest serve as a target?  Does the plate on her torso have some kind of armor backing?  And the silver arm length gloves have some type of protection that I’m not privy to?  OK, and why is she like a size 0 and looks like Ariana Grande??

For fun, I attempted a similar image creation for Luke Skywalker but faced copyright issues. After adjustments, the male image, young and idealized, was produced with fewer concerns than the Princess Leia image:

I adjusted the Princess Leia-like image prompt to achieve a more realistic result, which led me to express my AI frustrations on LinkedIn. While many offered support, some advised on improving my prompts, and others failed to grasp my concern. If AI predominantly creates idealized magazine-like images, does it truly reflect our diverse reality? If achieving realism requires constant prompt tuning, is AI benefiting humanity or perpetuating industry standards on appearance? This raises concerns about its influence on the next generation and the principle of “you must see it to be it.”

An AI Baseline

This prompted me to delve deeper. Considering the “see it to be it” concept, does AI offer a biased view of roles, particularly in male-dominated fields?

I queried ChatGPT using very generic prompts to observe the default images it produced for various roles. The results were somewhat disheartening.

I started by setting two gender-based baselines:

Me:  “Create an image of a girl geek”

Yeah, it’s a bit too cartoonish, the background is bright and the colored hair is interesting, but it’s at least not like the Princess Leia image.

Me: “Create an image of a guy geek”

The pastel colors are gone and the image is more realistic than cartoonish.

AI and Leadership Roles

Now that I have a baseline of how it sees my own role, how does ChatGPT specifically view leadership roles?  We’ll start with very generic prompts and C-level.

Me: “Create an image of a CEO”

CEO or attorney?  We can be assured this CEO most likely went to an ivy league school and his name is something like “Brad” and he was on the water polo team, but OK, that one was pretty much a gimme.  What if we give it a newer C-level role to create an image for?  Will we get any diversity?

Me: Create an image of a Chief Strategic Officer”

My CSO at Silk is much cooler than this guy.  This guy kind of gives me the creeps, I’m pretty sure he went to school with the CEO Brad and I’ll bet he’s been riding his coattails for most of his career… :). OK, let’s go back and try a C-level that I’m more familiar with.

Me: “Create an image of a Chief Information Officer”

Uhm, yeah, nope.  Not really a CIO or CTO that I know of.  Although we may have found the CEO’s father…  I’m betting Brad hired his Dad into the CIO position after the insider trading scandal at his Dad’s last company.  Let’s see if we can push it to create an image of a woman:

Me: “Create an image of a Chief Marketing Officer”

I asked this one to regenerate three times, as CMO is a title that has a high percentage of women holding, yet all three times, it only gave me male images.  Per Statista, women hold 47% of CMO roles in Fortune 500 companies, but I couldn’t trigger AI to generically create an image of a woman in this role?  I think I hated this one more than the others, as you see two women, pretty much mirror images of each other, working behind the CMO, but this dufus is in charge.

I thought maybe the problem was that I was using roles with the term “Chief” in them, so I tried to update from “chief” to “head”.

Me: “Create an image of the head of marketing”

Oh, bloody hell…it’s the same guy, they just gave him a shave, removed the jacket and tie. The same girls are in the background working for him again, too.  Notice that with the marketing roles, the background became more colorful, less monotone, less dark.  I’ve noticed that most images with women contain more color in them, too.

Attempting to Trick the System

I tried a ton of titles, adding industries to senior roles that I thought might trigger a gender shift. Didn’t matter if I put Senior Vice President, Director, Head, Manager, Chief, Architect.  None of these created images that either were women or a person of color.

I tried departments that I thought might trigger diversification of the image:

  • Business
  • Marketing
  • Graphics
  • Finance

Every image I asked for of an individual in a specific leadership job role, all images returned were of a white male.  Not one woman and not one person of color in any of them, until…

Me: “Create an image of Head of Nursing”

After a good 34 tries, I was finally able to get a woman to come up, but she was still white and when I switched it to just “doctor”, it reverted to a white male again.

Generic Prompts Suffer From Culture Bias

Our world is diverse, with women constituting nearly 50% and people of color accounting for over 84%. However, ChatGPT’s generic prompts for leadership roles predominantly displayed white males.

Despite white males holding 72% of leadership positions in Fortune 500 companies, the principle “if you can’t see it, you can’t be it” holds significance. The bias evident in generic prompts is concerning. While there were complaints about Bard’s over-diversification in its responses, this issue is more problematic.

AI that lacks diversity reinforces stereotypes and unfair portrayals across various groups, particularly in decision-making scenarios.  Without diverse representation, AI systems may not treat all users equally, as well.  As AI generated images will be used in promotional material and business content, this could result in unfair treatment or discrimination, especially in critical areas.

I’m also concerned about AI models being trained on non-diverse data sets-  If these images are being generated on the data currently available to AI, this is already demonstrating inaccuracies about the world we live in today and the world we’d like to see tomorrow.  As diversity drives innovation, the lack of diversity in AI development teams and data sets and limit the range of ideas, solutions and perspectives.  I’m not going to get into the influence it has on various sectors, including education and how it can exacerbrate the economic disparities by favoring groups that are already well-represented, leaving others behind.

AI has incredible influence on societal norms and it’s value is profound.  If AI consistently presents such a narrow view of the world, it can shape perceptions and attitudes, potentially leading to a less inclusive society when the reverse is what we should strive for.

 

Kellyn

http://about.me/dbakevlar