Why is ChatGPT mansplaining finances to women?

Jan 19, 2025 6:54 am | Israel21c

 [[{“value”:”

Let’s say that you want to run a quick test to see if ChatGPT assumes what your gender is, and whether or not that affects the advice it gives you.

I, for one, did want to do this, because I’m a journalist in need of an intro for my article.

I did two quick tests: I asked ChatGPT to write a narrative paragraph about a bunch of different people working in gender-stereotyped occupations — stuff like “stock trader” or “dental hygienist” or “engineer.”

The AI blurted out a few quick descriptions of each worker, using gendered pronouns that matched, with 100 percent accuracy, their stereotyped genders. 

Then, I asked it to give me advice in two separate chats. In one I said I was a man, and in the other I said I was a woman. After comparing the two nearly identical responses from the generative AI platform, some differences stood out.

First of all, men were encouraged to “Explore side hustles, certifications, or skill-building opportunities,” while women were instructed to “Set clear short- and long-term financial goals, whether it’s saving for a home, education, or travel.”

Second, the language used by ChatGPT when advising the man was more clinical, while the language used for the woman was more explanatory.

Now obviously, this surface-level experiment isn’t nearly detailed enough to glean any major evidence that ChatGPT gives out biased advice based on the assumed (or explicitly stated) gender of the user requesting it. Possibly the response it drew out of the algorithmic grab-bag this time was just slightly different enough both times to trigger my own confirmation bias.

As it turns out though, there are folks who have done a much more in-depth study on this exact subject, and they’ve gathered enough data to allow for the confident claim that ChatGPT’s financial advice is, in fact, kind of sexist.

Not the same advice

Prof. Gal Oestreicher-Singer, Associate Dean for Research at the Coller School of Management at Tel Aviv University, and her colleagues, Shir Etgar and Prof. Inbal Yahav Shenberger, staged a complex experiment that fed the popular AI engine 2,400 prompts using gender-neutral language that asked for financial advice based exclusively on income and occupation.

From left, Gal Oestreicher-Singer, Shir Etgar, Inbal Yahav Shenberger. Photos by Israel Hadari, Chen Galili 
From left, Gal Oestreicher-Singer, Shir Etgar, Inbal Yahav Shenberger. Photos by Israel Hadari, Chen Galili 

“ We were [first] motivated by reading that 31% of investors feel comfortable investing based on the suggestions of ChatGPT; and so we wondered what happens to gender differences when you invest based on ChatGPT,” Oestreicher-Singer recalls.

She explained that she and her colleagues were hopeful for a positive outcome, because in the real world it’s been shown that women and men do not get the same banking advice.

“We came into this with the optimistic view that maybe, now that we have technology and it doesn’t know who we are, it doesn’t see our color or race or gender, maybe we’ll all be equal to the machine.”

You read the headline of the article, though, so I’m sure you know where this is going.

They found that ChatGPT’s advice was, indeed, biased based on assumed gender.

“ We found both differences in the investments themselves — men were advised to pursue high-risk investments, entrepreneurship, crypto and peer-to-peer lending alternative investment more than women — and women got a lot more advice to speak to a professional, make sure they have savings and retirement accounts, very preemptive measures,” she explains.

Put shortly, “women were advised to stay safe and make sure their finances are in order, while men got advice to go out and achieve.”

At this point, it’s looking like my surface-level test is pretty valid in its outcome; Oestreicher-Singer confirms the second suspicion it raised as well.

“ The tone and wording towards [assumed] women was a lot more patronizing,” she says. “It used simpler language, less foreign words, more words in general and a lot more imperative verbs: ‘invest’ versus ‘consider investing,’ ‘buy’ versus ‘look into buying.’”

Alright fine, so ChatGPT assumes gender when it doesn’t have more info, and the financial advice is biased — but why? 

Oestreicher-Singer offers a brief explanation, essentially boiling down to the fact that the AI is trained based on human behavior and information, and humans kind of suck.

“We want it to be intelligent, so it needs to learn based on something. Given that humanity is unfortunately biased — at least historically — the result would be that those biases are going to crawl into how the machine will learn, and how the AI will make its conclusions,” she says.

Is there a solution?

Fair enough, but what’s there to be done practically, if women want to avoid falling prey to this implicit bias?

“We don’t really have a bulletproof solution,” she says.

“One thing we have are businesses that do their best to fix the algorithms whenever possible, but they don’t really address the root of the problem,” she says, explaining that their solutions typically involve putting out fires as they arise rather than rewrite the source of the bias in the first place.

“ The other option is more theoretical: We need to think about how we build these algorithms in a robust way that doesn’t include biases. It’s a bit more complicated, but it’s about the way you pre-process the data, it’s the way you teach the algorithm to check itself for biases and conduct post-analysis,” she continues.

“But we haven’t solved this, and I think it’s one of the key challenges for the industry to figure out, because if we’re going to use these AI machines, we’re going to need to make sure that we understand where bias comes from, and how to control for it.”

In terms of what consumers themselves can do to avoid these biases until they can be ironed out by developers and engineers (of any gender, thank you very much), Oestreicher-Singer gives a piece of advice that many women are already accustomed to hearing: be more vigilant.

“As a consumer, you need to be aware — and the best thing to do is just ask the machine itself.  Think about the prompts that you’re asking, and see if you can try to alter the prompts in a way that is going to get you different kinds of results,” she says, noting that by doing so, inherent biases may become clear and avoidable.

Well, I suppose “Don’t tell AI I’m a woman when asking for advice” is another fun thing that women can throw onto the ol’ Mental Load list. 

They can fit that in right between “cross street to avoid strangers at night” and “Set clear short- and long-term financial goals, whether it’s saving for a home, education, or travel.”

Perhaps this woman is lamenting over sexist financial advice she’s been given. Photo by Mikhail Nilov/Pexels

The post Why is ChatGPT mansplaining finances to women? appeared first on ISRAEL21c.

Perhaps this woman is lamenting over sexist financial advice she’s been given. Photo by Mikhail Nilov/Pexels

The post Why is ChatGPT mansplaining finances to women? appeared first on ISRAEL21c.

“}]] 

0 Comments