Close Menu
  • Homepage
  • News
  • Cloud & AI
  • ECommerce
  • Entertainment
  • Finance
  • Opinion
  • Podcast
  • Contact

Subscribe to Updates

Get the latest technology news from TechFinancials News about FinTech, Tech, Business, Telecoms and Connected Life.

What's Hot

Directing The Dual Workforce In The Age of AI Agents

2026-01-22

Pyrax Exposed: The Sinister Data Heist Targeting Crypto Investors

2026-01-22

Mettus Launches Splendi App To Help Young South Africans Manage Their Credit Health

2026-01-22
Facebook X (Twitter) Instagram
Trending
  • Directing The Dual Workforce In The Age of AI Agents
Facebook X (Twitter) Instagram YouTube LinkedIn WhatsApp RSS
TechFinancials
  • Homepage
  • News
  • Cloud & AI
  • ECommerce
  • Entertainment
  • Finance
  • Opinion
  • Podcast
  • Contact
TechFinancials
Home»Opinion»AI has a Gender Bias Problem – Just Ask Siri
Opinion

AI has a Gender Bias Problem – Just Ask Siri

ContributorBy Contributor2019-09-23No Comments5 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email
virtual personal assistants
virtual personal assistants. Shutterstock.com
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

by Rachel Adams

Suggest to Samsung’s Virtual Personal Assistant Bixby “Let’s talk dirty”, and the female voice will respond with a honeyed accent: “I don’t want to end up on Santa’s naughty list.”

Ask the same question to the programme’s male voice and it replies “I’ve read that soil erosion is a real dirt problem.”

In South Africa, where I live and conduct my research into gender biases in artificial intelligence, Samsung now offers Bixby in various voices depending on which language you choose. For American English, there’s Julia, Stephanie, Lisa and John. The voices of Julia, Lisa and Stephanie are coquettish and eager. John is clever and straightforward.

Virtual Personal Assistants – such as Bixby, Alexa (Amazon), Siri (Apple) and Cortana (Microsoft) – are at the cutting edge of marketable artificial intelligence (AI). AI refers to using technological systems to perform tasks that people usually would.

They function as an application on a smart device, responding to voice commands through natural language processing. Their ubiquity throughout the world is rapidly increasing. A recent report by UNESCO estimated that by as early as next year we will be having more conversations with our virtual personal assistants than with our spouses.

Yet, as I’ve explored in my own research with Dr Nora Ni Loideain from the Information Law and Policy Centre at the University of London, these technologies betray critical gender biases.

With their female names, voices and programmed flirtatiousness, the design of virtual personal assistants reproduces discriminatory stereotypes of female secretaries who, according to the gender stereotype, is often more than than just a secretary to her male boss.

It also reinforces the role of women as secondary and submissive to men. These AI assistants operate on the command of their user. They have no right to refuse these commands. They are programmed only to obey. Arguably, they also raise expectations for how real women ought to behave.

The objective of these assistants is to also free their user from menial work such as making appointments and purchasing items online. This is problematic on at least two fronts: it suggests the user has more time for supposedly more important work. Secondly, it makes a critical statement about the value of the kind of secretarial work performed, first by real women and now by digitalised women, in the digital future.

“What are you wearing?”

One of the more overt ways in which these biases are evident is the use of female names: Siri and Cortana, for instance. Siri is a Nordic name meaning “the beautiful woman that leads you to victory”.

Cortana takes its name (as well as visuals and voice) from the game series Halo. In Halo, Cortana was created from a clone of the brain of a successful female scientist married with a transparent and highly-sexualised female body. She functions as a fictional aide for gamers with her unassuming intelligence and mesmeric shape.

In addition to their female voices, all the virtual personal assistants on the market today come with a default female voice, which, like Bixby, is programmed to respond to all kinds of suggestive questions and comments. These questions include: “What are you wearing?” Siri’s response is

why would I be wearing anything?

Alexa, meanwhile, quips: “They don’t make clothes for me”; and Cortana replies, “Just a little something I picked up in engineering.”

Bias and discrimination in AI

It is being increasingly acknowledged that AI systems are often biased, particularly along race and gender lines. For example, the recent recruitment algorithm development by Amazon to sort resumes for job applications displayed gender biases by downgrading resumes which contained the word “women” or which contained reference to women’s colleges. As the algorithm was trained on historical data and the preferential recruitment of males, it ultimately could not be fixed and had to be dropped.

As research has shown, there is a critical link between the development of AI systems which display gender biases and the lack of women in teams that design them.

But there is rather less recognition of the ways in which AI products incorporate stereotyped representations of gender within their very design. For AI Now, a leading research institution looking into the social impact of AI, there is a clear connection between the male dominated AI industry and the discriminatory systems and products it produces.




Read more:
The fourth industrial revolution risks leaving women behind


The role of researchers is to make visible these connections and to show the critical links between the representations of women, whether in cultural or technological products, and the treatment of women in the real world.

AI is the leading technology in the so-called Fourth Industrial Revolution. It refers to the technological advances – from biotechnology, to AI and big data – that are rapidly reshaping the world as we know it. As South Africa continues to engage with the promises and pitfalls of what this holds, it will become increasingly more important to consider and address how the technologies driving these changes may affect women.The Conversation

Rachel Adams, Research Specialist, Human Sciences Research Council

This article is republished from The Conversation under a Creative Commons license. Read the original article.

AI Alexa Artificial intelligence Bixby Cortana Gender bias sexisim Siri virtual personal assistants Women and girls
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Contributor

Related Posts

Directing The Dual Workforce In The Age of AI Agents

2026-01-22

The Productivity Myth That’s Costing South Africa Talent

2026-01-21

The Boardroom Challenge: Governing AI, Data And Digital

2026-01-20

Ransomware: What It Is And Why It’s Your Problem

2026-01-19

Can Taxpayers Lose By Challenging SARS?

2026-01-16

Science Is Best Communicated Through Identity And Culture – How Researchers Are Ensuring STEM Serves Their Communities

2026-01-16

Could ChatGPT Convince You To Buy Something?

2026-01-15

AI Bot Redefines Business Ads with Radio

2026-01-14

Trust Is The New Currency Of The Digital Economy

2026-01-12
Leave A Reply Cancel Reply

DON'T MISS
Breaking News

Directing The Dual Workforce In The Age of AI Agents

We will be the last generation to work with all-human workforces. This is not a…

Huawei Says The Next Wave Of Infrastructure Investment Must Include People, Not Only Platforms

2026-01-21

South Africa: Best Starting Point In Years, With 3 Clear Priorities Ahead

2026-01-12

How SA’s Largest Wholesale Network is Paving the Way for a Connected, Agile Future

2025-12-02
Stay In Touch
  • Facebook
  • Twitter
  • YouTube
  • LinkedIn
OUR PICKS

Mettus Launches Splendi App To Help Young South Africans Manage Their Credit Health

2026-01-22

The EX60: A Volvo That Talks Back

2026-01-20

Over R270M In Phuthuma Nathi Dividends Remain Unclaimed

2025-11-27

Africa’s Next Voice Revolution, When 5G Meets AI

2025-11-21

Subscribe to Updates

Get the latest tech news from TechFinancials about telecoms, fintech and connected life.

About Us

TechFinancials delivers in-depth analysis of tech, digital revolution, fintech, e-commerce, digital banking and breaking tech news.

Facebook X (Twitter) Instagram YouTube LinkedIn WhatsApp Reddit RSS
Our Picks

Directing The Dual Workforce In The Age of AI Agents

2026-01-22

Pyrax Exposed: The Sinister Data Heist Targeting Crypto Investors

2026-01-22

Mettus Launches Splendi App To Help Young South Africans Manage Their Credit Health

2026-01-22
Recent Posts
  • Directing The Dual Workforce In The Age of AI Agents
  • Pyrax Exposed: The Sinister Data Heist Targeting Crypto Investors
  • Mettus Launches Splendi App To Help Young South Africans Manage Their Credit Health
  • The Fintech Resilience Gap: Why Africa’s Next Decade Depends On Structural Integrity
  • Resolv Secures $500,000 Pre-Seed To Build The Recovery Layer For Stolen Crypto
TechFinancials
RSS Facebook X (Twitter) LinkedIn YouTube WhatsApp
  • Homepage
  • Newsletter
  • Contact
  • Advertise
  • Privacy Policy
  • About
© 2026 TechFinancials. Designed by TFS Media. TechFinancials brings you trusted, around-the-clock news on African tech, crypto, and finance. Our goal is to keep you informed in this fast-moving digital world. Now, the serious part (please read this): Trading is Risky: Buying and selling things like cryptocurrencies and CFDs is very risky. Because of leverage, you can lose your money much faster than you might expect. We Are Not Advisors: We are a news website. We do not provide investment, legal, or financial advice. Our content is for information and education only. Do Your Own Research: Never rely on a single source. Always conduct your own research before making any financial decision. A link to another company is not our stamp of approval. You Are Responsible: Your investments are your own. You could lose some or all of your money. Past performance does not predict future results. In short: We report the news. You make the decisions, and you take the risks. Please be careful.

Type above and press Enter to search. Press Esc to cancel.

Ad Blocker Enabled!
Ad Blocker Enabled!
Our website is made possible by displaying online advertisements to our visitors. Please support us by disabling your Ad Blocker.