Gender Bias In Artificial Intelligence And What Women Can Do About It
When my husband and I bought our first house, my income was higher than his, my credit score was better, and I’d been the one who dealt directly with the bank. (No shade to my husband. These are just the facts!)
But as we went to file paperwork, the lender made him the primary buyer based on the fact that the bank’s operational set-up defaulted to favor the male in a two-buyer scenario.
Our experience wasn’t a unique one. Many institutions worldwide use guidance from artificial intelligence (AI) to help them make decisions. AI is powered by algorithms that ingest and learn from data sets, with the goal of identifying and analyzing trends. This process is called machine learning. The benefit of machine learning is that it expedites information-heavy processes – like screening for home-buying approval – because it’s quickly able to analyze the probability that an applicant will be a reliable customer by paying back loans on-time. The same is true for processes like college applications, credit card approvals, and pretty much anything you do that requires your credit to be checked.
But the inherent risk in relying on algorithms is the same as the reward: It lets a machine guide the decision based on a data set that you can’t review yourself.
How Gender Bias Informs Artificial Intelligence
In the scenario I encountered, the fact that I am a woman means that my credibility will be lower in the eyes of an algorithm. This is because 300 million fewer women than men have access to the Internet via a mobile phone – and therefore, fewer women than men are able to report information about themselves. This results in the data that is collected heavily skewing male.
In addition, the fact that marital status and gender informed credit worthiness until the 1970s significantly hindered the ability for women to gain the financial status and credit history that would inform an equal data set. Years of casting aside the earning potential and financial trustworthiness of women resulted in an imbalanced baseline. So, when an algorithm looks to history to determine the factors that would influence whether I would be a reliable buyer, it sees both that women typically receive lower credit limits and that I am a woman, resulting in a ding against me.
The nature of reducing gender to two choices – male or female – also plays a significant role in data misinterpretation and accuracy based on how a person may characterize themselves otherwise.
What We Can Do About It
Get involved in the process.
Because algorithms are informed through the collection of data – which is collected by people – women can get more involved in the collection process.Right now, women make up only 22 percent of the professionals in AI and data science fields. While we can’t change history, we can provide important voices in the discovery process of unearthing and correcting ways that bias exists.
Get curious.
When we run into examples of gender bias based on machine learning, we can question where that’s coming from and share our experiences. As an example, you can fill out this tracker from the Berkeley Haas Center for Equity, Gender and Leadership to report instances where you encounter bias in AI. This center reviews examples reported for further exploration and possible inclusion in its playbooks for business leaders to mitigate data bias in AI. With an eye toward fixing processes rather than accepting them, we can impact change.
Stay informed.
Last, as AI has come more to the conversational fore in recent months with the rise of products like ChatGPT, it’s important for us to stay informed about how AI will impact us in coming years. Early technology often escapes regulation, and the more data we feed it in these early days, the more we’re prone to contributing to and being exposed to misinformed data sets. While interest in innovation is important, so is understanding our greater impact on possible inequity.