FileHippo News

The latest software and tech news

Study published in the journal, Science, claims AI displays biases for racial and gender stereotyping. As AI and the algorithms that control it continue to become... Most Artificial Intelligence Exhibits Gender And Racial Bias, Claims New Research 

Study published in the journal, Science, claims AI displays biases for racial and gender stereotyping.

as Artificial Intelligence (AI) becomes closer to acquiring human like language skills, it is also absorbing the humans hidden biases for racial and gender stereotyping at a sub conscious level.

Report says AI absorbing the humans hidden biases for racial and gender stereotyping at a sub conscious level.

As AI and the algorithms that control it continue to become part of our everyday lives, there is growing evidence that the data sets and input stimulus it uses to train itself and grow do not include a diverse range of mankind.

The research, focuses on a specific type of machine learning known as “word embedding”, which has become a key process in the way that computers with some form of AI aspect interpret speech and text.

What You See Is What You Get

The results seem to prove that AI decision-making can become inherently biased along racial and gender lines, albeit unintentionally. At a basic level, AI can only self tech from the material and influences around it, or from the information it takes from its programmers. Essentially, it’s a WYSIWYG/WIZZYWIG (What You See Is What You Get) problem.

The research found among other things that female names were often strongly associated with artistic, family and creative terminology, while male names were found to be far more attached to career, maths and science related terms.

Cultural history

Our findings suggest that if we build an intelligent system that learns enough about the properties of language to be able to understand and produce it, in the process it will also acquire historical cultural associations, some of which can be objectionable… Already, popular online translation systems incorporate some of the biases we study. Further concerns may arise as AI is given agency in our society… If machine-learning technologies used for, say, résumé screening were to imbibe cultural stereotypes, it may result in prejudiced outcomes… Our work suggests that behaviour can be driven by cultural history embedded in a term’s historic use. Such histories can evidently vary between languages.”

One of the problems of the discovery is that humans don’t expect AI to lie to them, and expect honest and objective answers free from bias and racism. So, while AI has been found to be making decisions that are fundamentally flawed along gender and racial lines, at least it’s not their fault. Phew!

This could be a problem for the future and direction of AI, though. While progress within AI research isn’t exponential, it is accelerating all the time. Language however, changes far more slowly, and can take hundreds of years to evolve. It’s just sad that even AI is being tainted by our own prejudices.