Time and again, we’ve heard stories of machine learning algorithms and their developers being accused of discrimination.

Recently, I watched this video that explains how algorithms can produce racist results — which are consequently used in crucial decision making systems. YouTube’s demonetisation tool was found to be biased against queer content, by flagging video titles with LGBTQ-related vocabulary. In the tweet below, a Google Translate user shows how the tool missed the mark; by exhibiting gender stereotypes when translating some Kiswahili phrases to English.

All these stem from a huge problem in artificial intelligence known as bias.

Just as humans can be influenced by prejudice when making decisions, ML…

Allan Vikiru.

software engineer currently based in Nairobi, KE • portfolio: https://allanvikiru.github.io

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store