As a black male living in North America, it is sometimes hard to admit the bias that is inherently present in me. This is especially difficult because most things that I see on social media and the news are stories of people like me experiencing biased treatments — stories of unfair treatment due to skin colour. Additionally, it’s hard to admit that I am biased because I have also experienced bias myself and admitting that I am also biased is clouded by the few negative experiences I have had with a few bad apples.
I found the talk to be very interesting because Rebecca did not only focus on Racial and Gender Bias.
At Collision Conference in New Orleans, I had the privilege of listening to the talk Diversity, coding & bias in AI by Rebecca Parsons of ThoughtWorks. The talk shed light on some the biases that exist in our society today.
AI is biased
I found the talk to be very interesting because Rebecca did not only focus on Racial and Gender Bias. In addition to covering racial and gender biases, Rebecca did a 360 degrees coverage of biases in our society and demonstrated the importance of fostering a culture of inclusivity to create an equitable tech future. For me, some takeaways from the talk are:
- The thoughts, intentions, and biases of the makers of an artificial intelligence are transferred to the AI.
- Accepting that we are biased is an important step in improving equitability in tech.
- Developers should not only build for those like them but should build for and test with people of different background and orientations.
- The actions and decisions of AI is affected by what the developers thought were important while they were building it.
A No-Bias Resolution
To make any improvement or succeed in any endeavor, it is pertinent to first acknowledge the problem and make plans and set a goal to solve the problem. In addition to planning and setting goals, a consistent conscious effort is required to reach a long-term solution.
Going forward, I have resolved to do the following three things to promote inclusivity for an equitable tech future. I encourage you to tweak and adapt the following for your unique situation:
1. Acknowledge and reaffirm my own bias:
For me, I believe that acknowledging and reaffirming my biases is the first step to getting rid of them. Acknowledging my biases will enable me to tackle them and reaffirming them will serve as a constant reminder and check for me.
2. Promote inclusion in user testing:
Advocate that people of various demographics be brought in for user testing. This will enable the products I work on to work well for people of more demographics and prevent situations like what happened with a Personal Assistant designed for Doctors that worked well for male voices (Doctors) but not for female voices (Doctors).
3. Speak up when someone makes an inappropriate comment:
Speaking up against inappropriate comments is something that’s close to my heart. I want to continue working on it especially as it relates to women. I care about this because I have three sisters and I have other strong and powerful women in my life that I care about and work with. Sadly, a few years ago, I did not stand up against an inappropriate comment. While having lunch with some guys, one of them made a comment like “Why won’t she get it? She’s a pretty girl in skirts”. I was shocked and my mouth dropped in disbelief. Despite my shock, I didn’t confront the guy. After that situation, I learned not to be silent when I hear inappropriate comments like that. I have since been speaking up against such comments and I am now more determined to continue speaking up.
What we should all do
I am biased. You are biased. AI is biased. We are all biased.
As we continue to advance in technology and develop AI that can almost pass for a person, may we collectively and individually acknowledge and check our biases, include different demographics in user research and development and speak up against inappropriate comments made about people that are not like us.