Accessibility Tip of the Week

Be intelligent about AI.

  • Summary: When using artificial intelligence (AI), thoughtfully reduce and check for bias.
  • Who it helps: All people with disabilities, as well as other marginalized groups.
  • Additional benefits: Understanding AI bias and limitations will benefit your entire audience.

Artificial Intelligence (AI) based tools are increasingly used in hiring, writing, and other activities. This is often viewed as a timesaver by being a quick way to produce or summarize content. However, the challenge is that AI is often trained with datasets containing limited historic and cultural perspectives. As a result, these datasets often include biases such as more men than women in leadership roles or little information pertaining to individuals with disabilities. These biases can be missed unless you intentionally look for and correct them.

What can I do?

Follow these tips to improve your AI usage:

  • Take time to better understand bias in AI. We recommend starting with Christopher Land’s presentation Disability Bias in Artificial Intelligence .
  • Double check results of any AI tools that you use. Keep human judgment as part of the process.
  • For example, if you use an AI tool to screen job applicants, have a human being double check the people screened out from time to time to identify trends that discriminate against people with disabilities.
  • If you are responsible for training, carefully select and curate datasets to be as unbiased as possible.

This work is licensed under CC BY-NC-SA 4.0    .

If you are promoting accessibility within your organization or community, sending out easy-to-understand tips can be a helpful addition to your strategy. You are welcome to share the tips here under the mentioned Creative Commons license, as long as you cite Accessible Community as your source.