1: Listen to the ““Racism is America’s Oldest Algorithm”: How

1: Listen to the ““Racism is America’s Oldest Algorithm”: How Bias Creeps Into Health Care AI” episode of the Color Code podcast (via Sound Cloud or Apple Podcasts). Take notes.

A: Ziad Obermeyer discussed a case of algorithmic bias that he & other medical practitioners were working with the company that made a health care AI to produce non-discriminatory solutions for. Obermeyer found that the algorithm underscored Black patients needing medical care while fast-tracking white patients for medical care.

What roles do flaws in model design play in producing this case of algorithmic bias (For example, model design flaws can be due to: definitions of success, variables within the model, the data being used to train the algorithm, etc.)? Identify & explain one limitation of utilizing algorithmic audits to catch algorithmic bias.

B: Consider this scenario: a software engineer & data scientist has been hired to develop a health care AI to identify patients at risk of heart disease, but the designer doesn’t know much about the complex histories of racism nor how racist discrimination works.

Define care ethics & then explain how the engineer fails to meet the requirements to practice care ethics. Identify & define the ethical elements of care the engineer fails to meet. Explain your reasoning as to how the engineer failed to meet the ethical element of care.

Share This Post

Email
WhatsApp
Facebook
Twitter
LinkedIn
Pinterest
Reddit

Order a Similar Paper and get 15% Discount on your First Order

Related Questions