Data for good

This breast cancer survivor wants to use data and AI to help others beat the odds

September 20, 2023 | By Deborah Lynn Blumberg

Former dental hygienist Dorothy Oteng feels lucky to be alive. At 35, the mother of two from Maryland was diagnosed with breast cancer. While fewer Black women than white women are diagnosed with breast cancer, Black women are 40% more likely to die from the disease.

After a double mastectomy, chemotherapy and radiation, Oteng knew she wanted to be part of the solution to help improve diagnoses for other women. So she was ecstatic to be accepted into Howard University’s new Center for Applied Data Science and Analytics, which is training the next generation of data scientists to eliminate biases in artificial intelligence and address data equity issues.

“There aren’t a lot of women in data science, and especially African American women,” says Oteng, one of the nine Mastercard Impact Data Science Scholars who earned a full or partial scholarship to study at CADSA. She’s part of an inaugural class of 33, more than half of whom are women. “I want to diversify the field,” she says.

AI is playing an increasingly important role in diagnosing and treating all diseases. But there’s growing evidence that these algorithms can disadvantage communities of color. In one recent scientific study, researchers found that a health care algorithm that predicts which patients need extra care incorrectly identified Black patients as healthier and needing less treatment, potentially jeopardizing their health. 

Racial biases can seep into AI in two ways, says William Southerland, the Howard biochemistry professor running CADSA: one, because of the inherent and/or unrecognized bias of the algorithm designer, and two, when the data that is analyzed or interpreted is not obtained from diverse sources. In the first case, biased designers will tend to produce biased algorithms, and biased algorithms will likely generate biased results. In the second case, analysis of a biased data set will generate only biased results. 

“The best way to combat both issues is diversity in the data science workforce,” says Southerland, “because a diverse workforce inherently has the collective sensitivity to recognize bias in algorithms and in the data pool. If these pathways to bias in AI are not addressed, they can exacerbate existing disparities and possibly create new challenges.”

One of the initial focuses of CADSA, launched in part through a $5 million grant from Mastercard, is addressing bias in the AI-driven credit approval process. Banks, for example, could deny a potential homebuyer a mortgage based on their zip code because of an AI model built on the assumption that someone from a poorer neighborhood would be more likely to default. And denying that loan could continue to feed bias.

Fixing these types of biases can help create equity while benefitting other historically locked-out or disadvantaged groups. One of the Impact Data Science Scholars plans to study ways to use data to close wealth gaps. Another, a nutritionist, wants to use data science to solve global food-insecurity and health-disparity issues.

After attending an informational session on Howard’s program, Oteng knew she wanted to apply. “It feels like a community,” she says.

So far Oteng has taken two courses, an introduction to applied data science and computational social data justice. “I loved it immediately,” she says. “Howard has a very good support system. We all help each other.”

CADSA takes an interdisciplinary approach to improving AI. Oteng is learning from professors in computer science, engineering fields and the arts. By bringing a broader approach to building algorithms, the hope is to eliminate bias before it becomes a problem.

Ultimately, teaching at Howard and expanding the research on Black women’s health would be “a dream,” Oteng says, as a way to help future generations of data scientists and to give back. “For me, it’s been a journey,” she says, “and I’m going to see what opens up to me.”

Deborah Lynn Blumberg, contributor