SciTech

Research Spotlight: Researchers are adding fairness to automated systems

Anupam Datta, associate professor of electrical and computer engineering at Carnegie Mellon, is taking on the challenge of adding fair- ness to automated tasks. With a $3 million fund from the Natural Science Foundation, Datta hopes to improved automated tasks that affect healthcare, criminal justice, and advertisement. This would be done by making these systems, in a sense, more fair.

Datta said in a Carnegie Mellon press release, “A key innovation of the project is to automatically account for why an automated system with arti cial intelligence components exhibits behavior that is problematic for privacy or fairness.”

For example, some arti cial intelligence systems were recently found to show prejudice and partialness as a result of the content they were fed.

The potential for these is- sues to affect industries and consumers, especially as machine learning software becomes more widespread, is what brought this research team together.
The team is made of Assistant Professor of Computer Science Matthew Fredrikson, principal systems scientist in electrical and computer engineering, Ole Mengshoel, Professor of Information Science at New York University Helen Nissenbaum, Associate Professor of Computer Science at Cornell University, Thomas Risten- part, and senior researcher at the International Computer Science Institute in Berke- ley, California Michael C. Tschantz.

De ning privacy and fair- ness for these systems is one of the initial challenging tasks the team faced. “Doing so is critical, since these methods are increasingly used to power automated decision systems.”

For example, Datta and Tschantz’s prior research found that women, on average, are shown less online ads about high paying jobs than men are. Their current research will also focus on ways to uphold and promote intellectual property and copy right laws online; Fredrickson commented, “This project will be great opportunity to... improve machine learning to be more privacy-friendly.”

Fairness and objectivity is especially needed in tech. Many videos have surfaced online of black people unable to use soap dispensers because sensors cannot detect their skin.

Moreover, sexism is still rampant in the Silicon Valley. The Google manifesto, for example, is a prime example of the tech industry’s closeted or pseudo-intellectual sexism.

Women at companies like Uber have expressed how much of a boys club the companies are.
It isn’t shocking that the automated systems these tech giants make will have evidence of their workplace culture in them.

This research is a step in the right direction. Ameliorating the fairness of automated systems will offer the same user experiences for all consumers, regardless of gender or ethnicity.