亚洲AV

Confronting and combatting algorithm bias at CoNECD conference

In This Story

People Mentioned in This Story
Body

Algorithms聽help us聽make hard decisions every day. Credit card companies, job boards, and more聽use聽fast-thinking algorithms to fairly decipher who fits their chosen criteria. But sometimes, they aren鈥檛 always as fair as they appear.聽

At the annual Collaborative Network for Engineering and Computing Diversity (CoNECD) conference in January, the Director of the MS Data Analytics Engineering program, James聽Baldo, presented how algorithm biases arise and where data analysts and algorithm creators could make changes to increase fairness.聽

鈥淚 looked at algorithm bias from a high-level technical perspective to show the audience that yes, algorithms can be biased, but that there is more to it than a yes or no analysis,鈥 says聽Baldo.聽

There are numerous aspects of algorithms that could hold bias. The data used for the algorithm, the core of the algorithm itself, and even the people interpreting the algorithm鈥檚 data could be where bias sneaks into the decision-making process, says聽Baldo.聽

鈥淎lgorithms use artificial intelligence and are designed by computational data that may have an inherent and unintentional bias,鈥 he says. 鈥淓mployment decisions are a good example. If an algorithm is using a sample of data to sift through applicants for a software engineering job that doesn鈥檛 include many women, the algorithm could unintentionally sift women out.鈥澛

Baldo聽says there have been numerous studies on algorithm bias. Still, he shared聽his thoughts聽at the conference because he felt it was important for聽conference attendees to understand algorithm bias鈥檚 root causes.聽聽

鈥淥ne slide I presented discussed how we achieve fairness with algorithms. We need to look at the data and try and detect biases in it. We can train people who interpret the data and educate them on the best practices,鈥 says聽Baldo. 鈥淚 mainly wanted my presentation to raise awareness, and I wanted to take some of the mystery out of algorithms.鈥澛

From a data perspective,聽Baldo聽works to embed these best practices into the data analytics engineering program. 鈥淲e are trying to聽embed聽education on the effects of聽algorithm bias聽into the MS program. Since it is an interdisciplinary program, we have to work together to figure out how to do that collaboratively.鈥澛

Baldo聽sees聽algorithm bias awareness and prevention聽as crucial for building the data analytics workforce. 鈥淭he social fabric of this is very important, and we have a responsibility as engineers to address it, and this was a start.鈥澛