by Ted Pedersen | at Minnebar spring 2020 (canceled)
Online information systems have been shown to exhibit some of the same prejudices, biases, and even hatred that are sometimes found in human society. These systems can have the effect of unfairly denying resources to people because of their gender, race, religion, or other characteristics. They can also powerfully propagate and amplify negative and stereotypical attitudes towards members of underrepresented groups. This problem is often referred to as algorithmic bias. In this session we will discuss various examples of algorithmic bias, and some of the reasons why it occurs. We will also discuss possible response or solutions to these problems. The session is intended to be interactive, and so participants are encouraged to share their own experiences of algorithmic bias and how this may have affected you or those close to you.
Ted Pedersen is a professor of Computer Science at the University of Minnesota, Duluth. His research interests are in Natural Language Processing, and is most recently working on detecting different forms of hate speech using a variety of methods from Machine Learning and Deep Learning. He teaches classes in NLP, AI Ethics, and the History of Computing. More research and teaching related information is available at link