This session is facilitated by Matt Claypotch, Meg Tobin, Lyzi Diamond
About this session
Participants will get to experiment hands-on with a remix-able software project designed to demonstrate the vulnerability of supposedly “neutral” algorithms to the biases present in the developers that build them and the data they’re fed. They will be able to interact with a simple user interface to demonstrate the concepts as well as dig into the underlying code, as their ability allows.
Goals of this session
To teach attendees about how software can inherit the biases of the people who created it and the data they train on.