Emergent Mind

Abstract

The field of algorithmic fairness has highlighted ethical questions which may not have purely technical answers. For example, different algorithmic fairness constraints are often impossible to satisfy simultaneously, and choosing between them requires value judgments about which people may disagree. Achieving consensus on algorithmic fairness will be difficult unless we understand why people disagree in the first place. Here we use a series of surveys to investigate how two factors affect disagreement: demographics and discussion. First, we study whether disagreement on algorithmic fairness questions is caused partially by differences in demographic backgrounds. This is a question of interest because computer science is demographically non-representative. If beliefs about algorithmic fairness correlate with demographics, and algorithm designers are demographically non-representative, decisions made about algorithmic fairness may not reflect the will of the population as a whole. We show, using surveys of three separate populations, that there are gender differences in beliefs about algorithmic fairness. For example, women are less likely to favor including gender as a feature in an algorithm which recommends courses to students if doing so would make female students less likely to be recommended science courses. Second, we investigate whether people's views on algorithmic fairness can be changed by discussion and show, using longitudinal surveys of students in two computer science classes, that they can.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.