Capturing contextual morality
In order to build more fair Artificial Intelligence applications, a thorough understanding of human morality is required. Given the variable nature of human moral values, AI algorithms will have to adjust their behaviour based on the moral values of its users in order to align with end user expectations. Quantifying human moral values is, however, a challenging task which cannot easily be completed using e.g. surveys. In order to address this problem, we propose the use of game theory in longitudinal mobile sensing deployments. Game theory has long been used in disciplines such as Economics to quantify human preferences by asking participants to choose between a set of hypothetical options and outcomes. The behaviour observed in these games, combined with the use of mobile sensors, enables researchers to obtain unique insights into the effect of context on participant convictions.