In settings ranging from consumer credit to poverty targeting, an increasing number of decisions are being guided by machine learning algorithms. In Kenya, over one quarter of the adult population has taken out 'digital credit' loans over mobile money, where machine learning algorithms are used to assess the likelihood of repayment based on how an applicant uses his or her phone. However, the use of algorithms raises two issues. First, if people strategically manipulate their behavior to attain desired outcomes, then decision rules may cease to be effective. Second, consumers want to know how these decisions are being made -- but disclosing the decision rule make it easier to game the system. This paper develops a new class of estimators that are stable under manipulation, even when the decision rule is fully transparent. We explicitly model the costs of manipulating different behaviors, and identify decision rules that are stable in equilibrium. Through a large field experiment in Kenya, we show that decision rules estimated with our strategy-robust method outperform those based on standard machine learning approaches.
Speaker: Josh Blumenstock, UC Berkeley
Contact:Website: Click to Visit
Save this Event:iCalendar
Windows Live Calendar