**Algorithm Used in Jordan to Identify Low-Income Families Found to be Flawed and Biased**
The Human Rights Watch report conducted in the city of Amman, Jordan, revealed that an algorithm used to rank low-income families on socio-economic factors and distribute funds from the World Bank was inaccurate and biased. The study, which included 70 interviews and an analysis of World Bank documents, found that the algorithm excluded individuals who were in dire need of assistance. This raises concerns as international organizations increasingly advocate for the use of algorithms in social assistance programs.
**The Use of Algorithms in Social Assistance Programs**
The World Bank has been encouraging countries to develop technologies that can identify and rank individuals living in poverty to provide them with cash transfer and social assistance programs. One such program is Takaful in Jordan, which uses an algorithm to assess indicators such as property ownership, car ownership, business ownership, household size, and electricity and water consumption to determine eligibility for assistance. Launched in 2019 by the National Aid Fund (NAF) in Jordan with $1 billion from the World Bank, Takaful currently has 220,000 enrolled families.
**Inaccuracies and Biases in the Algorithm**
The Human Rights Watch study found that the algorithm failed to consider the economic complexities faced by individuals living below the poverty line in Jordan. While some factors, such as households headed by women, were taken into account, other indicators such as asset ownership and electricity consumption were deemed inappropriate measures of poverty. In some cases, families were disqualified because they had recently inherited property but lacked the financial means to support themselves. Additionally, individuals who owned a car less than five years old or had a failing business were automatically excluded, regardless of their actual financial situation.
**Personal Stories Illustrate the Flaws of the Algorithm**
The report highlighted the story of Mariam, a woman living outside the city, whose ownership of a car negatively impacted her chances of receiving financial aid. Despite needing the car for commuting and water transportation, Mariam was dropped from the Takaful program in 2022 due to her car ownership, as determined by the algorithm.
**Algorithmic Decision-Making in other Countries**
The use of algorithms to make critical decisions that affect people’s lives is not unique to Jordan. The report mentioned the case of Rotterdam, where an algorithm used to rank welfare recipients based on their risk of fraud was found to be biased and flawed. Similarly, crime prediction software used by police departments in the U.S. has been repeatedly criticized for its racial biases. In Australia, an automated system wrongly accused around 400,000 welfare recipients of overpayments and issuing debt notices in 2016.
**Concerns Raised by the Flawed Algorithm**
The Human Rights Watch researcher, Amos Toh, expressed concerns about the deprivation of support that families in Jordan experienced due to the algorithm’s rigid indicators and measurements. According to Toh, these limitations overlooked the nuances of each family’s circumstances at any given time, thus resulting in the exclusion of those in need.
**The Algorithm and the National Aid Fund**
The algorithm used in Jordan was developed by the National Aid Fund, a social protection agency, using data from the National Unified Registry. The World Bank provided a loan of $2.5 million for the creation of the registry, which was managed by a third-party contractor called Optimiza. However, the study pointed out that human-induced inaccuracies and inconsistencies in data compilation could impact individuals’ rankings in the algorithm.
**Responding to the Concerns**
The National Aid Fund responded to the concerns raised by Human Rights Watch, stating that each of the 57 indicators used in the algorithm carries a certain weight and that they were designed based on the concept of “multidimensional poverty.” The NAF also mentioned conducting house visits to verify the data collected in the application forms and having a complaints and appeals process for citizens.
**The World Bank’s Stance**
The World Bank defended the use of algorithms for targeting individuals in poverty, stating that they are effective tools for maximizing limited resources and reducing poverty. The World Bank has provided loans for similar projects in seven other countries in the region, including Egypt, Iraq, Lebanon, Morocco, Yemen, and Tunisia. In Tunisia alone, the World Bank has loaned $100 million to develop and integrate machine learning and AI models into the country’s social registry to detect fraud and improve compliance.
**The Social Tension Generated by the Algorithm**
According to Toh, the algorithm’s crude ranking of poverty can create social tension by pitting one household’s needs against another for support. This raises questions about the efficacy and fairness of relying solely on algorithms to determine eligibility for social assistance programs.
In conclusion, the Human Rights Watch report reveals the flaws and biases within the algorithm used in Jordan to identify low-income families. The study brings attention to the limitations of algorithms in capturing the complex socio-economic realities faced by individuals living in poverty. It emphasizes the need for more nuanced approaches and considerations when developing and implementing algorithms in social assistance programs.
GIPHY App Key not set. Please check settings