Synopsis
This book has traced a single question across two centuries, five transitions, and sixteen preceding chapters: who gets to decide what counts? The arithmetic of power has always answered that question in the same way — with instruments that appear neutral, criteria that appear technical, and models that appear objective, while encoding the priorities of those who built them in ways that are invisible to those they govern. An ethics of measurement does not begin by demanding better algorithms. It begins by demanding that the question be asked openly, answered accountably, and revisited perpetually. Transparency is the minimum. Participation is the requirement. Accountability is the right. Humility is the discipline. These are not abstract virtues — they are formal properties that mathematical systems can have or lack, and this chapter specifies what it would mean for the institutions examined in this book to have them. The EU AI Act (2024) represents the most ambitious regulatory attempt to impose some of these properties on algorithmic systems; this chapter examines what it achieves and where it falls short — particularly in its foundation-model provisions and its enforcement gap. Foxglove’s litigation against DWP’s fraud-detection system illustrates the central point: the opacity that transparency requirements most need to pierce is not the opacity of a sophisticated neural network but the opacity of a political decision protected from scrutiny by mathematical dress. Timnit Gebru’s dismissal from Google for attempting to publish a paper on the risks of large language models is not exceptional. It is structural. This is what the arithmetic of power does to the people who contest it from inside.
In This Chapter
- How Mollie Orshansky’s final interview — asked whether she is proud of what she created — puts the book’s central distinction in miniature: between visibility for remedy (her intention) and visibility for management (what was done with the measure)
- How Timnit Gebru’s firing is structural rather than exceptional: the arithmetic of power’s response to internal contestation, and what this implies for the political conditions required to sustain ethical measurement work inside institutions
- How the EU AI Act’s risk-based framework and transparency requirements represent genuine progress and specific inadequacies — the lobbying that weakened foundation-model provisions, the enforcement gap, and the absence of meaningful participation rights for affected communities
- How Foxglove’s DWP litigation reveals this chapter’s core argument: the opacity most urgently requiring transparency requirements is not algorithmic complexity but institutional concealment of political choices
- How the Scottish Government’s developing digital rights framework — operating under devolved powers in dialogue with the Scottish Social Security Act and Child Poverty Act — demonstrates what institutional expression of the book’s argument looks like in partial and imperfect practice
Connection Forward
Chapter 18 brings the book’s five transitions into view simultaneously — collects the Scottish thread, the gender thread, the counter-tradition, and the choice that is being made continuously, in every design decision, every threshold, every feature selection — and names it.
This chapter’s core argument is that ethical poverty measurement is a practice, not a methodology — it requires political conditions that no technical refinement can substitute for.
In This Chapter
- Theoretical frameworks: How Hacking’s dynamic nominalism, Porter’s trust in numbers, and Desrosières’s politics of large numbers reveal poverty measurement as a social practice that constructs, rather than merely describes, reality
- Institutionalizing transparency: How poverty statistics agencies can make measurement choices explicit, with examples from the Office for National Statistics and US Census Bureau showing both possibilities and limits
- Extending the right to explanation: How GDPR’s algorithmic accountability framework could be applied to poverty measurements, giving the measured legal standing to challenge metrics that govern them
- Political conditions for ethical measurement: How community data sovereignty, counter-expertise funding, consultation mechanisms, and legal accountability require redistributing power—not just refining methodology
Connection Forward
The Conclusion, Chapter 18, draws these threads together and asks the final question: what would it mean to count lives as if they mattered?