Is Data the Master’s Tool? Part 2
What if your data is 'The Ease of Doing Business'?
A recent Odd Lots podcast featured the brilliant Scottish sociologist Donald MacKenzie. Frustratingly, the conversation focused on the speed-limits of high-speed of trading and not on MacKenzie’s magnificent 2008 book, An Engine, Not a Camera. For those who were here for part 1 of the discussion of whether data is ‘the master’s tool’, MacKenzie’s a vital thinker.
His book’s title comes from Milton Friedman, who argued that economic theory was an “engine” of enquiry rather than a photographic reproduction of the world. Friedman meant this as a defence of simplification: models don’t need to be realistic, they just need to be useful.
MacKenzie took Friedman’s metaphor and turned it against him. Financial economics, he argued, did more than analyse markets. It altered them. It was an engine “in a sense not intended by Friedman: an active force transforming its environment, not a camera passively recording it.” (p12) The Black-Scholes model didn’t just describe how options were priced. Once traders started using it, options began to be priced the way the model predicted. The model made the world more like itself.
The Black-Scholes equation, published in 1973, offered a formula for calculating the "correct" price of a financial option — essentially, a bet on whether a stock will rise or fall by a certain date. Before the formula existed, options traders priced these contracts by gut feeling, experience, and negotiation. After it was published, traders began using the formula to set their prices — and, as MacKenzie documents, the actual prices in the market converged toward the model's predictions. The equation didn't discover a pre-existing truth about how options were valued. It created the truth it claimed to describe, because enough people used it that the market reorganised itself around its assumptions. When the model eventually failed — spectacularly, in the 1987 crash — the prices diverged from the formula and never fully returned. The camera had been an engine all along, and in a crisis the engine seized up.
MacKenzie borrows the term ‘performativity’ from the philosopher J. L. Austin, who distinguished between utterances that report on reality and utterances that do something. When I say “I apologise,” I’m not describing an apology. I’m performing one. MacKenzie’s argument is that economic models, data sets, and indicators can work the same way. They don’t just measure the economy. They perform it.
This matters for the question I left open in Part 1. There, I suggested that if data is more like raw material than like a tool, then the same soil data collected by a corporation could, under different ownership, serve a cooperative. I still think that’s partly true. But MacKenzie complicates this, and asks whether data is discovered or made. Some data isn’t raw material at all. Some data is an engine from the moment of its construction.
The invention of GDP
Consider GDP. It’s the most consequential number in modern political life — the figure that determines whether governments are judged to be succeeding or failing, the metric that shapes elections and policy and the allocation of trillions in capital. It is entirely made up, but its calculation drives
In 1934, the economist Simon Kuznets presented his report to the United States Congress, “National Income, 1929–35.” The context was the Great Depression. Policymakers had no reliable measure of how badly the economy had contracted — they were flying blind. Congress commissioned Kuznets, a Russian émigré working at the National Bureau of Economic Research, to develop a system for measuring national economic output.
Kuznets delivered. But he also delivered a warning. In his 1937 report, he wrote that the welfare of a nation “can scarcely be inferred from a measurement of national income.” He understood that what he’d built was a tool with a specific, limited purpose: tracking the aggregate business cycle to prevent another depression. It was a camera pointed at a very particular part of reality.
What happened next is a case study in what MacKenzie would call performativity. During the Second World War, Kuznets’s methodology was adapted — against his explicit objections — to measure not income but total production, including military spending. Kuznets had argued that military expenditure should be excluded in peacetime, since it didn’t contribute to welfare. He lost that argument. By 1944, at the Bretton Woods conference, GDP had become the global standard for comparing national economies. Seven years after Kuznets first warned about its misuse, his camera had become an engine.
And what an engine. Once governments were measured by GDP growth, they began to optimise for it. Policies that increased measurable output — regardless of whether they improved lives — became rational. Activities that GDP doesn’t count — household labour, subsistence farming, ecological services, care work — became invisible to the policymakers whose worlds were organised around the number. As the political theorist Tim Mitchell has argued, GDP didn’t just measure “the economy.” It constructed “the economy” as a statistical object, composed of formally defined aggregates like “demand” and “supply,” and in doing so constituted the position of the macroeconomic policymaker — a new kind of actor, responsible for managing GDP’s statistical artefacts.
For our purposes, this matters centrally. GDP is not a neutral observation of economic reality. It is a designed object, built with specific assumptions about what counts and what doesn’t. Military spending counts. Care work doesn’t. Pollution counts (as production). Clean air doesn’t. Every one of those choices is political, and every one of them shapes the world that organises itself around the number. The data performs the economy it claims only to be measuring.
The Ease of Doing Business
If GDP is the engine that nobody quite intended to build, the World Bank’s Ease of Doing Business Index was an engine from the start.
Launched in 2003, the index ranked 190 countries on how easy it was to start and run a business. The metrics were straightforward: how many procedures to register a company, how long to get a construction permit, how easy to enforce a contract. Fewer regulations meant a better business environment, which meant more growth, which meant more development. The data was an argument disguised as a measurement.
And the argument worked — performatively. Governments didn’t just accept their ranking. They began competing to improve it. India’s Prime Minister Modi made climbing the rankings a central plank of his economic agenda, coordinating across agencies and creating sub-national rankings to pressure bureaucrats. Russia’s President Putin decreed that Russia would improve its rank by a hundred places, and created a new agency to make it happen. Russia climbed from 125th to 28th — even as the actual number of new businesses being created in Russia was falling and enterprise failures were rising.
The index didn’t describe reality. It produced a reality in which governments dismantled labour protections, cut environmental safeguards, and slashed corporate taxes — not because these reforms would necessarily improve the lives of their citizens, but because they would improve their score.
And then, like MacKenzie’s counterperformativity — where the use of a model makes reality less like its own predictions — the whole thing collapsed. In 2018, the World Bank’s own chief economist, Paul Romer, revealed that the index’s data had been manipulated to penalise Chile during the presidency of the left-wing Michelle Bachelet. An independent investigation later found that senior Bank leadership — including the then-president Jim Yong Kim and the then-CEO Kristalina Georgieva (now head of the IMF) — had pressured staff to alter China’s ranking while the Bank was seeking a $13 billion capital increase from Beijing. Saudi Arabia’s and Azerbaijan’s scores were also manipulated.
In September 2021, the World Bank discontinued the index entirely.
An index designed to reward deregulation, funded by the institution that advised governments on how to improve their scores (a conflict of interest that Bank employees themselves flagged), inevitably became a site of political manipulation. When Georgieva thanked one of the inventors of the index for doing his “bit for multilateralism,” the irony was almost too neat. The data had been designed as an engine from the start.
Cameras that become engines
MacKenzie distinguishes three levels of performativity. At the weakest level, economic models simply get used — traders adopt a pricing formula, governments adopt a metric. At a stronger level, that use has effects — it changes how markets behave, how governments allocate resources. At the strongest level — what he calls “Barnesian performativity,” after the sociologist Barry Barnes — the use of the model makes reality more like the model’s predictions. The model validates itself. As Barnes observed, a metal disc is “money” if, collectively, we treat it as money. A country is “business-friendly” if, collectively, we organise around the index that says so.
When I argued that data is raw material — that soil data collected by Bayer could serve a cooperative if you changed the ownership — I was thinking about data as a camera. And some data is a camera: a soil moisture reading is a soil moisture reading whether a corporation or a cooperative collects it.
But GDP isn’t a soil moisture reading. The Ease of Doing Business Index isn’t a soil moisture reading. These are engines. They are data systems designed — consciously or not — to move the world in a particular direction. GDP was designed to measure aggregate output, and the world reorganised itself around maximising aggregate output. The Ease of Doing Business Index was designed to reward deregulation, and governments around the world deregulated.
So the question from Part 1 — “is data the master’s tool?” — needs refining. Some data is already an engine by the time it reaches you. It has assumptions built into its architecture, politics embedded in what it counts and what it ignores, a direction it wants the world to move. You can change the ownership of that data. You can put it in the hands of a cooperative instead of a corporation. But if the data itself was built to optimise for extraction — if the metric rewards deregulation, or equates welfare with output, or treats care work as invisible — then changing who holds it doesn’t change what it does.
Lorde would been unsurprised. The master builds the measurement systems that tell you what counts as progress. If you adopt those measurements uncritically — if you take GDP or “ease of doing business” as neutral descriptions of reality rather than as engines designed to produce particular outcomes — then you are using the master’s tools even when you think you’re just reading the data.
Next time: is there any data that is just ‘waiting to be discovered’, neutral in ways that might be repurposed by the people in the house that needs dismantling?
This is Part 2 of “Is Data the Master’s Tool?” Part 1 explored whether agricultural data can serve liberation, through the story of a railway in Gujarat and a dairy cooperative that ran milk on the master’s tracks. Part 3 will ask what counter-hegemonic data systems might look like in practice.



Hi Raj. This is great and brings up 2 thoughts for me. Firstly the discussion on GDP as a made-up engine reminds me that the current politics of the data-driven class (such as the so called 'progress movement' and everything adjacent to or funded by Peter Thiel) is driven by an obsession with so-called 'stagnation' (that our economies have seen decades of stagnating growth and only through priming something called 'innovation' can we escape from what Tyler Cowen calls 'The great stagnation'. Stagnation-diagnoses in turn depend upon GDP and related productivity measures and other made-up metrics and your piece reminds me these diagnoses are more engine than camera. In the case of how Thiel and his armies are now converting teh global economy to data and innovation bubbles, the engine looks likely to once again seize up or fall over.
Secondly are you familiar with Kelly Bronson's book on agricultural digitalization, 'The Immaculate conception of Data'? Its a really good study on the early days of digital farming and her main point, reflected in the title, is that some activists, policymakers and industry alike make the mistake of assuming data is a neutral object that is just found and collected in the wild n some sort of immaculate state whereas social scientists, communities, indigenous scholars etc know that it is a made object, formed for purpose, infected with ideology and fraught with the intentions and politics by which it was chosen. In that sense it's not quite true that a soil moisture reading is a soil moisture reading whether a corporation or a cooperative collects it. The act of collecting involves decisions on what to include and exclude, how to measure, what tools (with what capabilities, dual uses, ownerships, additional intel, silences) are employed. The making of data is deeply political. Anyway - probably you are headed that way in your third piece... ;-)
This is such an important piece, it makes me question the functional purpose for metrics that existed far before I was born. The good-faith assumption I carry is that the ubiquity of metrics like GDP indicates that it’s capturing an important - and perhaps even close to holistic - part of what going on around us. But reading this puts into context that GDP was designed in a specific historical context for a specific purpose, and therefore we should question if it deserves to be definitive metric that we look at to gauge the economic health of a society.