Why trying to perfectly manage a thing usually kills the thing
This post was first published on Medium
Modern statecraft is largely a project of internal colonization. — James C. Scott
A long time ago in Prussia, around the time of the Seven Years War, the king ordered the measurement of the forests.
Fiscal officials had become aware of an alarming shortage of wood, which meant fewer revenues, which meant fewer muskets and pikes and musketoons, which meant fewer victories against Austria, and France, and Russia.
The problem, as Prussian officials saw it, was that many of the old growth forests of beech, elm, oak, and spruce had been degraded by felling, and the kingdom’s methods of predicting new growth — basically, dividing the land into plots and expecting every plot to grow as quickly as the next — were inadequate.
And so, the task of precisely measuring the forests fell to an assortment of foresters and cameral scientists, including an inspector in Saxony named Johann Beckmann — a man we remember today for coining the term “technology.” Beckmann’s task was to forecast revenues by measuring the number and growth of a widely varying population of trees.
To do this, Beckmann and several assistants walked abreast through a plot of forested land, carrying compartmentalized boxes with color-coded nails corresponding to five categories of tree sizes. The assistants tagged each tree with the appropriate nail until the sample plot had been covered. At the end of the walk, the foresters subtracted the remaining nails from the original total and arrived at an inventory of trees by class for the entire plot.
With the trees properly tagged, the foresters could determine the average volume of wood per size of tree per class, and thus calculate how much revenue the state could expect from each individual tree. By counting the number of trees, they could also calculate the revenue yield of a plot of forest.
The foresters then made elaborate tables summarizing entire forests, predicting inventory, growth, and yield. They had narrowed their attention to measuring single instances of beech and elm and oak, and thus could summarize entire swaths of land on a piece of paper.
In other words, they could now see the forest because they could see the trees.
But they didn’t stop there. Some trees were more profitable than others. So they carefully seeded and planted and cut groves of trees until they had created a forest that was even easier to summarize and manipulate. They transformed diverse and chaotic old-growth forest into a new, more uniform type of forest that—as James C. Scott wrote in “Seeing Like a State”— “closely resembled the administrative grid of its measurement techniques.” In other words, the grid of the trees matched the grid of the tables used to count the trees. You could say that we took the measure of our trees, and thereafter those trees became the measure of us.
To create the new forests, the scientists cleared underbrush. They planted trees in rows. They created ranks and files of trees, like soldiers. And as the forest scientists would cut one row, another identical row would take its place.
This made everything, from a management point of view, easier. Clearing land became easier (you knew what you wanted, and what you didn’t).
Felling trees became easier (you knew when a tree was valuable, and when it wasn’t, and where exactly that tree was).
Planting and measuring and forecasting became easier (you knew where to plant them, and their average size, and how quickly they grew).
And all this simplicity of order allowed for simple care. Assistants who had barely been trained could walk the rows, noting size and variation and disease. It also became easier to experiment on any one population of trees.
This science of forestry, as Scott tells it, was an attempt to make the forests “legible” to the state. With the data about forests neatly summarized, the Kingdom of Prussia could better forecast revenues, assess taxes, monitor farmland, and control populations. The state could literally “read” its forests on a piece of parchment.
This reveals a simple and timeworn maxim: You can’t improve what you don’t measure.
It doesn’t require a tremendous cognitive leap to imagine the techniques of the 18th century European state being applied by 21st century companies of Silicon Valley. But where the revenue of states was tied to the measurement of land, the revenues of Facebook, Google, or Amazon, etc. is tied to the measurement of minds.
“To organize the world’s information and make it universally accessible and useful,” is simply Google’s application of forestry science to web pages, and books, and terrestrial maps, and human interest in those things.
Or consider Facebook’s mantra, “to connect the world’s people.” On its face, this statement is literally true and useful. But it’s true and useful in the manner of categorizing trees with colored nails. It’s not difficult to imagine a 17th-century Zuckerberg arguing for the adoption of that time’s other organizational inventions, like the metric system in France (“People people! Measuring everything in centimeters connects you to the neighboring hamlet!”) or imposing surnames on local populations in 17th century England (“People people! Adopting the surname Robertson connects you to all the other Robertsons”). Yes, the metric system replaced varied local systems of measurement and allowed for trade between towns. It also allowed that trade to be taxed. Yes, surnames made it easier to find people. It also made it easier to tax those same people. This is not a political argument. It’s just a fact.
In both the case of Google and Facebook, the very act of categorization means that all the world’s information and all the world’s connected people are more easily manipulated (for “good” or “evil”). The very act of categorizing makes that manipulation easier, just as the very act of putting a child’s toys in a plastic bin makes the toys easier to move, or categorizing a town’s ethnic population makes them easier to save or to kill, or measuring trees allows you to profit from those trees.
I mean, you could say Facebook is simply human forestry.
But what should make you raise a Spockishly curious eyebrow is how tech’s synoptic view exponentially expands upon the forestry science of 18th century Prussia.
Most obviously, Facebook , Google, or Amazon don’t need to make legible the physical layout of the world (though some are, in fact, doing that with, e.g., Google Maps and self-driving vehicles). At the end of the day there is only so much land, and that land only comes in so many shapes, and anyway nations are only so big. The physical world is a world of limited space. You could say the physical world doesn’t scale.
Rather, Facebook, Google, or Amazon need to make legible the infinite wants and desires and the infinite needs and necessaries of an infinite variety of humans. Those humans live and work in a digital space, and the digital world is a world of unlimited vistas. Where the state wants to take a vig on every human transaction for a physical thing—property, homes, capital gains, trees—technology companies want to take a vig on every human thought that leads to a transaction of any thing, whether it’s physical or otherwise.
In fact, the very idea of considering physical space misses the point. In 18th century Europe, the “what” of trees that mattered. But on 21st century Earth, it is not simply the “what” of a human that matters. It is the “how” and they “why” of how that human acts in aggregate with other humans (also in aggregate).
In one way, this is not a new observation. Who hasn’t heard, while trying to ignore junior marketing execs talking loudly in a bar, a version of the popular saying, “If you’re not paying for something, you’re not the customer; you’re the product being sold.”
But in another way, we’ve all been slow to understand that what’s being sold isn’t so much facts — i.e., the “what” of surname, birthday, hometown, etc., which marked the early search and social networks. Instead, what’s being measured and sold today is the “how” and “why” of actual decisions.
For example, consider how the Kingdom of Prussia planted their stands of trees, similar to this:
This is a repeating quincunx. You can stand at any one point and see straight lines radiating outward. Thomas Browne saw the hand of God in this pattern. Thomas Edison apparently tattooed it on his own hand (after also inventing the first electric pen, so there’s that).
In one way, you could say that Facebook, Google, or Amazon consider their data like a quincunx. Infinite rows of objects, neatly ordered in relational databases, easily legible to managers and accessible to developers for cutting or felling or experimentation.
In another way, you could also say that Facebook, Google, or Amazon consider their users in such a manner. Infinite rows of people objects, neatly ordered in relational databases, easily legible to managers and accessible to developers for cutting or felling or experimentation.
In yet another way, you could say that Facebook, Google, or Amazon also consider their users decisions like this. But decisions aren’t objects. Decisions are movements amongst and between objects, like so:
To Facebook, Google, Amazon, etc., the question began with “who” and “what.” What user, what object, what platform, etc. But eventually, these companies want to get to “why.” Why and in which categories and in what volume do people decide to click on video N, and then N + 1, and then N + 2. Or product N + 1. Or whatever.
As with measuring a forest, knowing those clicks allows a company to forecast revenues. But creating the condition of those clicks? Predicting their pathing? That’s planting a forest in the image of your own measurement technique.
Or put it like this: Predicting the growth of trees meant creating a mental model of a forest. Predicting the clicks of humans means creating a mental model of mental models.
You may not be surprised to learn that, eventually, the science of forestry created a few problems.
Perfectly manicured plots of trees looked impressive, and in the first generation, they produced record amounts of salable timber. But after a single crop rotation, or about a hundred years, the newly planted forests began to whither. Some began to die.
In measuring the forests for timber, the scientists had assumed other non-measured characteristics of the forest would stay constant. They didn’t measure soil building. They didn’t measure symbiotic relations among insects and fungi. They didn’t understand the importance of deadfalls and snags. They didn’t account for the uses of birds and mammals that contribute to soil creation.
The resulting forests were weak. They were vulnerable to storms. They were vulnerable to fires. They were vulnerable to disease. They died.
I don’t think it’s hysterical to suggest that our online communities are experiencing a similar form of forest death. By measuring “engagement” and holding all other variables of the human condition constant, our technology companies are creating weak populations vulnerable to strife, storms, fire, and disease. Partisan filter bubbles. Fake news. YouTube playlists that lead you down narrative paths that are ever more extreme.
And it’s worth noting: It’s not hard to predict what humans will click on. Any journalist who has watched Chartbeat in horror could tell you. Slideshows of lascivious teachers. Stories of grisly murders. McMansion house porn. Breaking news. Terrorist attacks. There is a simple rule governing all of this: Humans will pay attention whenever reality diverges from expectation. Pedophilia in a pizza shop? Backroom uranium sale? President boffing a porn star? Click. Click. Click.
What we’ve seen in the last few years and months is the realization that algorithms that measure engagement while holding other factors constant create divided and scared populations. Hence Facebook rejiggering its platform to rank news outlets and privilege news from family and friends.
But as hopeful as this seems, it isn’t a retreat from measuring engagement. This is a retrenchment. A fine tuning. Because, again, humans click on the salacious. The extreme. The sensational. They click to learn how far reality diverges from expectation. Which means: If you can depict a reality with intimate fidelity, if you can create a mental map of how users make decisions, you can guarantee clicks (engagement) by presenting any information which diverges from that reality by a measured degree.
And that’s what these tools we’re using do. That’s what they will always do. They’re making a mental model of our mental models, and just, y’know, tweaking it.
The science of “engagement” is the science of measuring precisely how much and how often we can freak each other out.
And then taking a vig on it.
I like letters. So does Tita. She's my dog. She opens all the mail.