From Peter Radford This is a long speculation, for which I apologize, provoked by the following: “In an information civilization, societies are defined by questions of knowledge — how it is distributed, the authority that governs its distribution and the power that protects that authority. Who knows? Who decides who knows? Who decides who decides who knows? Surveillance capitalists now hold the answers to each question, though we never elected them to govern. This is the essence of the epistemic coup.” — Shoshanna Zuboff, New York Times, January 2021 Coups are all the rage right now. Is Zuboff right that we are the victims of an “epistemic coup”? What do we even mean by an epistemic coup? It gets complicated. For years now we have been bombarded by articles, many of which emanate
Topics:
Peter Radford considers the following as important: Uncategorized
This could be interesting, too:
Merijn T. Knibbe writes ´Fryslan boppe´. An in-depth inspirational analysis of work rewarded with the 2024 Riksbank prize in economic sciences.
Peter Radford writes AJR, Nobel, and prompt engineering
Lars Pålsson Syll writes Central bank independence — a convenient illusion
Eric Kramer writes What if Trump wins?
from Peter Radford
This is a long speculation, for which I apologize, provoked by the following:
“In an information civilization, societies are defined by questions of knowledge — how it is distributed, the authority that governs its distribution and the power that protects that authority. Who knows? Who decides who knows? Who decides who decides who knows? Surveillance capitalists now hold the answers to each question, though we never elected them to govern. This is the essence of the epistemic coup.” — Shoshanna Zuboff, New York Times, January 2021
Coups are all the rage right now. Is Zuboff right that we are the victims of an “epistemic coup”? What do we even mean by an epistemic coup?
It gets complicated.
For years now we have been bombarded by articles, many of which emanate from the business consulting world where Zuboff made a living, arguing that we are moving from a world dominated by material supply and demand into world dominated by digital supply and demand. This new world has to be differentiated from its predecessor, so people have taken to calling it the “knowledge economy”. This is trivial and well known to all of us. Information has emerged as the critical ingredient. Knowledge workers are the darlings of the media and political class. Education is the panacea for all ills, especially the so-called hollowing out of the middle class. The “learning”organization is the business idea du jour.
Well, not to be too curmudgeonly, but this is all a bit of an over reaction. Yes, we are living in a digital world, but the epistemic revolution occurred ages ago. Zuboff is right, though, to speak of an urgent need to recognize what’s going on.
What we call the Industrial Revolution, that period in history during which our recent ancestors hurtled us all out of the languor of the long Middle Ages during which progress was repressed by various aristocratic, military, and religious elites, was actually an epistemic revolution. We lose sight of this fact when we turn our gaze and get bedazzled by the feats of engineering and practical discovery that mark the onset of industrialization. More to the point, we are also likely to get lost in the various explanations of the social and political upheaval surrounding that revolution. Marxism and neoclassical economics are both utopian ideologies rooted in the explosion of prosperity where humanity suddenly could contemplate a future stripped of drudgery. Their optimistic roots misguide and mislead us when we need to be practical and recognize the uncertainties of reality.
We can get a clue to the actual nature of the change when we consider that the word “revolution” itself is a bit controversial. We tend to think of revolutions as short, sharp events with some definitive change occurring rapidly. The industrial revolution is hardly such an event. It was a long slog. In some senses it lasted pretty much the entire nineteenth century and didn’t reach full fruition until into the twentieth. The entire process of moving us out of an archaic agriculturally-based world and into what we call an industrially-based modern world was spread well over more than a century.
Why did it take so long? It takes that amount of time to create the knowledge that drove the entire thing forward. Practical inventions sometimes got ahead of the science, as in the steam engine. Sometimes it was the other way around. Innovations take time to work their way to their fullest exploitation — again steam is a good example. But the common feature is the learning necessary to make the change both possible and practical. Learning is laborious, and is never without setbacks.
Not only this, but as many have argued, squeezing all that prosperity from the new ideas, needs a context in which such value extraction is both socially and politically tenable: there has to be a culture in place that accepts and can absorb the change. And, as countless economic historians tell us, that culture has to gel periodically into institutions to provide the structure and support for change.
None of this is novel. But all the drivers of the industrial revolution: the new knowledge and know-how; the cultural support; and the creation of the institutional framework, are epistemic in their origins. You can’t build the world’s first iron bridge without first conceiving of it. The idea pre-dates the event.
So, with all that said, and let me repeat my acknowledgement of its triviality, what are we to make of Zuboff’s identification of a current epistemic revolution under way?
There is clearly a sea-change going on in the nature of society and thus the economy. Our contemporary understanding of, and ability to process, information is running far ahead of our cultural and institutional ability to contain it and extract social, rather than purely economic, value from it. That distance between our capacity to process information and our ability to adapt society to that capacity is the source of great discomfort to many of us. Worse, the realization that everything, absolutely everything, can be viewed through the lens of information is even more disturbing. While undoubtedly correct, viewing everything as information requires us to adapt ourselves to the vision and not ignore its consequences.
Nowhere is this discomfort more evident than in the application of algorithmic analysis to our day-to-day lives.
One of the great factors in economics has always been the existence of consumer preferences. These are assumed to be the inscrutable, but pervasive, drivers of our economic activity. Our preferences are our wants and desires expressed, or “revealed” in the jargon, when we go out and buy things. Economists have grappled with this inscrutability. The identification of preferences is vital to the smooth functioning of the theories underlying basic supply and demand. That we cannot know them precisely because they exist only in the minds of consumers until they are acted upon, is why economics has long had an air of being kluged together. They drive the entire market theory, yet they are ethereal, they exist only at the moment of impact. Then they disappear again into the vapor. This is why economists were reduced to the notion of revelation.
All this is being overturned by the advent of algorithmic analysis.
Now, because so much of our existence includes activity in a digital domain, our actions can be captured. Our histories can be stored. Our tendencies can be identified. We leave our preferential footprints in the sands of the internet where electronic surveillance collects them and makes them amenable to interpretation. Our preferences are not just revealed, but they are predicted. There is a machine looking inside your head. It is thinking what you are thinking.
And, so powerful are these machines, and so acute are their predictions, that the owners of these machines are able to bend the arc of your preferences by putting in front of you options that you are predisposed to accept. You have become the raw material for the production and the sale of the product that you desire even before you are aware of that desire. This is a perverse digital update of Say’s Law: supply creates its own demand.
All this is just one dire consequence of the onset of the information civilization that Zuboff mentions. It is the arrival of a dystopia driven by the potential elimination of human indecision or whimsy and its replacement by prediction and the illusion of reliability dependent upon the pervasive collection and analysis of information. We are de-humanizing humanity in order to extract a profit.
Remember: all is information, information is all.
You and I are simply packages of information. That was a dull obscure philosophical statement in the era before the explosion in our computational ability coupled with the incessant collection of information. Now it is a practical expression of reality. We are information. We can be computed. We can be predicted. We can be reduced to machines. By other machines.
Or can we?
All life is information, but the value of that information comes in two contradictory forms. The first is the predictability and replicability of information. Codes are useful. They allow us to memorize and replicate useful things. When we unpack the code and embody its information in a physical form we get a reliable and recognizable output. But codes are only as useful as their context. Change the context and the code becomes less reliable. This is where the second value of information comes in. Feedback from its context allows an information processor to evolve. New codes can be discovered. Life adapts. But, the ability to learn is a cost, it is a burden because it implies error and discovery: there is potential waste in both. This redundancy in learning makes it the bane of efficiency. Pursue efficiency at all costs, and you abandon, or diminish, the ability to learn.
So a challenge for an owner of an algorithm is how to get their code to learn. Or, conversely, how to get their code to unlearn any human biases expressed in the original code.
Algorithms have their origins as human constructs. They suffer from human biases. A recent example being the digital portrayal of various human beings. The portrayals are now so well expressed that they appear to be photographic representations of actual human beings. But they are not. They are artifacts of an algorithmic imagination. Or rather the imagination of the human that created the algorithm. So when researchers asked a machine to portray people in various forms of dress, over half the females created were in bikinis. The men were all soberly dressed in suitable office attire. Or thereabouts. Bias? Obviously. Concerning? No kidding.
We get similar biases in pretty much all algorithms. The lack of transparency in their development and application is what Zuboff cries out against. The owners of the machines own the information. That means they own us. They have tilted, dangerously, the balance towards profit and away from democracy.
Here’s another despatch from the front lines of Zuboff’s coup: when you look for something using a search engine, how do you know that what you find is valid or the full extent of the possible available information? You don’t. You cannot. You are reliant on the utility of the search engine. But search engines are owned by people trying to make a profit. The profit motive introduces a source of bias. What the search engine returns might engender a profit opportunity for the search engine owner. Your search is raw material for profit. You are information. Being mined for profit.
But enough.
I have lingered too long.
Two things: first, all economic growth and especially that of our modern era is epistemic in nature. Economies are learning devices with the learning expressed in new opportunities and material progress, prosperity, and health improvements or other advances. Second, the current exploitation of our impressive ability to understand and manipulate information is running far ahead of our cultural and institutional reaction to it. There ought to be an urgency in getting democratic control over the owners of the machines. It is one thing to innovate — or to use their jargon “disrupt” — it is another to impose biases on or to intrude untrammeled into our lives. Disruption needs to be met by a counter construction to satisfy all of us.
We are information and we own ourselves. So we own our information. We have a right to determine its fair use. Such algorithmic justice is a new concept. We will be hearing a lot more about it as we re-balance, once more, the contest between capitalism and democracy.
The capitalists, as usual, are ahead of us. We, as usual, must rein them in. Quickly.