A new economy is emerging. And this new economy is powered by a new type of fuel: data. As the data economy becomes increasingly prominent, there are troubling signs that it is worsening existing power imbalances, and creating new problems of domination and lack of accountability. But it would be wrong simply to draw dystopian visions from our current situation. Technological change does not determine social change, and there is a whole range of potential
neweconomics considers the following as important:
This could be interesting, too:
Spencer England writes Protected: Fed Deficit as a % of GDP now at new record
V. Ramanan writes Joan Robinson On How The Economic System Has A Deflationary Bias
A new economy is emerging. And this new economy is powered by a new type of fuel: data. As the data economy becomes increasingly prominent, there are troubling signs that it is worsening existing power imbalances, and creating new problems of domination and lack of accountability. But it would be wrong simply to draw dystopian visions from our current situation. Technological change does not determine social change, and there is a whole range of potential futures – both emancipatory and discriminatory – open to us. We must decide for ourselves which one we want.
This is the second of four papers exploring power and accountability in the data economy. These will set the stage for future interventions to ensure power becomes more evenly distributed. This paper explores data and algorithms in the labour market, while future papers will examine: the companies built on data that mediate our interface with the digital world; and the data economy.
Our research so far has identified a range of overarching themes around how power and accountability is changing as a result of the rise of the digital economy. These can be summarised into four key points:
- Although the broader digital economy has both concentrated and dispersed power, data is very much a concentrating force.
- A mutually reinforcing government-corporation surveillance architecture – or data panopticon – is being built, that seeks to capture every data trail that we create.
- We are over-collecting and under-protecting data.
- The data economy is changing our approach to accountability from one based on direct causation to one based on correlation, with profound moral and political consequences.
This four-part series explores these areas by reviewing the existing literature and conducting interviews with respected experts from around the world.
The labour market has always been a delicate balance between workers and employers. History is in some sense the story of employers trying to get the most out of their employees while workers organise and fight for power and more control over their lives. The introduction of data-gathering technology, its analysis and use has disrupted that balance and shifted power firmly back to employers. This is especially true within the new on-demand labour platforms like Deliveroo or Amazon Mechanical Turk but is also filtering into all areas of work. We have identified a number of major issues related to data and labour:
- The extension of surveillance tools, both temporally and spatially, combine to create a Panopticon-like scenario whereby even though the worker knows they are probably not being directly monitored at all times, the fact that they could be being monitored at any time elicits a psychological response equal to permanent surveillance.
- Many companies that are gathering and analysing data about their workers frame it as being beneficial for everyone. The potential benefits are, however, highly skewed towards management and in fact allow for the intensification of work and the reduction of employees.
- Employers are increasingly using algorithms as a tool to obscure the specific decisions being made. At the same time, the black box nature of algorithms and the difficulty in questions their decisions leads to a loss in accountability.
- Although there is a hope that data and algorithms can work to remove individual bias, critiques suggest that algorithms are often blind to biases inherent in the training data with companies rarely if ever recording false negatives