Ideas that make a difference
Every three months the Design Column at Museum Boijmans van Beuningen focuses on a news item in the form of a small exhibition. The column is a place where new ideas are made visible, where the power of imagination is given expression. Designers and artists are especially interested in experimental imagination. With their idiosyncratic vision, they see things differently and are capable of bringing about change. The Design Column creates a space for these innovative concepts.
The Design Column is not only a presentation but also an opportunity for reaction and dialogue. Everyone is cordially invited to participate in roundtable conversations. If you would like to participate in this conversation, you can contact the curators at email@example.com.
Blog Design Column
Accompanying each Design Column the museum keeps up a blog. Here you can find reactions and up to date information on the current and previous editions – [Link]
Design Column #6 is called
The topic is all over the news – governments and corporations gathering data about people.
Here is the exhibition brief
Sharing information has never been as easy as it is in today’s networked society. Email and brief online messages play important roles in everyday information transfer. Information is no longer transferred directly from one person to another, but is part of an expanding network of ‘big data’.
While this ‘dataism’ brings with it a great degree of freedom it also has huge implications. Telephones are tapped, our medical records are stored centrally in an electronic file, cookies track our internet history and we can be found wherever we are through the global positioning system (GPS) in our mobile phones. Details from this huge quantity of information, also known as ‘big data’, acquire significance when they are combined into a relevant story by someone or something. But how are these stories put together and does that interpretation reflect the reality?
Design Column #6 Dataism explores the role of the individual in this ‘big data’ scenario. It is up to us not to lose ourselves in an illusion of omniscience and remain aware of the consequences of this system of networked information.
My reaction – a comment on the politics of data
The fascinating works in the latest exhibition Design Column #6 explore the rich human and philosophical dimensions of data.
I became quite absorbed by them. So much so that my commentary below grew to several pages. So I begin with a summary.
I was prompted to ask – Where did the enormous investment and confidence in data come from, especially when they’re also somewhat threatening and mysterious? Data are a key component of a scientific view of things, but now that corporations and governments are recording so much, we are feeling suspicious that our privacy and senses of self are being challenged. The typical response from governments and corporations has been that data collection can be taken too far, but that there’s nothing inherently wrong with collecting data about anything, especially if it concerns security and profit.
I have found all this difficult to fathom. Surely we simply need data to sort out the facts, but how can this sometimes be a matter of good taste and sensitivity to people’s feelings?
I dug around a little in the history of some old philosophical questions.
Is secure knowledge based on direct encounters with things (as captured by data)? Responses today to the criticism of government and business that they are collecting too much private data have not involved doubt that data are somehow always compromised. But can description, at the heart of the notion of data, ever be neutral and objective? Can a person be objectively described by data, or by any other means? Can description be separated from interpretation, ideology and politics? If so, how do we make an objective description of the world the basis for policy and government – how do we reconnect description and interpretation in politics, when they have been kept separate in producing (social) scientific knowledge? Way back in the eighteenth century David Hume pointed out there is no obvious way for a series of statements about what ought to be the case to be derived from a series of statements of what is the case.
It seemed that a key term in all this was security, and a host of connected notions like confidence, anxiety, threat, trust: seeking a secure basis for knowledge; keeping our business or state intelligence secret and secure; having confidence in the intelligence agencies who promise security in the face of terrorist threat; finding a secure basis for business and economic decision making; trusting a corporation with our personal data; having confidence that government is based upon an awareness of what is happening in the world, rather than on dogma and ideology.
So I realized these philosophical questions are actually about politics and business management.
Julian Oliver – Transparency Grenade
The first scientists wanted to escape the dogma of the Church and the power of the Prince and base knowledge on what they observed in experiments and fieldwork. It has not proved possible to provide a sound philosophical case that data, records produced from direct observation, are inherently neutral and value-free, arguing, for example, that data directly connect somehow with reality. Data are always collected for a particular purpose or from a certain viewpoint, with certain values in mind: data are always inherently biased in this way. Data collection also always involves possibilities of error.
So how do you justify trust in the data gatherer? In the seventeenth century natural philosophers trusted that the honor of a gentleman scholar would prevent him from bearing false witness. By the nineteenth century data collection and processing were being handed over to professionals who were meant to follow standardized procedures, such that they could be audited on their performance. This separated data collection from processing from summary from decision making and policy. A division of labor and management model offered a solution to a philosophical problem.
I think we are now seeing the results of this break-up of the process of building knowledge of our world. We need to reconnect, democratically, with the decision making that determines what we want to know so that we all can share in the open and transparent construction of knowledge, seeking a bigger picture of the world we inhabit.
OK – that’s the summary.
Here’s the detail –
Information about so much is now so readily available. Information technology has become a defining feature of daily experience. We live in an information society with a knowledge economy where the generation of information, its processing, transmission and transformation into knowledge and experience have become fundamental sources of productivity and power, and indeed entertainment.
Industrial mass manufacture from the eighteenth century required new instruments and measures, new kinds of information to devise and regulate production, distribution and consumption: determining the precise chemical composition of a metal alloy so that quality and consistency of manufacture might be maintained; determining the necessary supply of coal to fuel the furnaces. Now the production of intangible information is an end in itself, with actionable knowledge the primary focus, rather than energy and matter.
Here’s a sketch of what is happening.
Walmart, the American multinational retail corporation with 8500 stores all over the world, records more than 1 million customer transactions every hour. The details of location, price, quantity, and much more are imported into databases estimated to contain more than 2.5 petabytes (PB) of data. A petabyte is a million gigabytes (GB), or enough to digitally store 100 million high resolution photographs (@10 MB each) – the equivalent of 167 times the information contained in all the books in the US Library of Congress.
The NASA Center for Climate Simulation stores 32 PB of climate data that it uses to simulate the world’s weather and environmental change on its “Discover” supercomputing cluster.
eBay, the e-commerce giant, stores almost 90PB of data about customer transactions and behaviors associated with $3500 of product sales a second, with 500 million concurrent live auction listings, split into more than 50,000 categories. eBay sites have more than 100 million active users, generating up to 100 terabytes (TB – a thousand gigabytes) of new data each day. The company has a special custom computer system called “Singularity” that performs deep-dive analysis of customers and transactions, so as to identify patterns and trends, to get to know the business better.
The Utah Data Center is currently being constructed by the United States National Security Agency at a cost of some $2 billion. The facility contains 100,000 square feet of computer servers running on 65 megawatts of electricity at a cost of about $40 million a year. It is part of the NSA’s plan to expand its data gathering and processing capability to handle yottabytes of data. A yottabyte is a billion million gigabytes (10*24 bytes). Such a massive capacity anticipates Internet traffic to pass a zettabyte, a thousand exabytes, or a thousandth of a yottabyte a year by the end of 2016, spread across 2.7 billion Internet and mobile media users. The data gathered and stored by the NSA comes from a variety of sources that have been the subject of much attention since Edward Snowden, a contractor for the NSA, leaked a slew of secret documentation to the press in June 2013: they include surveillance satellites, telephone intercept stations, arrangements, approved or not, with Internet and communications companies such as Verizon, Yahoo and Google to look at the data they keep on their customers, and just plain old intelligence snooping.
Data gathered needs filtering, searching, analyzing and interpreting. eBay and Walmart and just about every corporation today seek to learn about their customers’ desires among the web browsing and social media chat, the store visits and transactions. The NSA aims to spot suspicious communications among the vast sea of everyday conversation, and wants to break through any encryption to get a clear view of what people try to keep secret. NASA and other agencies want to model the world’s climate and anticipate what changes are coming. Big data demand super computers at the limit of number-crunching capability.
I suggest that we need to be careful with terms, if we are to see into this post-industrial world.
People and machines collect data. They may be raw observations, values, typically numbers, but they may also be names, words, locations, any kind of basic record. Data don’t speak for themselves; they are mute. Working with data is a search for meaning. We can summarize data visually, using graphs, maps, statistics. Data analysis, using complex, or even just simple statistics, for example, can find patterns. Data are useless until they are turned into information that can help build insights or knowledge about the way the world’s climate works, for example, though we might not see it, how and why people behave the way they do, how the economy works, or just simply what people are actually up to in their lives and dreams.
The purpose? I would like to be idealistic and say that gaining insights by working on data aims to deliver better decisions, rooted in knowledge rather than ideology, that will make the world a better place to live in. More accurately I think big data are intended to inform decision making and policy, to optimize the process of government and business. It is clear that agencies, states and corporations will go to extreme lengths to achieve this supposed end.
an attitude and a conundrum
There’s an attitude towards data assumed in all this colossal effort. Data are, as the word signifies, “given”. Though they are collected by people and machines, gathered by sensors and according to procedures, data are typically, after appropriate filtering for errors, accepted as they stand, as observations about the way the world is. We have every reason to desire and cherish, to invest in such apparent direct access to the world. Nevertheless as historian. social scientist and archaeologist, I am very aware that this attitude is part of a long and political campaign to separate observation from systematic accounts of the world, to sever the connection between description and analysis and interpretation.
That there’s something awkward about this attitude comes out in the debates at the moment about privacy. Should Google be keeping every item of data about its users’ activities? Should state surveillance of the sort pursued by the NSA involve accessing just about every recorded act in daily life, public or private? The answer given by some politicians has been that if you haven’t done anything wrong, you don’t need to be concerned about the data gathered by security and intelligence services. Google asks you to share your private data so that the corporation can put it all together and offer you a better service, and even make the world a better place. Surely the Google geeks in California are only computing, not conspiring to take over the world. I believe many of us feel very suspicious about these answers to the question of privacy, even though we have all signed privacy agreements. Google and Government assume the attitude that data are neutral observations, though they can be made to be very useful. We are asked to trust those who gather and use data.
The attitude is connected to a conundrum. If the acquisition and processing of vast amounts of data are the core of international security and informed decision making and policy, why have there been such spectacular failures to see what is going on in the world, when the world seems so out of control? Notoriously the economic crash of 2008 came as almost a complete surprise to those governments, banks and economists who are some of the greatest users of big data analysis. And what is the reward for releasing so much data about your private life to Google or Facebook? More targeted advertising? Anticipating your purchasing decisions? Certainly the reward is bigger profit for the corporations. It is impossible to assess how much insight is being delivered by ubiquitous security surveillance because, Catch-22, it’s kept a secret in the security world of NSA “never-say-anything”. But the story of US foreign and anti-terrorist policy is hardly one of spectacular success. Will a yottabyte of information about everyone’s everyday life actually improve democracy? Or indeed, has all the data and information about climate change, and the overwhelming scientific consensus about human involvement, led to radical policy change? The connection between data and decisive political action seems contrary.
I’ll return to this attitude and conundrum later. Here I mention that at the heart of both is an old philosophical problem—how do we move from observation and encounter to data to information to knowledge to policy, decision making and government?
The exhibits in Design Column #6 probe this world of big data and prompt us to think deeper, particularly by modeling artifacts and behaviors that reveal the attitude and conundrum I have just outlined.
Adam Harvey in CV Dazzle, ‘Anti-Drone’ Wear, and DNA Spooﬁng [Link] offers ways of confusing and escaping surveillance and the collection of data about your movements. Cameras are watching from above and on the street. They can recognize you. But not if you’re wearing make-up that alters your appearance to such an extent that the automatic visual facial recognition software no longer works, wearing a hoody that makes infra-red detection by drones (unmanned aircraft) impossible. With ‘DNA Spoofing’ you encrypt your genetic material, leaving not even a microscopic trace that can be identified by forensic science. To some this may all seem like the kind of things desired by those who would wish to subvert authority and the state. The hoody has become a mark of someone wishing to signify some antipathy towards authority. But what, then, of Adam’s burkas, those symbols of Islamic women hidden from gaze? Adam Harvey is asking us to consider the process of observation, detection, recognition at the heart of data collection. Are they as secure and neutral as they purport to be? What of the misrecognized targets of drone assassination?
Misrecognition is the core of Mirror Piece by Marnix de Nijs [Link]. You look into what seems like an ordinary mirror. But it is much more. Your face is scanned and matched against a database of famous people. A mechanical voice then publicly identifies you as one of these famous – or more accurately, notorious – people – murderers, criminals, terrorists. The controversial characteristics of these figures are also announced. The public denunciation links you, of course, to all kinds of suspect practices. You become the criminal accused, sharing their physiognomy.
Again we’re prompted to consider the process of data gathering and pattern matching. Here the connection is an old one. Criminology in the nineteenth century was one of the first fields to use physical attributes, facial features, fingerprints, anthropometric, biometric and then psychometric data, for identification and management of a criminal population.
Statistics as a mirror of reality? Mirror Piece even poses the question – Do you recognize yourself? Do you know who you really are? I am prompted to recall Oscar Wilde’s Picture of Dorian Gray – the painted portrait of the beautiful young Dorian witnesses the decay and corruption of his soul in a life of debauchery, while Dorian remains youthful and unblemished by his lifestyle.
Private or public. Secret or transparent. Here is how Julian Oliver describe’s his Transparency Grenade [Link]
The lack of corporate and governmental transparency has been a topic of much controversy in recent years, yet our main tool for encouraging greater openness is the slow, tedious process of policy reform.
Presented in the form of a Soviet F1 hand grenade, the Transparency Grenade is an iconic cure for these frustrations, making the process of leaking information from closed meetings as easy as pulling a pin.
Equipped with a tiny computer, microphone and powerful wireless antenna, the Transparency Grenade captures network traffic and audio at site. When you pull the pin and set off the grenade it securely and anonymously streams everything to a dedicated server where it is mined for information. Email fragments, HTML pages, images and voice extracted from this data are then presented on an online, public map, shown at the location of the detonation.
The metaphor is a powerful one, with its references to military, state and terrorist force and intervention, as well as to the key figure and role of the witness and whistleblower. We are seeing more and more how privacy, what some want to keep secret, or personal, or secure, is so intimate with power, wealth and resources. And there is a politics as well as ethics to unconcealing what may remain hidden.
In Study for The Rhythm of Life [Link] Mike Thompson, Susana Cámara Leret, and Dave Young make tangible some of the communication processes in our bodies that are invisible to the naked eye. Working with scientists from the Netherlands Metabolomics Centre (NMC), they measure the emission of photons in our hands by means of highly light-sensitive equipment, and then they turn the patterns into music. We are confronted with the processes of metamorphosis that are essential to data, or indeed to self-awareness. A hand becomes a structured array of proton emissions and then strange music of inhuman origin – a pattern of which we are typically unaware. We might listen to the rhythmic patterns that are this representation, this information that is self, experience, this personal emission, and again ask – Is this me? Even though invisible, intangible.
The insight follows one popularized by the European Romantic movement, that we may find God in a grain of sand, in the infinity of being. For what are we but a depthless well of data wherein we are not what we seem?
Might our notion of selfhood not extend beyond our bodies? Marcia Nolte in Corpus 2.1 [Link] presents us with models of augmented selves. A sensor might be embedded to augment human memory, for example. Skin, as integument, serves as both a barrier and a link between the world inside and the world outside the human body. So Marcia’s portraits focus on the human skin and show how modifications to the body, such as enhanced eyes, may appear so slight, and not entail the monstrous cyborgs of science fiction. Data are here mediating, connecting fields, such as sensory inputs and outputs. We have long delegated tasks of data gathering and indeed processing to instruments; we have long used various kinds of artifact and machines to help us communicate and remember. These functions are core to being human. Why might we not therefore bring them inside ourselves with implants and cyborg attachments? A recording machine can surely be the equivalent of our memory.
Of course a science fiction scenario here is of the implant that monitors our behavior and even thoughts, processes its data to anticipate and so control us. The relationship between data, predictive modeling and the regulation of practice actually blurs the neat distinction between observation, record, decision making and action.
Designed by Bastian Bischoff and Per Emanuelsson, founders of studio Human Since 1982, and constructed with engineer David Cox, The Clock Clock [Link] is an array of 24 analog two handed clocks all showing different times. And then, every minute, the hands of the four groups of six clocks combine to indicate the time in the four digits of a 24hr display. Again this is a remarkably beautiful metaphor of order or pattern hidden in behavior, with a twist that the analog turns digital. We watch and monitor and then, at the right moment, the pattern emerges out of mechanical movement as a visual effect. But only momentarily – order and sense then literally dis-integrates as the clock hands move on and the four digits dissolve. We seize the moment – a temporary conjuncture, when things come together just right. The question is prompted whether all pattern and order is similarly time bound, to slip away if one misses the moment. Process, precipitation and intervention, even as a glance in the right direction, are all implicated in the work performed upon data in order to convert to information.
Metamorphosis is the subject also of Silence Is Golden But This Is No Silence from Sarah van Sonsbeeck. [Link] A shimmering mirror made of gold leaf offers a reflection upon the Dutch saying “silence is golden”. Is being silent worth its weight in gold? What would that be? Can we buy silence?
This is now quite a provocative question given the heated controversy over US covert intelligence gathering, when even the Chancellor of Germany had her phone tapped by the NSA. The piece raises questions about the equivalences that data records assume. Data are typically values recorded against quantitative or qualitative variables – Chancellor Merkel phoned home 36 times and for 2 hours 24 minutes in June 2008 (or whatever); I am 66 inches tall and weigh 150 pounds. Though accurate (let’s suppose), these are hardly equivalences.
Here we’re looking at two eurocents worth of gold, the Euro coins, originally found in van Sonsbeeck’s studio, exchanged for, transformed into 3.06 grams of gold leaf that covers 1892 square centimeters of board (43 x 44cm). The mirror doesn’t speak, but that doesn’t mean it is mute. The gold reflects the room and people around it; we see all sorts in the less than perfect reflection. Value can been treated abstractly as quantity of gold, but such an abstraction has to take a material form that always connotes, implies, suggests: there is always something waiting to fill the silence, just as equivalence is never complete – we are never completely described by our data.
New media artist Geoffrey Lillemon and the Stööki art collective have made an online animation that defies the logic of information value. Unlike the value of gold, which is directly connected to its scarcity, the value of an information post on a web site is connected to how many people visit and look at the page. Data and information value is related to ubiquity. On Facebook you are invited to like the things you see and read; your mouse click liking that photograph adds to its rating. The value of a YouTube video is related to the number of times people have watched it. Web traffic, page visits, clicked links are the keys to calculating data and information value.
Somewhat perversely I have always found it intriguing to find those YouTube videos that no one has ever viewed (there’s a counter by every video), or discover a Google search that returns just one hit. As perverse is Like to Death [Link], Lillemon’s figure of death that disintegrates with every click of the like button. For the animation to persist one must not “like” it. It takes 20,000 “likes” to destroy the figure altogether. The project’s makers are referencing the end of the curse of social media – digital networks that can likewise only be destroyed if a huge number of people give them up.
Like to Death is also a celebration of a kind of value that is not related to popularity. The likes of Amazon.com operate a “long tail” retailing strategy of selling a large number of unique items in relatively small quantities rather than large numbers of popular items (though they do that too). Statistical means, normal distributions, correlation and fitting data to mathematically defined relationships filter out, ignore or marginalize outliers, rarities, quirks, exceptions to the rule. This is because finding pattern in data is all about probability and likelihood: you want to know what is likely or normal, not what is weird and anomalous. But the vastness of the World Wide Web allows space for all kinds of anomalies, quirks, eccentricities, a long tail to a distribution of goods (and data), where value may indeed lie in uniqueness and diversity, and yet not in the abstract scarcity represented by a commodity like gold.
data have a history
When did we become so interested in data, in numbers, in applied math and statistics?
I have mentioned that there is a history to the idea of data, though I have not found it easy to piece together. Nevertheless, here are four short historical vignettes that I find very thought provoking.
How did a banker in fifteenth century Florence convince people that he was trustworthy? Show them his books. Double-entry bookkeeping was a way of setting out numbers, debits and credits, so that errors could be easily identified in an overall view of the business. It is an accounting procedure, a coherent system of writing and representation, where numbers, banking data, are associated with credibility, credulity, honesty. Here was s system of manipulating data that could carry moral connotation. Note that it is not the data themselves that are sign of honesty and virtue, but the system of what we would now call professional procedures that generates them and their display in the ledger. And more – bookkeeping is both a system of writing, of representation, and a model for effective government, because bankers, like politicians, need to obey the rules of a system designed to ensure trust.
What was the best foundation for effective government in the seventeenth century? Political arithmetic – government by information? Charles Stuart, King of England had been executed by Parliament and rule by divine right was under question as civil society began to grow. “Reason of state” arguments advised the Prince to strengthen the nation’s resources instead of trying to govern by abstract principles. This was a radical new theory of politics that offered numerical representation (of trade figures, for example) as an effective, because impartial, instrument of rule. Data could make visible a state’s health (or its maladies). The political arithmetic of William Petty proposed that arguments for policy based on number, weight, and measure could compel assent assuredly as mathematics in natural philosophy. Meanwhile Robert Boyle and members of the scientific Royal Society insisted that one could gather data completely free of theory, bias and ideology – all one had to do was contrive proper experiments and field work and assemble credible witnesses to record the data. A credible witness for the early Royal Society might well be an independently wealthy gentleman. By the Napoleonic state of the nineteenth century handing over data collection to professional state officials was another sound option, for those with the power.
In exploring the origins of commercial society how did historians in Edinburgh in the eighteenth century deal with the lack of evidence and data? By using conjecture, filling in the gaps through theory – what David Hume called fictions, arguing that they were nevertheless essential for secure knowledge. Moral philosopher and political economist Adam Smith relied upon abstract concepts of theory such such as “society” and “the market” as the basis for his new science of wealth. Such abstractions were also its objects of analysis, though they could not be established through data. Rather, Smith held that numerical data should be collected in the light of philosophical theories, in order that they might work to establish the impartiality of theory, that is, establish its descriptive potential.
How were mathematical statistics associated with the eugenics movement, internationally popular until the 1930s, that proposed improving the quality of humanity by promoting the reproduction of individuals with more desirable genetic traits over those considered to possess less desirable traits? Victorian polymath Francis Galton (1822-1911) was the first to apply statistical methods to the study of human differences and inheritance of intelligence. He devised many tests, including standard deviation, correlation and regression, himself. Galton introduced the use of questionnaires and surveys for collecting data on human communities. A pioneer in anthropometrics and psychometrics (quantifying human physical and psychological diversity), he was convinced that abilities were inherited and so founded the eugenics movement as a means to promote social reform. Galton also placed the study of fingerprints at the heart of a scientific criminology, complementing the identification system of Alphonse Bertillon and the anthropological criminology of Cesare Lombroso which held that criminality was inherited, and could be identified by physical features – physiognomy.
are data innocent? —ask a professional
We all want what counts as knowledge to be based upon the way the world is. If data were completely neutral observations about the world, carrying no bias or interpretation, they could be held to offer direct access to the way things are. This would be a great place to start.
I‘m not sure how many people have ever been convinced that it is as simple as this. Data have to come from somewhere. They involve people witnessing things and events, recording and collecting their observations. Things are missed, mistakes are made, and we have to trust data against deliberate falsification. Even numbers are not neutral and free from interpretation, because they embody assumptions about what can and should be counted, and how one can move from numbers to knowledge. Then there’s the conundrum of moving from data, statistics and interpretation to decision and action, especially when there’s more at stake than just economic profit. This is a messy business.
Yet data and computation appeal more than ever to government officials and corporations. So how have data come to be seen as the simple secure foundation of knowledge and action?
The conundrum – data are not neutral but we would like them to be – has been treated as something that is not at all to do with data, numbers, interpretation and analysis, statistics and computation in themselves. To have data be simply descriptive, you lay down procedures for data collection, interpretation and analysis and have them followed by serious professionals and officials who risk losing their accreditation if they fail to comply. You have data collection belong to a different stage of the project of producing knowledge, a stage that could be in the hands of a different kind of professional expert. You convert the desire for data neutrality into a matter of professional and political responsibilities across corporations, state institutions and agencies.
This division of labor in the construction of knowledge is reinforced and authorized when, as began with bookkeeping, numbers and their representation are separated graphically and in their notation from narrative and commentary, and, of course, from political argument and rhetoric. So a wonderful concern about how we might get to know the world and act appropriately together in state and society, as explored by the political theorists and economists of the seventeenth and eighteenth centuries in Europe and America, is converted into a particular management model dealing with the administration of research teams, companies and government departments.
Expert and professional practices, though they are here about fundamental matters of the organization of community, are not a matter for common everyday discussion. Why should a democratic assembly, for example, concern itself with neutral matters of scientific fact authorized by trustworthy professional experts? When concerns do arise, the response, as we are seeing today is to divert attention away from the status of data and how a collective process of establishing knowledge of ourselves has been taken from people, and instead to focus on matters of the regulation of professional conduct.
Here are the kinds of things we hear. “We just need proper regulation, proper accountability.” “There’s no need to be concerned – the professionals may have gone too far, but the system, when tweaked, will rein them in.” “You don’t need to know the details of what they’re doing, and anyway it needs to kept secret, because it’s all really about keeping you safe and secure.”
This is how a state reproduces itself in the face of the people it is meant to serve.
a bigger picture that we can all share
Let me pick up again my archaeological perspective, a long term view.
Humans have always lived in an information society – families and communities communicating in the interminable soap opera of life, vying for satisfaction and success, meeting challenges and dangers, pursuing projects. We have always been trying to figure things out. We have always looked for pattern in an ocean of noise, trying to read the signs, even if they were flights of birds that were once thought to herald the future.
We have always needed summary views and advice, when we don’t know where to look for neat answers. Now we have the capacity to search through more noise, more effectively and less subject to superstition and dogma, but the patterns we seek are arguably the same as ever. We seek security, to know what might be coming, seeing ahead but not being seen, for that might be precarious. We want to do what is right, to know better who we are and how the world works.
People have shared with family and friends, sought out advisors, experts, wise people, looked to create summaries, to find order and pattern. We have created archives to augment human memory, delegated the task of recording to scribes, of seeing beyond the local to spies, missionaries, ambassadors, helped in their work by systems of writing, carriages, ships, postal systems.
There has always been too much data. How can you capture, in all its richness, the experience of just one lifetime? I do not accept that in one digital day I experience more information than is contained in all the newspapers of the last century (or whatever the statistic is), because I don’t think that there is any significant equivalency. The equivalent of today is, precisely, today, though, of course, I may choose to record, compress and summarize in a particular way.
Typically there has just not been enough of the kind of data that can give us the information we want. Even in seventeenth century London when William Petty was formulating his political arithmetic, there were hardly any summary data about the effects of the plague on the population of London. Now we can focus on some problems, say traffic, or health care provision, gather data and compute the flows and bottlenecks.
As I indicated in the precious section, I think the key changes are managerial and administrative, concerning the ways that data are administered and retrieved, and, above all, the way that sharing and delegating the tasks of recording and archiving with instruments and machines and across teams in organizations has allowed data collection, storage and processing on a much greater scale.
Today data are gathered through more and speedier connections. Remote sensors have been recruited to act as proxies for agents with tape recorders and notebooks, satellites wirelessly connected, alliances struck up with corporations and foreign intelligence services and corporations. More and varied relationships have been created that can help us see a bigger picture, but the components of the process of observation, witnessing, recording, summarizing, reporting, remain the same.
The design projects in our exhibition display the rich human side to the issues at stake in the current debate about data: questions of self and identity, of value, diversity, equivalences and metamorphoses that run far beyond abstract economic transaction.
Data are mainly about such rich connections – social, cultural, technical networks – how we organize ourselves to get to know our world. Data should be about the politics of sharing this richness. While big data may impress with their size and cost and inspire dreams of omniscience and control, the crucial concern is how, as ever, can we all simply share a bigger picture, work together openly to know the wonders of the world of which we are a part.
The trouble is that the way the process of building knowledge has been arranged can run counter to such a democratic vision in its obscurity and exclusivity.