Green Justice

I came across a very insighful keynote on the New Internationalist when reading about the Earth Submit 1992. Probably I need to subscribe to this magazine again!


Green Justice, by David Ransom

The caged birds and animals were silent in the darkness. The place was deserted. I followed a familiar path, around the dozing exhibits in the Monkey House, past the life-sized memorial to Guy the gorilla and into the Reptile House of the London Zoo. Drinks were on offer at a private environmental function.

An iguana peered out from its brightly-lit box as I talked to a woman from one of the largest international nature conservation agencies. As it turned out we shared – with the iguana, no doubt – a degree of gloom about ecology. Private polls had revealed, she said, that her organization's supporters were not prepared to change their lifestyles to save the planet. Well, if they – the active supporters of nature conservation – weren't., who would be? Anyway, it meant no more campaigning to change lifestyles and consumption patterns in the North. That would put her organization out of business and her out of a job.

I should not have been surprised. It is, after all, election year in both the US and the UK. The environment is not a priority on the election campaign trails. Imagine this from one of the candidates: 'Read my lips! No more carbon emissions!'; 'Cut consumption and save the earth!' Such slogans would be, everyone agrees, among the shortest political suicide notes ever written. No-one – or almost no-one – in the North is prepared to vote for change. We all – or almost all – still require of our political leaders that they at least promise greater material prosperity all round.

This is a problem, and it is our problem in the North. Because, unless the evidence provided by scientists and our own eyes deceives us, the threat to the environment comes from what we consume in the North. It does not matter which measure you take – against emissions of greenhouse gases like carbon dioxide and methane, ozone-depleting chloroflourocarbon (CFC) gases, toxic or radioactive wastes, garbage or chemicals – the degradation of the earth and the threat of global warming comes from the wealthy minority who live largely in the North.

But it is not just a problem for the North. It is also a problem for the poor majority who live largely in the South. The unusual skin cancers that are appearing in southern Chile may be the result of a hole in the ozone layer over Antarctica, but the CFC gases that punched the hole come from the North. The truth that people on a small planet depend upon each other for their survival is self-evident.

What is less clear, however, is what that means in practice. Here the problem for the people of the South is not so much with the planet as with the people of the North. Because, since we in the North won't pay the real cost of living as we do, we look for an escape route. Almost by tradition, we find it to the South.

So, with a bit of fine-tuning of the evidence, we change the subject. We say that if the South fits itself out with refrigerators, cars or televisions on the same scale as the North, the pollution of the planet will spiral upwards out of control. Anyway there are just too many people in the South – its population is growing too fast. Their governments are incompetent and corrupt. Besides, the people of the South are more vulnerable to environmental disaster from, say, rising sea levels or soil erosion than we are in the North. We shift our attention away from what we can see happening here and now, onto what might happen somewhere else in the indefinite future.

There may be an element of truth in all these points. They are important issues that have to be tackled. But that does not matter. They are being deployed by the North not because we have the slightest intention of tackling them, but to get us off the environmental hook. It's not a question of blame and guilt, but of power and responsibility.

For, precisely because of its wealth, the North actually does have the power to inflict the environmental costs of its 'lifestyle' on the South. It can use the South as its environmental sink. It can impose conditions on the South for the receipt of aid or credit. It can and in practice does insist that the South remains poor and relatively 'green'. The vulnerability of the South has become the world's single biggest environmental problem. A psychologist might say that the South is being required to 'collude' with the North's self-deception about the responsible stance it thinks it is taking.

You can see this process at work in Rio de Janeiro, where in June the United Nations Conference on Environment and Development (UNCED) will convene. It's billed as the biggest-ever gathering of world leaders, a veritable 'Earth Summit', and it's meant to usher in a new era of global co-operation to save the planet.

[image, unknown] Rio is a very tempting venue and no doubt there will be a good turn-out for the conference. But the city has its problems – violence, drugs, destitute children on the streets and the most terrible pollution.

The Avenida Brasil runs from the international airport into town and crosses one of the most run-down areas of the city. When it rains the road floods, the traffic stops and the more unruly elements have on occasion been known to make piratical raids on stranded vehicles.

Not, you might think, a problem that ranks high among Rio's priorities. But somehow it crept to the top of the list. Very little can happen in debt-strapped Brazil these days without the say-so of its Northern creditors. Since they will be turning out in large numbers for the 'Earth Summit', and doubtless would not take at all kindly to being hijacked on the road to Rio, one can only assume they had at least some influence over the preparations.

So, in a city where just last year I was all but blinded by air pollution, hundreds of millions of dollars are now being spent on an elevated motorway – the 'Red Route' – that will smash up the communities of the turbulent citizens of Rio in the interests of whisking UNCED notables over their heads and into CFC-consuming air-conditioned hotels.

These notables will then focus on an agenda they have fixed for themselves – and particularly on climate change and biodiversity. The general principle is that the South should sign up to an agreement to limit emissions of 'greenhouse' gases like carbon dioxide, and preserve tropical rainforests as 'gene banks' for scientific research and the medical industry in the North. As for 'development', the only specific suggestion so far is that a 'Green Fund' should be set up to support environment-friendly initiatives, administered by the World Bank.

Now, the World Bank may be many things, but an environmental protection agency it certainly is not. Lawrence Summers, the Bank's chief economist, observes flippantly in a leaked internal memo that in some African countries air pollution is 'probably vastly, inefficiently low compared to Los Angeles or Mexico City'. He argues that the only thing pre venting the export of more pollution from North to South is the physical difficulty of moving it. The London Economist remarked, on publishing the memo, that 'on the economics, his points are hard to answer'.1

But shifting the stuff is getting easier all the time. Carlos Milstein, deputy director of the Office of Technology Imports in Argentina, claims that 'in 20 years of working at customs I have never seen the quantities of industrial waste and trash [that are now] coming into this country from the US and Europe.' Last October it amounted to 200 tons a week of hazardous waste; local entrepreneurs are now planning to import 250,000 tons of plastics a year for incineration and land dumping.

Ironically, the worst dumpers are the nations with the toughest environmental laws, like the Netherlands, Austria, Switzerland, Germany and the US. Tough laws at home mean higher costs and so instead of cleaning up their act many companies simply ship their filth elsewhere. It may look good locally in the rich world – but it makes no difference in global terms. In Brazil, for example, huge lead smelting plants are working flat out recycling the lead from car batteries returned by well-meaning motorists in the North. Workers in and around these smelters now have very high levels of lead in their blood.2

To the free-market economists of the North it all makes perfect sense. Goods come and go as they please, and disposing of Northern toxic wastes in the South is cheaper and easier than it is in the 'environmentally-conscious' North. This is the way North-South trade usually works – and has worked for the past decade. During this time heavily indebted Southern countries have been required by the World Bank to follow what are called 'structural adjustment' policies. These policies demand exports of any kind in exchange for credit. If you have natural resources like copper or wood, then you must produce more of them and cheaper. Because everyone else is required to do the same thing there is a glut on the world market, your exports get cheaper and cheaper and so you must export more and more.

The net result is that the North gets plentiful raw materials cheaply – and therefore doesn't have to worry about conservation – while the South is left with torn up forests, polluted rivers, gigantic holes in the ground and an impoverished people living in an almighty mess. All that's new is that the South now has the option of importing toxic wastes as well as exporting raw materials.

The body that sets the rules for world trade – and so could intervene to stop this happening – is the General Agreement on Tariffs and Trade (GATT). Its latest round' of negotiations has been trundling along since 1987, trying to reach agreement on things like 'Intellectual Property Rights' and 'Trade-related Investment Measures'. It is currently stalled on a dispute between the US and the European Community over farm subsidies. It does have a committee on the environment, set up in 1972. But, in all that time, it has never actually met.

The undeniable truth is that, in practice, the North is much less bothered about the environment than it pretends to be. We might churn out scientific papers and documentaries, we might listen with rapt attention to environmental Jeremiahs, but given a chance most of us still go shopping in cars – and only use our feet when we vote against self-denial. Hit by the most severe economic recession in 50 years, yet finding the pollution of the planet continues unabated,3 we may well discover that the rest of the century will have to be spent revising conventional wisdom on both the environment and the economy of the North. But we have not started yet.

Not, you might think, a very auspicious point from which to launch a UN Conference on Environment and Development. Alternatively, you could say that never was such a conference more sorely needed – so long as it produces results. But what results? What is to be done?

Well, that's not really in dispute. Ever since the first environment conference in Stockholm in 1972 – and the publication of the Brandt Report in 19804 and the Brundtland Report in l9875 – there's been very little argument about what should be done.

The catchphrase today is 'sustainable development'. It comes from Brundtland and it contrasts a largely 'unsustainable' present with our duty to respect the interests of future generations and the need for minimal standards of well-being world-wide. Arguing against 'sustainable development' these days is tantamount to arguing in favour of sin.

But saying things is not enough. Protection of the environment requires conscious, positive, human intervention – legislation, enforcement, education, public information and debate. It needs action like that taken by the Organization of African Unity in 1988, banning the import of hazardous wastes and substantially reducing the trade as a result. There's no way round this. But conventional wisdom at the World Bank and GATT runs in precisely the opposite direction, towards unfettered competition for profit, open borders, deregulation, commercial secrets and letting the market decide.

Junkets like the UN Conference on Environment and Development may produce more cynicism than action. But unless we think of them as at least one of the available tools, we are toying with pessimism as if it were some self-indulgent luxury.

For a start, outside the official conference, Rio is likely to see the largest-ever gathering of Non-Governmental Organizations (NGOs) at what's being called the '92 Global Forum. Their vitality, both North and South, is one of the more hopeful signs. The most effective weapon they have is their ability to think radically – to reflect and influence public attitudes. On environmental issues in the North they have clearly had an impact already. The challenge now is to develop and modify that experience by learning from the South.

Is there anything that the North can learn from the South? Environmental organizations in the North have in the past tended to treat the South as if it were a mirror of their own preoccupations. At best there's been a romantic interest in the environmental wisdom of indigenous peoples. Deep as that wisdom runs, and much as we may have to learn from it, the most relevant fact about indigenous peoples is that they continue to be persecuted to the verge of extinction.

But what about the vast majority who live in the cities, towns and villages of Latin America, Africa, Asia? What about those whom we in the North tend to think of as population rather than people? Is it true that people who live without adequate education or health care, often on the edge of hunger, are preoccupied with 'survival' and have no wish to explore the global village?

Well, human survival is what 'sustainable development' is supposed to be all about. When it comes to developing survival strategies the people of the South are experts. In fact they know a lot more about it than the self-proclaimed experts of the North – the World Bank officials who have the power to decide what shape survival should take. It is these people who should be taking lessons from the people of the South – not the other way round.

But something more than an exercise in humility is required. The experts of the North need to recognize that the people of the South matter as much as they do. If that were to happen the world would have to become a very different place.

Take just one example. These days 'democracy' is as much in vogue as 'sustainable development'. But do we have a democratic world? Do we have one-person-one-vote in the global village? If not, why not? What do we have and why? Answer these questions as you will, you are still left with the fact that if we did have global democracy then the views of people in the South would count a great deal more than they do now – they are, after all, the majority.

A couple of months ago I was listening to the veteran ecologist Edward Goldsmith at a public meeting in London. He was berating the powers that be for failing to tackle the world's environmental problems. It was, he confessed, a negative and depressing message. So what was to be done? He turned to a Canadian priest and a group of activists from the Pastoral Land Commission in Brazilian Amazônia who were sharing the platform with him. They had been struggling for years against brutal repression and for land reform. Also on the platform were campaigners against the Narmada valley dams in India. 'Our future', said Edward Goldsmith, 'rests with them.'

It was a dramatic gesture. I suppose there was a dash of 1960s 'Third Worldism' about it. I half expected a suitably green ghost of Che Guevara to descend onto the platform. But I also thought he was right. It's time to start listening.

1 The Economist, London, 8 February 1992.
2 John Vidal, 'The new waste colonialists' in The Guardian, 14 February 1992.
3 The Sunday Independent, 16 February 1992.
4 North-South: A Programme for Survival, Pan Books 1980.
5 Our Common Future. The World Commission on Environment and Development, OUP 1987.
– See more at: http://newint.org/features/1992/04/05/keynote/#sthash.m4jLWhBP.dpuf

Rượu và bình

Bình cũ, rượu mới. Hay là bình mới, rượu cũ. Đại khái thế. Cùng một vấn đề có thể nhìn ở nhiều góc độ khác nhau. Cùng một bài toán có thể có nhiều lời giải. Cách đặt câu hỏi, frame vấn đề nhiều khi quan trọng hơn là lời giải.

Xin bà con bấm vào link (tiêu đề) để đọc bài viết đầy đủ trên blog của ĐH Standford.

Shift Your Lens: The Power of Re-Framing Problems

What is the sum of 5 plus 5?

What two numbers add up to 10?

The first question has only one right answer, while the second has an infinite number of solutions, including negative numbers and fractions. These two problems, which rely on simple addition, differ only in the way they are framed. In fact, all questions are the frame into which the answers fall. And as you can see, by changing the frame, you dramatically change the range of possible solutions.

Frame

“If I had an hour to solve a problem and my life depended on the solution, I would spend the first fifty-five minutes determining the proper question to ask, for once I know the proper question, I could solve the problem in less than five minutes.” – Albert Einstein

Dữ liệu lớn

"Big Data" được nói đến khá nhiều trong thời gian gần đây. Cụm từ này được dịch sang tiếng Việt là "dữ liệu lớn" (ví dụ như bài trên Tia Sáng của Gs. Hồ Tú Bảo). Mình nghĩ dịch là "siêu dữ liệu" nghe hay hơn. Tuy nhiên, "siêu dữ liệu" lại bị gán cho từ "metadata" mất rồi.

Bài dưới đây lôi từ trang của Gs. Trần Hữu Dũng về. Trên trang của bác ấy cũng đăng link một bài phản biện bác lại các quan điểm lạc quan về big data. Bà con muốn xem mà không muốn bị các bạn FP làm phiền bắt log-in thì cứ chọn chế độ Print Preview mà xem.

Mình thì đồng tình với quan điểm của bài dưới đây. Lâu nay vẫn băn khoăn về vấn đề chọn mẫu và xử lý dữ liệu trong các nghiên cứu khoa học xã hội. Nhiều khi thấy việc áp dụng các phương pháp trong nghiên cứu cũng như kiểu niềm tin tôn giáo – no question asked!

—————————————————–

The Rise of Big Data

How It's Changing the Way We Think About the World

Kenneth Neil Cukier and Viktor Mayer-Schoenberger
 

KENNETH CUKIER is Data Editor of The Economist. VIKTOR MAYER-SCHOENBERGER is Professor of Internet Governance and Regulation at the Oxford Internet Institute. They are the authors of Big Data: A Revolution That Will Transform How We Live, Work, and Think [1] (Houghton Mifflin Harcourt, 2013), from which this essay is adapted. Copyright © by Kenneth Cukier and Viktor Mayer-Schoenberger. Reprinted by permission of Houghton Mifflin Harcourt.

Everyone knows that the Internet has changed how businesses operate, governments function, and people live. But a new, less visible technological trend is just as transformative: “big data.” Big data starts with the fact that there is a lot more information floating around these days than ever before, and it is being put to extraordinary new uses. Big data is distinct from the Internet, although the Web makes it much easier to collect and share data. Big data is about more than just communication: the idea is that we can learn from a large body of information things that we could not comprehend when we used only smaller amounts.

In the third century BC, the Library of Alexandria was believed to house the sum of human knowledge. Today, there is enough information in the world to give every person alive 320 times as much of it as historians think was stored in Alexandria’s entire collection — an estimated 1,200 exabytes’ worth. If all this information were placed on CDs and they were stacked up, the CDs would form five separate piles that would all reach to the moon.

This explosion of data is relatively new. As recently as the year 2000, only one-quarter of all the world’s stored information was digital. The rest was preserved on paper, film, and other analog media. But because the amount of digital data expands so quickly — doubling around every three years — that situation was swiftly inverted. Today, less than two percent of all stored information is nondigital.

Given this massive scale, it is tempting to understand big data solely in terms of size. But that would be misleading. Big data is also characterized by the ability to render into data many aspects of the world that have never been quantified before; call it “datafication.” For example, location has been datafied, first with the invention of longitude and latitude, and more recently with GPS satellite systems. Words are treated as data when computers mine centuries’ worth of books. Even friendships and “likes” are datafied, via Facebook.

This kind of data is being put to incredible new uses with the assistance of inexpensive computer memory, powerful processors, smart algorithms, clever software, and math that borrows from basic statistics. Instead of trying to “teach” a computer how to do things, such as drive a car or translate between languages, which artificial-intelligence experts have tried unsuccessfully to do for decades, the new approach is to feed enough data into a computer so that it can infer the probability that, say, a traffic light is green and not red or that, in a certain context, lumière is a more appropriate substitute for “light” than léger.

Using great volumes of information in this way requires three profound changes in how we approach data. The first is to collect and use a lot of data rather than settle for small amounts or samples, as statisticians have done for well over a century. The second is to shed our preference for highly curated and pristine data and instead accept messiness: in an increasing number of situations, a bit of inaccuracy can be tolerated, because the benefits of using vastly more data of variable quality outweigh the costs of using smaller amounts of very exact data. Third, in many instances, we will need to give up our quest to discover the cause of things, in return for accepting correlations. With big data, instead of trying to understand precisely why an engine breaks down or why a drug’s side effect disappears, researchers can instead collect and analyze massive quantities of information about such events and everything that is associated with them, looking for patterns that might help predict future occurrences. Big data helps answer what, not why, and often that’s good enough.

The Internet has reshaped how humanity communicates. Big data is different: it marks a transformation in how society processes information. In time, big data might change our way of thinking about the world. As we tap ever more data to understand events and make decisions, we are likely to discover that many aspects of life are probabilistic, rather than certain.

APPROACHING "N=ALL"

For most of history, people have worked with relatively small amounts of data because the tools for collecting, organizing, storing, and analyzing information were poor. People winnowed the information they relied on to the barest minimum so that they could examine it more easily. This was the genius of modern-day statistics, which first came to the fore in the late nineteenth century and enabled society to understand complex realities even when little data existed. Today, the technical environment has shifted 179 degrees. There still is, and always will be, a constraint on how much data we can manage, but it is far less limiting than it used to be and will become even less so as time goes on.

The way people handled the problem of capturing information in the past was through sampling. When collecting data was costly and processing it was difficult and time consuming, the sample was a savior. Modern sampling is based on the idea that, within a certain margin of error, one can infer something about the total population from a small subset, as long the sample is chosen at random. Hence, exit polls on election night query a randomly selected group of several hundred people to predict the voting behavior of an entire state. For straightforward questions, this process works well. But it falls apart when we want to drill down into subgroups within the sample. What if a pollster wants to know which candidate single women under 30 are most likely to vote for? How about university-educated, single Asian American women under 30? Suddenly, the random sample is largely useless, since there may be only a couple of people with those characteristics in the sample, too few to make a meaningful assessment of how the entire subpopulation will vote. But if we collect all the data — “n = all,” to use the terminology of statistics — the problem disappears.

This example raises another shortcoming of using some data rather than all of it. In the past, when people collected only a little data, they often had to decide at the outset what to collect and how it would be used. Today, when we gather all the data, we do not need to know beforehand what we plan to use it for. Of course, it might not always be possible to collect all the data, but it is getting much more feasible to capture vastly more of a phenomenon than simply a sample and to aim for all of it. Big data is a matter not just of creating somewhat larger samples but of harnessing as much of the existing data as possible about what is being studied. We still need statistics; we just no longer need to rely on small samples.

There is a tradeoff to make, however. When we increase the scale by orders of magnitude, we might have to give up on clean, carefully curated data and tolerate some messiness. This idea runs counter to how people have tried to work with data for centuries. Yet the obsession with accuracy and precision is in some ways an artifact of an information-constrained environment. When there was not that much data around, researchers had to make sure that the figures they bothered to collect were as exact as possible. Tapping vastly more data means that we can now allow some inaccuracies to slip in (provided the data set is not completely incorrect), in return for benefiting from the insights that a massive body of data provides.

Consider language translation. It might seem obvious that computers would translate well, since they can store lots of information and retrieve it quickly. But if one were to simply substitute words from a French-English dictionary, the translation would be atrocious. Language is complex. A breakthrough came in the 1990s, when IBM delved into statistical machine translation. It fed Canadian parliamentary transcripts in both French and English into a computer and programmed it to infer which word in one language is the best alternative for another. This process changed the task of translation into a giant problem of probability and math. But after this initial improvement, progress stalled.

Then Google barged in. Instead of using a relatively small number of high-quality translations, the search giant harnessed more data, but from the less orderly Internet — “data in the wild,” so to speak. Google inhaled translations from corporate websites, documents in every language from the European Union, even translations from its giant book-scanning project. Instead of millions of pages of texts, Google analyzed billions. The result is that its translations are quite good — better than IBM’s were–and cover 65 languages. Large amounts of messy data trumped small amounts of cleaner data.

FROM CAUSATION TO CORRELATION

These two shifts in how we think about data — from some to all and from clean to messy — give rise to a third change: from causation to correlation. This represents a move away from always trying to understand the deeper reasons behind how the world works to simply learning about an association among phenomena and using that to get things done.

Of course, knowing the causes behind things is desirable. The problem is that causes are often extremely hard to figure out, and many times, when we think we have identified them, it is nothing more than a self-congratulatory illusion. Behavioral economics has shown that humans are conditioned to see causes even where none exist. So we need to be particularly on guard to prevent our cognitive biases from deluding us; sometimes, we just have to let the data speak.

Take UPS, the delivery company. It places sensors on vehicle parts to identify certain heat or vibrational patterns that in the past have been associated with failures in those parts. In this way, the company can predict a breakdown before it happens and replace the part when it is convenient, instead of on the side of the road. The data do not reveal the exact relationship between the heat or the vibrational patterns and the part’s failure. They do not tell UPS why the part is in trouble. But they reveal enough for the company to know what to do in the near term and guide its investigation into any underlying problem that might exist with the part in question or with the vehicle.

A similar approach is being used to treat breakdowns of the human machine. Researchers in Canada are developing a big-data approach to spot infections in premature babies before overt symptoms appear. By converting 16 vital signs, including heartbeat, blood pressure, respiration, and blood-oxygen levels, into an information flow of more than 1,000 data points per second, they have been able to find correlations between very minor changes and more serious problems. Eventually, this technique will enable doctors to act earlier to save lives. Over time, recording these observations might also allow doctors to understand what actually causes such problems. But when a newborn’s health is at risk, simply knowing that something is likely to occur can be far more important than understanding exactly why.

Medicine provides another good example of why, with big data, seeing correlations can be enormously valuable, even when the underlying causes remain obscure. In February 2009, Google created a stir in health-care circles. Researchers at the company published a paper in Nature that showed how it was possible to track outbreaks of the seasonal flu using nothing more than the archived records of Google searches. Google handles more than a billion searches in the United States every day and stores them all. The company took the 50 million most commonly searched terms between 2003 and 2008 and compared them against historical influenza data from the Centers for Disease Control and Prevention. The idea was to discover whether the incidence of certain searches coincided with outbreaks of the flu — in other words, to see whether an increase in the frequency of certain Google searches conducted in a particular geographic area correlated with the CDC’s data on outbreaks of flu there. The CDC tracks actual patient visits to hospitals and clinics across the country, but the information it releases suffers from a reporting lag of a week or two — an eternity in the case of a pandemic. Google’s system, by contrast, would work in near-real time.

Google did not presume to know which queries would prove to be the best indicators. Instead, it ran all the terms through an algorithm that ranked how well they correlated with flu outbreaks. Then, the system tried combining the terms to see if that improved the model. Finally, after running nearly half a billion calculations against the data, Google identified 45 terms — words such as “headache” and “runny nose” — that had a strong correlation with the CDC’s data on flu outbreaks. All 45 terms related in some way to influenza. But with a billion searches a day, it would have been impossible for a person to guess which ones might work best and test only those.

Moreover, the data were imperfect. Since the data were never intended to be used in this way, misspellings and incomplete phrases were common. But the sheer size of the data set more than compensated for its messiness. The result, of course, was simply a correlation. It said nothing about the reasons why someone performed any particular search. Was it because the person felt ill, or heard sneezing in the next cubicle, or felt anxious after reading the news? Google’s system doesn’t know, and it doesn’t care. Indeed, last December, it seems that Google’s system may have overestimated the number of flu cases in the United States. This serves as a reminder that predictions are only probabilities and are not always correct, especially when the basis for the prediction — Internet searches — is in a constant state of change and vulnerable to outside influences, such as media reports. Still, big data can hint at the general direction of an ongoing development, and Google’s system did just that.

BACK-END OPERATIONS

Many technologists believe that big data traces its lineage back to the digital revolution of the 1980s, when advances in microprocessors and computer memory made it possible to analyze and store ever more information. That is only superficially the case. Computers and the Internet certainly aid big data by lowering the cost of collecting, storing, processing, and sharing information. But at its heart, big data is only the latest step in humanity’s quest to understand and quantify the world. To appreciate how this is the case, it helps to take a quick look behind us.

Appreciating people’s posteriors is the art and science of Shigeomi Koshimizu, a professor at the Advanced Institute of Industrial Technology in Tokyo. Few would think that the way a person sits constitutes information, but it can. When a person is seated, the contours of the body, its posture, and its weight distribution can all be quantified and tabulated. Koshimizu and his team of engineers convert backsides into data by measuring the pressure they exert at 360 different points with sensors placed in a car seat and by indexing each point on a scale of zero to 256. The result is a digital code that is unique to each individual. In a trial, the system was able to distinguish among a handful of people with 98 percent accuracy.

The research is not asinine. Koshimizu’s plan is to adapt the technology as an antitheft system for cars. A vehicle equipped with it could recognize when someone other than an approved driver sat down behind the wheel and could demand a password to allow the car to function. Transforming sitting positions into data creates a viable service and a potentially lucrative business. And its usefulness may go far beyond deterring auto theft. For instance, the aggregated data might reveal clues about a relationship between drivers’ posture and road safety, such as telltale shifts in position prior to accidents. The system might also be able to sense when a driver slumps slightly from fatigue and send an alert or automatically apply the brakes.

Koshimizu took something that had never been treated as data — or even imagined to have an informational quality — and transformed it into a numerically quantified format. There is no good term yet for this sort of transformation, but “datafication” seems apt. Datafication is not the same as digitization, which takes analog content — books, films, photographs — and converts it into digital information, a sequence of ones and zeros that computers can read. Datafication is a far broader activity: taking all aspects of life and turning them into data. Google’s augmented-reality glasses datafy the gaze. Twitter datafies stray thoughts. LinkedIn datafies professional networks.

Once we datafy things, we can transform their purpose and turn the information into new forms of value. For example, IBM was granted a U.S. patent in 2012 for “securing premises using surface-based computing technology” — a technical way of describing a touch-sensitive floor covering, somewhat like a giant smartphone screen. Datafying the floor can open up all kinds of possibilities. The floor could be able to identify the objects on it, so that it might know to turn on lights in a room or open doors when a person entered. Moreover, it might identify individuals by their weight or by the way they stand and walk. It could tell if someone fell and did not get back up, an important feature for the elderly. Retailers could track the flow of customers through their stores. Once it becomes possible to turn activities of this kind into data that can be stored and analyzed, we can learn more about the world — things we could never know before because we could not measure them easily and cheaply.

BIG DATA IN THE BIG APPLE

Big data will have implications far beyond medicine and consumer goods: it will profoundly change how governments work and alter the nature of politics. When it comes to generating economic growth, providing public services, or fighting wars, those who can harness big data effectively will enjoy a significant edge over others. So far, the most exciting work is happening at the municipal level, where it is easier to access data and to experiment with the information. In an effort spearheaded by New York City Mayor Michael Bloomberg (who made a fortune in the data business), the city is using big data to improve public services and lower costs. One example is a new fire-prevention strategy.

Illegally subdivided buildings are far more likely than other buildings to go up in flames. The city gets 25,000 complaints about overcrowded buildings a year, but it has only 200 inspectors to respond. A small team of analytics specialists in the mayor’s office reckoned that big data could help resolve this imbalance between needs and resources. The team created a database of all 900,000 buildings in the city and augmented it with troves of data collected by 19 city agencies: records of tax liens, anomalies in utility usage, service cuts, missed payments, ambulance visits, local crime rates, rodent complaints, and more. Then, they compared this database to records of building fires from the past five years, ranked by severity, hoping to uncover correlations. Not surprisingly, among the predictors of a fire were the type of building and the year it was built. Less expected, however, was the finding that buildings obtaining permits for exterior brickwork correlated with lower risks of severe fire.

Using all this data allowed the team to create a system that could help them determine which overcrowding complaints needed urgent attention. None of the buildings’ characteristics they recorded caused fires; rather, they correlated with an increased or decreased risk of fire. That knowledge has proved immensely valuable: in the past, building inspectors issued vacate orders in 13 percent of their visits; using the new method, that figure rose to 70 percent — a huge efficiency gain.

Of course, insurance companies have long used similar methods to estimate fire risks, but they mainly rely on only a handful of attributes and usually ones that intuitively correspond with fires. By contrast, New York City’s big-data approach was able to examine many more variables, including ones that would not at first seem to have any relation to fire risk. And the city’s model was cheaper and faster, since it made use of existing data. Most important, the big-data predictions are probably more on target, too.

Big data is also helping increase the transparency of democratic governance. A movement has grown up around the idea of “open data,” which goes beyond the freedom-of-information laws that are now commonplace in developed democracies. Supporters call on governments to make the vast amounts of innocuous data that they hold easily available to the public. The United States has been at the forefront, with its Data.gov website, and many other countries have followed.

At the same time as governments promote the use of big data, they will also need to protect citizens against unhealthy market dominance. Companies such as Google, Amazon, and Facebook — as well as lesser-known “data brokers,” such as Acxiom and Experian — are amassing vast amounts of information on everyone and everything. Antitrust laws protect against the monopolization of markets for goods and services such as software or media outlets, because the sizes of the markets for those goods are relatively easy to estimate. But how should governments apply antitrust rules to big data, a market that is hard to define and that is constantly changing form? Meanwhile, privacy will become an even bigger worry, since more data will almost certainly lead to more compromised private information, a downside of big data that current technologies and laws seem unlikely to prevent.

Regulations governing big data might even emerge as a battleground among countries. European governments are already scrutinizing Google over a raft of antitrust and privacy concerns, in a scenario reminiscent of the antitrust enforcement actions the European Commission took against Microsoft beginning a decade ago. Facebook might become a target for similar actions all over the world, because it holds so much data about individuals. Diplomats should brace for fights over whether to treat information flows as similar to free trade: in the future, when China censors Internet searches, it might face complaints not only about unjustly muzzling speech but also about unfairly restraining commerce.

BIG DATA OR BIG BROTHER?

States will need to help protect their citizens and their markets from new vulnerabilities caused by big data. But there is another potential dark side: big data could become Big Brother. In all countries, but particularly in nondemocratic ones, big data exacerbates the existing asymmetry of power between the state and the people.

The asymmetry could well become so great that it leads to big-data authoritarianism, a possibility vividly imagined in science-fiction movies such as Minority Report. That 2002 film took place in a near-future dystopia in which the character played by Tom Cruise headed a “Precrime” police unit that relied on clairvoyants whose visions identified people who were about to commit crimes. The plot revolves around the system’s obvious potential for error and, worse yet, its denial of free will.

Although the idea of identifying potential wrongdoers before they have committed a crime seems fanciful, big data has allowed some authorities to take it seriously. In 2007, the Department of Homeland Security launched a research project called FAST (Future Attribute Screening Technology), aimed at identifying potential terrorists by analyzing data about individuals’ vital signs, body language, and other physiological patterns. Police forces in many cities, including Los Angeles, Memphis, Richmond, and Santa Cruz, have adopted “predictive policing” software, which analyzes data on previous crimes to identify where and when the next ones might be committed.

For the moment, these systems do not identify specific individuals as suspects. But that is the direction in which things seem to be heading. Perhaps such systems would identify which young people are most likely to shoplift. There might be decent reasons to get so specific, especially when it comes to preventing negative social outcomes other than crime. For example, if social workers could tell with 95 percent accuracy which teenage girls would get pregnant or which high school boys would drop out of school, wouldn’t they be remiss if they did not step in to help? It sounds tempting. Prevention is better than punishment, after all. But even an intervention that did not admonish and instead provided assistance could be construed as a penalty — at the very least, one might be stigmatized in the eyes of others. In this case, the state’s actions would take the form of a penalty before any act were committed, obliterating the sanctity of free will.

Another worry is what could happen when governments put too much trust in the power of data. In his 1999 book, Seeing Like a State, the anthropologist James Scott documented the ways in which governments, in their zeal for quantification and data collection, sometimes end up making people’s lives miserable. They use maps to determine how to reorganize communities without first learning anything about the people who live there. They use long tables of data about harvests to decide to collectivize agriculture without knowing a whit about farming. They take all the imperfect, organic ways in which people have interacted over time and bend them to their needs, sometimes just to satisfy a desire for quantifiable order.

This misplaced trust in data can come back to bite. Organizations can be beguiled by data’s false charms and endow more meaning to the numbers than they deserve. That is one of the lessons of the Vietnam War. U.S. Secretary of Defense Robert McNamara became obsessed with using statistics as a way to measure the war’s progress. He and his colleagues fixated on the number of enemy fighters killed. Relied on by commanders and published daily in newspapers, the body count became the data point that defined an era. To the war’s supporters, it was proof of progress; to critics, it was evidence of the war’s immorality. Yet the statistics revealed very little about the complex reality of the conflict. The figures were frequently inaccurate and were of little value as a way to measure success. Although it is important to learn from data to improve lives, common sense must be permitted to override the spreadsheets.

HUMAN TOUCH

Big data is poised to reshape the way we live, work, and think. A worldview built on the importance of causation is being challenged by a preponderance of correlations. The possession of knowledge, which once meant an understanding of the past, is coming to mean an ability to predict the future. The challenges posed by big data will not be easy to resolve. Rather, they are simply the next step in the timeless debate over how to best understand the world.

Still, big data will become integral to addressing many of the world’s pressing problems. Tackling climate change will require analyzing pollution data to understand where best to focus efforts and find ways to mitigate problems. The sensors being placed all over the world, including those embedded in smartphones, provide a wealth of data that will allow climatologists to more accurately model global warming. Meanwhile, improving and lowering the cost of health care, especially for the world’s poor, will make it necessary to automate some tasks that currently require human judgment but could be done by a computer, such as examining biopsies for cancerous cells or detecting infections before symptoms fully emerge.

Ultimately, big data marks the moment when the “information society” finally fulfills the promise implied by its name. The data take center stage. All those digital bits that have been gathered can now be harnessed in novel ways to serve new purposes and unlock new forms of value. But this requires a new way of thinking and will challenge institutions and identities. In a world where data shape decisions more and more, what purpose will remain for people, or for intuition, or for going against the facts? If everyone appeals to the data and harnesses big-data tools, perhaps what will become the central point of differentiation is unpredictability: the human element of instinct, risk taking, accidents, and even error. If so, then there will be a special need to carve out a place for the human: to reserve space for intuition, common sense, and serendipity to ensure that they are not crowded out by data and machine-made answers.

This has important implications for the notion of progress in society. Big data enables us to experiment faster and explore more leads. These advantages should produce more innovation. But at times, the spark of invention becomes what the data do not say. That is something that no amount of data can ever confirm or corroborate, since it has yet to exist. If Henry Ford had queried big-data algorithms to discover what his customers wanted, they would have come back with “a faster horse,” to recast his famous line. In a world of big data, it is the most human traits that will need to be fostered — creativity, intuition, and intellectual ambition — since human ingenuity is the source of progress.

Big data is a resource and a tool. It is meant to inform, rather than explain; it points toward understanding, but it can still lead to misunderstanding, depending on how well it is wielded. And however dazzling the power of big data appears, its seductive glimmer must never blind us to its inherent imperfections. Rather, we must adopt this technology with an appreciation not just of its power but also of its limitations.

 
Copyright © 2002-2012 by the Council on Foreign Relations, Inc.

Link: http://www.foreignaffairs.com/articles/139104/kenneth-neil-cukier-and-viktor-mayer-schoenberger/the-rise-of-big-data

Nhìn về toàn cầu hóa

Đợt rồi tranh thủ đọc trên xe xong cuốn này. Có nhiều vấn đề gợi ra đáng để suy nghĩ về vấn đề hàng hóa công, quản lý công sản, phát triển quốc tế, kiềm chế sự tham lam và hung hãn của các định chế thị trường, v.v. Nói chung là sách nên đọc!

Bác Soros tự giới thiệu về cuốn sách này (copy từ Sách Hay):

Mục đích tôi viết cuốn sách này không chỉ để đề cập về hoạt động của hệ thống tư bản toàn cầu mà còn nhằm đề xuất một số đường lối để cải thiện nó. Với mục tiêu này, tôi đã áp dụng một định nghĩa hẹp hơn về toàn cầu hóa: tôi đánh đồng toàn cầu hóa với với sự di chuyển vốn tự do và sự thống trị ngày càng tăng của thị trường tài chính và các công ty đa quốc gia trong nền kinh tế một số nước. Cách tiếp cận này có ưu điểm là thu hẹp phạm vi thảo luận. Tôi có thể khẳng định rằng toàn cầu hóa ngày nay đang bị mất cân bằng: Sự phát triển các tổ chức quốc tế đã không bắt kịp sự phát triển của những thị trường tài chính quốc tế và các dàn xếp chính trị quá tụt hậu so với quá trình toàn cầu hóa kinh tế. Dựa trên lập luận này, tôi đã đề xuất những giải pháp thiết thực giúp chủ nghĩa tư bản toàn cầu ổn định và công bằng hơn…

… Trong cuốn sách này, tôi kịch liệt phản đối sự nguy hiểm của chủ nghĩa thị trường chính thống – một niềm tin cho rằng lợi ích cộng đồng hình thành từ sự theo đuổi vô hạn lợi ích cá nhân. Kể từ ngày 11/09, việc nước Mỹ theo đuổi thế lực quân đội, đầu tiên thông qua tuyên bố học thuyết của Bush và tiếp theo là đưa học thuyết đó vào cuộc xâm lược Iraq, đã trở thành mối đe dọa nghiêm trọng hơn cho toàn thế giới.” – Soros

Mô hình kinh tế mới?

Bác lắm râu đẹp zai này (bác nào lắm râu mà chả đẹp zai, vưỡn!), có bài ngắn tẹo trên Financial Times mà thấy đăng lại ở nhiều nơi rồi.

Hôm trước ở Bangkok ngồi nhậu với chú J Lắm Mồm cũng có tranh luận liên quan đến chủ đề này khi bàn về biến đổi khí hậu, REDD riếc. Sẽ thêm thắt bình luận cụ tỉ sau.

Vài thông tin thêm cho bài của bác già:

 


Needed: a new economic paradigm
By Joseph Stiglitz

The blame game continues over who is responsible for the worst recession since the Great Depression – the financiers who did such a bad job of managing risk or the regulators who failed to stop them. But the economics profession bears more than a little culpability. It provided the models that gave comfort to regulators that markets could be self-regulated; that they were efficient and self-correcting. The efficient markets hypothesis – the notion that market prices fully revealed all the relevant information – ruled the day. Today, not only is our economy in a shambles but so too is the economic paradigm that predominated in the years before the crisis – or at least it should be.

It is hard for non-economists to understand how peculiar the predominant macroeconomic models were. Many assumed demand had to equal supply – and that meant there could be no unemployment. (Right now a lot of people are just enjoying an extra dose of leisure; why they are unhappy is a matter for psychiatry, not economics.) Many used “representative agent models” – all individuals were assumed to be identical, and this meant there could be no meaningful financial markets (who would be lending money to whom?). Information asymmetries, the cornerstone of modern economics, also had no place: they could arise only if individuals suffered from acute schizophrenia, an assumption incompatible with another of the favoured assumptions, full rationality.

Bad models lead to bad policy: central banks, for instance, focused on the small economic inefficiencies arising from inflation, to the exclusion of the far, far greater inefficiencies arising from dysfunctional financial markets and asset price bubbles. After all, their models said that financial markets were always efficient. Remarkably, standard macroeconomic models did not even incorporate adequate analyses of banks. No wonder former Federal Reserve chairman Alan Greenspan, in his famous mea culpa, could express his surprise that banks did not do a better job at risk management. The real surprise was his surprise: even a cursory look at the perverse incentives confronting banks and their managers would have predicted short-sighted behaviour with excessive risk-taking.

The standard models should be graded on their predictive ability – and especially their ability to predict in circumstances that matter. Increasing the accuracy of forecast in normal times (knowing whether the economy will grow at 2.4 per cent or 2.5 per cent) is far less important than knowing the risk of a major recession. In this the models failed miserably, and the predictions of policymakers based on them have, by now, totally undermined their credibility. Policymakers did not see the crisis coming, said its effects were contained after the bubble burst, and thought the consequences would be far more short-lived and less severe than they have been.

Fortunately, while much of the mainstream focused on these flawed models, numerous researchers were engaged in developing alternative approaches. Economic theory had already shown that many of the central conclusions of the standard model were not robust – that is, small changes in assumptions led to large changes in conclusions. Even small information asymmetries, or imperfections in risk markets, meant that markets were not efficient. Celebrated results, such as Adam Smith’s invisible hand, did not hold; the invisible hand was invisible because it was not there. Few today would argue that bank managers, in their pursuit of their self-interest, had promoted the well-being of the global economy.

Monetary policy affects the economy through the availability of credit – and the terms on which it is made available, especially to small- and medium-sized enterprises. Understanding this requires us to analyse banks and their interaction with the shadow banking sector. The spread between the Treasury bill rate and lending rates can change markedly. With a few exceptions, most central banks paid little attention to systemic risk and the risks posed by credit interlinkages. Years before the crisis, a few researchers focused on these issues, including the possibility of the bankruptcy cascades that were to play out in such an important way in the crisis. This is an example of the importance of modelling carefully complex interactions among economic agents (households, companies, banks) – interactions that cannot be studied in models in which everyone is assumed to be the same. Even the sacrosanct assumption of rationality has been attacked: there are systemic deviations from rationality and consequences for macroeconomic behaviour that need to be explored.

Changing paradigms is not easy. Too many have invested too much in the wrong models. Like the Ptolemaic attempts to preserve earth-centric views of the universe, there will be heroic efforts to add complexities and refinements to the standard paradigm. The resulting models will be an improvement and policies based on them may do better, but they too are likely to fail. Nothing less than a paradigm shift will do.

But a new paradigm, I believe, is within our grasp: the intellectual building blocks are there and the Institute for New Economic Thinking is providing a framework for bringing the diverse group of scholars striving to create this new paradigm together. What is at stake, of course, is more than just the credibility of the economics profession or that of the policymakers who rely on their ideas: it is the stability and prosperity of our economies.

The writer, recipient of the 2001 Nobel Memorial Prize in economics, is University Professor at Columbia University. He served as chairman of President Bill Clinton’s Council of Economic Advisers and as chief economist of the World Bank. He is on the Advisory Board of INET.

Chủ nghĩa tư bản phiên bản 3.0: Hướng dẫn cách dành lại công sản

Đọc xong cuốn này vừa lúc Ủy ban Nobel trao giải Kinh tế năm nay cho Gs. Elinor Ostrom. Giới thiệu về giải Nobel Kinh tế năm nay và vấn đề quản lý công sản, xin đọc ở đây: http://www.thiennhien.net/news/193/ARTICLE/9757/2009-10-26.html

Chu_nghia_tu_ban_30

Mời bà con xem giới thiệu về cuốn này ở đây >>>

Có lẽ vấn đề quản lý và sở hữu công sản sẽ dần trở thành mainstream trong nghiên cứu, ứng dụng các lý thuyết kinh tế trong thời gian tới đây. Các dịch vụ sinh thái và tài nguyên thiên nhiên (ở nghĩa rộng, không chỉ là khoáng vật với lâm sản, …) về lâu dài không thể miễn phí mãi được mà phải được tính vào trong chi phí. Như vậy mới có thể đảm bảo công bằng và công lý trong tiếp cận môi trường, sinh thái được.

Định nghĩa lại khái niệm phát triển

Tên sách: Development Redefined: How the Market Met Its Match

Tác giả: Robin Broad và John Cavanagh. Họ là hai vợ chồng 🙂

Mới đọc xong cuốn này. Vợ chồng nhà họ dắt díu nhau đi khắp nơi, ngó nghiêng, nghiền ngẫm, viết sách phê phán các bác lão thành về lý thuyết phát triển và kinh tế học.

Nói chung phần tổng hợp quá trình WB, IMF và các bác đầu to khác dắt mũi bà con như thế nào thì nhiều. Tuy nhiên, đến phần về “ánh sáng cuối đường hầm” thì không nhiều lắm.

Buồn ngủ quá. Hôm sau rảnh thì bổ sung sau (nếu máu).