Free Website Hosting

Wednesday, April 27, 2011

The New Geopolitics of Food

From the Middle East to Madagascar, high prices are spawning land grabs and ousting dictators. Welcome to the 21st-century food wars.

BY LESTER R. BROWN | MAY/JUNE 2011

In the United States, when world wheat prices rise by 75 percent, as they have over the last year, it means the difference between a $2 loaf of bread and a loaf costing maybe $2.10. If, however, you live in New Delhi, those skyrocketing costs really matter: A doubling in the world price of wheat actually means that the wheat you carry home from the market to hand-grind into flour for chapatis costs twice as much. And the same is true with rice. If the world price of rice doubles, so does the price of rice in your neighborhood market in Jakarta. And so does the cost of the bowl of boiled rice on an Indonesian family's dinner table.
Welcome to the new food economics of 2011: Prices are climbing, but the impact is not at all being felt equally. For Americans, who spend less than one-tenth of their income in the supermarket, the soaring food prices we've seen so far this year are an annoyance, not a calamity. But for the planet's poorest 2 billion people, who spend 50 to 70 percent of their income on food, these soaring prices may mean going from two meals a day to one. Those who are barely hanging on to the lower rungs of the global economic ladder risk losing their grip entirely. This can contribute -- and it has -- to revolutions and upheaval.
Already in 2011, the U.N. Food Price Index has eclipsed its previous all-time global high; as of March it had climbed for eight consecutive months. With this year's harvest predicted to fall short, with governments in the Middle East and Africa teetering as a result of the price spikes, and with anxious markets sustaining one shock after another, food has quickly become the hidden driver of world politics. And crises like these are going to become increasingly common. The new geopolitics of food looks a whole lot more volatile -- and a whole lot more contentious -- than it used to. Scarcity is the new norm.
Until recently, sudden price surges just didn't matter as much, as they were quickly followed by a return to the relatively low food prices that helped shape the political stability of the late 20th century across much of the globe. But now both the causes and consequences are ominously different.



How Food Explains the World
By Joshua E. Keating

Street Eats

An FP Slide Show
In many ways, this is a resumption of the 2007-2008 food crisis, which subsided not because the world somehow came together to solve its grain crunch once and for all, but because the Great Recession tempered growth in demand even as favorable weather helped farmers produce the largest grain harvest on record. Historically, price spikes tended to be almost exclusively driven by unusual weather -- a monsoon failure in India, a drought in the former Soviet Union, a heat wave in the U.S. Midwest. Such events were always disruptive, but thankfully infrequent. Unfortunately, today's price hikes are driven by trends that are both elevating demand and making it more difficult to increase production: among them, a rapidly expanding population, crop-withering temperature increases, and irrigation wells running dry. Each night, there are 219,000 additional people to feed at the global dinner table.
More alarming still, the world is losing its ability to soften the effect of shortages. In response to previous price surges, the United States, the world's largest grain producer, was effectively able to steer the world away from potential catastrophe. From the mid-20th century until 1995, the United States had either grain surpluses or idle cropland that could be planted to rescue countries in trouble. When the Indian monsoon failed in 1965, for example, President Lyndon Johnson's administration shipped one-fifth of the U.S. wheat crop to India, successfully staving off famine. We can't do that anymore; the safety cushion is gone.
That's why the food crisis of 2011 is for real, and why it may bring with it yet more bread riots cum political revolutions. What if the upheavals that greeted dictators Zine el-Abidine Ben Ali in Tunisia, Hosni Mubarak in Egypt, and Muammar al-Qaddafi in Libya (a country that imports 90 percent of its grain) are not the end of the story, but the beginning of it? Get ready, farmers and foreign ministers alike, for a new era in which world food scarcity increasingly shapes global politics.

THE DOUBLING OF WORLD grain prices since early 2007 has been driven primarily by two factors: accelerating growth in demand and the increasing difficulty of rapidly expanding production. The result is a world that looks strikingly different from the bountiful global grain economy of the last century. What will the geopolitics of food look like in a new era dominated by scarcity? Even at this early stage, we can see at least the broad outlines of the emerging food economy.
On the demand side, farmers now face clear sources of increasing pressure. The first is population growth. Each year the world's farmers must feed 80 million additional people, nearly all of them in developing countries. The world's population has nearly doubled since 1970 and is headed toward 9 billion by midcentury. Some 3 billion people, meanwhile, are also trying to move up the food chain, consuming more meat, milk, and eggs. As more families in China and elsewhere enter the middle class, they expect to eat better. But as global consumption of grain-intensive livestock products climbs, so does the demand for the extra corn and soybeans needed to feed all that livestock. (Grain consumption per person in the United States, for example, is four times that in India, where little grain is converted into animal protein. For now.)
At the same time, the United States, which once was able to act as a global buffer of sorts against poor harvests elsewhere, is now converting massive quantities of grain into fuel for cars, even as world grain consumption, which is already up to roughly 2.2 billion metric tons per year, is growing at an accelerating rate. A decade ago, the growth in consumption was 20 million tons per year. More recently it has risen by 40 million tons every year. But the rate at which the United States is converting grain into ethanol has grown even faster. In 2010, the United States harvested nearly 400 million tons of grain, of which 126 million tons went to ethanol fuel distilleries (up from 16 million tons in 2000). This massive capacity to convert grain into fuel means that the price of grain is now tied to the price of oil. So if oil goes to $150 per barrel or more, the price of grain will follow it upward as it becomes ever more profitable to convert grain into oil substitutes. And it's not just a U.S. phenomenon: Brazil, which distills ethanol from sugar cane, ranks second in production after the United States, while the European Union's goal of getting 10 percent of its transport energy from renewables, mostly biofuels, by 2020 is also diverting land from food crops.
This is not merely a story about the booming demand for food. Everything from falling water tables to eroding soils and the consequences of global warming means that the world's food supply is unlikely to keep up with our collectively growing appetites. Take climate change: The rule of thumb among crop ecologists is that for every 1 degree Celsius rise in temperature above the growing season optimum, farmers can expect a 10 percent decline in grain yields. This relationship was borne out all too dramatically during the 2010 heat wave in Russia, which reduced the country's grain harvest by nearly 40 percent.
While temperatures are rising, water tables are falling as farmers overpump for irrigation. This artificially inflates food production in the short run, creating a food bubble that bursts when aquifers are depleted and pumping is necessarily reduced to the rate of recharge. In arid Saudi Arabia, irrigation had surprisingly enabled the country to be self-sufficient in wheat for more than 20 years; now, wheat production is collapsing because the non-replenishable aquifer the country uses for irrigation is largely depleted. The Saudis soon will be importing all their grain.
Saudi Arabia is only one of some 18 countries with water-based food bubbles. All together, more than half the world's people live in countries where water tables are falling. The politically troubled Arab Middle East is the first geographic region where grain production has peaked and begun to decline because of water shortages, even as populations continue to grow. Grain production is already going down in Syria and Iraq and may soon decline in Yemen. But the largest food bubbles are in India and China. In India, where farmers have drilled some 20 million irrigation wells, water tables are falling and the wells are starting to go dry. The World Bank reports that 175 million Indians are being fed with grain produced by overpumping. In China, overpumping is concentrated in the North China Plain, which produces half of China's wheat and a third of its corn. An estimated 130 million Chinese are currently fed by overpumping. How will these countries make up for the inevitable shortfalls when the aquifers are depleted?
Even as we are running our wells dry, we are also mismanaging our soils, creating new deserts. Soil erosion as a result of overplowing and land mismanagement is undermining the productivity of one-third of the world's cropland. How severe is it? Look at satellite images showing two huge new dust bowls: one stretching across northern and western China and western Mongolia; the other across central Africa. Wang Tao, a leading Chinese desert scholar, reports that each year some 1,400 square miles of land in northern China turn to desert. In Mongolia and Lesotho, grain harvests have shrunk by half or more over the last few decades. North Korea and Haiti are also suffering from heavy soil losses; both countries face famine if they lose international food aid. Civilization can survive the loss of its oil reserves, but it cannot survive the loss of its soil reserves.
Beyond the changes in the environment that make it ever harder to meet human demand, there's an important intangible factor to consider: Over the last half-century or so, we have come to take agricultural progress for granted. Decade after decade, advancing technology underpinned steady gains in raising land productivity. Indeed, world grain yield per acre has tripled since 1950. But now that era is coming to an end in some of the more agriculturally advanced countries, where farmers are already using all available technologies to raise yields. In effect, the farmers have caught up with the scientists. After climbing for a century, rice yield per acre in Japan has not risen at all for 16 years. In China, yields may level off soon. Just those two countries alone account for one-third of the world's rice harvest. Meanwhile, wheat yields have plateaued in Britain, France, and Germany -- Western Europe's three largest wheat producers.
IN THIS ERA OF TIGHTENING world food supplies, the ability to grow food is fast becoming a new form of geopolitical leverage, and countries are scrambling to secure their own parochial interests at the expense of the common good.
The first signs of trouble came in 2007, when farmers began having difficulty keeping up with the growth in global demand for grain. Grain and soybean prices started to climb, tripling by mid-2008. In response, many exporting countries tried to control the rise of domestic food prices by restricting exports. Among them were Russia and Argentina, two leading wheat exporters. Vietnam, the No. 2 rice exporter, banned exports entirely for several months in early 2008. So did several other smaller exporters of grain.
With exporting countries restricting exports in 2007 and 2008, importing countries panicked. No longer able to rely on the market to supply the grain they needed, several countries took the novel step of trying to negotiate long-term grain-supply agreements with exporting countries. The Philippines, for instance, negotiated a three-year agreement with Vietnam for 1.5 million tons of rice per year. A delegation of Yemenis traveled to Australia with a similar goal in mind, but had no luck. In a seller's market, exporters were reluctant to make long-term commitments.
Fearing they might not be able to buy needed grain from the market, some of the more affluent countries, led by Saudi Arabia, South Korea, and China, took the unusual step in 2008 of buying or leasing land in other countries on which to grow grain for themselves. Most of these land acquisitions are in Africa, where some governments lease cropland for less than $1 per acre per year. Among the principal destinations were Ethiopia and Sudan, countries where millions of people are being sustained with food from the U.N. World Food Program. That the governments of these two countries are willing to sell land to foreign interests when their own people are hungry is a sad commentary on their leadership.
By the end of 2009, hundreds of land acquisition deals had been negotiated, some of them exceeding a million acres. A 2010 World Bank analysis of these "land grabs" reported that a total of nearly 140 million acres were involved -- an area that exceeds the cropland devoted to corn and wheat combined in the United States. Such acquisitions also typically involve water rights, meaning that land grabs potentially affect all downstream countries as well. Any water extracted from the upper Nile River basin to irrigate crops in Ethiopia or Sudan, for instance, will now not reach Egypt, upending the delicate water politics of the Nile by adding new countries with which Egypt must negotiate.
The potential for conflict -- and not just over water -- is high. Many of the land deals have been made in secret, and in most cases, the land involved was already in use by villagers when it was sold or leased. Often those already farming the land were neither consulted about nor even informed of the new arrangements. And because there typically are no formal land titles in many developing-country villages, the farmers who lost their land have had little backing to bring their cases to court. Reporter John Vidal, writing in Britain's Observer, quotes Nyikaw Ochalla from Ethiopia's Gambella region: "The foreign companies are arriving in large numbers, depriving people of land they have used for centuries. There is no consultation with the indigenous population. The deals are done secretly. The only thing the local people see is people coming with lots of tractors to invade their lands."
Local hostility toward such land grabs is the rule, not the exception. In 2007, as food prices were starting to rise, China signed an agreement with the Philippines to lease 2.5 million acres of land slated for food crops that would be shipped home. Once word leaked, the public outcry -- much of it from Filipino farmers -- forced Manila to suspend the agreement. A similar uproar rocked Madagascar, where a South Korean firm, Daewoo Logistics, had pursued rights to more than 3 million acres of land. Word of the deal helped stoke a political furor that toppled the government and forced cancellation of the agreement. Indeed, few things are more likely to fuel insurgencies than taking land from people. Agricultural equipment is easily sabotaged. If ripe fields of grain are torched, they burn quickly.
Not only are these deals risky, but foreign investors producing food in a country full of hungry people face another political question of how to get the grain out. Will villagers permit trucks laden with grain headed for port cities to proceed when they themselves may be on the verge of starvation? The potential for political instability in countries where villagers have lost their land and their livelihoods is high. Conflicts could easily develop between investor and host countries.
These acquisitions represent a potential investment in agriculture in developing countries of an estimated $50 billion. But it could take many years to realize any substantial production gains. The public infrastructure for modern market-oriented agriculture does not yet exist in most of Africa. In some countries it will take years just to build the roads and ports needed to bring in agricultural inputs such as fertilizer and to export farm products. Beyond that, modern agriculture requires its own infrastructure: machine sheds, grain-drying equipment, silos, fertilizer storage sheds, fuel storage facilities, equipment repair and maintenance services, well-drilling equipment, irrigation pumps, and energy to power the pumps. Overall, development of the land acquired to date appears to be moving very slowly.
So how much will all this expand world food output? We don't know, but the World Bank analysis indicates that only 37 percent of the projects will be devoted to food crops. Most of the land bought up so far will be used to produce biofuels and other industrial crops.
Even if some of these projects do eventually boost land productivity, who will benefit? If virtually all the inputs -- the farm equipment, the fertilizer, the pesticides, the seeds -- are brought in from abroad and if all the output is shipped out of the country, it will contribute little to the host country's economy. At best, locals may find work as farm laborers, but in highly mechanized operations, the jobs will be few. At worst, impoverished countries like Mozambique and Sudan will be left with less land and water with which to feed their already hungry populations. Thus far the land grabs have contributed more to stirring unrest than to expanding food production.
And this rich country-poor country divide could grow even more pronounced -- and soon. This January, a new stage in the scramble among importing countries to secure food began to unfold when South Korea, which imports 70 percent of its grain, announced that it was creating a new public-private entity that will be responsible for acquiring part of this grain. With an initial office in Chicago, the plan is to bypass the large international trading firms by buying grain directly from U.S. farmers. As the Koreans acquire their own grain elevators, they may well sign multiyear delivery contracts with farmers, agreeing to buy specified quantities of wheat, corn, or soybeans at a fixed price.
Other importers will not stand idly by as South Korea tries to tie up a portion of the U.S. grain harvest even before it gets to market. The enterprising Koreans may soon be joined by China, Japan, Saudi Arabia, and other leading importers. Although South Korea's initial focus is the United States, far and away the world's largest grain exporter, it may later consider brokering deals with Canada, Australia, Argentina, and other major exporters. This is happening just as China may be on the verge of entering the U.S. market as a potentially massive importer of grain. With China's 1.4 billion increasingly affluent consumers starting to compete with U.S. consumers for the U.S. grain harvest, cheap food, seen by many as an American birthright, may be coming to an end.
No one knows where this intensifying competition for food supplies will go, but the world seems to be moving away from the international cooperation that evolved over several decades following World War II to an every-country-for-itself philosophy. Food nationalism may help secure food supplies for individual affluent countries, but it does little to enhance world food security. Indeed, the low-income countries that host land grabs or import grain will likely see their food situation deteriorate.
AFTER THE CARNAGE of two world wars and the economic missteps that led to the Great Depression, countries joined together in 1945 to create the United Nations, finally realizing that in the modern world we cannot live in isolation, tempting though that might be. The International Monetary Fund was created to help manage the monetary system and promote economic stability and progress. Within the U.N. system, specialized agencies from the World Health Organization to the Food and Agriculture Organization (FAO) play major roles in the world today. All this has fostered international cooperation.
But while the FAO collects and analyzes global agricultural data and provides technical assistance, there is no organized effort to ensure the adequacy of world food supplies. Indeed, most international negotiations on agricultural trade until recently focused on access to markets, with the United States, Canada, Australia, and Argentina persistently pressing Europe and Japan to open their highly protected agricultural markets. But in the first decade of this century, access to supplies has emerged as the overriding issue as the world transitions from an era of food surpluses to a new politics of food scarcity. At the same time, the U.S. food aid program that once worked to fend off famine wherever it threatened has largely been replaced by the U.N. World Food Program (WFP), where the United States is the leading donor. The WFP now has food-assistance operations in some 70 countries and an annual budget of $4 billion. There is little international coordination otherwise. French President Nicolas Sarkozy -- the reigning president of the G-20 -- is proposing to deal with rising food prices by curbing speculation in commodity markets. Useful though this may be, it treats the symptoms of growing food insecurity, not the causes, such as population growth and climate change. The world now needs to focus not only on agricultural policy, but on a structure that integrates it with energy, population, and water policies, each of which directly affects food security.
But that is not happening. Instead, as land and water become scarcer, as the Earth's temperature rises, and as world food security deteriorates, a dangerous geopolitics of food scarcity is emerging. Land grabbing, water grabbing, and buying grain directly from farmers in exporting countries are now integral parts of a global power struggle for food security.
With grain stocks low and climate volatility increasing, the risks are also increasing. We are now so close to the edge that a breakdown in the food system could come at any time. Consider, for example, what would have happened if the 2010 heat wave that was centered in Moscow had instead been centered in Chicago. In round numbers, the 40 percent drop in Russia's hoped-for harvest of roughly 100 million tons cost the world 40 million tons of grain, but a 40 percent drop in the far larger U.S. grain harvest of 400 million tons would have cost 160 million tons. The world's carryover stocks of grain (the amount in the bin when the new harvest begins) would have dropped to just 52 days of consumption. This level would have been not only the lowest on record, but also well below the 62-day carryover that set the stage for the 2007-2008 tripling of world grain prices.
Then what? There would have been chaos in world grain markets. Grain prices would have climbed off the charts. Some grain-exporting countries, trying to hold down domestic food prices, would have restricted or even banned exports, as they did in 2007 and 2008. The TV news would have been dominated not by the hundreds of fires in the Russian countryside, but by footage of food riots in low-income grain-importing countries and reports of governments falling as hunger spread out of control. Oil-exporting countries that import grain would have been trying to barter oil for grain, and low-income grain importers would have lost out. With governments toppling and confidence in the world grain market shattered, the global economy could have started to unravel.
We may not always be so lucky. At issue now is whether the world can go beyond focusing on the symptoms of the deteriorating food situation and instead attack the underlying causes. If we cannot produce higher crop yields with less water and conserve fertile soils, many agricultural areas will cease to be viable. And this goes far beyond farmers. If we cannot move at wartime speed to stabilize the climate, we may not be able to avoid runaway food prices. If we cannot accelerate the shift to smaller families and stabilize the world population sooner rather than later, the ranks of the hungry will almost certainly continue to expand. The time to act is now -- before the food crisis of 2011 becomes the new normal.

More Than 1 Billion People Are Hungry in the World

But what if the experts are wrong?

BY ABHIJIT BANERJEE, ESTHER DUFLO | MAY/JUNE 2011

For many in the West, poverty is almost synonymous with hunger. Indeed, the announcement by the United Nations Food and Agriculture Organization in 2009 that more than 1 billion people are suffering from hunger grabbed headlines in a way that any number of World Bank estimates of how many poor people live on less than a dollar a day never did.
But is it really true? Are there really more than a billion people going to bed hungry each night? Our research on this question has taken us to rural villages and teeming urban slums around the world, collecting data and speaking with poor people about what they eat and what else they buy, from Morocco to Kenya, Indonesia to India. We've also tapped into a wealth of insights from our academic colleagues. What we've found is that the story of hunger, and of poverty more broadly, is far more complex than any one statistic or grand theory; it is a world where those without enough to eat may save up to buy a TV instead, where more money doesn't necessarily translate into more food, and where making rice cheaper can sometimes even lead people to buy less rice.
But unfortunately, this is not always the world as the experts view it. All too many of them still promote sweeping, ideological solutions to problems that defy one-size-fits-all answers, arguing over foreign aid, for example, while the facts on the ground bear little resemblance to the fierce policy battles they wage.
Jeffrey Sachs, an advisor to the United Nations and director of Columbia University's Earth Institute, is one such expert. In books and countless speeches and television appearances, he has argued that poor countries are poor because they are hot, infertile, malaria-infested, and often landlocked; these factors, however, make it hard for them to be productive without an initial large investment to help them deal with such endemic problems. But they cannot pay for the investments precisely because they are poor -- they are in what economists call a "poverty trap." Until something is done about these problems, neither free markets nor democracy will do very much for them.
But then there are others, equally vocal, who believe that all of Sachs's answers are wrong. William Easterly, who battles Sachs from New York University at the other end of Manhattan, has become one of the most influential aid critics in his books, The Elusive Quest for Growth and The White Man's Burden. Dambisa Moyo, an economist who worked at Goldman Sachs and the World Bank, has joined her voice to Easterly's with her recent book, Dead Aid. Both argue that aid does more bad than good. It prevents people from searching for their own solutions, while corrupting and undermining local institutions and creating a self-perpetuating lobby of aid agencies. The best bet for poor countries, they argue, is to rely on one simple idea: When markets are free and the incentives are right, people can find ways to solve their problems. They do not need handouts from foreigners or their own governments. In this sense, the aid pessimists are actually quite optimistic about the way the world works. According to Easterly, there is no such thing as a poverty trap.
This debate cannot be solved in the abstract. To find out whether there are in fact poverty traps, and, if so, where they are and how to help the poor get out of them, we need to better understand the concrete problems they face. Some aid programs help more than others, but which ones? Finding out required us to step out of the office and look more carefully at the world. In 2003, we founded what became the Abdul Latif Jameel Poverty Action Lab, or J-PAL. A key part of our mission is to research by using randomized control trials -- similar to experiments used in medicine to test the effectiveness of a drug -- to understand what works and what doesn't in the real-world fight against poverty. In practical terms, that meant we'd have to start understanding how the poor really live their lives.
Take, for example, Pak Solhin, who lives in a small village in West Java, Indonesia. He once explained to us exactly how a poverty trap worked. His parents used to have a bit of land, but they also had 13 children and had to build so many houses for each of them and their families that there was no land left for cultivation. Pak Solhin had been working as a casual agricultural worker, which paid up to 10,000 rupiah per day (about $2) for work in the fields. A recent hike in fertilizer and fuel prices, however, had forced farmers to economize. The local farmers decided not to cut wages, Pak Solhin told us, but to stop hiring workers instead. As a result, in the two months before we met him in 2008, he had not found a single day of agricultural labor. He was too weak for the most physical work, too inexperienced for more skilled labor, and, at 40, too old to be an apprentice. No one would hire him.
Pak Solhin, his wife, and their three children took drastic steps to survive. His wife left for Jakarta, some 80 miles away, where she found a job as a maid. But she did not earn enough to feed the children. The oldest son, a good student, dropped out of school at 12 and started as an apprentice on a construction site. The two younger children were sent to live with their grandparents. Pak Solhin himself survived on the roughly 9 pounds of subsidized rice he got every week from the government and on fish he caught at a nearby lake. His brother fed him once in a while. In the week before we last spoke with him, he had eaten two meals a day for four days, and just one for the other three.
Pak Solhin appeared to be out of options, and he clearly attributed his problem to a lack of food. As he saw it, farmers weren't interested in hiring him because they feared they couldn't pay him enough to avoid starvation; and if he was starving, he would be useless in the field. What he described was the classic nutrition-based poverty trap, as it is known in the academic world. The idea is simple: The human body needs a certain number of calories just to survive. So when someone is very poor, all the food he or she can afford is barely enough to allow for going through the motions of living and earning the meager income used to buy that food. But as people get richer, they can buy more food and that extra food goes into building strength, allowing people to produce much more than they need to eat merely to stay alive. This creates a link between income today and income tomorrow: The very poor earn less than they need to be able to do significant work, but those who have enough to eat can work even more. There's the poverty trap: The poor get poorer, and the rich get richer and eat even better, and get stronger and even richer, and the gap keeps increasing.
But though Pak Solhin's explanation of how someone might get trapped in starvation was perfectly logical, there was something vaguely troubling about his narrative. We met him not in war-infested Sudan or in a flooded area of Bangladesh, but in a village in prosperous Java, where, even after the increase in food prices in 2007 and 2008, there was clearly plenty of food available and a basic meal did not cost much. He was still eating enough to survive; why wouldn't someone be willing to offer him the extra bit of nutrition that would make him productive in return for a full day's work? More generally, although a hunger-based poverty trap is certainly a logical possibility, is it really relevant for most poor people today? What's the best way, if any, for the world to help?
THE INTERNATIONAL COMMUNITY has certainly bought into the idea that poverty traps exist -- and that they are the reason that millions are starving. The first U.N. Millennium Development Goal, for instance, is to "eradicate extreme poverty and hunger." In many countries, the definition of poverty itself has been connected to food; the thresholds for determining that someone was poor were originally calculated as the budget necessary to buy a certain number of calories, plus some other indispensable purchases, such as housing. A "poor" person has essentially been classified as someone without enough to eat.
So it is no surprise that government efforts to help the poor are largely based on the idea that the poor desperately need food and that quantity is what matters. Food subsidies are ubiquitous in the Middle East: Egypt spent $3.8 billion on food subsidies in the 2008 fiscal year, some 2 percent of its GDP. Indonesia distributes subsidized rice. Many states in India have a similar program. In the state of Orissa, for example, the poor are entitled to 55 pounds of rice a month at about 1 rupee per pound, less than 20 percent of the market price. Currently, the Indian Parliament is debating a Right to Food Act, which would allow people to sue the government if they are starving. Delivering such food aid is a logistical nightmare. In India it is estimated that more than half of the wheat and one-third of the rice gets "lost" along the way. To support direct food aid in this circumstance, one would have to be quite convinced that what the poor need more than anything is more grain.
But what if the poor are not, in general, eating too little food? What if, instead, they are eating the wrong kinds of food, depriving them of nutrients needed to be successful, healthy adults? What if the poor aren't starving, but choosing to spend their money on other priorities? Development experts and policymakers would have to completely reimagine the way they think about hunger. And governments and aid agencies would need to stop pouring money into failed programs and focus instead on finding new ways to truly improve the lives of the world's poorest.
Consider India, one of the great puzzles in this age of food crises. The standard media story about the country, at least when it comes to food, is about the rapid rise of obesity and diabetes as the urban upper-middle class gets richer. Yet the real story of nutrition in India over the last quarter-century, as Princeton professor Angus Deaton and Jean Drèze, a professor at Allahabad University and a special advisor to the Indian government, have shown, is not that Indians are becoming fatter: It is that they are in fact eating less and less. Despite the country's rapid economic growth, per capita calorie consumption in India has declined; moreover, the consumption of all other nutrients except fat also appears to have gone down among all groups, even the poorest. Today, more than three-quarters of the population live in households whose per capita calorie consumption is less than 2,100 calories in urban areas and 2,400 in rural areas -- numbers that are often cited as "minimum requirements" in India for those engaged in manual labor. Richer people still eat more than poorer people. But at all levels of income, the share of the budget devoted to food has declined and people consume fewer calories.
What is going on? The change is not driven by declining incomes; by all accounts, Indians are making more money than ever before. Nor is it because of rising food prices -- between the early 1980s and 2005, food prices declined relative to the prices of other things, both in rural and urban India. Although food prices have increased again since 2005, Indians began eating less precisely when the price of food was going down.
So the poor, even those whom the FAO would classify as hungry on the basis of what they eat, do not seem to want to eat much more even when they can. Indeed, they seem to be eating less. What could explain this? Well, to start, let's assume that the poor know what they are doing. After all, they are the ones who eat and work. If they could be tremendously more productive and earn much more by eating more, then they probably would. So could it be that eating more doesn't actually make us particularly more productive, and as a result, there is no nutrition-based poverty trap?
One reason the poverty trap might not exist is that most people have enough to eat. We live in a world today that is theoretically capable of feeding every person on the planet. In 1996, the FAO estimated that world food production was enough to provide at least 2,700 calories per person per day. Starvation still exists, but only as a result of the way food gets shared among us. There is no absolute scarcity. Using price data from the Philippines, we calculated the cost of the cheapest diet sufficient to give 2,400 calories. It would cost only about 21 cents a day, very affordable even for the very poor (the worldwide poverty line is set at roughly a dollar per day). The catch is, it would involve eating only bananas and eggs, something no one would like to do day in, day out. But so long as people are prepared to eat bananas and eggs when they need to, we should find very few people stuck in poverty because they do not get enough to eat. Indian surveys bear this out: The percentage of people who say they do not have enough food has dropped dramatically over time, from 17 percent in 1983 to 2 percent in 2004. So, perhaps people eat less because they are less hungry.
And perhaps they are really less hungry, despite eating fewer calories. It could be that because of improvements in water and sanitation, they are leaking fewer calories in bouts of diarrhea and other ailments. Or maybe they are less hungry because of the decline of heavy physical work. With the availability of drinking water in villages, women do not need to carry heavy loads for long distances; improvements in transportation have reduced the need to travel on foot; in even the poorest villages, flour is now milled using a motorized mill, instead of women grinding it by hand. Using the average calorie requirements calculated by the Indian Council of Medical Research, Deaton and Drèze note that the decline in calorie consumption over the last quarter-century could be entirely explained by a modest decrease in the number of people engaged in heavy physical work.
Beyond India, one hidden assumption in our description of the poverty trap is that the poor eat as much as they can. If there is any chance that by eating a bit more the poor could start doing meaningful work and get out of the poverty trap zone, then they should eat as much as possible. Yet most people living on less than a dollar a day do not seem to act as if they are starving. If they were, surely they would put every available penny into buying more calories. But they do not. In an 18-country data set we assembled on the lives of the poor, food represents 36 to 79 percent of consumption among the rural extremely poor, and 53 to 74 percent among their urban counterparts.
It is not because they spend all the rest on other necessities. In Udaipur, India, for example, we find that the typical poor household could spend up to 30 percent more on food, if it completely cut expenditures on alcohol, tobacco, and festivals. The poor seem to have many choices, and they don't choose to spend as much as they can on food. Equally remarkable is that even the money that people do spend on food is not spent to maximize the intake of calories or micronutrients. Studies have shown that when very poor people get a chance to spend a little bit more on food, they don't put everything into getting more calories. Instead, they buy better-tasting, more expensive calories.
In one study conducted in two regions of China, researchers offered randomly selected poor households a large subsidy on the price of the basic staple (wheat noodles in one region, rice in the other). We usually expect that when the price of something goes down, people buy more of it. The opposite happened. Households that received subsidies for rice or wheat consumed less of those two foods and ate more shrimp and meat, even though their staples now cost less. Overall, the caloric intake of those who received the subsidy did not increase (and may even have decreased), despite the fact that their purchasing power had increased. Nor did the nutritional content improve in any other sense. The likely reason is that because the rice and wheat noodles were cheap but not particularly tasty, feeling richer might actually have made them consume less of those staples. This reasoning suggests that at least among these very poor urban households, getting more calories was not a priority: Getting better-tasting ones was.
All told, many poor people might eat fewer calories than we -- or the FAO -- think is appropriate. But this does not seem to be because they have no other choice; rather, they are not hungry enough to seize every opportunity to eat more. So perhaps there aren't a billion "hungry" people in the world after all.
NONE OF THIS IS TO SAY that the logic of the hunger-based poverty trap is flawed. The idea that better nutrition would propel someone on the path to prosperity was almost surely very important at some point in history, and it may still be today. Nobel Prize-winning economic historian Robert Fogel calculated that in Europe during the Middle Ages and the Renaissance, food production did not provide enough calories to sustain a full working population. This could explain why there were large numbers of beggars -- they were literally incapable of any work. The pressure of just getting enough food to survive seems to have driven some people to take rather extreme steps. There was an epidemic of witch killing in Europe during the Little Ice Age (from the mid-1500s to 1800), when crop failures were common and fish was less abundant. Even today, Tanzania experiences a rash of such killings whenever there is a drought -- a convenient way to get rid of an unproductive mouth to feed at times when resources are very tight. Families, it seems, suddenly discover that an older woman living with them (usually a grandmother) is a witch, after which she gets chased away or killed by others in the village.
But the world we live in today is for the most part too rich for the occasional lack of food to be a big part of the story of the persistence of poverty on a large scale. This is of course different during natural or man-made disasters, or in famines that kill and weaken millions. As Nobel laureate Amartya Sen has shown, most recent famines have been caused not because food wasn't available but because of bad governance -- institutional failures that led to poor distribution of the available food, or even hoarding and storage in the face of starvation elsewhere. As Sen put it, "No substantial famine has ever occurred in any independent and democratic country with a relatively free press."
Should we let it rest there, then? Can we assume that the poor, though they may be eating little, do eat as much as they need to?
That also does not seem plausible. While Indians may prefer to buy things other than food as they get richer, they and their children are certainly not well nourished by any objective standard. Anemia is rampant; body-mass indices are some of the lowest in the world; almost half of children under 5 are much too short for their age, and one-fifth are so skinny that they are considered to be "wasted."
And this is not without consequences. There is a lot of evidence that children suffering from malnutrition generally grow into less successful adults. In Kenya, children who were given deworming pills in school for two years went to school longer and earned, as young adults, 20 percent more than children in comparable schools who received deworming for just one year. Worms contribute to anemia and general malnutrition, essentially because they compete with the child for nutrients. And the negative impact of undernutrition starts before birth. In Tanzania, to cite just one example, children born to mothers who received sufficient amounts of iodine during pregnancy completed between one-third and one-half of a year more schooling than their siblings who were in utero when their mothers weren't being treated. It is a substantial increase, given that most of these children will complete only four or five years of schooling in total. In fact, the study concludes that if every mother took iodine capsules, there would be a 7.5 percent increase in the total educational attainment of children in Central and Southern Africa. This, in turn, could measurably affect lifetime productivity.
Better nutrition matters for adults, too. In another study, in Indonesia, researchers tested the effects of boosting people's intake of iron, a key nutrient that prevents anemia. They found that iron supplements made men able to work harder and significantly boosted income. A year's supply of iron-fortified fish sauce cost the equivalent of $6, and for a self-employed male, the yearly gain in earnings was nearly $40 -- an excellent investment.
If the gains are so obvious, why don't the poor eat better? Eating well doesn't have to be prohibitively expensive. Most mothers could surely afford iodized salt, which is now standard in many parts of the world, or one dose of iodine every two years (at 51 cents per dose). Poor households could easily get a lot more calories and other nutrients by spending less on expensive grains (like rice and wheat), sugar, and processed foods, and more on leafy vegetables and coarse grains. But in Kenya, when the NGO that was running the deworming program asked parents in some schools to pay a few cents for deworming their children, almost all refused, thus depriving their children of hundreds of dollars of extra earnings over their lifetime.
Why? And why did anemic Indonesian workers not buy iron-fortified fish sauce on their own? One answer is that they don't believe it will matter -- their employers may not realize that they are more productive now. (In fact, in Indonesia, earnings improved only for the self-employed workers.) But this does not explain why all pregnant women in India aren't using only iodine-fortified salt, which is now available in every village. Another possibility is that people may not realize the value of feeding themselves and their children better -- not everyone has the right information, even in the United States. Moreover, people tend to be suspicious of outsiders who tell them that they should change their diet. When rice prices went up sharply in 1966 and 1967, the chief minister of West Bengal suggested that eating less rice and more vegetables would be both good for people's health and easier on their budgets. This set off a flurry of outrage, and the chief minister was greeted by protesters bearing garlands of vegetables wherever he went.
It is simply not very easy to learn about the value of many of these nutrients based on personal experience. Iodine might make your children smarter, but the difference is not huge, and in most cases you will not find out either way for many years. Iron, even if it makes people stronger, does not suddenly turn you into a superhero. The $40 extra a year the self-employed man earned may not even have been apparent to him, given the many ups and downs of his weekly income.
So it shouldn't surprise us that the poor choose their foods not mainly for their cheap prices and nutritional value, but for how good they taste. George Orwell, in his masterful description of the life of poor British workers in The Road to Wigan Pier, observes:
The basis of their diet, therefore, is white bread and margarine, corned beef, sugared tea and potatoes -- an appalling diet. Would it not be better if they spent more money on wholesome things like oranges and wholemeal bread or if they even, like the writer of the letter to the New Statesman, saved on fuel and ate their carrots raw? Yes, it would, but the point is that no ordinary human being is ever going to do such a thing. The ordinary human being would sooner starve than live on brown bread and raw carrots. And the peculiar evil is this, that the less money you have, the less inclined you feel to spend it on wholesome food. A millionaire may enjoy breakfasting off orange juice and Ryvita biscuits; an unemployed man doesn't.… When you are unemployed … you don't want to eat dull wholesome food. You want something a little bit "tasty." There is always some cheaply pleasant thing to tempt you.
The poor often resist the wonderful plans we think up for them because they do not share our faith that those plans work, or work as well as we claim. We shouldn't forget, too, that other things may be more important in their lives than food. Poor people in the developing world spend large amounts on weddings, dowries, and christenings. Part of the reason is probably that they don't want to lose face, when the social custom is to spend a lot on those occasions. In South Africa, poor families often spend so lavishly on funerals that they skimp on food for months afterward.
And don't underestimate the power of factors like boredom. Life can be quite dull in a village. There is no movie theater, no concert hall. And not a lot of work, either. In rural Morocco, Oucha Mbarbk and his two neighbors told us they had worked about 70 days in agriculture and about 30 days in construction that year. Otherwise, they took care of their cattle and waited for jobs to materialize. All three men lived in small houses without water or sanitation. They struggled to find enough money to give their children a good education. But they each had a television, a parabolic antenna, a DVD player, and a cell phone.
This is something that Orwell captured as well, when he described how poor families survived the Depression:
Instead of raging against their destiny they have made things tolerable by reducing their standards.
But they don't necessarily lower their standards by cutting out luxuries and concentrating on necessities; more often it is the other way around -- the more natural way, if you come to think of it. Hence the fact that in a decade of unparalleled depression, the consumption of all cheap luxuries has increased.
These "indulgences" are not the impulsive purchases of people who are not thinking hard about what they are doing. Oucha Mbarbk did not buy his TV on credit -- he saved up over many months to scrape enough money together, just as the mother in India starts saving for her young daughter's wedding by buying a small piece of jewelry here and a stainless-steel bucket there.
We often see the world of the poor as a land of missed opportunities and wonder why they don't invest in what would really make their lives better. But the poor may well be more skeptical about supposed opportunities and the possibility of any radical change in their lives. They often behave as if they think that any change that is significant enough to be worth sacrificing for will simply take too long. This could explain why they focus on the here and now, on living their lives as pleasantly as possible and celebrating when occasion demands it.
We asked Oucha Mbarbk what he would do if he had more money. He said he would buy more food. Then we asked him what he would do if he had even more money. He said he would buy better-tasting food. We were starting to feel very bad for him and his family, when we noticed the TV and other high-tech gadgets. Why had he bought all these things if he felt the family did not have enough to eat? He laughed, and said, "Oh, but television is more important than food!"
Daniel Berehulak/Getty Images
 
Abhijit V. Banerjee and Esther Duflo direct the Abdul Latif Jameel Poverty Action Lab at the Massachusetts Institute of Technology and are authors of Poor Economics: A Radical Rethinking of the Way to Fight Global Poverty, from which this excerpt is adapted.

Tuesday, April 26, 2011

Post Uprising Depression

Two good books about an important but confusing country which has been driven, partly by American intervention, into strange ways
from The Economist
Pakistan: A Hard Country. By Anatol Lieven. PublicAffairs; 558 pages; $35. Allen Lane; £30. Buy from Amazon.comAmazon.co.uk
Deadly Embrace: Pakistan, America, and the Future of the Global Jihad. By Bruce Riedel. Brookings Institution Press; 180 pages; $24.95 and £16.99. Buy from Amazon.comAmazon.co.uk
IT IS a shame that these books should be published at a time when the world is riveted by events in the Middle East. Pakistan’s population is more than half the size of the entire Arab world; for most of the past three decades it has been involved in a war with a superpower, first against it, and now on the same side as it; it suffers from an Islamic insurgency that has killed 30,000 people over the past four years; it is regarded by students of geopolitics as the most likely location of nuclear conflict; and the reasons why it does not work as a country are many and fascinating.
The trouble with Pakistan’s story is that the country is one rather depressing stage on from the Middle East. Its people have risen up bravely against autocrats (three times over, if you count only the generals, or four if, like some Pakistanis, you count Zulfikar Ali Bhutto as well) and had several unsuccessful attempts at democracy. So it ricochets between military and civilian governments, with a state that does not work very well but has not collapsed, and an insurgency that is not turning into a civil war but won’t go away. Unlike the Middle East, it is not full of hope.
Yet for drama, colour and complexity, the place is hard to beat; and Anatol Lieven captures the richness of the place wonderfully. His book has the virtues of both journalism and scholarship—not surprising, since Mr Lieven used to be a reporter for the Times and is now at King’s College, London. He has travelled extensively and talked widely, to generals, shopkeepers, farmers, lawyers and bureaucrats.
He quotes the people he meets with both sympathy and scepticism, pointing to “Pakistani society’s ability to generate within an astonishingly short space of time several mutually incompatible versions of a given event or fact, often linked to conspiracy theories which pass through the baroque to the rococo”—a characteristic which anybody who has worked there will recognise. He has a great affection for the country, which he describes as “a place that cries out for the combined talents of a novelist, an anthropologist and a painter.” Aside from occasional bits of horrible writing, he does it justice.
The notion that Pakistan is approaching the condition of a failed state is popular these days. Mr Lieven rejects it. The state may be weak, but in his view society is strong, which both holds the place together and frustrates attempts to modernise it. For instance, Mr Lieven finds the official bit of the legal system—the police, lawyers and judges—horribly wanting. “When I visited the city courts in Quetta, Baluchistan, a majority of the people with whom I spoke outside had cases which had been pending for more than five years, and had spent more than 200,000 rupees [$4,500] on legal fees and bribes—a colossal sum for a poor man in Pakistan.”
Many therefore turn to tribal courts, or to the Pakistani Taliban in areas where they are strong. Few outsiders would recognise some of the tribal courts’ decisions as justice—girls are traditionally given as compensation for particularly serious crimes—yet service is speedy and generally reckoned to be superior to that provided by the state. Indeed, this is one of the main reasons why the Taliban’s rise was, at least initially, widely welcomed.
Democracy, similarly, sits uncomfortably with traditional society. Politics is dominated by big landowners and tribal chiefs, who regard their job not as developing the country’s economy and civil institutions for the good of all Pakistanis, but as distributing patronage to their clan or tribe; and that’s how government is run. Values diverge radically from those normally associated with representative democracy. In 2008, three teenage Baluch girls were shot and buried alive for refusing to marry the husbands chosen for them by their tribes. A tribal chief, a senator belonging to the Pakistan People’s Party of President Asif Ali Zardari, commented: “these are centuries-old traditions and I will continue to defend them. Only those who commit immoral acts should be afraid.” The man was subsequently made a federal minister.
Mr Lieven thinks growing resentment at the hierarchical nature of Pakistani society has helped the Taliban. Educated Pakistanis would ask of some Islamist on the rise: “Who on earth can respect a former bus conductor as a leader?” The answer, says Mr Lieven in rather cross italics, is “another bus conductor…It is precisely the lowly origins of the Taliban…which endear them to the masses.”
Still, Mr Lieven reckons that because of the strength of traditional social bonds, which tie individual to family, and family to tribe or clan, “Pakistani society is probably strong enough to prevent any attempt to change it radically through Islamist revolution, which is all to the good.” Bruce Riedel is less sanguine. He regards “a jihadist victory” in Pakistan as “neither imminent nor inevitable…[but] a real possibility that needs to be assessed”. It might come about, he reckons, as a result of a military coup by an officer sharing the world- view of General Zia ul Haq, or as a result of an insurgent victory; neither of which Mr Lieven’s analysis suggests is likely.
Though Mr Lieven knows Pakistan from the inside, Mr Riedel, who has advised no fewer than four American presidents, knows power from the inside—something he is keen to share with the reader. Every chapter starts with some version of “We were aboard Air Force One en route to California when I began briefing President Barack Obama…”
For readers who can successfully suppress their irritation, his book provides a useful account of the dysfunctional relationship between Pakistan and America. The governments are supposedly close allies, yet betray each other with monotonous regularity. After the Soviet Union left Afghanistan, America abandoned Pakistan for India. Pakistan both helps America in its war against the Afghan Taliban and—playing both sides—allows Taliban fighters to conduct attacks in Afghanistan from Pakistani territory. Pakistan’s people regard America with deep suspicion, and Pakistan’s Taliban is taking up the baton of global (and particularly anti-American) terror from a weakened al-Qaeda.
Although the books disagree somewhat about Pakistan’s prospects, they are not far apart on at least one important aspect of its past. America’s interventions, argues Mr Riedel, have made it “harder for Pakistanis to develop a healthy democracy that can effectively fight terror”, by encouraging military interference in civilian affairs. “It has above all been the US-led campaign in Afghanistan,” says Mr Lieven, “which has been responsible for increasing Islamist insurgency and terrorism in Pakistan since 2001.” These two books, in different ways, sharply illustrate an uncomfortable truth about American foreign policy: that the war in Afghanistan has helped foster in Pakistan exactly the sorts of tendencies that America went into Afghanistan to wipe out.

Source:http://www.economist.com/node/18526715/print

Files reveal what al-Qaida did after 9/11

WikiLeaks releases secret documents on Gitmo detainees

By PETER FINN WASHINGTON POST

April 24, 2011, 10:41PM


On Sept. 11, 2001, the core of al-Qaida was concentrated in a single city: Karachi, Pakistan.
At a hospital, the accused mastermind of the bombing of the USS Cole was recovering from a tonsillectomy. Nearby, the alleged organizer of the 2002 bombing in Bali, Indonesia, was buying lab equipment for a biological weapons program. And in a safe house, the man who would later describe himself as the intellectual author of the Sept. 11 attacks was with other key al-Qaida members watching the scenes from New York and Washington unfold on television.
Within a day, much of the al-Qaida leadership was on the way back to Afghanistan, planning for a long war.
A cache of classified military documents obtained by the anti-secrecy organization WikiLeaks presents new details of their whereabouts on Sept. 11, 2001, and their movements afterward. The documents also offer some tantalizing glimpses into the whereabouts and operations of Osama bin Laden and his Egyptian deputy, Ayman al-Zawahiri.
The documents, provided to European and U.S. news outlets, including The Washington Post, are intelligence assessments of nearly every one of the 779 individuals who have been held at Guantanamo Bay, Cuba, since 2002. In them, analysts have created detailed portraits of detainees based on raw intelligence, including material gleaned from interrogations.
Detainees are assessed "high," "medium" or "low" in terms of their intelligence value, the threat they pose while in detention and the continued threat they might pose to the United States if released.
The documents tend to take a bleak view of the detainees, even those who have been ordered released by the federal courts because of a lack of evidence to justify their continued detention. And the assessments are often based, in part, on reporting by informants at the military detention center, sources that some judges have found wanting.
In a statement, the Pentagon, which described the decision to publish some of the material as "unfortunate," stressed the snapshot and incomplete nature of the assessments, known as Detainee Assessment Briefs, or DABs.
"The Guantanamo Review Task Force, established in January 2009, considered the DABs during its review of detainee information. In some cases, the Task Force came to the same conclusions as the DABs. In other instances the Review Task Force came to different conclusions, based on updated or other available information," said Pentagon press secretary Geoff Morrell and Ambassador Daniel Fried, the Obama administration's special envoy on detainee issues. "Any given DAB illegally obtained and released by Wikileaks may or may not represent the current view of a given detainee."

Histories of detainees

Regardless of how detainees are currently assessed, many of the documents shed light on their histories, particularly those of the high-value detainees. When pieced together, they capture some of the drama of al-Qaida's scattering in the wake of the Sept. 11 attacks. They also point to tensions between certain members of the terrorist group.
Among other previously unknown meetings, the documents describe a major gathering of some of al-Qaida's most senior operatives in early December 2001 in Zormat, a mountainous region of Afghanistan between Kabul and Khost. There, the operatives began to plan new attacks, a process that would consume them, according to the assessments, until they were finally captured.
Four days after the Sept. 11 attacks, bin Laden visited a guesthouse in Afghanistan's Kandahar province. He told the Arab fighters gathered there "to defend Afghanistan against the infidel invaders."
It was beginning of a peripatetic three months for bin Laden and Zawahiri. Traveling by car among several locations in Afghanistan, bin Laden handed out assignments to his followers, met with some of the Taliban leadership and delegated control of al-Qaida to the group's Shura Council, presumably because he feared being captured or killed as U.S. forces closed in.
At some point, bin Laden and Zawahiri used a secret guesthouse in or relatively near Kabul. The al-Qaida leader welcomed a stream of visitors and issued a series of orders, including instructions to continue operations against Western targets. He dispersed his fighters from training camps and instructed women and children, including some of his wives, to flee to Pakistan.

Meetings in Afghanistan

In October, bin Laden met in Kabul with two Malaysians, Yazid Zubair and Bashir Lap — both of whom are now at Guantanamo Bay - and lectured them on history and religion. On the day that the U.S.-led coalition began bombing Afghanistan, bin Laden met in Kandahar with Taliban official Mullah Mansour.
Bin Laden and Zawahiri also met that month with Taliban leader Jalaluddin Haqqani, who continues to lead a deadly insurgency against the United States and its allies in Afghanistan.
Bin Laden, accompanied by Zawahiri and a handful of close associates in his security detail, escaped to his cave complex in Tora Bora in November. Around Nov. 25, he was seen giving a speech to the leaders and fighters at the complex.
According to the documents, bin Laden and his deputy escaped from Tora Bora in December 2001. At the time, the al-Qaida leader was apparently so strapped for cash that he borrowed $7,000 from one of his protectors - a sum he paid back within a year.
In December, al-Qaida's top lieutenants gathered in Zormat. They included Khalid Sheik Mohammed, the self-described mastermind of the Sept. 11 attacks; Abd al-Rahim al-Nashiri, the alleged planner of the USS Cole attack; and Abu Faraj al-Libbi, a key facilitator for bin Laden.
The place was teeming with fighters who were awaiting for al-Qaida to return their passports so they could flee to Pakistan.

U.S., Israeli targets

Nashiri reported that while at Zormat he was approached by two Saudi nationals who wanted to strike U.S. and Israeli targets in Morocco. Nashiri said he had been considering an operation in the Strait of Gibraltar and thought that the British military base there, which he had seen in a documentary, would be a good target.
Nashiri's willingness to approve a plot on his own was later the source of some tension within the organization, particularly with Mohammed.
In May or June 2002, Mohammed learned of the disrupted plan to attack the military base in Gibraltar and was upset that he had not been informed of it.
Nashiri separately complained that he was being pushed by bin Laden to continue planning aggressive operations against U.S. interests in the Arabian gulf region without much regard for his security.
Indeed, Nashiri was arrested in the United Arab Emirates in late 2002.
After the Zormat conclave, Mohammed and other senior al-Qaida figures began to return to Karachi.
The documents state that Mohammed "put together a training program for assassinations and kidnappings as well as pistol and computer training." It was not intended for specific operations but to occupy the bored fighters stuck in safe houses.
At the time, money was flowing into the country for Mohammed, according to the documents, allowing him to acquire safe houses and fund operations.
In November 2002, his nephew Baluchi took a delivery of nearly $70,000 from a courier.
Gradually, Mohammed and the other operatives were picked off by Pakistanis working with the CIA and the FBI. When Ramzi Binalshibh, a key liaison between the Sept. 11 hijackers and al-Qaida, was arrested at a safe house in Karachi on the first anniversary of the Sept. 11 attacks, there was a four-hour standoff while the Yemeni and two others held knives to their own throats and threatened to kill themselves rather than be taken.
There are few geographic references in the documents for bin Laden after his flight into Pakistan.
He apparently sent out letters from his hiding place through a trusted courier, who then handed them to Libbi, who had provided the secret guesthouse in Kabul immediately after the Sept. 11 attacks.
After the capture of Mohammed in March 2003, Zawahiri fled from the house where he had been staying.
The documents state that Zawahiri left on his own and sought out an Afghan, who delivered him to Libbi.
In May 2005, while waiting for bin Laden's courier at a drop point, Libbi was arrested by Pakistani forces.
Zawahiri, in response, moved again. His residence, documents state, "was changed to a good place owned by a simple old man."
He remains at large.

Terror’s Training Ground

By Ayesha Siddiqa
Openning09-09 
A few years ago, I met some young boys from my village near Bahawalpur who were preparing to go on jihad. They smirked politely when I asked them to close their eyes and imagine their future. “We can tell you without closing our eyes that we don’t see anything.”
It was not entirely surprising. South Punjab is a region mired in poverty and underdevelopment. There are few job prospects for the youth. While the government has built airports and a few hospitals, these projects are symbolic and barely meet the needs of the area. It’s in areas like this, amid economic stagnation and hopelessness, that religious extremists find fertile ground to plant and spread their ideology.
The first step is recruitment – and the methodology is straightforward. Young children, or even men, are taken to madrassas in nearby towns. They are fed well and kept in living conditions considerably better than what they are used to. This is a simple psychological strategy meant to help them compare their homes with the alternatives offered by militant organisations. The returning children, like the boys I met, then undergo ideological indoctrination in a madrassa. Those who are indoctrinated always bring more friends and family with them. It is a swelling cycle.
Madrassas nurturing armies of young Islamic militants ready to embrace martyrdom have been on the rise for years in the Punjab. In fact, South Punjab has become the hub of jihadism. Yet, somehow, there are still many people in Pakistan who refuse to acknowledge this threat.

Four major militant outfits, the Sipah-e-Sahaba Pakistan (SSP), Lashkar-e-Jhangvi (LeJ), Jaish-e-Mohammad (JeM) and Lashkar-e-Tayyaba (LeT), are all comfortably ensconced in South Punjab (see article “Brothers in Arms”). Sources claim that there are about 5,000 to 9,000 youth from South Punjab fighting in Afghanistan and Waziristan. A renowned Pakistani researcher, Hassan Abbas cites a figure of 2,000 youth engaged in Waziristan. The area has become critical to planning, recruitment and logistical support for terrorist attacks in Pakistan and Afghanistan. In fact, in his study on the Punjabi Taliban, Abbas has quoted Tariq Pervez, the chief of a new government outfit named the National Counter-Terrorism Authority (NCTA), as saying that the jihad veterans in South Punjab are instrumental in providing the foot soldiers and implementing terror plans conceived and funded mainly by Al-Qaeda operatives. This shouldn’t come as a surprise considering that the force that conquered Khost in 1988-89 comprised numerous South Punjabi commanders who fought for the armies of various Afghan warlords such as Gulbuddin Hikmatyar and Burhanuddin Rabbani. Even now, all the four major organisations are involved in Afghanistan.
The above facts are not unknown to the provincial and federal governments or the army. It was not too long ago that the federal Interior Minister Rehman Malik equated South Punjab with Swat. The statement was negated by the IG Punjab. Perhaps, the senior police officer was not refuting his superior but challenging the story by Sabrina Tavernese of The New York Times (NYT). The story had highlighted jihadism in South Punjab, especially in Dera Ghazi Khan. The NYT story even drew a reaction from media outlets across the country. No one understood that South Punjab is being rightly equated with Swat, not because of violence but due to the presence of elements that aim at taking the society and state in another direction.
An English-language daily newspaper reacted to the NYT story by dispatching a journalist to South Punjab who wrote a series of articles that attempted to analyse the existing problem. One of the stories highlighted comments by the Bahawalpur Regional Police Officer (RPO) Mushtaq Sukhera, in which he denied that there was a threat of Talibanisation in South Punjab. He said that all such reports pertaining to South Punjab were nothing more than a figment of the western press’s imagination. Many others express a similar opinion. There are five explanations for this.
Firstly, opinion makers and policy makers are in a state of denial regarding the gravity of the problem. Additionally, they believe an overemphasis on this region might draw excessive US attention to South Punjab – an area epitomising mainstream Pakistan. Thus, it is difficult even to find anecdotal evidence regarding the activities of jihadis in this sub-region. We only gain some knowledge about the happenings from coincidental accidents like the blast that took place in a madrassa in Mian Chunoon, exposing the stockpile of arms its owner had stored on the premises.
Del267762
Nothing is off limits: Militants attack the Sri Lankan cricket team in Lahore in March 2009.

Secondly, officer Sukhera and others like him do not see any threat because the Punjab-based outfits are “home-grown” and are not seen as directly connected to the war in Afghanistan. This is contestable on two counts: South Punjabi jihadists have been connected with the Afghan jihad since the 1980s and the majority is still engaged in fighting in Afghanistan.
Thirdly, since all these outfits were created by the ISI to support General Zia-ul-Haq’s Islamisation process, in essence to fight a proxy war for Saudi Arabia against Iran by targeting the Shia community, and later the Kashmir war, the officials feel comfortable that they will never spin out of control. Those that become uncontrollable, such as Al-Furqan, are then abandoned. This outfit was involved in the second assassination attempt on Musharraf and had initially broken away from the JeM after the leadership developed differences over assets, power and ideology. Thus, the district officials and intelligence agencies turned a blind eye to the killing of the district amir of Al-Furqan in Bahawalpur in May 2009. As far as the JeM is concerned, it continues its engagement with the establishment. In any case, groups that are partly committed to the Kashmir cause and confrontation with India continue to survive. This is certainly the perception about the LeT. But in reality, the Wahhabi outfit has also been engaged in other regions, such as the Afghan provinces of Kunar and Badakhshan since 2004.
Fourthly, there is confusion at the operational level in the government regarding the definition of Talibanisation, which is then reflected in the larger debate on the issue. Many, including the RPO, define the process as an effort by an armed group to use force to change the social conditioning in an area. Ostensibly, the militant outfits in the Punjab continue to coexist with the pirs, prostitutes and the drug mafia, and there is no reason that they will follow in the footsteps of Sufi Mohammad and Maulana Fazlullah, or Baitullah Mehsud. Since the authorities only recognise the pattern followed by the Afghan warlords or those in Pakistan’s tribal areas, they tend not to understand that what is happening in the Punjab may not be Talibanisation but could eventually prove to be as lethal as what they call Talibanisation.
Del273533
Finally, many believe that Talibanisation cannot take place in a region known for practicing the Sufi version of Islam. There are many, besides the Bahawalpur RPO, who subscribe to the above theory. A year ago in an interview with an American channel, Farahnaz Ispahani, an MNA and wife of Pakistan’s ambassador to Washington, Husain Haqqani, stated that extremism couldn’t flourish in South Punjab because it was a land of Sufi shrines. This is partially true. The Sufi influence would work as a bulwark against this Talibanisation of society. However, Sufi Islam cannot fight poverty, underdevelopment and poor governance – all key factors that encourage Talibanisation.
South Punjab boasts names such as the Mazaris, Legharis and Gilanis, most of whom are not just politicians and big landowners but also belong to significant pir families. But they have done little to alleviate the sufferings of their constituents. A visit to Dera Ghazi Khan is depressing. Despite the fact that the division produced a president, Farooq Khan Leghari, the state of underdevelopment there is shocking. Reportedly, people living in the area in the immediate vicinity of the Leghari tribe could not sell their land without permission from the head of the tribe, the former president, who has been the tribal chief for many years. Under the circumstances, the poor and the dispossessed became attractive targets for militant outfits offering money. The country’s current economic downturn could raise the popularity of militant outfits.
02Punjab_terror-09-09
In recent history, the gap created due to the non-performance of Sufi shrines and Barelvi Islam, or the exploitative nature of these institutions, has been filled partly by the Deobandi and Ahl-e-Hadith madrassa conversion teams and groups, such as the Tableeghi Jamaat, and militant outfits. This alternative, unfortunately, is equally exploitative in nature. Sadly, today the shrines and Barelvi Islam have little to offer in terms of “marketing” to counter the package deal offered by the Salafists for the life hereafter, especially to a shaheed: 70 hoors (virgins), a queen hoor (virgin queen), a crown of jewels and forgiveness for 70 additional people. This promise means a lot for the poor youth who cannot hope for any change in a pre-capitalist socio-economic and political environment, where power is hard to re-negotiate. Furthermore, as stated by the former information minister Mohammad Ali Durrani, who had been a jihadi from 1984-90, a poor youth suddenly turning into a jihadi commander is a tremendous story of social mobility and recognition that he would never get in his existing socio-economic system. More importantly, the Deobandis and Ahl-e-Hadith offer a textual basis for their package, which is difficult for the pirs to refute due to the lack of an internal religious discourse in the Islamic world. The modern generation of pirs has not engaged in an internal discourse to counter this ideological onslaught by the Salafis. The main belief of Salafism is that all Muslims should practice Islam as it was during the time of Prophet Muhammad. The religion at that time, according to them, was perfect. Salafism – which pre-dates Wahhabism – is often used interchangeably with Wahhabism, which is actually an extension of Salafism.
*  *  *
Punjab offers a different pattern of extremism and jihadism. The pattern is closer to what one saw in Swat, where Sufi Mohammad and his TNSM spent quite a few years indoctrinating the society and building up a social movement before they got embroiled in a conflict with the state. South Punjab’s story is, in a sense, like Swat’s in that there is a gradual strengthening of Salafism and a build-up of militancy in the area. The procedure of conversion though, dates back to pre-1947. Still, the 1980s were clearly a watershed, when both rabid ideology and jihad were introduced to the area. Zia-ul-Haq encouraged the opening up of religious seminaries that, unlike the more traditional madrassas that were usually attached with Sufi shrines, subscribed to Salafi ideology. In later years, South Punjab became critical to inducting people for the Kashmir jihad. The ascendancy of the Tableeghi Jamaat and such madrassas that presented a more rabid version of religion gradually prepared the ground for later invasion by the militant groups. Two reports prepared around 1994, firstly by the district collector Bahawalpur and later by the Punjab government, highlighted the exponential rise in the number of madrassas and how these fanned sectarian and ideological hatred in the province. These reports also stated that all of these seminaries were provided funding by the government through the zakat fund.
jem_ayesha
The number of seminaries had increased during and after the 1980s. According to a 1996 report, there were 883 madrassas in Bahawalpur, 361 in Dera Ghazi Khan, 325 in Multan and 149 in Sargodha district. The madrassas in Bahawalpur outnumbered all other cities, including Lahore. These numbers relate to Deobandi madrassas only and do not include the Ahl-e-Hadith, Barelvi and other sects. Newer estimates from the intelligence bureau for 2008 show approximately 1,383 madrassas in the Bahawalpur division that house 84,000 students. Although the highest number of madrassas is in Rahim Yar Khan district (559) followed by Bahawalpur (481) and Bahawalnagar (310), it is Bahawalpur in which the highest  number of students (36,000) is enlisted. The total number of madrassa students in Pakistan has reached about one million.
Visions of paradise: JeM supporters walk by a banner of jihadi art. Photo: Tariq Mahmood
Visions of paradise: JeM supporters walk by a banner of jihadi art. Photo: Tariq Mahmood, AFP.

Everyone has been so focused on FATA and the NWFP that they failed to notice the huge increase in religious seminaries in these districts of South Punjab. According to a study conducted by historian Tahir Kamran, the total number of madrassas in the Punjab rose from 1,320 in 1988 to 3,153 in 2000, an increase of almost 140%. These madrassas were meant to provide a rapid supply of jihadis to the Afghan war of the 1980s. At the time of 9/11, the Bahawalpur division alone could boast of approximately 15,000-20,000 trained militants, some of whom had resettled in their areas during the period that Musharraf claimed to have clamped down on the jihad industry. Many went into the education sector, opened private schools and even joined the media.
These madrassas play three essential roles. First, they convert people to Salafism and neutralise resistance to a more rabid interpretation of the Quran and Sunnah in society. Consequently, the majority of the Barelvis cannot present a logical resistance to the opposing ideology. In many instances, the Barelvis themselves get converted to the idea of jihad. Secondly, these madrassas are used to train youth, who are then inducted into jihad. Most of the foot soldiers come from the religious seminaries. One of the principles taught to the students pertains to the concept of jihad as being a sacred duty that has to continue until the end of a Muslim’s life or the end of the world. Lastly, madrassas are an essential transit point for the youth, who are recruited from government schools. They are usually put through the conversion process after they have attended a 21-day initial training programme in the Frontier province or Kashmir (see box “A Different Breed”).
State support, which follows two distinct tracks, is also instrumental in the growth of jihadism in this region. On the one hand, there has generally been a link or understanding between political parties and militant groups. Since political parties are unable to eliminate militants or most politicians are sympathetic towards the militants, they tend to curb their activities through political deal-making. The understanding between the SSP and Benazir Bhutto after the 1993 elections, or the alleged deal between the PML-N and the SSP during the 2008 elections, denote the relationship between major political parties and the jihadis. Currently, the SSP in South Punjab is more supportive of the PML-N.
The second track involves operational links between the outfits and the state’s intelligence apparatus. As mentioned earlier, some of the outfits claim to have received training from the country’s intelligence agencies. Even now, local people talk of truckloads of weapons arriving at the doorstep of the JeM headquarters and other sites in the middle of the night. While official sources continue to claim that the outfit was banned and does not exist, or that Masood Azhar is on the run from his hometown of Bahawalpur, the facts prove otherwise. For instance, the outfit continues to acquire real estate in the area, such as a new site near Chowk Azam in Bahawalpur, which many believe is being used as a training site. Although the new police chief has put restraints on the JeM and disallowed it from constructing on the site, the outfit continues to appropriate more land around the area. Junior police officials even claim seeing tunnels being dug inside the premises. The new facility is on the bank of the Lahore-Karachi national highway, which means that in the event of a crisis, the JeM could block the road as has happened in Kohat and elsewhere. Furthermore, the outfit’s main headquarters in the city is guarded by AK-47-armed men who harass any journalist trying to take a photograph of the building. In one instance, even a police official was shooed away and later intimidated by spooks of an intelligence agency for spying on the outfit. Despite the claim that the SSP, the LeJ and the JeM have broken ties with intelligence agencies and are now fighting the army in Waziristan, the fact remains that their presence in the towns of South Punjab continues unhindered.
Is it naivety and inefficiency on the part of officialdom or a deliberate effort to withhold information? The government claims that Maulana Masood Azhar has not visited his hometown in the last three years. But he held a massive book launch of his new publication Fatah-ul-Jawad: Quranic Verses on Jihad, on April 28, 2008, in Bahawalpur. Moreover, JeM’s armed men manned all entrances and exits to the city that day – and there was no police force in sight. The ISI is said to have severed its links with the JeM for assisting the Pashtoon Taliban in inciting violence in the country. Sources from FATA claim, however, that the JeM, Harkat-ul-Mujahideen (HuM) and LeT are suspected by the Taliban for their links with state agencies.
In addition, intelligence agencies reportedly ward off anyone attempting to probe into the affairs of these outfits. In one case, a local in Bahawalpur city invoked daily visits from a certain agency after he assisted a foreign journalist. Similarly, only six months back, a BBC team was chased out of the area by agency officials. In fact, intelligence officials, who had forgotten about my existence since my last book was published, revisited my village in South Punjab soon after I began writing on militancy in the area and have gone to the extent of planting a story in one of the Urdu newspapers to malign me in my own area. In any case, no serious operation was conducted against these outfits after the Mumbai attacks and the recent spate of violence in the country. Hence, all of them continue to survive.
The Deobandi outfits are not the only ones popular in South Punjab. Ahl-e-Hadith/Wahhabi organisations such as the Tehreek-ul-Mujahidden (TuM) and the LeT also have a following in the region. While TuM, which is relatively a smaller organisation, has support in Dera Ghazi Khan, the LeT is popular in Bahawalpur, Multan and the areas bordering Central Punjab. Headquartered in Muridke, the LeT is popular among the Punjabi and Urdu-speaking Mohajir settlers.
Punjab_terror-09-09
There are obvious sociological reasons for LeT’s relative popularity among these people. The majority of this population represents either the lower-middle-class farmers or middle-class trader-merchants. The middle class is instrumental in providing funding to these outfits. And the support is not confined to South Punjab alone. In fact, middle-class trader-merchants from other parts of the Punjab also feed jihad through their funding. This does not mean that there are no Seraiki speakers in Wahhabi organisations but just that the dominant influence is that of the Punjabis and Mohajirs. The Seraiki-speaking population is mostly associated with the SSP, LeJ and JeM, not to mention the freelancing jihadis that have direct links with the Tehrik-e-Taliban (TTP).
The LeT’s presence in South Punjab is far more obvious than others courtesy of the wall chalkings and social work by its sister outfit, the Jamaat-ud-Dawa. Despite the rumours of friction between the LeT and the JuD leadership, the two segments operate in unison in South Punjab. Three of the favourite areas of recruitment in South Punjab for all outfits are Cholistan in Bahawalpur, the Rekh in Dera Ghazi Khan, and the Kacha area in Rajanpur. The first two are desert areas known for their poverty and underdevelopment, while the third is known for dacoits. However, another known feature of Kacha in Rajanpur is that the clerics of the Lal Masjid come from this area and have partly managed to push back the dacoits. Local sources claim that the influence of the clerics has increased since they started receiving cooperation from the police to jointly fight the dacoits.
Organisations such as the LeT have even begun to recruit women in the Punjab. These women undergo 21 days of ideological and military training. The goal is to ensure that these women will be able to fight if their menfolk are out on jihad and an enemy attacks Pakistan.
The militant outfits are rich, both ideologically and materially. They have ample financial resources that flow from four distinct sources: official sources (in some cases); Middle Eastern and Gulf states (not necessarily official channels); donations; and the Punjabi middle class, which is predominantly engaged in funding both madrassas and jihad for social, moral and political ends. With regard to donations, the militant outfits are extremely responsive to the changing environment and have adapted their money-collection tactics. Gone are the days of money-collection boxes. Now, especially in villages, followers are asked to raise money by selling harvested crops. And in terms of the Punjabi middle class, there are traders in Islamabad and other smaller urban centres that contribute regularly to the cause. These trader-merchants and upcoming entrepreneurs see donations to these outfits as a source of atonement for their sins. In Tahir Kamran’s study “Deobandiism in the Punjab,” Deobandiism (and Wahhabiism) is an urban phenomenon. If so, then the existence of these militant outfits in rural Punjab indicates a new social trend. Perhaps, due to greater access to technology (mobiles, television sets, satellite receivers, etc), the landscape (and rustic lifestyles) of Punjab’s rural areas has changed. There is an unplanned urbanisation of the rural areas due to the emergence of small towns with no social development, health and education infrastructure. Socially and politically, there is a gap that is filled by these militant outfits or related ideological institutions.
Fortunately, they have not succeeded in changing the lifestyles of the ordinary people. This is perhaps because there are multiple cultural strands that do not allow the jihadis to impose their norms the way they have in the tribal areas or the Frontier province. This is not to say that there is no threat from them in South Punjab: the liberalism and multi-polarity of society is certainly at risk. The threat is posed by the religious seminaries and the new recruits for jihad, who change social norms slowly and gradually. Sadly nothing, including the powerful political system of the area, which in any case is extremely warped, helps ward off the threat of extremism and jihadism. Ultimately, South Punjab could fall prey to the myopia of its ruling elite.
*  *  *
So how does the state and society deal with this issue?
Deploying the military is not an option. In the Punjab this will create a division within the powerful army because of regional loyalty. The foremost task is to examine the nature of the state’s relationship with the militants as strategic partners: should this relationship continue to exist to the detriment of the state? Once this mystifying question is resolved, all militant forces can be dealt with through an integrated police-intelligence operation.
This, however, amounts to winning only half the battle. The other half deals with the basic problems faced by the likes of those young jihadis-in-training from Bahawalpur who said, “We don’t see anything” in our futures. Presently, there is hardly any industrialisation in South Punjab and the mainstay of the area, agriculture, is faltering. The region requires economic strengthening: new ideas in agriculture, capital investment and new, relevant industries. This is the time that the government must plan beyond the usual textile and sugar industries that have arguably turned into huge mafias that are draining the local economy rather than feeding it.
Investment in social development is desperately needed. A larger social infrastructure that provides jobs and an educational system that is responsive to the needs of the population can contribute to filling the gaps. The message of militancy is quite potent, especially in terms of the dreams it sells to the youth, such as those disillusioned boys from my village. Jihad elevates youngsters from a state of being dispossessed to an imagined exalted status. They visualise themselves taking their places among great historical figures such as Mohammad bin Qasim and Khalid bin Waleed. It is these dreams for which the state must provide an alternative.