“Despite the global food crisis of 2007–8, the coming famine hasn’t happened yet. It is a looming planetary emergency…it is arriving even faster than climate change.” That’s the vision of famine that awaits us, says Australian science writer Julian Cribb. And he’s far from alone. “The world is in transition from an era of food abundance to one of scarcity,” says the environmentalist Lester Brown.
They’re wrong, and their virulent strain of technopessimism—which is finding lots of resonance in the media these days—has been wrong for a long time. In his 1968 book The Population Bomb, Paul R. Ehrlich wrote: “The battle to feed all of humanity is over. In the 1970s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now. At this late date nothing can prevent a substantial increase in the world death rate.” Ehrlich himself rode the frayed coattails of Thomas Malthus, who two centuries ago warned that the combination of an arithmetic increase in food supply and a geometric increase in population would result in famine, pestilence, and war.
It didn’t happen. And what’s more, it’s not going to happen in the foreseeable future, despite the admonitions of today’s crop of Cassandras. In the advanced economies, people now spend 15 percent or less of their disposable income on food. It has never been lower. On the eve of the French Revolution in 1789, about when Malthus first published his essay, it took nearly the entire daily wage of an unskilled worker to buy two loaves of bread, enough to feed a family of four. Today, it takes a Parisien about 15 minutes working at minimum wage to do to the same.
There are two reasons for this great change: Farmers are working more land, and they are wringing more food from it. This latter factor is what’s critical now. From 1800 to 1950, the acreage under cultivation pretty much kept pace with world population growth. But since 1950 the amount of such land per capita has fallen by half even as food supply per capita has increased by about 30 percent. In addition, global trading networks greatly expanded to bring food from regions with food surpluses to those with food deficits.
Those predicting the onset of scarcity argue that the trend toward ever-cheaper food is beginning to falter. And indeed, beginning around 2001, global food prices did begin to edge up. The shift in direction was a result of several developments. One was a decline since 1990 in the year-on-year growth in yields of major cereal crops like wheat and rice. But the more significant causes were a rise in demand for food in rapidly developing countries like China, as people there moved away from food staples to more grain-intensive meat, and crops like corn, sugar, and oilseeds were diverted to the production of biofuel.
We shouldn’t be overly vexed by these burdens on the food supply. There is indeed every reason to believe that productivity could go on rising briskly for at least several decades, by which time the world’s population is expected to top out at around 10 billion. However, this productivity increase won’t happen by itself—we have to invest in technology, not only in our accustomed centers of research but in the very regions of the world that lag behind the modernized nations. Research has to happen where the crops are. In the end, all agricultural innovation, like politics, is local.
You probably understand that agricultural productivity depends on science and technology. But you may not realize just how much it does so. At the beginning of the 20th century, essentially any increase in food production came from new lands going under the plow. By the end of the century, most increases in production came from better seeds, breeds, fertilizers, pest control, automation, and management skill. All of those improvements were almost entirely due to research.
Perhaps the greatest examples of what research has done are the discoveries of genetic inheritance by the Austrian monk Gregor Mendel in the mid-19th century and the Haber-Bosch process for the chemical fixation of nitrogen from the air, developed in Germany in the early 1900s. The lack of nitrogen was the single outstanding constraint on crop yields, and removing that constraint allowed a given amount of labor, on a given amount of land, to produce vastly more food.
Use of synthetic nitrogen fertilizer expanded rapidly, especially after World War II, when munitions factories could be converted to manufacture fertilizer. Then researchers capitalized on this innovation by applying Mendel’s discoveries to the breeding of crop varieties that responded to higher doses of fertilizer. These included corn (maize) hybrids that could be planted more densely and “semidwarf” varieties of wheat and rice, which responded to higher doses of nitrogen by producing more grain rather than growing too tall to stand. This combination unleashed a green revolution starting around the 1940s in North America and the 1960s in developing countries. It’s true that the revolution unleashed by nitrogen fixation has had unwanted side effects: Much of the fertilizer currently applied is wasted, escaping into the atmosphere or running off to pollute waterways. Solving these problems is challenging—farmers must improve the efficiency with which they apply fertilizer. But that’s already happening in many places.
Meanwhile, to speed the adoption of new practices, governments have provided programs to train farmers in new agricultural methods, financial credit to help them invest in new techniques, and infrastructure like roads and electrification to improve access to markets and the quality of rural life. Reward for effort is also critical. China got a boost in productivity when it abandoned its experiment with communal farms and went back to family-managed farms, which tied earnings more directly to the farmer’s own work. Secure tenure to land gives farmers a strong incentive to invest in long-term improvements like irrigation and soil conservation.
These developing-world examples are vital, because increasingly it is improvements in these countries that drive global agricultural productivity. As developing countries have become richer, their share of both production and consumption has ballooned. They now account for about two-thirds of global agricultural output, up from 42 percent in 1961. And their diets have shifted measurably from staples like starches and grains to include more meats, milk, fruits, nuts, and oils. This means that the old standby commodities—wheat, rice, and corn—aren’t as important as they once were. To focus on statistics about their yield trends misses the bigger picture.
Agricultural productivity is typically measured in terms of output per unit area of land, or per person employed. Some of the highest yields have been achieved in the developed countries of Asia, where the total harvest of crop and livestock products per hectare is about eight times that in the United States. In these countries intensive agriculture—with a lot of labor, capital, and fertilizer invested in each hectare, and multiple harvests per hectare per year—has long been the norm. The highest outputs per worker, on the other hand, are in North America, where technology allows a single farmer to cover a lot of land. There, a farm worker produces about US $90 000 of crops and livestock per year, compared to a global average of about $2000.
But it’s a mistake to think that you can improve productivity simply by throwing resources at it—adding more fertilizers, chemicals, machinery, and energy. You’d get more produce, but you’d spend more money to get it. What matters is how you get more out of a given combination of land, labor, capital, and materials—what we call total factor productivity, or TFP. A rise in TFP reflects underlying technical and managerial innovations. It’s not just about using more machines and chemicals; it’s really a story of learning how to get more with less.
Measuring TFP requires accurate data on what farmers produce and what they use to produce it. It also requires a good way of aggregating all these inputs and outputs. That’s important because a rise in the price of one input may induce farmers to use less of it and more of something else—for instance, if the cost of labor goes up, a farmer might react by employing fewer people to pull up weeds and applying more herbicides instead. If you aren’t careful with such matters, you could get the impression that the farmers are suddenly raising their productivity (output per worker), when in fact TFP may have remained constant.
Until recently we lacked such detailed information for the many developing countries and ex-Soviet countries, but now, for the first time, we can put everything into a single model of agricultural TFP growth for the entire world. The results are encouraging. Globally, the rate of productivity growth in agriculture has accelerated, and most of this acceleration stems from improved productivity in developing countries. Today, the developing countries are about as productive in TFP terms as the industrialized nations were back in the 1960s. And they are catching up to productivity levels already achieved in industrialized nations, although sub-Saharan Africa still lags the most.
The good news is that there has been a nearly threefold improvement in global agricultural output between 1961 and 2009. But there’s even better news: Only about 60 percent of that improvement can be attributed to the use of more land, labor, capital, or materials. The rest came from improvement in TFP—more efficient use of land, labor, and capital through better management and, of course, technology. As an analogy, think of the difference between making transistors by hand, as opposed to fabricating them by the millions on a silicon chip.
Further analysis of the data revealed the best news of all: Over that 48-year period, TFP’s contribution grew. By the decade ending in 2009, it accounted for about three-quarters of the annual increase in the global food supply.
Over time, farmers have raised their TFP by becoming more precise in applying inputs [see “Farming by the Numbers,” in this issue]. Tractors are being guided by computers with GPS to apply seeds and fertilizers more precisely, according to what a soil test indicates is needed for the field, or even a section of the field. New types of sprinkler and drip irrigation systems are able to irrigate crops using much less water than before. New pesticide compounds require far lower doses, in part because they are applied only when computer models predict the crop is at risk of significant pest infestation.
Because of advances in weed control, farmers are increasingly implementing “no-till” techniques, which let them plant the next crop directly into the stubble of the old crop while controlling weeds with herbicides. This cuts out the need for plowing, allows the use of smaller, cheaper tractors, saves time and fuel, reduces erosion, and helps water retention. Improved breeds of livestock and poultry, fed specialized diets at each stage of their development, are fabulously productive: A typical dairy cow in the United States now gives more than four times as much milk as its counterpart did in 1950. And while the number of pigs raised on U.S. farms has hardly changed in 60 years, the production of pork has more than doubled, thanks to larger, leaner, faster-growing animals.
Why do I think we are nowhere near the end of this remarkable rise in efficiency? After all, many experts who had no initial prejudice against the effectiveness of technology in agriculture thought that proven technologies were close to their limits years ago.
My doctoral thesis advisor, the late Vernon W. Ruttan, a leading scholar of world agriculture, began as a technological optimist but changed his tune toward the end of his career. In a 1998 presentation, he said, “I find it much more difficult to tell a convincing story about the likely sources of increase in crop and animal production over the next half century than I did a half century ago…. I find it difficult to escape a conclusion that both public and private sector agricultural research, in those countries that have achieved the highest levels of agricultural productivity, has begun to experience diminishing returns.”
I remain more optimistic, for several reasons. First, let’s look at the modernized countries, where a new fusion of biotechnology, genomics, and information technologies is helping agricultural scientists to develop technologies faster and on a grander scale. Public attention has focused on the contentious use of genetically engineered plants and animals in agriculture—in which a gene from one species is inserted into the genome of another to give it a desirable trait, such as the ability to withstand herbicides. Then farmers can use large amounts of herbicides to kill weeds without harming their crops.
The practice is controversial, involving the use of an unfamiliar technology and leading as it has to greatly increased use of herbicides, which can have undesirable side effects. It may be, however, that the bigger story in developed countries is how scientists are finding new life (so to speak) in the old-fashioned technique of selective breeding. For example, DNA-based molecular markers allow breeders to screen large populations of plants for traits of interest rather than rely on just a visual identification of the trait. By removing some of the guesswork, such methods lower the cost and speed the development of new varieties. Or take livestock breeding: Today, instead of having to wait years for a female calf to mature before you’re able to measure how much milk she can give, dairy breeders can predict it pretty accurately from the start. All they have to do is run some of a bull calf’s genetic markers through a computer model to predict the milk production of his future daughters. Breeders can then immediately select which bull calf to raise.
But the main reason for my optimism is the clear evidence that many developing countries are finally getting serious about doing their own agricultural research, especially in places where land and labor productivity are low. And what happens in those places matters because they are producing ever more of the world’s food—about two-thirds of the global total in 2010. Even if Ruttan was right that agricultural productivity in rich countries is hitting a ceiling, faster growth in developing countries can keep the global average rising.
It was quite an achievement, back in the 1960s, when the developing regions managed to grow more food with a given amount of land, labor, and capital—the total factor productivity I mentioned earlier. It is world-changing that since around 1990 these regions have raised the annual growth rate of TFP to around 2 percent. The magic of compound interest is at work here: An annual increase of 1 percent will double output in 70 years; a rate of 2 percent will double it in 35. And that’s just the improvement the developing countries are getting every year by increasing productivity—the way they use resources. If you throw in the results from using more resources, you get an overall increase in output of more than 3 percent a year. That’ll double the overall output of these regions in just 23 years.
Consider China and Brazil. For two decades they have sustained exceptionally high growth in TFP, and that growth has reaped remarkable dividends. China can now just about feed its own 1.36 billion people, and at a higher standard than ever before; and by some metrics Brazil has emerged as second only to the United States in its exports of food. Southeast Asia and Latin America also have done well. And it appears that the Eastern European countries have finally gotten their acts together since the breakup of the Soviet Union in 1991, which sent their inefficient, centrally planned agricultural sectors into a tailspin. TFP stagnated during the Soviet era, but it took off around 2001.
Only sub-Saharan Africa has failed to significantly improve its productivity. Some of my colleagues would take issue with that statement, but my analysis indicates that the productivity of land, labor, and capital invested in African agriculture is growing at only about 1 percent per year [PDF]. My view is that this slow growth is a consequence of several factors: weak agricultural research, poor transfer of the results of what research there is to the farmers, unfavorable price policies, poor infrastructure, and insecure land tenure.
What distinguishes the most successful countries? Several things, but two stand out: national research institutions and policies that strengthen incentives and market access for farmers.
Agricultural advances aren’t like other kinds. Today’s Paris fashions are disseminated within weeks throughout the world, with every stitch in place. A logic chip designed in Silicon Valley doesn’t care whether it runs a laptop in the next town or on another continent. A hybrid-electric car can be driven out of its factory in Japan and soon make its way onto a street in Chicago. Not so with agricultural innovations—they must be tailored to whoever would use them.
A famous example of this involves the efforts of Soviet premier Nikita Khrushchev to improve corn. He had long noted the high yields of that crop in the United States. Upon becoming premier he ordered that it be planted in the Soviet Union, including in some farms in Siberia. So spectacular were the resulting crop failures that they were held against him some years later during his ouster, when his successors ridiculed his “hare-brained schemes.”
Crops need to be bred for specific climates, day lengths, and soil types, and to be resistant to local pests and diseases. Much of this work can’t be done for you; you just have to do it yourself. That’s why it’s so heartening that many developing-world countries are getting serious about doing their own agricultural research. Brazil’s recent success in expanding soybean production came only after a decade of research to adapt soybeans—normally grown in the temperate zone—to tropical conditions. Large parts of sub-Saharan Africa are off-limits to cattle because the tsetse fly transmits sleeping sickness to them, but crossbreeding work is under way to improve a local cattle breed, the N’Dama, that is resistant to the disease.
And it’s not just organisms that need to be redesigned. It’s big and complex systems and networks consisting of myriad machines, techniques, routines, chemicals, infrastructures, and more. When Deere & Co. introduced its tractors to India, it had to redesign the transmissions so that the machines could perform the double duty required there—plowing and road transport. Because of the importance of localizing innovation, the single most important difference between countries that have sustained long-term productivity growth and those that have not may well be in whether they have invested in their own research systems.
A second thing that distinguishes agriculture technology from other industries is the need to stay one step ahead of nature. Crop pests and animal diseases are quick to evolve resistance to any new measure we introduce to control them. And now it’s not only pests that are changing but the climate itself. Climate change poses a real risk to agriculture, especially after 2050 if global emissions of greenhouse gases are not curbed. Research is critical to enable farmers to adapt to these changes, not only to improve but merely to maintain agricultural productivity. It’s similar to the Red Queen’s predicament in Lewis Carroll’s Through the Looking-Glass, who had to run as fast as she could just to stay in the same place.
Of course, research won’t do any good if the results never leave the lab. Success in raising productivity also requires an economic environment that rewards farmers and agribusinesses that innovate. When China and Vietnam abandoned their misguided experiments with communal farms and reverted to family-run operations, their productivity took off. When the government of Ghana imposed heavy taxes on cocoa, its main export, many farmers abandoned the crop and productivity languished. When taxes were reduced, farmers rehabilitated old plantations and adopted improved seed stock, and output quickly recovered. Of course, political and social factors can harm as well as help. War, brigandage, and disease—such as HIV in Africa and avian flu in Asia—have seriously depressed agricultural productivity growth in some countries.
Such episodes are also reflected in the history of the western countries. Back in George Washington’s time, one of the surest ways for U.S. farmers to make money was to distill some of their grain into whiskey, which alone was compact enough to carry to distant markets, given the poor state of transport at the time. Indeed, it was to overcome just such problems that the federal government later justified investments in canals and railroads.
Even today, transportation remains a challenge in some agricultural regions. Years ago, the Brazilian government used research and infrastructure investment to improve agricultural productivity in and access to the Cerrado, a vast tropical savanna deep inside the country. And the Cerrado has now become the “soybean belt” of South America.
I’m not worried about the world’s sheer capacity to grow enough food of sufficient quality to feed us and our descendants for many years to come. Regions that have lagged behind the agricultural technology frontier, like much of sub-Saharan Africa, could follow the examples of Brazil and China, for instance, which tapped international expertise, invested heavily in local research, made critical reforms to policies and institutions, and thereby raised their farmers’ productivity immensely.
Agricultural innovation is among the best ways—better perhaps even than medical innovation—to help the broad mass of people in the poorest countries of the world to live more comfortable lives. No improvement in manufacturing, services, communications, or transportation can enrich the entire bottom half of the world’s population as high-yield agriculture has done, time and again.
In later editions of his essay, Malthus himself expressed some optimism that catastrophe could be avoided, emphasizing the possibility that population growth could be reduced if people delayed marriage and had fewer children. He also acknowledged the possibility of raising productivity but insisted that this avenue faced natural limits. But the evidence so far seems to suggest that it is not science that is the limiting factor but rather the willingness of society to understand and support the scientific endeavor.
About that, I’m slightly less optimistic.
This article originally appeared in print as “Why the Pessimists Are Wrong.”
About the Author
Keith Fuglie is branch chief for resources, environmental, and science policy at the U.S. Department of Agriculture’s Economic Research Service. The views expressed here are his own and do not reflect those of the USDA or the ERS.