A Brief History of Money
Or, how we learned to stop worrying and embrace the abstraction
In the 13th century, the Chinese emperor Kublai Khan embarked on a bold experiment. China at the time was divided into different regions, many of which issued their own coins, discouraging trade within the empire. So Kublai Khan decreed that henceforth money would take the form of paper.
It was not an entirely original idea. Earlier rulers had sanctioned paper money, but always alongside coins, which had been around for centuries. Kublai’s daring notion was to make paper money (the chao) the dominant form of currency. And when the Italian merchant Marco Polo visited China not long after, he marveled at the spectacle of people exchanging their labor and goods for mere pieces of paper. It was as if value were being created out of thin air.
Kublai Khan was ahead of his time: He recognized that what matters about money is not what it looks like, or even what it’s backed by, but whether people believe in it enough to use it. Today, that concept is the foundation of all modern monetary systems, which are built on nothing more than governments’ support of and people’s faith in them. Money is, in other words, a complete abstraction—one that we are all intimately familiar with but whose growing complexity defies our comprehension.
Today, many people long for simpler times. It’s a natural reaction to a world in which money is becoming not just more abstract but more digital and virtual as well, in which sophisticated computer algorithms execute microsecond market transactions with no human intervention at all, in which below-the-radar economies are springing up around their own alternative currencies, and in which global financial crises are brought on for reasons difficult to parse without a Ph.D. Back in the day, the thinking goes, money stood for something: Gold doubloons and cowrie shells had real value, and so they didn’t need a government to stand behind them.
In fact, though, money has never been that simple. And while its uses and meanings have shifted and evolved throughout history, the fact that it is no longer anchored to any one substance is actually a good thing. Here’s why.
Let's start with what money is used for. Modern economists typically define it by the three roles it plays in an economy:
It’s a store of value, meaning that money allows you to defer consumption until a later date.
It’s a unit of account, meaning that it allows you to assign a value to different goods without having to compare them. So instead of saying that a Rolex watch is worth six cows, you can just say it (or the cows) cost $10 000.
And it’s a medium of exchange—an easy and efficient way for you and me and others to trade goods and services with one another.
All of these roles have to do with buying and selling, and that’s how the modern world thinks of money—so much so that it seems peculiar to conceive of money in any other way.
Yet in tribal and other “primitive” economies, money served a very different purpose—less a store of value or medium of exchange, much more a social lubricant. As the anthropologist David Graeber puts it in his recent book Debt: The First 5000 Years (Melville House, 2011), money in those societies was a way “to arrange marriages, establish the paternity of children, head off feuds, console mourners at funerals, seek forgiveness in the case of crimes, negotiate treaties, acquire followers.” Money, then, was not for buying and selling stuff but for helping to define the structure of social relations.
How, then, did money become the basis of trade? By the time money makes its first appearance in written records, in Mesopotamia during the third millennium B.C.E., that society already had a sophisticated financial structure in place, and merchants were using silver as a standard of value to balance their accounts. But cash was still not widely used.
It’s really in the seventh century B.C.E., when the small kingdom of Lydia introduced the world’s first standardized metal coins, that you start to see money being used in a recognizable way. Located in what is now Turkey, Lydia sat on the cusp between the Mediterranean and the Near East, and commerce with foreign travelers was common. And that, it turns out, is just the kind of situation in which money is quite useful.
To understand why, imagine doing a trade in the absence of money—that is, through barter. (Let’s leave aside the fact that no society has ever relied solely or even largely on barter; it’s still an instructive concept.) The chief problem with barter is what economist William Stanley Jevons called the “double coincidence of wants.” Say you have a bunch of bananas and would like a pair of shoes; it’s not enough to find someone who has some shoes or someone who wants some bananas. To make the trade, you need to find someone who has shoes he’s willing to trade and wants bananas. That’s a tough task.
With a common currency, though, the task becomes easy: You just sell your bananas to someone in exchange for money, with which you then buy shoes from someone else. And if, as in Lydia, you have foreigners from whom you’d like to buy or to whom you’d like to sell, having a common medium of exchange is obviously valuable. That is, money is especially useful when dealing with people you don’t know and may never see again.
The Lydian system’s breakthrough was the standardized metal coin. Made of a gold-silver alloy called electrum, one coin was exactly like another—unlike, say, cattle. Also unlike cattle, the coins didn’t age or die or otherwise change over time. And they were much easier to carry around. Other kingdoms followed Lydia’s example, and coins became ubiquitous throughout the Mediterranean, with kingdoms stamping their insignia on the coins they minted. This had a dual effect: It facilitated the flow of trade, and it established the authority of the state.
Modern governments still like to place their stamp upon money, and not just on bills and coins. In general, they prefer that money—whether physical cash or digital—be issued and controlled only by official entities and that financial transactions (especially international ones) be traceable. And so the recent rise of an alternative currency like Bitcoin [see “The Cryptoanarchists’ Answer to Cash,” in this issue], which is based on a cryptographic code that allows for anonymous transactions and that so far has proved to be uncrackable, is the kind of thing that tends to make governments very unhappy.
The spread of money throughout the Mediterranean didn’t mean that it was universally used. Far from it. Most people were still subsistence farmers and existed largely outside the money economy.
But as money became more common, it encouraged the spread of markets. This, in fact, is one of the enduring lessons of history: Once even a small part of your economy is taken over by markets and money, they tend to colonize the rest of the economy, gradually forcing out barter, feudalism, and other economic arrangements. In part this is because money makes market transactions so much easier, and in part because using money seems to redefine what people value, pushing them to view things in economic, rather than social, terms.
Governments were quick to embrace hard currency because it facilitated the collection of taxes and the building of military forces. In the third century B.C.E., with the rise of Rome, money became an important tool for unifying and expanding the empire, reducing the costs of trade, and funding the armies that kept the emperors in power.
The decline of the Roman Empire, starting in the third century C.E., saw a decline in the use of money as well, at least in the West. Parts of the former empire, like Britain, simply stopped using coins. Elsewhere people still used money to balance accounts and keep track of debts, and many small kingdoms minted their own coins. But in general, the circulation of money became less central, as cities shrank in size and commerce dwindled.
The rise of feudal society also undercut money’s role. The basic relationship between master and vassal was mediated not by payment for services rendered but rather by an oath of loyalty and a promise of support. Land was not bought and sold; it belonged, ultimately, to the king, who granted use of the land to his lords, who in turn provided plots of land to their vassals. And feudalism discouraged trade; a feudal estate, or fief, was often a closed community that aimed to be self-sufficient. In such a setting, money had little use.
Money’s decline in feudal times is worth noting for what it reveals about money’s essential nature. For one thing, money is impersonal. With it, you can cut a deal with, say, a guy named Jeff Bezos, whom you don’t know and will probably never meet—and that’s okay. As long as your money and his products are good, you two can do business. Similarly, money fosters a curious kind of equality: As long as you have sufficient cash, all doors are open to you. Finally, money seems to encourage people to value things solely in terms of their market value, to reduce their worth to a single number.
These characteristics make money invaluable to modern financial systems: They encourage trade and the division of labor, they reduce transaction costs—that is, the cost incurred in executing an economic exchange—and they make economies more efficient and productive. These same qualities, though, are why money tends to corrode traditional social orders, and why it is commonly believed that when money enters the picture, economic relationships trump all other kinds.
It’s unsurprising, then, that feudal lords had little use for the stuff. In their world, maintaining the social hierarchy was far more important than economic growth (or, for that matter, economic freedom or social mobility). The widespread use of money, with its impersonal transactions, its equalizing effect, and its calculated values, would have upended that order.
Money's decline didn't last, of course. By the 12th century, even as the Chinese were experimenting with paper currency, Europeans began to embrace a new view of money: Instead of being something to hoard or spend, money became something to invest, to be put to work in order to make more money.
This idea came with a renewed interest in commerce. Trade fairs sprang up across Europe, frequented by a community of merchants who had begun to do business across the continent. This period also saw the emergence of a banking industry in the city‑states of Italy. These new institutions introduced a host of financial innovations that we still use today, including municipal bonds and insurance. The banks fostered the use of credit and debt, which became ever more central to the economy as kings borrowed to finance their military adventures and merchants borrowed to fund their long-range trades.
The invention of the bill of exchange, which laid the groundwork for the emergence of paper money in the West, also occurred during this period. The bill of exchange was a sort of precursor to the traveler’s check: a document representing a quantity of gold that could be exchanged for the real thing in a different city. Traveling merchants liked the bills because they could be carried around with far less risk (and exertion) than the precious metal.
By the 16th century in Europe, many of the ideas about money that shape our thinking today were in place. Still, money remained a physical thing—that thing being a piece of gold or silver. A gold coin wasn’t a symbol of value; it was an embodiment of it, because everyone believed that the gold had intrinsic worth. Likewise, the amount of money in the economy was still a function of how much gold and silver was available. The rulers of Spain and Portugal didn’t quite appreciate the limits of this system, however, which led them to plunder their New World colonies and accumulate vast hoards of precious metals, which in turn triggered periods of rampant inflation and enormous tumult in the European economy.
These days, countries have central banks to oversee their money supplies, as well as to set interest rates, combat inflation, and otherwise control their monetary policy. The United States has the Federal Reserve System, the Eurozone has the European Central Bank, the Maldives has the Maldives Monetary Authority, and so on. When the Federal Reserve wants to increase the money supply, it doesn’t have to go looking for El Dorado. Neither does it phone up the United States Mint and order it to start printing more dollars; in fact, only about 10 percent of the U.S. money supply—about $1 trillion of the roughly $10 trillion total—exists in the form of paper cash and coins.
Instead, the Fed buys government securities, such as treasury bills, on the open market, typically from regular private banks, and then credits the banks’ accounts with the money. As the banks lend, invest, and otherwise spend this new money, the overall money supply that’s circulating increases. If, on the other hand, the Reserve wants to decrease the money supply, it does the opposite: It sells government bonds on the open market, again typically to private banks, and then deducts the sales price from the banks’ accounts. The banks have less money to spend, and the money supply shrinks.
The sophisticated and relatively opaque machinations by which central banks keep economies afloat may make the Spanish Empire’s inflationary foibles look quaintly naive. But in fact the fine-tuning of monetary policy—the delicate juggling of interest rates, money supply, and other financial mechanisms so that an economy keeps expanding at a steady, manageable rate, without excessive inflation, unemployment, debt, or boom and bust cycles—is still a work in progress, as the ongoing economic woes in both Europe and the United States demonstrate.
Back to the 1600s: The view of money as commodity began to shift only with the widespread adoption of paper currency, which found the warmest welcome in the American colonies. In 1690, for instance, the Massachusetts Bay Colony issued paper money to fund a military campaign, and did so without explicitly promising to redeem the bills for gold or silver.
Later, during the American Revolutionary War, the Continental Congress printed “continentals” to pay for the new country’s war debts. These bills were in principle backed by gold, but so many were issued that their collective value far exceeded the available gold. When soldiers and merchants discovered they’d been paid in near-worthless scrip, it inspired a backlash against paper money; the U.S. Constitution, for instance, prohibited states from using any other money than gold and silver coins. It wasn’t until 1862, during the Civil War, that Congress finally passed a law allowing the government to print paper money, or “greenbacks.”
That’s not to say that paper money was unavailable before then. Even as the U.S. government minted nothing but coins, private banks, often called “wildcats” [PDF], began issuing what in effect became thousands of currencies. Like the wartime continentals, these bank notes were in theory backed by gold, but it was hard to know whether a bank actually had enough gold to back up its notes, bank regulation being pretty much nonexistent at the time. Unsurprisingly, the wildcat era was fertile ground for fraud. What is surprising perhaps is that most banks did a reasonable job of keeping their currency and their gold reserves in balance, and the U.S. economy grew briskly.
The Bank of England, meanwhile, took a far more sober approach. In 1821, it adopted the gold standard, promising to redeem its notes for gold upon request. As other countries followed suit, the gold standard became the general rule for developed economies. The discovery of major new gold fields over the course of the 19th century ensured that the money supply kept growing.
The gold standard, as it was intended to do, brought stability to prices and was enormously beneficial to property holders and lenders. However, it also brought deflation—that is, prices generally fell—because as countries’ populations and economies grew, their governments had no easy way to increase the money supply short of mining more gold, and so money in effect became more scarce. Deflation was hard on farmers and borrowers, who longed for a little inflation to help them with their debts; when money gradually loses some of its value, so, too, do people’s debts.
The gold standard also didn’t prevent economies from falling into recession, and when they did—as during the worldwide slump known as the Long Depression, which lasted from 1873 to 1896—adherence to the standard made it difficult to do any of the things that might have quickly set things right, like cutting interest rates or pumping more money into the economy. As a result, economies took a long time to recover from downturns.
Of course, clever financial minds will always find an end run around the rules. Having a gold standard, it turns out, didn’t completely limit the growth of money. Banks could still make loans against their gold reserves, and they did so freely. Economic historians now believe that the amount of paper currency in circulation dwarfed the actual amount of gold and silver that banks had on hand. And so, while money was still tethered to gold in people’s minds, it had already begun to become unhooked.
What finally derailed the gold standard was World War I. Governments needed more money for their militaries than they had in gold, and so they simply began printing it. And though many countries tried to return to the gold standard after the war, the Great Depression ended that experiment for good.
The result? Currencies today are “fiat” currencies, meaning they’re backed by the authority of the issuing government, and nothing more. In the United States, for example, that means the government accepts only dollars as payment for taxes and requires its creditors to accept dollars in payment for debts. But if people were to lose faith in the dollar and stop accepting it in everyday transactions, it would eventually become worthless.
Many people find this situation unnerving, which is why there are perennial calls to return to the gold standard. The reliance on fiat money, we’re told, gives too much power to the government, which can recklessly print as much money as it wants. Yet the truth is that this has always been possible. Even with the gold standard, governments revalued their currencies from time to time, in effect dictating a new price for gold, or they ignored the standard when it proved too limiting, as during the First World War.
What’s more, the notion that gold is somehow more “real” than paper is, well, a mirage. Gold is valuable because we’ve collectively decided that it’s valuable and that we’ll accept goods and services in exchange for it. And that’s no different, ultimately, from our collective decision that colorful rectangles of paper are valuable and that we’ll accept goods and services in exchange for them.
The reality is that it’s a good thing that we’ve moved away from the gold standard and the idea that money needs to be tied to something else. In the first place, it’s honest: As soon as we left behind the habit of trading cattle for barley (both of which had intrinsic value), money became a social convention, and paper money just makes that convention obvious. These days, instead of worrying about where we’re going to find more gold and silver, we can focus on how to wisely manage the money supply for the greater good.
Second, and more important, abandoning the gold standard has given central banks much more flexibility in dealing with economic downturns. Recessions are downward spirals: Instead of spending and investing, people and businesses hold on to their cash, which shrinks overall demand, which forces businesses to cut back, which creates unemployment, which shrinks demand even more.
One solution is for governments to make up the difference by spending more. But it’s also important for interest rates to drop and for the money supply to increase, thereby making it easier for people to borrow money and helping overcome their reluctance to spend. Such actions are easier for the folks at the Federal Reserve and other central banks to pull off when they don’t have to worry about maintaining the gold standard. And recessions have been shorter and less painful since the gold standard was abandoned. Even the most recent global downturn, severe as it was, was minor compared to the Great Depression.
Of course, all this talk of central bankers tinkering with the money supply is precisely what critics of the fiat money system dread, because they believe it will inevitably lead to runaway inflation. And history does show that when a government massively and carelessly expands the money supply, it ends up with hyperinflation and a worthless currency, as happened in Weimar Germany in 1923 and in Zimbabwe just a few years ago.
But such episodes are rare. In the past 90 years, the United States and Europe have had just one sustained bout of high inflation—in the 1970s. That track record should engender some faith; on the whole, central bankers act responsibly, and healthy industrial economies aren’t prone to regular inflationary spirals. But that faith is apparently hard to muster; instead, it feels to many of us as if inflation is always about to soar out of control.
This irrational fear is ultimately a legacy of the way money evolved: We cling to the belief that money needs to be backed by something “solid.” In that sense, we’re just like Marco Polo—still a bit amazed by the thought that you can base an entire economy on little pieces of paper.
And yet we do. For more than 80 years, we’ve been living in a world in which money can be created, in effect, out of thin air. As we’ve already discussed, the central banks can create money, but so can ordinary banks. When a bank makes a loan, it typically just puts the money into the borrower’s bank account, whether or not it has that money on hand—banks are allowed to lend more money than they have in their reserves. And so with each home equity loan, car loan, and mortgage, banks add incrementally to the money supply.
There is, to be sure, something a bit eerie about all this, and periods like the recent housing bubble, when banks made an extraordinary number of bad loans, should remind us of the dangers of runaway credit. But it’s a mistake to yearn for a more “solid” foundation for the monetary system. Money is a social creation, just like language. It’s a tool that can be used well or poorly, and it’s preferable that we have more freedom to use that tool than less.
Over the course of history, the material substance of money has become less important, to the point that these days people talk easily about the possibility of a cashless society. The powerful combination of computers and telecommunications, of smartphones and social media, of cryptography and virtual economies, is what fuels such talk. And that progression makes sense because what matters most about money is not what it is, but what it does. Successful currencies, after all, are those that people use: They lubricate commerce, allow people to exchange goods and services, and thus encourage people to work and create. The German sociologist Georg Simmel described money as “pure interaction,” and that description seems apt—when money is working as it should, it is not so much a thing as it is a process.
This, perhaps, is what Kublai Khan understood seven centuries ago. It’s what we’re still trying to understand today.
About the Author
James Surowiecki writes The New Yorker’s popular business column “The Financial Page.” He is also the author of the best seller The Wisdom of Crowds (Doubleday, 2004). He found the task of condensing a few millennia’s worth of material into one magazine article challenging, but also incredibly compelling. “Money is one of those things that’s completely familiar and completely mysterious,” he says, “and that makes it a great subject.”