How Much Information Was Consumed By Americans In 2008?

About 100K Words and 34 Gigabytes of Information Per Day

2 min read
How Much Information Was Consumed By Americans In 2008?

This time of year, the news is filled with both the year's and decade's end retrospectives. One that I recently ran across was the "How Much Information?" study by Global Information Industry Center (GIIC) at the University of California, San Diego. The study, led by Roger Bohn and James Short, tries to answer questions concerning,"What is the rate of new information growth each year? Who produces the greatest amounts of information annually? How does information growth in North America compare with growth in other geographies, markets, and people globally?"

The GIIC, its web site says, "seeks to identify and describe through its research programs the underlying issues and consequences of technology enabled change in information and communications practices in government and industry, and those affecting individuals."

What the GIIC researchers found (or better estimated), was that in 2008 the average American consumed 100,500 words and 34 gigabytes of information for 11.8 hours on an average day. In 1980, Americans averaged 7.4 hours consuming information.

"Information" is defined as flows of data delivered to people and is measured in the bytes, words, and hours of consumer information. Video sources (moving pictures) dominate bytes of information (television mostly), with radio not that far behind.
 
However, the study notes that, "computers have had major effects on some aspects of information consumption. In the past, information consumption was overwhelmingly passive, with telephone being the only interactive medium. Thanks to computers, a full third of words and more than half of bytes are now received interactively. Reading, which was in decline due to the growth of television, tripled from 1980 to 2008, because it is the overwhelmingly preferred way to receive words on the Internet."

The study is interesting to look at, especially in comparison to the 2000 and 2003 studies by Peter Lyman and Hal Varian at the University of California, Berkeley that looked at how much information was being produced since 1980.

They estimated that in 2000, the average American household spent 43 hours per year on the Internet (excluding work).  Now it is closer to 60 hours per month.

The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
Vertical
A plate of spaghetti made from code
Shira Inbar
DarkBlue1

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less
{"imageShortcodeIds":["31996907"]}