Tech Talk iconTech Talk

Why the Evidence of Water on the Moon is Bad News

The big science story this week was the confirmation that the moon is covered in water. And not just in the shadowed, polar craters where scientists suspected it, but all over. Water on the moon--that has to be great news, right?

Not really. While unexpected discoveries are always interesting scientifically, this one is actually bad news for space exploration. It would have been better if lunar water wasn't quite so ubiquitous.

When I previewed NASA's LCROSS mission last March, I noted that its purpose was not really in determining whether the moon had water, but in determining how the water was distributed. Even though LCROSS is still scheduled to smash into the Moon's surface in a couple weeks, it now seems likely that it will only confirm this week's findings: that there are trace amounts of water everywhere on the moon.

For human exploration, highly concentrated deposits of ice would be a much more useful distribution. The possibility of mining water on the moon has often been cited as both the means and the end for sending people back to the moon (Bill Stone, for example, has been a big proponent). Even with the possibility of big deposits, though, I was always skeptical that mining lunar water ice could ever be efficient enough to be worthwhile. Now, it seems unlikely that any usable amount of water can be extracted from the surface. Geologist Robert Clark estimated that one ton of lunar soil might yield "as much as 32 ounces of water" (a little less than a liter). That means it will take a lot of work to get a little liquid, even with the help of innovative suction robots.

I'm now interested to see how this finding is used as political spin. Back in the Bush years, NASA's Constellation program promised a fully-functioning moon base within two decades (the program continues to move forward, but its future remains in limbo). Constellation's ultimate destination, at least in theory, was Mars. But, in practice, the moon quickly became the primary target. The agency then needed to come up with a bunch of justifications for going there, after the fact. The prospect of mining lunar water, always seemed l one of these. Regardless of the impracticality of collecting it, the water finding may energize Moon proponents. A article declared that "Water Makes Moon Suddenly a More Attractive Destination." The New Scientist  similarly raised hopes:

Newly confirmed water on the moon could help sustain lunar astronauts and even propel missions to Mars, if harvesting it can be made practical.

In this summer's Mars special report, we examined whether going back to the moon before Mars is even necessary. While a moon program may make us better prepared for the longer and difficult journey, it could  easily turn into a wasteful disatrction that eats up valuable time and resources. Frankly, the chance of finding evidence of life on Mars will always make it a more attractive destination than our close, but definitely dead, satellite. And if it's water you're really after? Mars has more of that, too.

Top Image: NASA's Cassini spacecraft observations of the moon on Aug. 19, 1999 show water and hydroxyl at all latitudes on the surface, even areas exposed to direct sunlight. [Credit: NASA/JPL-Caltech/USGS] Middle Image: These images show a very young lunar crater on the side of the moon that faces away from Earth. [Credit: ISRO/NASA/JPL-Caltech/USGS/Brown Univ.] Bottom Image: A fresh, 6-meter-wide (20-foot-wide) crater on Mars on Oct. 18, 2008, (left) and on Jan. 14, 2009. [Credit: NASA/JPL-Caltech/University of Arizona]

If There's An Innovation Gap, Where Is It?

A BusinessWeek article this week, "Turning Research into Inventions and Jobs," asserts that there's plenty of basic research in the world. What there's not enough of, the authors assert, is products - products that exploit this research.

But too often overlooked in discussions over research spending is a fundamental fact: We've already got an abundance of research. The next transistor, semiconductor, or breakthrough in MRI technology may already have been discovered. The problem is, we've dropped the ball on translating this science into invention. The vast majority of great research is languishing in filing cabinets, unable to be harnessed by the entrepreneurs and scientist-businesspeople who can set it free. We consider this shortfall academia's equivalent of Alaska's "bridge to nowhere."

The article, by Vivek Wadhwa and Robert E. Litan, was written in disagreement with an earlier BusinessWeek article, "How Science Can Create Millions of New Jobs," which asserted that "Reigniting basic research can repair the broken U.S. business model and put Americans back to work." I think Judy Estrin might agree with that, and she would certainly disagree with the new article.

Almost exactly a year ago, Estrin's book, Closing the Innovation Gap, was published by McGraw-Hill Press. Andy Grove of Intel called it “A passionate look at innovation by a proven innovator concerned about the level of short-sightedness surrounding us.” Grove is right — it's a topic Estrin is remarkably qualified to talk about. She's now at Stanford, but back in the day she co-founded seven different technology companies. When one of them was bought by networking goliath Cisco Systems she became its chief technology officer. (Chapter Two of the book, "The Innovation Ecosystem," is available here.)

I did a long podcast interview with Estrin when the book came out. In it, she asserts just the opposite of what Wadhwa and Litan says. Her view is that there is a dearth of fundamental research, in fact, that we're still living off the seed corn of the 1960s and 1970s - and it's running out. Here's a snippet from the interview.

SPECTRUM: Engineering and innovation are in your blood. Your mother was the second woman ever to get a Ph.D. in electrical engineering.... And your father taught at UCLA — he helped start up the computer science department. You went there in the early 1970s; you were on Vint Cerf's research team as the Internet Protocol was being invented there. But you say in your book that was also the time innovation started to decline.
ESTRIN: It started in the 70's with a narrowing of the horizons in the research community first and that's where we began to pull back on planting the seeds for the future. It came in a variety of ways, it came in terms of government funding for research, not just the magnitude of the funding but how it was allocated, and the types of funding that were coming out of the government agencies, and how it was allocated amongst different fields of science.
But the other thing that happened in the 70s and 80s is that corporations began to focus on becoming more and more efficient and more and more productive; a good thing and on the surface you would say. Of course they need to do that but as they did and as they started to focus on productivity and efficiency they essentially took all the slop out of the system and often innovation happens, comes out of some of those inefficiencies. And they became so efficient that people began to invest just for the short term and in order to have sustainable innovation, you have to be wiling to invest in things that you don't know what the outcome is going to be, that you don't know is going to succeed. And as corporations became more efficient they cut back on investing in things that didn't have direct correlation to their quarterly or this year's earnings.
So we stopped planting seeds for the future not just in research but in corporations for a while. The startup ecosystem was still thriving, so a lot of innovation was coming out of Silicon valley and other places where startups thrived.
But when we hit 2000 and the bursting of the internet bubble, the corporate scandals and the tragedy of 9/11 we saw a shift here in Silicon Valley of people becoming more risk averse and more short term focused. So as a result I have people coming and saying to me “well come on Judy there's been lots of innovation over the last couple of years look at the iPod, look at consumer internet, look at what's happened in biotech.”
And yes there is still innovation going on my claim is not that there is not but the innovation we're doing is tending to be more incremental and is built upon years and years of seeds that were planted but we're not continuing to plant enough seeds to sustain us out 10, 20, 30 years from now.
I have a quote in the book that I really liked. When I interviewed Mark Andressen, who developed the initial browser, he was telling me how quickly he was able to bring the browser to the market, and I looked at him and just said “That was an incredibly short time” and he said “you know the browser we developed was just the icing on a cake that had been baking for 30 years.”

Saudi Arabia Aims to Become Data Visualization Hub

Saudi Arabia's biggest experiment in higher education, the King Abdullah University of Science and Technology, has just opened its doors to an international student body, as we reported earlier this month. The King has gambled billions of dollars on raising a university out of the desert that he hopes will compete against other top-notch institutions worldwide. Intellectual freedom isn't exactly the first thing that jumps to mind when one thinks of Saudi Arabia, and for a country whose technological contributions basically begin and end with oil, the hurdle is significant.

In recognition of this challenge, the king has recruited an international collection of about 70 faculty members (rumor: an assistant professor makes about US $200 000) and built laboratories with staggering price tags. The campus supercomputer, Shaheen, is the fastest in the Middle East and had a starting price of about $50 million, which will certainly grow. A nanofabrication clean room, one of the cleanest clean rooms in academia, came with a price tag that was “much, much larger than Shaheen,” according to Khaled Salama, an electrical engineering professor at KAUST.

I'm attending the inauguration ceremonies this week and got a quick tour of some of the university's laboratories, including the supercomputer and the clean room. From my perspective, if you've seen one clean room, you've seen them all. What did draw my attention were the visualization labs, which are using Shaheen's computing power to add a visual dimension to large data sets. The first example I'm posting here is of a visualization of the human brain, where researchers are attempting to trace how signals travel between different regions by mapping the flow of water through the brain.  


Million Dollar Netflix Prize Won

At last, it's official: the research team known as BellKor has won the $1 million Netflix Prize by improving the accuracy of the Netflix recommender algorithm by 10 percent. It won because it submitted its entry 20 minutes before that of Ensemble, the only other team to make the 10 percent mark.

Netflix sure got its money's worth. CEO, Reed Hastings told the New York Times, “You look at the cumulative hours and you're getting Ph.D.'s for a dollar an hour.” Beyond the technical harvest, Netflix got the kind of halo effect that no company has enjoyed since 1987, when IBM's “Deep Blue” machine won a chess match against then-world champion Gary Kasparov.

Bellkor and Ensemble were both coalitions formed by teams that had to join forces to stay in the race. BellKor's own core, though, had always been the team to beat. Lead members Robert M. Bell, Chris Volinsky, and Yehuda Koren began their work at Bell Laboratories, where the first two still work; Koren has since joined Yahoo Research, in Israel. They snagged the competition's first, $50 000 milestone by achieving the best improvement, short of 10 percent, by a certain date. The three scientists, together with Jim Bennett, then a vice-president at Netflix, described their work for IEEE Spectrum, in “The Million-Dollar Programming Prize” (May 2009). 

Of course, Netflix has launched a follow-up competition. This time, though, in the interest of brevity, it has set no particular target, deciding instead to  award $500 000 to whoever's in the lead after six months and an equal sum to the leader after 18 months.

The competition has advanced the entire field of recommender systems, to the benefit of other companies.  Earlier this year Tom Conrad, CTO of, the Internet radio company, told IEEE Spectrum that “we have benefited by peering inside the approaches tried by some of the thinking that went into the Netflix prize. We have incorporated some ideas into our own system.”

DOE Mad Science Wing Finally Gets a Director

On Friday, President Obama announced his pick to head the Advanced Research Projects Agency Energy (ARPA-E); Lawrence Berkeley National Laboratory director Arun Majumdar.

ARPA-E, a high-risk research incubator in the U.S. Department of Energy, was signed into law in 2007 but languished for two years as funding and interest lagged. In February, the incoming Obama administration lavished $415 million on the fledgling organization.

So what’s next for Majumdar? The new director will need to be confirmed, a process that will take at least one month. By the time he is confirmed, it will likely be time for the house to adjourn for winter. In a report released in July, former HSARPA director Jane "Xan" Alexander laid out a roadmap for success for the new director.

ARPA-E was recommended in an influential 2005 report co-authored by Obama Energy Secretary and Nobel prize winning physicist Steven Chu, and signed into law in 2007. The agency is modeled on the Defense Advanced Research Projects Agency (DARPA), credited with developing the Internet. ARPA-E’s goal is to create game-changing energy technologies from high-risk research gambles. Other agencies have adopted DARPA’s framework, leading to the creation of IARPA (Intelligence) and HSARPA (Homeland Security). With a director who reports only to the Energy secretary and a lean core staff with a personnel cap of 120 directly responsible for all funding, such agencies award grants fast and is free of the bureaucracy that famously slows government to a crawl.

The plans are there, and the money is there, and now a new director is in place. My advice? The first step needs to be to get your own web site.

FCC to Tackle Internet Rules

FCC Chairman Julius Genachowski announced that the Commission will kick off a rulemaking proceeding on Internet regulation shortly, as soon as the formality of a vote is completed. The long-anticipated announcement has more to do with cleaning up the state of Internet regulation, which the previous chairman left in a bit of a mess. The FCC is supposed to make rules in public proceedings before enforcing them, but the previous commission slapped Comcast's wrists with a set of rules that it had declared unenforceable when they were written, the Four Freedoms that make up the Internet Policy Statement. 

As expected, the Genachowski announced that he intends to propose an anti-discrimination rule and a transparency rule. These had been considered mutually exclusive, so the combination is a bit of a surprise.

As the purpose of the speech was to announce the rule making procedure and not the precise nature of the rules themselves, it wasn't the most stirring piece of oratory. There were some curious moments early in the narrative when the chairman walked through the history of ARPANET and touted the architectural wonder of the Internet (the speech is on a new web site the FCC created today,

Historian John Naughton describes the Internet as an attempt to answer the following question: How do you design a network that is “future proof” -- that can support the applications that today’s inventors have not yet dreamed of? The solution was to devise a network of networks that would not be biased in favor of any particular application. The Internet’s creators didn’t want the network architecture -- or any single entity -- to pick winners and losers. Because it might pick the wrong ones. Instead, the Internet’s open architecture pushes decision-making and intelligence to the edge of the network -- to end users, to the cloud, to businesses of every size and in every sector of the economy, to creators and speakers across the country and around the globe. In the words of Tim Berners-Lee, the Internet is a “blank canvas” -- allowing anyone to contribute and to innovate without permission.

While this is pretty much standard Internet mythology, it's not accurate enough for regulatory work. Network engineers know that no single-service network, which is what the Internet has become post-BGP, can ever be application neutral. The Internet's best-effort delivery service is fine for generic content applications like the Web, and much less fine for real-time services and for high bandwidth content applications like P2P file sharing. There is no such thing as a truly neutral network, and we can only approach neutrality to the extent that the network can tailor delivery services to the needs of applications. That's why we have Quality of Service logic in IEEE 802 LANs, WLANs, WPANs, and WWANs. One-size-fits-all is a myth. A network with no QoS does pick winners and losers, make no mistake about it.

The chairman's lack of precision is par for the course in political circles, but there's a significant danger to innovation from trying to apply these metaphoric descriptions too literally. When your network has a structural bias in favor of a particular class of applications, it needs to permit management practices to overcome it. It's not clear that the FCC has the digital chops to appreciate this.

So the shoe has finally dropped and the FCC is on the road to fulfilling President Obama's campaign promise to protect the open Internet. This could result in clarity, certainty, and a good environment for investment, or it could degenerate into a circus as the Comcast proceeding did in 2008. Chairman Genachowski is a bright and earnest public servant, the odds are better than even money that the rulemaking will not do significant harm, but you never know how these things will turn out until the votes are counted. Those of us who do network engineering for a living need to keep a close watch on this proceeding.

Can You Trust Crowd Wisdom?

Can you trust crowd wisdom? An article this week on the MIT Technology Review website asks that question and answers it in the negative - or rather, says that new research indicates the answer is no: "Researchers say online recommendation systems can be distorted by a minority of users."

When searching online for a new gadget to buy or a movie to rent, many people pay close attention to the number of stars awarded by customer-reviewers on popular websites. But new research confirms what some may already suspect: those ratings can easily be swayed by a small group of highly active users.
Vassilis Kostakos, an assistant professor at the University of Madeira in Portugal and an adjunct assistant professor at Carnegie Mellon University (CMU), says that rating systems can tap into the "wisdom of the crowd" to offer useful insights, but they can also paint a distorted picture of a product if a small number of users do most of the voting. "It turns out people have very different voting patterns," he says, varying both among individuals and among communities of users.

What¿s the official informal fallacy name for bait-and-switch? This Tech Review article commits it. It wants you to think this about recommendation systems, but it isn't. It wants you to think that there's a hidden problem of only a few people voting, when the research is really talking about the fact that a relatively small fraction of people are doing a large share of the total voting at places like IMDb.

That's not to say that there aren't problems with the voting at IMDb. Is Inglourious Basterds really the 43rd best movie ever made, better than The Departed (#57), Slumdog Millionaire (#75), Braveheart (#100), Unforgiven (#110), No Country For Old Men (#116), Million Dollar Baby (#150), or Crash (#232), each of which won the Academy Award for Best Picture in its respective year? Of course not. But the problem is''t a handful of voters influencing the vote - these fewest number of votes for any one of these is 85 000. The problem is 18-years-olds with no historical memory of cinema giving a movie a 10 the same night they see it, while those of us over 40 are carefully weighing whether Yojimbo gets an 8 or a 9.

Suppose for the sake of argument there's an 80/20 rule for IMDb voting - that is, 80 percent of all votes are cast by 20 percent of the people who vote. Is that a problem? What if it turns out there's an 80/20 rule for electoral voting in the United States. Does that invalidate the election process?

In other words, consider the entire aggregation of election votes cast by everyone alive who has ever voted. It might very well be the case that a handful of people turn out to every election, casting votes for every county supervisor and municipal judge election, while a large number of people turn out once every four years to vote for the U.S. President, while another large group votes even less frequently than that. It might well turn out that 20 percent of all citizens cast 80 percent of the votes. In fact, in the absence of Soviet-style mandatory voting, it would be surprising if something like that weren't the case.

As might be expected, the paper itself, which was presented at the 2009 IEEE International Conference on Social Computing and is available from Kostakos's website here [PDF], isn't about the unreliability of crowd wisdom at all. It looked at three different online voting systems with different barriers of entry to voting. (Its conclusion that experts can be encouraged to vote more often by lowering the barriers to voting seems to me to be rather circular and obvious, given that it defines experts simply as people who vote often.)

The paper takes for granted that if an item has been only reviewed or voted on a couple of times, the result is unreliable, and it doesn't seem to have anything particular to say about the reliability of a recommendation based on a large number of votes or reviews. It doesn't, by the way, even contain the word "distorted" - that seems to have come from a conversation or interview with Kostakos, not from the paper itself.

Nor does the paper have anything to say about "online recommendation systems" - when discussing Amazon, for example, it considers only the voting and reviewing on the Amazon site, and not the feature by which it recommends other products based on what other people looked at or bought. This reviewer's recommendations: One shaky thumb up for the research, two firm thumbs down for Tech Review's report on it.

IEEE Standards Board Member to Rejoin Iggy Pop and The Stooges


Many of James Williamson’s colleagues—at Sony, where, until a few months ago, he was Vice President of Technology Standards, and at IEEE, where he serves as a member of the Standards Association Board of Governors and the Association's Corporate Advisory Group—didn’t know about the years he spent as a punk guitarist and member of The Stooges. His calm manner and even temper at standards meetings belied his previous reputation as one of the loudest and raunchiest punk rockers in the business.

Williamson co-wrote the songs and played guitar on the 1973 album, Raw Power, now considered a punk classic. He collaborated with Iggy Pop on the 1975 album Kill City, then turned to electrical engineering, getting his BSEE degree from California State Polytechnic University.

He did return to music briefly, contributing to Iggy Pop’s 1979 album New Values, then focused on his technical career.

But now, recently retired from Sony, he’s picking up the guitar again. Williamson, who hasn’t performed in front of a paying audience in 35 years, has reportedly started practicing for his musical comeback. The Stooges are currently booked to appear next year at the All Tomorrow’s Parties Festival in London, possibly the first stop on a tour.

No word yet as to whether IEEE members will be able to purchase concert tickets at a discount.

Apple Just Announced a Flip-killer, the iPod Nano Video Camera

I've been thinking about putting a Flip video camera high on my Christmas list, so much more convenient than lugging around my old digital video cassette camera for family events. But Apple's intro today of its Flip-killer–a video camera that oh, by the way, is built into an iPod Nano–just sunk that idea. Not just because it's an iPod too (I'm thinking I wouldn't use it for music, I'd be saving the memory for movies), but because I have complete faith in Apple making the user interface easy, I won't need to load more software (Flip requires a special app), and it'll go right into iTunes without the conversion that Flip videos require. Plus it's thinner, boasts a five hour battery life, and is about the same price ($149 for 8 GB). And oh yeah, I like the colors. Which could present a problem--do I want pink, or red, or blue...

Followup: I saw my first video Nano in the wild shortly after 7 p.m., just eight hours after the announcement--in the hands of parent taking videos at a back-to-school event. It was a red one. It got away before I could check it out.

Tech Museum of Silicon Valley Announces 2009 Laureates

This week, the Tech Museum of Silicon Valley announced its 2009 laureates. Among the 15 honorees:
—Joseph Adelegan, whose project in Nigeria takes the waste stream from slaughterhouses and turns it into methane for electricity generation or cooking gas.
—Sean White, who is digitizing the plant collection of the Smithsonian to create an Electronic Field Guide that will identify species through object recognition.
—The Alternative Energy Development Corp. of South Africa, which is using zinc air fuel cells for household electricity.
—Solar Ear, a Brazilian company building inexpensive hearing aids that come with solar rechargers.
—Geogebra, an organization developing open-source software for teaching geometry, algebra, and calculus.

The Tech Awards annually honor efforts to use technology to improve the lives of people around the world. One laureate in each of five categories—environment, economic development, education, equality, and health—will receive a cash prize of $50,000, to be announced at a gala on November 19th. This year’s James C. Morgan Global Humanitarian Award recipient, Al Gore, will also be recognized at the gala.

The announcement came at the unveiling of a new Tech Museum gallery, “Technology Benefiting Humanity. The exhibit includes interactive looks at the inventions of eleven previous laureates including Solar Sailor, a company that combines wind, solar, and hybrid technology to power boats, and Adaptive Eyecare, a company that is developing glasses with lenses whose power can be adjusted by the wearer. 


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More