Tech Talk iconTech Talk

Startling Claims about China-Pakistan Nuclear Cooperation

An article by a former U.S. Air Force Secretary in this monthâ¿¿s Physics Today magazine says that China turned over the blueprints for its own first atomic bomb to Pakistan, started to provide Pakistan with nuclear weapons technology as early as 1982, and likely helped Pakistan conduct that countryâ¿¿s first nuclear weapons test at a Chinese test site. The article, by Thomas Reed, whose career started with nuclear weapons work at the Lawrence Livermore National Laboratory in the early 1950s, is based largely on allegations by Danny Stillman. Stillman, as director of technical intelligence at the Los Alamos National Laboratory, was given official tours of Chinese nuclear facilities two decades ago. The two of them Reed and Stillman, are authors of a book telling the whole story, Nuclear Express, to be published early next year by Zenith Press.

Their arresting claims are best treated with caution. The American Institute of Physics, publisher of Physics Today, raises the question in its press release about the article of why the Chinese would have given Stillman the red carpet treatment he describes. â¿¿Why should the Chinese escort a knowledgeable American official on what became a sort of nuclear Marco Polo tour analogous to the fabled journey by the Venetian merchant through the heart of 13th century China?â¿¿ asks Phil Schewe, AIPâ¿¿s chief physics information officer. Reed speculates that the Chinese wanted the West to be aware of their work, in which they took pride, and to treat them and their country with greater respect.


But does that explain why they would imply they directly helped Pakistan test nuclear weapons and say they started to share sensitive nuclear technology with Pakistan decades ago? To the extent they did such thingsâ¿¿and thereâ¿¿s been no doubt for a long time that to some extent they didâ¿¿they have helped create a mega-problem for the West that is not making the West feel grateful.


Even if the most arresting allegations of the Reed-Stillman article and book turn out to not hold water, the secondary allegations are still absorbing. The various statements about China-Pakistan nuclear cooperation actually are tacked onto the article only at the very end. The bulk of the article describes visits to various highly sensitive facilities in detail, evoking a weirdness that often reminded this reader of passages in Don DeLilloâ¿¿s Underworld.


Stillman was impressed by the sophistication of the instrumentation the Chinese were using for nuclear test diagnostics, which he says â¿¿were every bit as good as those used in American nuclear tests.â¿¿ But he found an alarming absence of automated controls on Chinese nuclear weapons in the early 1990s; the Chinese basically were relying on human guards deemed loyal and trustworthy.

Kolbert Casts Cold Eye on Candidates Climate Credentials

The New Yorkerâ''s Elizabeth Kolbert, author of an acclaimed magazine series and book about climate change, takes a wary look at the two U.S. presidential candidates in the latest issue of OnEarth, a quarterly published by the Natural Resources Defense Council. Though both McCain and Obama have sponsored legislation to curb carbon, â''both [also] have supported laws whose goals are directly at odds with cutting emissions,â'' Kolbert observes. Obama, for example, has favored incentives for corn ethanol, despite evidence that it is about as bad or worse than oil. Last year he sponsored legislation to support conversion of coal to liquids, â''about the worst possible move the country could make,â'' as Kolbert puts it.

Obama has since amended his position on the issue to say he only supports coal-to-liquids if the technology emits at least 20 percent less carbon over its lifecycle as competing conventional fuels. About the best that can be said about thatâ''Iâ''d addâ''is that the otherwise brilliantly well informed Obama seems to have been not so well informed about an issue he claims to care a lot about.

As for McCain, his outspoken support for offshore oil drilling is hardly consistent with the notion that we should use less oil so as to emit less carbon, Kolbert points out. And with benefit of additional experience we may add: why, if the cares so much about climate, would he pick a vice presidential candidate who considers the jury still to be out on the science, ignoring the effects of global warming all around her?

As Thomas Friedman noted in The New York Times, McCain had stood apart from President Bush by opposing drilling the Arctic National Wildlife Refuge and by advocating action on climate. But now, as the Sierra Clubâ''s Carl Pope says, he has picked a running mate whoâ''s dismissive of alternative energy. â''While the northern edge of her state literally falls into the rising Arctic Ocean, Sarah Palin says, â''The jury is still out on global warming.â'' â''

Shortcuts at U.S. Nuclear Fuel Facility?

The Chemical Engineer, a magazine published by Britainâ''s Institution of Chemical Engineers, is reporting allegations that safety standards are being neglected in the design of a nuclear fuel fabrication facility being built in South Carolina. The $4 billion plant, near Aiken, will produce so-called mixed-oxide fuel, consisting of uranium and plutonium recovered from spent nuclear fuel or nuclear weapons. Dan Tedder, an emeritus professor of chemical engineering at the Georgia Institute of Technology, told The Chemical Engineer that basic process design information was incomplete, with serious implications for safety.

Tedder, who served last year as an independent technical reviewer on the project for the Nuclear Regulatory Commission, predicted that safety problems will manifest themselves when the plant is operational. â''The documentation provided in the license application is very superficialâ'' and â''isnâ''t consistent with reasonable and generally accepted good engineering practice,â'' Tedder told The Chemical Engineer.

The NRC has dismissed Tedderâ''s accusations as unfounded but has barred access to documents in dispute, on security grounds.

Nuclear Physics Hip-hop Video Climbs the Charts

A few weeks back, associate editor Sally Adee spotted a music video on YouTube that struck her fancy, plus it actually offered some real scientific background on the controversial Large Hadron Collider (LHC), the world's largest subatomic particle accelerator. So she posted it to our blog (please see Large Nerd Collider).

(The controversy over the LHC comes from skeptics who claim the operation of the accelerator could result in producing artificial black holes that could eventually swallow the whole planet: see Courts Weigh Doomsday Claims at MSNBC.)

For a publication such as IEEE Spectrum, which sometimes runs the risk of seeming a bit, uh, studious (if that's the right word), it was a breath of fresh air. So we decided to promote it in our weekly Tech Alert newsletter, which goes out to thousands of our readers. Guess what happened next?

Yep, the video, "Large Hadron Rap," went viral. As of today, it's been viewed at YouTube more than 750 000 times. It's so hot that the Associated Press has now noticed. The news service ran an article on the hip-hop video yesterday: This ain't no jive, particle physics rap is a hit.

The creator of the tune, Kate McAlpine, a 23-year-old Michigan State University graduate, has now become something of a phenom in the science community.

"Rap and physics are culturally miles apart, and I find it amusing to try and throw them together," McAlpine, a science writer at CERN in Switzerland, commented recently.

Just in case you've missed it, here's the video again:

Now, we're not saying that we had anything to do with its meteoric popularity, but we at least recognized a good thing when we saw it.

Peace out!

Confessions of a Hot Chips n00b

I spent last week at the Hot Chips conference, which, for you non-cognoscenti, is an all-star conference on high-performance microprocessors. I watched as Intel, NVIDIA, IBM, AMD, and a constellation of other chip designers presented Power-pointy microchip architectures until my brain had disintegrated into a thin gruel. I would like to share some observations, but they will all be borrowed, as my melting neurons were unable to produce their own.

It's not news that everything is about multicore and GPGPUsâ''that's general-purpose graphics processing unitsâ''and the Hot Chips lineup reflected that fact. For those of you unlucky enough to know even less than I do, a GPGPU is a sort of semi-holy grail for system-on-a-chip architectures. GPUs have been used for, well, graphics rendering and processing pretty much since the dinosaurs roamed the earth. But recently with Moore's law sending the semiconductor industry into its screaming death spiral, people have looked for ways out of relying solely on CPUs (central processing units) which are brainy but compensate for their intelligence by being a lot less energy efficient per computation.

If you can get the CPU to be the brains of the operation, so to speak, you can get him directing a bunch of heavy-lifter GPUs, whose strength lies in their amazing ability to crunch numbers that would make your head explode. They can do that because of their ability to deal with floating-point operations.

But the problem is that in order to use these hired thugs for anything other than video processing, you basically have to lie to them and tell them they're working with graphics. You do that with a thin layer of code that converts your instructions into the only language they can understand: red, blue, and green pixels and where to put them. NVIDIA was the first to do so, inventing a language called CUDA. Then the Cell processor came along. Now it's Intel's turn, with the much-vaunted Larrabee architecture, which isn't even a chip yet. But it's still made big waves, because it takes GPU manipulation out of the proprietary NVIDIA pool. Now you don't have to learn to use CUDA. That is the chip engineering equivalent of a swift slap across the face with a white glove: 85 percent of the world's programmers already know how to use Intel's x86 architecture (not to mention C).

A quick rundown of several technologies at the show, and the associated commentary, after the jump.

1. A company called Audience has built a chip called the A1010, a voice processor based on the human hearing system. This digital signal processing chip replaces the traditional fast fourier transform with something they call a fast cochlear transform. (A Spectrum story that examines this technology in depth is coming soon.)

A bigwig in attendance thought it was a great idea because it works more like the human ear than like a machine. Cell phones equipped with these babies will block out the nasal lady announcing the gate change, the shrieking baby, the man nattering about The Big Merger, and even a jackhammer. The best part? It can probably do it on your end too, adjusting the volume to cancel out noise not just on the other guy's end but on yours. To my (limited) knowledge, the only way they could possibly do that is by installing a little finger that extends out of the phone to plug your ear, but who knows.

And the most important information? The thing is now in the LG Cyon and the

Sharp SH705iII.

2. Faint blurring around objects on HD video, called "halos," are apparently a bigger problem than colon cancer, judging by the concentrated brainpower going into solving this enormous problem. Witness three separate chips, rolled out by AMD (which bought ATI to integrate graphics and CPUs), NXP Semi and Toshiba, each taking on this life-threatening situation so that AV nerds need never again struggle with faint halos around the helicopters populating their video games.

You see, HD right now is "fudged." 24 frames per second of TV-frequency are translated to the 60Hz frequency that an HD TV is capable of. That means that every second, 96 pixels need to be interpolated by image processing software. The result is a weird kind of visual time lag that the presenter showed in a frame-to-frame analysis of a helicopter flying past a building with many windows in the background. Let's say the helicopter is in front of the first row of windows in one frame, and in front of the second row of windows in the next. Because the TV has pushed through only about a third of the information the processor needs to visually interpret the 3-dimensional location of the helicopter relative to the building, the chip just starts making things up. So between the first and second row of windows in the background, instead of a smooth wall, you see a schizophrenic pattern of "new" windows which the computer threw in there as its best guess for what we should be looking at.

Anyway, AMD's mediaDSP solves that problem via a whole system of flow charts

that I never want to see again. This seems like a pretty boring payoff for buying ATI.

3. The really big reveal of the day was the architecture of ANTON, a specialized chip that is optimized for a molecular dynamics simulation engine. Yeah, yeah, bear with me.

It's a supercomputer from New York-based D.E. Shaw Research, following in the footsteps of everyone who is going to GPUs to take over for CPUs. D. E. Shaw counts biologists, computational chemists, and electrical engineers, among other scientists, among its constellation of polymaths working on machines that can simulate molecular dynamics. Here's why you want a molecular dynamics simulation engine: Drug design.

Right now, drug design averages 5 years before clinical trials can even get started. You have to start with 10 000 petri dishes and test 10 000 interactions. Then you have to pay lots of people to physically examine the results. Then, of those 10 000 interactions, you pick the best ones. You whittle them down until you have something you can test in mice. Then you test it in monkeys. Then you can start clinical trials with human test subjects. That's ten years for one drug. But if you could model the entire first part, you get rid of your 10 000 petri dishes; your countless man-hours wasted on people checking the interactions manually; your lather, your rinse, your repeatâ''with the right supercomputer, you could shorten drug development cycles by five years.

"Think of it as a CAD tool for drug design," said presenter Martin Deneroff. The reason no one has been able to do that is the enormity of parallel processing power required-- more than you can find in a Roadrunner, a Cray and a Blue Gene L all put together. "The designers of these supercomputers couldn't build a general purpose computer that goes faster," said Deneroff. "They're not useful for drug design." On any of these, just one simulation will take a year to complete. Anton was designed to get this number to under one month. The solution? Throw out all general processing functions: you'd have a computer that literally does molecular dynamics and nothing else.

I'm pretty confident that ANTON was the biggest deal at Hot Chips. But, on a

note of caution, and as was made clear in the panel session on Monday night that looked back at all the bad predictions and misfires over the past 20 years, my opinion may be subject to changes.

4. Godson-III is a next-generation multicore microprocessor (4 cores, then eventually 8) from the Chinese Academy of Sciences which, if all goes well, should eventually compete with the rest of the world's great-great grand-daddy generation of microprocessors. A time machine to the year 1997!

That performance lag belies the INCREDIBLE fact that China has been putting resources into microprocessor R & D for, um, three years now, as opposed to the 35-40 that's been the norm among the lock-step IC manufacturing frenemies around the globe. I am not making that up: "20 years ago in China, the decision was made not to support R & D in microprocessors," explained the presenter. "Consequently, our microprocessor R&D started only recently." The chips will be fabbed by ST Microelectronics.

More than a few noses were out of joint about Chinaâ''s porous relationship with intellectual property, specifically MIPS, a chip architecture that has been used in microprocessors since the mid-1980s (it was invented by a Stanford University-based company). Earlier generations of Godson were known for their unfettered use of the â''MIPS-likeâ'' architecture. To their credit, the Chinese government later paid for a MIPS license, so Godson IIIâ''s use of MIPS is legitimate. But there are still a lot of sore feelings about how quickly the Godson chip series has progressed, given its short R & D timeline. The implications are obvious.

China, perhaps deservedly, didnâ''t seem to get much respect around these parts. Top comments overheard during and after the presentation:

Anonymous: [snickers and leans over to neighbor] "Think they fab 'em in Taiwan?"

Anonymous big deal: "I wonder how many patents they violated just taping these things out?"

Yet a third publicly considered the irony and absurdity of China making money on intellectual property.

But Real World Technologies wunderkind David Kanter puts it best: â''Even if they were more or less â''copiesâ'' from other designs, itâ''s still an impressive and significant feat.â''

More rap music for engineers

CERN's Large Hadron Collider, going into operation on Sept. 10 near Geneva, is the subject of a hot new rap song by science writer Kate McAlpine, with over 700,000 views on its main YouTube location and more on alternate sites. Watch it below.

Could be engineer-rap, like Rajeev Bajaj's geek rhythms, profiled in IEEE Spectrum here, is catching on.

Small Galaxies Show Influence of Dark Matter

Astronomers at the University of California at Irvine (UCI), looking at dwarf galaxies orbiting our own Milky Way, have found that most of them display characteristics suggesting they were formed by the effect of dark matter.

A news report in National Geographic informs us that the scientists, using the relative speeds of stars, determined that 18 of the 23 known satellite galaxies have a common central mass of about ten million times that of the sun.

Prior to their findings, the nearby dwarf galaxies were thought to be much less massive at their cores.

The astronomers have theorized that this points to the presence of the mysterious force exhibited by so-called dark matter, which physicists think comprises the bulk of the mass of the universe and enables galaxies to coalesce, even though it can not be directly observed.

The current model of the universe, as put forth by leading scientists, theorizes that galaxies form as dark matter's gravity attracts normal matter (or atoms), producing the starry cosmos we see at night.

"We've gone down to the smallest galaxies we can see," said Manoj Kaplinghat, a UCI astrophysicist who worked on the research. "What's surprising is there's so much dark matter, even though these guys are little. They barely have a few thousand stars."

It's something to think about the next time you take a stroll on a clear summer night and gaze up at the heavens.

Out of Africa: Zambia Online comes to Stanford

Every Labor Day, Stanford University assembles a remarkable group of journalists from all over the world in order to spend an academic year sampling courses, meeting professors and generally enriching their intellectual outlook at one of America's top universities. The global journalists are gathered through the prestigious Knight fellowship program.

Yesterday, this year's one African journalist, a distinguished Zambian writer and online journalist, Chanda Chisala, arrived. I met Chisala at the airport, partly out of courtesy and partly because on my recent visit to Lusaka, he showed me great hospitality.

Full disclosure: I sometimes teach courses at Stanford, so I predisposed to find value in the university's programs. Chisala, meanwhile, has become a valued friend, and I frankly want him to gain a wider audience during his stay in Palo Alto, California.

Chisala plans to study at Stanford the effect of the Internet on African media. The subject is close to his heart: in his own Zambia, he is the leading actor in bringing serious locally-produced content to the Net. He also thinks deeply about the political consequences of new media forms, and especially the effect on citizen participation in government. He is both an original thinker on African social issues and an important critic of conventional notions of black identity.

In Lusaka, Chisala is best known for his work on Zambia Online. He also is an important developer of software and Internet services in Zambia. He sees no contradiction between using new technologies to communicate and working to create and adapt new technologies to the conditions of his southern African country.

Chisala's journey at Stanford -- and in America -- is only just starting. What he brings back to Africa from my country will be worth a look, I am sure.

Apple's bad battery news could be good news for Boston Power

4197VZ0BH5L._SL500_AA280_.jpgIn the newspaper last week I read about Appleâ''s admission that some first-generation iPod Nanoâ''s are overheating due to battery problems. The announcement came in the wake of a Japanese government report that credited overheating in first-generation nanos with causing three fires, two light burn injuries, and twelve damaged cases. In this list it included one iPod nano that scorched a tatami mat back in January and a second unit that burned sheets of paper in this month.

In the mail last week I got an invitation to a party celebrating the fact that Boston Power is shipping its first generation of lithium-ion batteries.

Conclusion: If timing is everything, Boston Power has great timing. This is the startup company I wrote about in the March issue of IEEE Spectrum that began looking for money to fund development of a safer, more reliable, lithium-ion battery just before the big Sony battery recalls of 2006. It probably would have gotten funded eventually without battery flameouts being in the news, but Sonyâ''s problems certainly didnâ''t hurt.

Now, after keeping a fairly low profile all year, Boston Power is about to send lithium-ion batteries to as-yet-unidentified laptop manufacturs who will quickly be installing them into computers and shipping them out to consumers. And the company also announced product developers can order batteries for evaluation from its website.

Which is why the Nanoâ''s troubles are good news for Boston Power. Thereâ''s nothing like a scorched tatami mat or two to make no-meltdown technology just that much more attractive.

Out of Africa: death of the digital divide

Is the "digital divide," one of the most popular technology buzz terms of the decade, dead?

The question was posed to me last night by Eric Osiakwan, an old friend and Internet promoter from Accra, Ghana. Osiakwan is visiting the U.S. to attend a gathering of global geeks convened by Harvardâ''s Berkman Center next week. As soon as we sat down for beers and pizza in Berkeley's Jupiter cafe, he asked me whether I had seen the confession by Jeffrey Sachs, the economist, in the London Guardian.

I had not, so Eric found the piece on my Ipod. Sure enough, Sachs was admitting that for too long he had underplayed the importance of information technology -- computing, communications and the Internet -- in reducing poverty in Africa.

"The digital divide is beginning to close," Sachs opined.

"Extreme poverty is almost synonymous with extreme isolation, especially rural isolation. But mobile phones and wireless internet end isolation, and will therefore prove to be the most transformative technology of economic development of our time," Sachs added.

"The digital divide is ending not through a burst of civic responsibility, but mainly through market forces," he continued. "Mobile phone technology is so powerful, and costs so little per unit of data transmission, that it has proved possible to sell mobile phone access to the poor."

At this point Osiakwan beamed proudly, but then made an important critical point: On a global scale the digital divide is closing but within African countries the divide remains -- and may even be worsening as in many places the gulf between rich and poor is widening.

Another major issue, Osiakwan told me, is the shifting nature of computing and communications. Five years ago, the experts thought the computer would be the engine of networking, even in very poor parts of Africa. Great and expensive initiatives, often backed by governments and charities, arose with the aim of bringing computers to the African masses.

Computers of course remain important in Africa. But two factors changed the equation. First, used computers began to proliferate in African cities. These machines often cost as little as $150, and they are fully functional desktop machines, effective though usually using a generation-old system. Even laptop computers can be found for $250; again, they are used but in good condition.

These aging computers can no longer be sold in Europe and the U.S., but they are sought after in Africa, and for good reason. They cost 5 to 10 times less than a new machine.

The second factor disloging the computer from center stage is the rapid rise of the mobile phone. Phones are becoming more powerful, so that the prospect is approaching of convergence between Internet and mobility.

As we dive into our pizzas, Osiakwan admits to me that he never anticipated the sudden ascent of mobile phones -- and the relative lack of excitement about computers today in Africa.

"Mobile phones are where the action is," he says, "but the Internet remains the foundation for the new information society arising in Africa. Without the Internt, the phone would only be for talking."

And we all know, talk is cheap.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More