Tech Talk iconTech Talk

Stanford's High Energy Physics Lab goes to....the dump


This week, bulldozers are knocking down and carting away the Stanford High Energy Physics Laboratory, a.k.a. Hansen Experimental Physics Laboratory, to make room for a new Science and Engineering Quad. The 1949 building, originally funded by the Office of Naval Research, housed a long line of pioneering accelerators and led to the development of the Stanford Linear Accelerator. From 1952 to 1990 alone it was a home to more than 700 research projects, 13 National Academy of Science members, three Nobel Laureates, and 750 Ph.D.s. Experiments included the first large-scale superfluid accelerator and the first high-energy colliding beam test.

The tunnel housing Stanfordâ''s free electron laser lies four stories below the building; it will be preserved.

The site above ground will become the universityâ''s second Science and Engineering Quad, encompassing the Jerry Yang and Akiko Yamazaki Enviroment and Energy Building (Y2E2), the School of Engineering Center, the Center for Nanoscale Science and Technlology, and the Bioengineering and Chemical Engineering Building.


Karl Brown at the controls of the Mark II accelerator in 1949 or 1950.

Top photo by Linda A. Cicero / Stanford News Service

What's Up with All the Slashed Internet Cables?

As the pace of repair work picked up on three Internet cables in the Middle East this week, word that more damage has occurred to nearby undersea fiber-optic lines in the last 24 hours arrives. The slew of slashed cables has caused a frenzy of speculation on their causes in the blogosphere. As of today, Egyptian officials still had no explanation as to the cause of the damage to the first two lines, slashed a week ago, but they said there was no evidence that ship's anchors caused the breakage.

The two new damaged lines being reported are to some of the same systems as were cut recently, namely the FLAG Europe-Asia and SeaMeWe-4 networks. Landline and satellite connections have ameliorated some of the outages in the Middle East and South Asia regions, but it is estimated that some 85 million Internet users have been adversely affected. According to one report, nearly 90 per cent of Internet traffic is routed through undersea cables in these parts of the world.

Officials for the cable operators predicted that engineers working on repair ships at sea should be able to restore service in approximately one week for the earlier incidents. FLAG Telecom, operator of two of the damaged cables, told the Associated Press today that it is laying an entirely new "fully resilient" cable that will be able to withstand harsher treatment in underwater conditions.

"We are still treating this as a crisis," a FLAG spokesman told the AP. "But the new cable will provide a diversity in routes and be more resilient."

[See our earlier entry, "Internet Problems Mount for Asia/Europe Connection" for more details on last week's cable outages.]

Bush S&T Funding Initiative Keeps US Nanotechnology Funding Near Present Level

Itâ''s not clear whether President Bushâ''s latest announcement of the American Competitiveness Initiative (ACI) is just a pie-in-the-sky funding proposal from a lame duck president, or if it really has a chance of being funded. But if it does receive funding, the already top-tier government funding of the US will inch just a little higher in some areas, but grow significantly in others.

According to an article on the new funding proposal, fiscal year 2009 will see the National Science Foundation (NSF) receive $397 million for nanotechnology research, up from $389.9 proposed for the 2008 budget, a slight percentage increase of 1.8%.

But the National Institute of Standards and Technology (NIST) would get a 20% boost in its overall budget, with part of that increase going towards nanotechnology along with four new facilities for astronomy and physics research budgeted at $148 million.

Nanotechnology aside, the recently announced R&D funding of a record $147 billion, representing a 3% increase over 2008, has observers calling the budget proposal â''unrealâ'' and landing the R&D budget in the same fate of the last two yearâ''trouble.

With the US facing the specter of a recession, the idea of investing in science for future economic growth is not the worst idea to come from inside the beltway.

Low-tech voting on Super Tuesday

276493969_77c117b99a.jpg Hanging chads never looked so good.

After several years of voting touch screen in California elections, sometimes smoothly, sometimes with difficulty, itâ''s back to old technology. Old old technology. Paper and pen technology.

California Secretary of State Debra Bowen last summer decertified machines made by Diebold Election Systems, Sequoia Voting Systems, and Hart InterCivic; they can only now be used under special security precautions. So I voted the old-fashioned way. And it was chaos. There didnâ''t seem to be much organization to the way poll workers handed out the paper ballots; it would have been easy to get one without signing in, or verifying that I was at the right precinct. There was absolutely no privacyâ''the ballots were huge and easy to read at a distance. â''Privacy envelopesâ'' were supposedly available, a poll worker told me theyâ''d only received three, as I handed him my completely visible ballot. Somewhere in the back of all of this, a poll worker told me, sat one electronic voting machine for the visually impaired.

Meanwhile, in San Francisco, the city just settled a $3.5 million lawsuit against voting machine manufacturer AutoMARK, which had sold the city uncertified machines, necessitating hand counting of paper ballots in the November election. The city replaced them with machines from Sequoia Voting Systems for todayâ''s race, mostly devices that read paper ballots, but a few touch screens are out there for those who want to use them. This morning, blogger Kevin Ho faced the Sequoia voting machine red screen of death and watched voters using paper ballots stream past him as the system rebooted.

And, in New Jersey, Gov. John Corzineâ''s scheduled 6:15 a.m. vote was delayed nearly an hour as poll workers struggled to fix touch screen machines. These were also from Sequioa Voting Systems.

For those eager to hear election results, it may be a long night.


My polling place again suffered from an election day paper shortage. Last year they ran out of paper for the printers and had to shut down the electronic voting machines. This year, many polling sites, including mine, ran out of paper ballots towards the end of the day. Late night news showed video of election workers xeroxing stacks of ballots to be driven out to the polls, where voters were waiting long past closing to vote.

So high-tech wasn't the answer, low-tech isn't the answer...November should be interesting.

Nanotechnology's Role in Reducing CO2 Emissions

If you accept that man-made CO2 emissions are a major contributing factor to global warming, then finding ways to combat those emissions with nanotechnology are an application area of increasing interest.

A recent column in Nanotech-Now poses the question of whether nanotechnology can be economically used to fight CO2 emissions. The brief column mainly focuses on the main contributor to man-made CO2 emissions: electric power plants.

The nanotechnologies being investigated for reducing carbon emissions in this area are still somewhat speculative, and certainly suffer from the prospect of not ever achieving an economical solution. Retrofitting power plants with bioenzyme â''scrubbersâ'' appears to remain price prohibitive.

But an attempt was made last year to quantify the impact nanotechnology could have on reducing carbon emissions. The company I work for, Cientifica, released in 2007 a free Whitepaper entitled: Nanotech/Cleantech: Quantifying the Effect of Nanotechnologies on CO2 Emissions.

The results indicated that impact of nanotechnologies in emission reductions will be in three main areas:

â'¢ The reduction of emissions from transportation through weight reduction and improved drive train efficiency

â'¢ The use of improved insulation in residential and commercial buildings

â'¢ The generation of renewable photovoltaic energy


Reduction Of Emissions Due To Use Of Nanotechnologies Source: Cientifica

So by using available technologies, nanotechnology was estimated to be able to reduce carbon emissions by 200,000 tons by 2010, mainly through weight savings and improved combustion in transport applications.

Sounds greatâ'¿but by 2010 this will only reduce carbon emissions by 0.00027%.

However, if the materials-based advances enabled by nanotechnologies that are still under development as discussed in the Nanotech-Now column are successfully applied than the impact could be far more dramatic.

Realism might not matter for next-gen prostheses

Both of DARPA's Revolutionizing Prosthetics programsâ''Deka's 2007 "Luke arm" and the larger 2009 projectâ''are working on a cosmesis for their bionic arm; a flesh-colored, hand-shaped, natural looking glove to disguise the alloy and plastic underneath. Theyâ''ve gone to great lengths to make it look like the real thing. At Deka, I saw a silicon hand that had been painstakingly hand-painted from the inside out to keep the paint from ever rubbing off. The paint job convincingly mimics veins, knuckles and tendons. The realistic nails are added on later, the fingertips felt real enough to creep me out, and the thing felt like a real (if cold, dead) hand.

Todd Farrington, a software engineer at Deka, has been wearing two classic hook and cable prosthetic arms since he was electrocuted at the age of 12. He still has his arms above the elbow. We share the same hunt-and-peck approach to keyboard use, but thereâ''s an important difference between us: he can write poetry. Farrington is eager to start test piloting the Luke arm.

But Farrington and another Deka test pilot, Chuck Hildreth, made it clear to me that making the new arm look real was the last thing on their minds.

Farrington says that cosmeses are important to people who have recently lost limbs. They want to get back a feeling of normalcy. But within a few years, Farrington says, looking normal stops being the most important thing. â''Theyâ''re only going to care about functionality,â'' he says. Thatâ''s one of the many reasons why todayâ''s prosthetic limbs, skin-color-matched and hand-shaped though they may be, end up collecting dust in the closet instead of on the patient.

The problem with cosmeses is that while they look pretty, especially Dekaâ''s $10,000 model, they impede the function of the prosthesis beneath. In fact, Farringtonâ''s own prosthetics illustrate his point: â''Face it, itâ''s never really going to look real,â'' he says. â''Itâ''s not going to be convincing.â'' So even though his right arm is a pale peach-colored plastic to approximate his skin, when it was time to get the left arm, Farrington called the prosthetist and told him to go with the less expensive carbon fiber. He pulls up his sleeve and shows me silver-black carbon fiber, shimmering with a pattern of subtle scales. I ask him if he would wear a cosmesis over the Luke arm. He grins and shakes his head quickly.

I think while these next-gen arms were being conceived, the idea of the cosmesis came and went. When you see an arm that moves as naturally and gracefully as the real thing, you stop caring if itâ''s got skin. Itâ''s a weird little cognitive dissonance trick your brain plays on you: last year at DARPATech, I met Jesse Sullivan, who was wearing the Revolutionizing Prosthetics 2009 Proto-2 arm. Heâ''s got it wired into his reinnervated pectoral muscles, so when he wants to move his hand, he doesnâ''t have to stop and think about it. He just moves his hand. After about a minute, my brain could not compute the prosthetic hand/natural movement paradox and just gave up. My primate brain could not keep up, and so against my will, I perceived Jesse as having a fully functioning arm, covered by a robot suit.

Now that weâ''re not giving people hooks and claws anymore, thereâ''s no need to be so insistent with the camouflage. With the Blade Runner making headlines, it seems like everyoneâ''s starting to get used to the idea of being modded. Somehow, an amputee just isnâ''t all that pitiable when he can outrun you and everyone you know.

Prediction markets front and center at relaunched tech magazine

Remember The Industry Standard? It was one of a spate of magazines that first exploded, in the sense of growing very quickly, and then exploded, in the sense that they blew up in smithereens, during the dot-com boom and bust.

The International Data Group announced today they were relaunching the magaine as a web-only publication. it's hardly the first to do so, but there's a twist.

The Industry Standard, once known as The Bible of the Internet economy, with a new publishing model that includes editorial content, a community-driven prediction market and social networking components.

Dovetailing with the editorial content is the prediction market, a way of betting on the outcome of future events, [Derek Butcher, the online publication's vice president and general manager] said.

The prediction market uses community input and proven algorithms to forecast events in the technology industry, according to the statement. Registered users can use mock currency to place "virtual bets" on the outcome of these events.

"For example, a prediction might state, 'Apple will ship 10 million iPhones by the end of 2008,' or 'High-tech venture funding will decrease by 15% in Q2 2008,'" according to a company statement. "As the community members place bets on a given prediction, the resulting market price of the prediction represents the community's consensus as to the probability of that event occurring."

Butcher said he doesn't know of any other media sites that prominently feature a prediction market operating in conjunction with editorial content.

We're interested to see how prediction markets do there, because we at Spectrum have a high regard for them.

Indeed, we touted the virtues of prediction markets just last September ("Bet On It"). It's clear that they do a good job of collecting the wisdom of the crowd, which often is collectively wiser than any individual poll or pundit. (So much so, in fact, that they are being increasingly adopted by corporations, which was the focus of the article.)

Dan Gross, a senior editor at Newsweek, seems to have ignored all that in favor of the cachet of offering a contrarian opinion in a long but uninformed segment of the national public radio show, On The Media.

Discussing the premier prediction website for political wagering, TradeSports, in Dublin, Gross said of the $40 million bet there, "when you compare it to the activity in the real stock market," it was "a tiny amount." It's hard to know what motivates such a comparison. The $130,000 you'd pay for a Jaguar XJ Convertible pales by comparison to the total annual revenue of the Ford Motor Company, but that doesn't mean it isn't an expensive car.

Gross's most egregiously wrong thoughts about prediction markets, though, are contained in a single soundbite:

"It's clear they are just reacting to conventional wisdom rather than setting it."

Prediction markets aren't supposed to set the conventional wisdom, they're supposed to exceed it by, in effect, giving the smartest opinions the greatest voice. People who know best back their knowledge with their dollars. Uninformed wagers that move the market away from its best guess merely attract more smart money, swinging things back again. Unfortunately, as is increasingly common these days, On The Media host Brook Gladstone asked about none of this in a 6 minute, 35 second radio segment.

Referring specifically to wagers on the U.S. presidential primaries and market predictions of the eventual party nominees, Gross said:

"What you see is the action of the prices really following what happens in the polls and what happens as the tallies are counted."

This is the "just" in Gross's "just reacting to conventional wisdom," and it's the most wrongheaded thing he said in an interview where that wasn't an easy choice to make. In a word, as the philosopher Homer Simpson would say, "Doh."

Of course experts are going to use all available data in making their assessments, and as new data is available, they're going to recalculate their predictions. What Gross needs to show is that the markets move in lock-step with the latest information in a some completely mechanical way. Otherwise, they're reacting, but not merely just reacting.

For example, if a savvy political observer, considering Sen. Barak Obama's impressive win there, looks deeply into the South Carolina exit polls, she might notice that the senator did particularly well in certain demographics that are not as strongly represented in the next race. She might, then, shade her prediction away from the conventional wisdom, which doesn't look deeply into the exit polls, and bet that Obama won't do quite as well as others expect.

Gross looked at the way Obama's share price - a wager that Obama would win the Democratic nomination - went up after he won the Iowa caucuses, and went back down after his main rival, Sen. Hillary Clinton, won in New Hampshire. This is a surprise?

Gross acknowledges how well the prediction market InTrade did in 2004 state elections, getting nearly all of them right. But he says that the same claim can't be made about polls, "because of their margin of error." Well the fact is, every poll got more than a few races wrong, even beyond the margin of error.

"They're more accurate than the polls on the day of the election after 8 or 9 months of campaigning... they are completely inaccurate if you want to know, today, who's going to win the election in November."

But that's a classic straw-man argument. The issue is not, can prediction markets make perfect predictions today about an election 9 months down the road. The issue is whether they can do so better than the polls and better than the pundits. Gross sidesteps this obvious question, and the obvious answer - yes they can - and On The Media host Gladstone unfortunately lets him get away with it.

Luckily, few seem eager to join Gladstone and Gross in embracing the contrarian stance. Rather most seem interested to see more of them, such as the ones at the resuscitated Industry Standard. We wish them good fortune.

Electronic medical records: A billion here, $77 billion there--it starts to add up

Would electronic medical records save the United States $77 billion?

Hillary Clinton, Senator from New York and one of the leading candidates for the 2008 presidency, said so on Thursday night.

You can hear it for yourself. Itâ''s about five and half minutes into this YouTube video.

If you don't want to listen, hereâ''s the key soundbite:

According to the Rand Corporation, hardly a bastion of liberal thinking, they have said that we would save $77 billion dollars a year. That money could be put into prevention. It could be put into chronic care management. It can be put into making sure that our health care system has enough access so that if you are in a rural community somewhere in California or somewhere in Tennessee or somewhere in Georgia, youâ''ll have access to health care. If youâ''re in an inner city area, and you see your hospital, like the Drew Medical Center, closed on you, then youâ''re going to have a place once again where you can get health care in the immediate area.

Clinton wants to pay for universal health care, in part, with these savings. And sheâ''s been talking about it for a while. She mentioned the $77 billion figure in a key policy speech on the eve of the New Hampshire primary that reinvigorated her campaign. Itâ''s obviously an important matter, yet Iâ''m not sure the press has taken even the quickest look at the RAND study on which so much of Clintonâ''s health plan depends.

If they had, theyâ''d notice that the study was released back in 2005. Not only that, but it was two years in the making, according to a RAND press release issued at the time. So the savings might be greater, adjusted for inflation, and might be greater, or a lot less, depending on how outdated the data is.

The study actually claims $81 billion in annual savings, according to a press release issued at the time. For some reason Clinton isnâ''t counting $4 billion that â''would be saved each year because of improved safety, primarily by reducing prescription errors as computerized systems warn doctors and pharmacists of potential mistakes.â''

Leaving aside those safety savings, what would the $77 billions in savings result from? Richard Hillestad, a senior management scientist at RAND who led the study,

estimates that if 90 percent of doctors and hospitals successfully adopt health information technology and use it effectively, resulting efficiencies would save $77 billion annually. The biggest savings would come through shorter hospital stays prompted by better-coordinated care; less nursing time spent on administrative tasks; better use of medications in hospitals; and better utilization of drugs, labs and radiology services in outpatient settings.

In other words, itâ''s not as if we save $77 billion from eliminating the manual operations of paper records, and then can plunge the savings into improved care. The savings come from the very improvements in heathcare that electronic health records make possible. So the question arises: Is Clinton double-counting the benefits of electronic health care records, once in the saving of the money, and then again in the spending of it? Look at one specific instance the RAND release gives:

For example, health information technology could make a major contribution to improving care for patients with chronic conditions such as diabetes, who account for 75 percent of the nation's medical care costs, according to researchers.

That sounds an awful lot like the â''chronic care managementâ'' that Clinton cited as something that â''that money could be put into.â''

To be sure, the estimated cost of Clintonâ''s proposed changes to health care run about $110, half of which she says can come from ending the Bush tax cuts, which are set to expire soon. â''The other $55 billion,â'' she explained, â''would come from the modernization and the efficiencies,â'' of which, presumably, electronic health care is only one. It is, though, the only one she discussed at length in the debate.

Thereâ''s also, then, a question of timing. The RAND study says

â''It's going to take 10 to 15 years to achieve wide adoption of electronic medical information, even if all the ongoing efforts are successful,â'' Hillestad said.

Does Clinton plan to wait for the savings to materialize before reforming health care? Surely not.

Health care in general is a serious issue, as is the specific one of electronic health records. It was given quite a bit of debate time this week. Unfortunately, Wolf Blitzer, who moderated the CNN-sponsored debate, arrived unprepared to challenge Clintonâ''s airy claims about it, despite their having been made more than three weeks earlier in an important speech.

For those who want more substance than air, Spectrum has plenty to offer. Way back in 2002, we published "Welcome To The (Almost) Digital Hospital,"

More recently, contributing editor Robert N. Charette looked specifically at the promises and problems of electronic medical records in â''Dying for Data.â''

In fact, Bobâ''s been a little obsessed by the topic. Last summer he blogged about it three times in one month, here, here, and here. The last two are about the critical issue of privacy.

And earlier this month, Bob wrote about a fascinating 3-D visualization tool for electronic health records being developed at IBM, â''Visualizing Electronic Health Records With â''Google-Earth for the Bodyâ''.â''

Weâ''ll continue to follow this and other tech-related issues as the presidential campaign continues. Some of the claims made by the candidates involve some pretty interesting and complicated technologies. For this one, though, all you had to do was read a two-year-old press release.

Internet Problems Mount for Asia/Europe Connection

For the third time this week, a vital cable routing Internet service between Europe and Asia has been severed. On Wednesday, two lines running under the Mediterranean Sea were cut off the coast of Egypt, most likely by anchors dropped by mooring ships. And today, a third high-capacity cable off the coast of Dubai has been damaged, also likely caused by ship activity. The combined disruptions have put a severe strain on network services across the Middle East and South Asia, according to numerous media accounts.

In a report today, BBC News relates that the latest blow to the regions came when the FALCON undersea cable, operated by U.K.-based FLAG Telecom, was severed 56 kilometers from Dubai in the Persian Gulf. It was the second major underwater accident for the FLAG (Fiber-Optic Link Around the Globe) system in less than 48 hours. Two days earlier, its FLAG Europe-Asia cable was sliced 8.3 km at sea from Alexandria, Egypt.

Also damaged at the time of the first accident, the SEA-ME-WE 4 (South East Asia-Middle East-Western Europe 4) cable line, running parallel to FLAG Europe-Asia lost service. SEA-ME-WE 4 is operated by a consortium of companies throughout Europe, Africa, and Southern and Southeast Asia. Both fiber-optic systems connect providers and users from Western Europe to Eastern Asia directly.

A communications analyst, Eric Schoonover of the research firm TeleGeography, told CNN News that Wednesday's accident was the more serious of the two, according to a late-breaking report. He noted that SEA-ME-WE 4 and FLAG Europe-Asia carry about three-fourths of the online service available between Europe and the Middle East. The FALCON system that was interrupted earlier today operates on a ring that makes much of its capacity redundant in cases of physical damage to individual cables.

Still, users throughout this populous portion of the world were stymied by the combined outages. Angry customers voiced strong opinions on the matter to various news outlets throughout the affected regions.

"Everyone is trying to absorb the shock," Joseph Metry, a network supervisor at Orascom Telecom Holding SAE, one of the largest phone companies in the Middle East and North Africa, said in a Times of London account.

On its Web site (which is operating normally), FLAG Telecom stated that a repair ship is expected to arrive at the site of the FLAG Europe-Asia accident in four days and that it expects the damage will be repaired "within a week thereof." As for FALCON, the company said, "[A] repair ship has been notified and [is] expected to arrive at the site in [the] next few days."

Meanwhile, network traffic cut off by the damaged cables is slowly being re-routed through other systems (such as the older SEA-ME-WE 3 cable) and connectivity has begun to pick up again.

Egypt's minister of Communications and Information Technology, Tarek Kamil, said he expects his nation's infrastructure will suffer over the days ahead but will gradually improve. "However, it's not before ten days until the Internet service returns to its normal performance," Kamil told the state Al-Ahram newspaper.

The remarkable coincidence of losing three major communications channels in such short order surely tells us much about how the world has been connected from any point on the globe to another in a relatively brief amount of time--and how much we take this for granted. While such robust connectivity has served to bring us together, its fragility also serves to remind us that we are still bound by circumstances beyond our control. Increasingly, this is becoming a lesson experienced by all the people of our planet.

U.S. Government Terminates Its Major Clean Coal Project

By far the most important project in the U.S. governmentâ''s carbon sequestration program came to a screeching end on 31 January with the announcement by the U.S. Secretary of Energy that the department was pulling the plug on FutureGen. The basic idea of FutureGen, which goes back more than a decade, was to develop an integrated carbon-free coal gasification technology, where the gas would drive electric power turbines, separated hydrogen might power fuel cells, and the captured carbon dioxide would be permanently disposed of in geologic repositories. With the demise of FutureGen, whether it turns out to be somewhat exaggerated or not (as Mark Twain once said of his own alleged death), all the more significant is the clean coal plant being built in eastern Germany, with an alternative carbon-capture technology called oxyfuel.

That project will be the first larger-than-laboratory-scale electric power plant in which the carbon is captured for permanent disposal.

The East German plant, located in a town called Schwarze Pumpe, not far from the Czech and Polish borders, is a joint project of the Swedish national energy company Vattenfall and the French power engineering company Alstom. At that demonstration facility, which is to be completed this spring, nitrogen will be separated from air pre-combustion, so that post-combustion flue gases consist essentially of just water and carbon dioxide. The initial air separation process is costly in terms of both energy and money, but the dividend comes with the simplification of the CO2 removal process.

As for FutureGen, the concept for the plant was outlined in a 1997 report by the energy panel of the Presidentâ''s Committee of Advisors on Science and Technology (PCAST), the last-such top-level look at long-term U.S. energy policy (if one excepts the controversial 2001 Cheney report, which was produced behind closed doors, without the same kind of open scientific review). The Bush administration adopted the project in 2003, defining it as a public-private partnership, in which a group of private energy companies would pay for the gasification and generation plant, while the government, would cover the carbon capture and disposal costs. In the meantime, however, the estimated cost of the project has soared from about $1 billion to $1.8 billion, the Energy Department says.

A site for FutureGen had been selected in Illinois, and so the departmentâ''s decision to shelve the project drew howls of protest from several of the stateâ''s heavyweight political leaders, including presidential candidate Barack Obama and Rep. Rahm Emanuel, the Democratic Partyâ''s most influential strategist. Like a Phoenix, the project may of course rise againâ''but by the time it does, Vattenfallâ''s Schwarze Pumpe plant will be up and running, and follow-on commercial-scale oxyfuel projects will likely be well along.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More