Tech Talk iconTech Talk

One Atom + Two Photons = Quantum Computing Switch

A scheme that uses a single atom to switch the direction of a single photon could pave the way toward quantum computers much more powerful than today’s machines.

The setup is described this week in the online issue of Science by researchers from the Weizmann Institute of Science in Rehovot, Israel. In simple terms, the atom can be in one of two states, either “left” or “right.” If the atom is in the left state, a photon that strikes it from the left will continue on in the same direction, as if it hadn’t hit the atom at all. A photon coming from the right, however, will be reflected back in the direction it came from, and at the same time the interaction will cause the atom to flip from left to right. Left and right can stand in for the 1s and 0s of digital logic.

 Barak Dayan, head of the Weizmann Quantum Optics group, says the basic principle at work is interference. In one direction the ph ton does not interact with the atom and so continues in the same direction. But in the other direction, there is destructive interference between the incoming photon and the outgoing emission from the atom  in the direction of travel, so the only direction the light can travel in is back from whence it came. Each such reflection toggles the state of the atom.

One type of quantum computer under development uses the electrical states of ions as the bits, or rather qubits, that make up their logic. The difficulty, Dayan says, is that to communicate the state of one atom to another, the atoms either have to be maneuvered to be next to each other or the state has to be transferred from ion to ion one at a time to its final destination, in an atomic version of “Post Office.” Using light instead, he says, means any qubit can share its information with any other, regardless of how far apart they are, thus simplifying the system.

The group’s next step will be to impose on the photons quantum information such as superposition, essentially getting them to be both right and left at the same time. It’s that ability to hold multiple states at once, instead of just 1 or 0, that promises to make quantum computers much more powerful than digital computers. Dayan hopes his group can achieve that goal in a few months.

Fresh Hope for an Abandoned NASA Spacecraft

News of ISEE-3’s demise may be a bit premature. A months-long bid to bring the 35-year-old International Sun-Earth Explorer 3 back into the vicinity of Earth seemed to have met its end this week, when the citizen group attempting to “reboot” the mission failed to get the thrusters to produce much more than a burp.

Tests done on Wednesday during a communications session with the Arecibo telescope seemed to suggest that the culprit was a lack of “pressurant." The spacecraft appears to have run out of the nitrogen gas it used to force hydrazine fuel from the tanks and into the thrusters.

But the reboot team, led by editor Keith Cowing and entrepreneur Dennis Wingo, CEO of California-based Skycorp Incorporated, isn’t quite ready to give up. One of the project volunteers has suggested that perhaps the nitrogen isn’t actually gone. It may in fact still be there, but dissolved in with the hydrazine.

Read More

Motion Capture Technology Goes Into the Wild for Dawn of the Planet of the Apes

The men and women who survived a deadly virus that wiped out much of earth's human population hunker down amidst the ruins of San Francisco; meanwhile, a growing ape population has built a lovely and thriving community outside of San Francisco in Muir Woods. The tension between the two societies drives the action-packed Dawn of the Planet of the Apes, the sequel to the 2011 Rise of the Planet the Apes that starred James Franco.

As this sequel begins, Franco's character has been dead for a decade, and the apes have had plenty of time to create their version of civilization. But in real time, it's been just three years since the Rise movie. In the world of motion picture technology, though, that's an eternity. Long enough to create computer graphics gear robust enough to take out of the studio and deep into a real forest. And long enough that moviemakers no longer need to give a recognizable Hollywood star top billing to bring in audiences.

In fact, if you passed the leading man of Dawn—Andy Serkis—on the street, you wouldn't recognize his face at all, for you never see it on the screen. That's because his performance in the woods (actually, forests near Vancouver, not San Francisco) wasn't fillmed traditionally, it was motion captured, and used as a framework for a computer-created realistic digital ape, Caesar. And, for the first time in my knowledge, it's the performances of the motion capture actors, not the regular actors portraying humans, that are getting all the good reviews from critics; there is even talk of the first best-actor Oscar nomination for a motion-capture performance.

Read More

Contraceptive Implant Hands Women Remote Control

human os iconWomen may soon bid farewell to birth control pills and welcome a new type of contraception in the form of microchip implants. An MIT startup backed by the Bill Gates Foundation plans to start pre-clinical testing for the birth control chip next year and pave the way for a possible market debut in 2018.

Read More

Can The Human Brain Project Succeed?

An ambitious effort to build human brain simulation capability is meeting with some very human resistance. On Monday, a group of researchers sent an open letter to the European Commission protesting the management of the Human Brain Project, one of two Flagship initiatives selected last year to receive as much as €1 billion over the course of 10 years (the other award went to a far less controversy-courting project devoted to graphene).

The letter, which now has more than 450 signatories, questions the direction of the project and calls for a careful, unbiased review. Although he’s not mentioned by name in the letter, news reports cited resistance to the path chosen by project leader Henry Markram of the Swiss Federal Institute of Technology in Lausanne. One particularly polarizing change was the recent elimination of a subproject, called Cognitive Architectures, as the project made its bid for the next round of funding.

According to Markram, the fuss all comes down to differences in scientific culture. He has described the project, which aims to build six different computing platforms for use by researchers, as an attempt to build a kind of CERN for brain research, a means by which disparate disciplines and vast amounts of data can be brought together. This is a "methodological paradigm shift" for neuroscientists accustomed to individual research grants, Markram told Science, and that's what he says the letter signers are having trouble with.

But some question the main goals of the project, and whether we're actually capable of achieving them at this point. The program's Brain Simulation Platform aims to build the technology needed to reconstruct the mouse brain and eventually the human brain in a supercomputer. Part of the challenge there is technological. Markram has said that an exascale-level machine (one capable of executing 1000 or more petaflops) would be needed to "get a first draft of the human brain", and the energy requirements of such machines are daunting

Crucially, some experts say that even if we had the computational might to simulate the brain, we're not ready to. "The main apparent goal of building the capacity to construct a larger-scale simulation of the human brain is radically premature," signatory Peter Dayan, who directs a computational neuroscience department at University College London, told the Guardian. He called the project a "waste of money" that "can't but fail from a scientific perspective". To Science, he said "the notion that we know enough about the brain to know what we should simulate is crazy, quite frankly.”

This last comment resonated with me, as it reminded me of a feature that Steve Furber of the University of Manchester wrote for IEEE Spectrum a few years ago. Furber, one of the co-founders of the mobile chip design powerhouse ARM, is now in the process of stringing a million or so of the low-power processors together to build a massively parallel computer capable of simulating 1 billion neurons, about 1% as many as are contained in the human brain.

Furber and his collaborators designed their computing architecture quite carefully in order to take into account the fact that there's still a host of open questions when it comes to basic brain operation. General-purpose computers are power-hungry and slow when it comes to brain simulation. Analog circuitry, which is also on the Human Brain Project's list, might better mimic the way neurons actually operate, but, he wrote,

“as speedy and efficient as analog circuits are, they’re not very flexible; their basic behavior is pretty much baked right into them. And that’s unfortunate, because neuroscientists still don’t know for sure which biological details are crucial to the brain’s ability to process information and which can safely be abstracted away”

The Human Brain Project's website admits that exascale computing will be hard to reach: "even in 2020, we expect that supercomputers will have no more than 200 petabytes." To make up for the shortfall, it says, "what we plan to do is build fast storage random-access storage systems next to the supercomputer, store the complete detailed model there, and then allow our multi-scale simulation software to call in a mix of detailed or simplified models (models of neurons, synapses, circuits, and brain regions) that matches the needs of the research and the available computing power. This is a pragmatic strategy that allows us to keep build ever more detailed models, while keeping our simulations to the level of detail we can support with our current supercomputers."

This does sound like a flexible approach. But, as is par for the course with any ambitious research project, particularly one that involves a great amount of synthesis of disparate fields, it's not yet clear whether it will pay off. 

And any big changes in direction may take a while. Although the proposal for the second round of funding will be reviewed this year, according to Science, which reached out to the European Commission, the first review of the project itself won't begin until January 2015.

Rachel Courtland can be found on Twitter at @rcourt.

DARPA Wants a Memory Prosthetic for Injured Vets—and Wants It Now

human os iconNo one will ever fault DARPA, the Defense Department's mad science wing, for not being ambitious enough. Over the next four years, the first grantees in its Restoring Active Memory (RAM) program are expected to develop and test prosthetic memory devices that can be implanted in the human brain. It's hoped that such synthetic devices can help veterans with traumatic brain injuries, and other people whose natural memory function is impaired. 

The two teams, led by researchers Itzhak Fried at UCLA and Mike Kahana at the University of Pennsylvania, will start with the fundamentals. They'll look for neural signals associated with the formation and recall of memories, and they'll work on computational models to describe how neurons carry out these processes, and to determine how an artificial device can replicate them. They'll also work with partners to develop real hardware suitable for the human brain. Such devices should ultimately be capable of recording the electrical activity of neurons, processing the information, and then stimulating other neurons as needed. 

Read More

Freeman Dyson Predicts the Future

Interactive Video: Choose the sections you want to watch by clicking on subjects on the video’s menu screen.

When we started making a list of visionaries to interview for our special issue commemorating IEEE Spectrum’s 50th anniversary, Freeman Dyson was one of the first names to come up.

The celebrated physicist’s career got off to a quick start in the late 1940s, with a critical contribution to the then-nascent field of quantum electrodynamics. Since then it’s ranged far and wide, touching on subjects as varied as solid-state physics, biology, and climate change.

But for many, Dyson is known for his most speculative ideas. He is the man for whom the Dyson sphere is named—a hypothetical structure, built by an alien civilization, that could capture most or all the energy emitted by a star (and leave a telltale excess of infrared light that could be picked up by our telescopes). Dyson was also one of the key players on Project Orion, which ran from 1958 to 1963 and which conceived of a spacecraft, powered by a series of controlled nuclear explosions, that could have potentially carried humans to Saturn by 1970.

We wanted to see what this bold and imaginative thinker might have to say about humanity’s next 50 years. He welcomed IEEE Spectrum to his office at the Institute for Advanced Study in Princeton, N.J., last October, just a few days after a celebration honoring his 90th birthday.  

In the video posted here, you’ll find an interactive version of his discussion with Associate Editor Rachel Courtland. Topics include the possibility of finding extraterrestrial life, the future of space exploration, and what might become of our efforts to better understand the human brain. One of Dyson’s wilder ideas is a sort of “super-chicken,” a biological system that could allow people without a wealth of natural resources to grow their own chairs, tables, and other objects.

Toward the end of the discussion, Courtland couldn’t help but ask Dyson what it’s like to make predictions about the far future. “The point about prediction is not that it’s true. Prediction is just either a warning or a hope,” he responded. “Predictions should never claim to be true. But you can certainly claim that they’re possibilities you ought to think about.”

Boeing Gets $2.8 Billion to Help Build World's Most Powerful Rocket

The most powerful rocket ever built will use four space shuttle engines and two solid rocket boosters to propel NASA astronauts to Earth orbit and beyond—and that translates into a lot of rocket fuel. The U.S. space agency recently finalized a $2.8 billion contract with Boeing to build the rocket's core stage, which will contain the hundreds of of metric tons of liquid hydrogen and oxygen needed to fuel the four main engines.

Read More

Printed Diode Is Fast Enough to Speak With Smartphones

One day, your smartphone will have conversations with your refrigerator, and your car will ask a city’s streets for tip-offs about free parking spaces. The ‘Internet of Things’ describes a world where pretty much everything is connected to everything else, enabling myriad applications from streamlined shopping to energy conservation.

It’s easy to imagine giving home appliances the hardware needed to plug into the Internet of Things – but what about a T-shirt, a magazine, or even an orange? These things need electronic labels—flexible, printed electronics that can draw power from their environments and use it to dispense useful information about the object.

A research team based in Sweden and the UK has now created the first printed e-label than can communicate with a smartphone. As a proof of principle, the device harvests energy from the smartphone’s signal, and uses it to illuminate a small display. The device was unveiled today in the Proceedings of the National Academy of Sciences of the United States of America.

Read More

No Tech Solution for Civilian IED Threat

The U.S. military's approach in dealing with improvised explosive devices (IEDs) has been described by an expert at a military academic institution as "hide and pray: hiding behind more armor and praying that there’s a technical solution to all this." But there is no hiding for ordinary civilians caught in IED blasts. Even the latest battlefield technologies for countering IEDs may not be practical as protection for civilians in crowded urban areas.

Homemade bombs that represented the signature weapon used against United States and coalition military forces in Iraq and Afghanistan have taken an increasingly deadly toll on civilians in recent years, according to The Guardian. The latest data suggests that IEDs have killed or maimed more than 53,000 civilians over the past three years during incidents ranging from the conflicts in the Middle East to the Boston Marathon bombings.

Read More

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More