Tech Talk iconTech Talk

Bell Labs Sets New Record for Internet Over Copper

Traditional copper telephone lines can now run ultra-fast broadband service, at least in the lab.

Bell Labs, the research arm of Alcatel-Lucent, has developed a prototype technology that can deliver upload and download speeds of up to 10 gigabits per second (Gbps) simultaneously.

The technology, XG-FAST, is an extension of a new broadband standard,, which will be commercially available next year. XG-FAST uses an increased frequency range (up to 500 MHz) compared to to deliver higher speeds, but over shorter distances. In the lab, researchers achieved speeds topping 1 Gbps on a single copper pair over a distance of 70 meters. The eye-popping 10-Gbps rate was achieved over 30 meters using two pairs of lines, a technique referred to as bonding.

For some Internet providers, 70 meters may be enough to expand coverage. Many service providers have laid fiber across their networks, but getting it to every last home is an expensive additional cost.

Alcatel-Lucent said the new technology should allow for Internet connections over cable that are “indistinguishable” from fiber-to-the-home in places where it’s not “physically, economically or aesthetically viable to lay new fiber cables all the way into residences.”

“XG-FAST can help operators accelerate [fiber-to-the-home] deployments, taking fiber very close to customers without the major expense and delays associated with entering every home.” Federico Guillén, President of Alcatel-Lucent’s Fixed Networks business, said in a statement.

For the past few years, Alcatel-Lucent has also been working on other ways to improve the speed of fast Internet over copper. Another nagging issue they've been wrestling with is the cross talk that can leak between customers' copper wires. Alcatel-Lucent has introduced vectoring, which adjusts signals from the home that are sent back to the hardware in the street cabinet in order to minimize the interference. However, with, cross-talk "is more like cross-shouting," according to Alcatel's TechZine blog, and will require even more innovation if it's to be overcome. 

Vectoring was paired with Alcatel-Lucent’s very-high-speed DSL technology (VDSL2) starting in 2011, but the latest breakthrough at Bell Labs considerably dwarfs the speeds achieved with VDSL2, albeit over a far shorter distance. Earlier this year, Alcatel-Lucent set a new world record for real-world fiber speeds of 1.4 terabits per second.

“Our demonstration of 10 Gbps over copper is a prime example: by pushing broadband technology to its limits, operators can determine how they could deliver gigabit services over their existing networks, ensuring the availability of ultra-broadband access as widely and as economically as possible,”  Marcus Weldon, president of Bell Labs, said in a statement.

But the short distances over which XG-FAST operates in the lab may not be enough to deliver faster Internet over copper to those outside of dense, urban environments. Chris Green, a principal technology analyst at the Davies Murphy Group consultancy, told BBCNews that in small towns and especially rural locations, the distance from the street cabinet to the home would still likely render this latest breakthrough impractical.

“The problem that rural properties have is that they are usually very far away from the nearest telephone exchange,” he told BBCNews. “You can usually measure it in miles.”


Graphic: Bell Labs

Hackaday Prize Competition Gears Up

The World Cup is over. But we engineering nerds will still have the Hackaday Prize competition to entertain us.

After all, we really enjoyed the thrill of the Ansari X Prize competition for nongovernmental flights into space, which ran from 1996 until 2004, when Burt Rutan and his colleagues claimed it. And there was the American Helicopter Society’s Sikorsky Prize for a human-powered helicopter. The prize, which was established in 1980 and long remained a tantalizing challenge to athletic aeronauts, was finally awarded a year ago almost exactly.

Such engineering-design competitions are indeed great fun to follow. But unless you’re part of a well-funded and well-organized team, participation is out of the question. The Hackaday Prize is different, because any avid DIYer can throw his or her hat into the ring. All you need to do is come up with an idea for “an open, connected device” and describe the design by 4 August 2014. Then if you make the first cut, you’ll have until 29 September to build the hardware and be in the running for the grand prize: a trip into space, which will be awarded in Munich in November.

Wait, how is Hackaday going to send somebody into space? I for one wouldn’t want to ride in a rocket they built!

Mike Szczys, managing editor at Hackaday, explains that they are not erecting some sort of Jules Verne canon over there; they are offering to buy the winner a ticket to space at some point in the future, after commercial flights become available. And that might not take long: Virgin Galactic (an outgrowth of the aforementioned Ansari X Prize competition) hopes to begin its commercial sub-orbital operations sometime this year.

Even if you’re not keen on rocketing to the Kármán line, if you win, you can collect a cash award of US $196,418 instead—which, as Szczys explains, was chosen because it’s a Fibonacci number somewhat shy of what he and his colleagues expect a ticket into space will cost.

What sort of gizmos qualify for the Hackaday Prize? You can read up on all the details, but the requirements really only call for something that is open and connected, which is easy enough to satisfy. Of course, it would have to be really cool to win, place, or show. And the competition is bound to be brutal.

Current entries include a frequency-modulated continuous-wave radar (a much slicker version of the MIT coffee-can radar, which I described in 2012) and a 3D-printable Raman spectrometer controlled by a Raspberry Pi. My favorite at the moment, though, is a homebrew proton-precession magnetometer, something I once tried (and failed) to hack together myself.

So if you’re a DIYer who has always wanted to enter an engineering-design competition but were never in a position to do so, now’s your chance. Gentleman (and ladies), start your soldering irons. And may the best hack win.

How D-Wave Built Quantum Computing Hardware for the Next Generation

Photo: D-Wave Systems

One second is here and gone before most of us can think about it. But a delay of one second can seem like an eternity in a quantum computer capable of running calculations in millionths of a second. That's why engineers at D-Wave Systems worked hard to eliminate the one-second computing delay that existed in the D-Wave One—the first-generation version of what the company describes as the world's first commercial quantum computer.

Read More

One Atom + Two Photons = Quantum Computing Switch

A scheme that uses a single atom to switch the direction of a single photon could pave the way toward quantum computers much more powerful than today’s machines.

The setup is described this week in the online issue of Science by researchers from the Weizmann Institute of Science in Rehovot, Israel. In simple terms, the atom can be in one of two states, either “left” or “right.” If the atom is in the left state, a photon that strikes it from the left will continue on in the same direction, as if it hadn’t hit the atom at all. A photon coming from the right, however, will be reflected back in the direction it came from, and at the same time the interaction will cause the atom to flip from left to right. Left and right can stand in for the 1s and 0s of digital logic.

 Barak Dayan, head of the Weizmann Quantum Optics group, says the basic principle at work is interference. In one direction the ph ton does not interact with the atom and so continues in the same direction. But in the other direction, there is destructive interference between the incoming photon and the outgoing emission from the atom  in the direction of travel, so the only direction the light can travel in is back from whence it came. Each such reflection toggles the state of the atom.

One type of quantum computer under development uses the electrical states of ions as the bits, or rather qubits, that make up their logic. The difficulty, Dayan says, is that to communicate the state of one atom to another, the atoms either have to be maneuvered to be next to each other or the state has to be transferred from ion to ion one at a time to its final destination, in an atomic version of “Post Office.” Using light instead, he says, means any qubit can share its information with any other, regardless of how far apart they are, thus simplifying the system.

The group’s next step will be to impose on the photons quantum information such as superposition, essentially getting them to be both right and left at the same time. It’s that ability to hold multiple states at once, instead of just 1 or 0, that promises to make quantum computers much more powerful than digital computers. Dayan hopes his group can achieve that goal in a few months.

Fresh Hope for an Abandoned NASA Spacecraft

News of ISEE-3’s demise may be a bit premature. A months-long bid to bring the 35-year-old International Sun-Earth Explorer 3 back into the vicinity of Earth seemed to have met its end this week, when the citizen group attempting to “reboot” the mission failed to get the thrusters to produce much more than a burp.

Tests done on Wednesday during a communications session with the Arecibo telescope seemed to suggest that the culprit was a lack of “pressurant." The spacecraft appears to have run out of the nitrogen gas it used to force hydrazine fuel from the tanks and into the thrusters.

But the reboot team, led by editor Keith Cowing and entrepreneur Dennis Wingo, CEO of California-based Skycorp Incorporated, isn’t quite ready to give up. One of the project volunteers has suggested that perhaps the nitrogen isn’t actually gone. It may in fact still be there, but dissolved in with the hydrazine.

Read More

Motion Capture Technology Goes Into the Wild for Dawn of the Planet of the Apes

The men and women who survived a deadly virus that wiped out much of earth's human population hunker down amidst the ruins of San Francisco; meanwhile, a growing ape population has built a lovely and thriving community outside of San Francisco in Muir Woods. The tension between the two societies drives the action-packed Dawn of the Planet of the Apes, the sequel to the 2011 Rise of the Planet the Apes that starred James Franco.

As this sequel begins, Franco's character has been dead for a decade, and the apes have had plenty of time to create their version of civilization. But in real time, it's been just three years since the Rise movie. In the world of motion picture technology, though, that's an eternity. Long enough to create computer graphics gear robust enough to take out of the studio and deep into a real forest. And long enough that moviemakers no longer need to give a recognizable Hollywood star top billing to bring in audiences.

In fact, if you passed the leading man of Dawn—Andy Serkis—on the street, you wouldn't recognize his face at all, for you never see it on the screen. That's because his performance in the woods (actually, forests near Vancouver, not San Francisco) wasn't fillmed traditionally, it was motion captured, and used as a framework for a computer-created realistic digital ape, Caesar. And, for the first time in my knowledge, it's the performances of the motion capture actors, not the regular actors portraying humans, that are getting all the good reviews from critics; there is even talk of the first best-actor Oscar nomination for a motion-capture performance.

Read More

Contraceptive Implant Hands Women Remote Control

Women may soon bid farewell to birth control pills and welcome a new type of contraception in the form of microchip implants. An MIT startup backed by the Bill Gates Foundation plans to start pre-clinical testing for the birth control chip next year and pave the way for a possible market debut in 2018.

Read More

Can The Human Brain Project Succeed?

An ambitious effort to build human brain simulation capability is meeting with some very human resistance. On Monday, a group of researchers sent an open letter to the European Commission protesting the management of the Human Brain Project, one of two Flagship initiatives selected last year to receive as much as €1 billion over the course of 10 years (the other award went to a far less controversy-courting project devoted to graphene).

The letter, which now has more than 450 signatories, questions the direction of the project and calls for a careful, unbiased review. Although he’s not mentioned by name in the letter, news reports cited resistance to the path chosen by project leader Henry Markram of the Swiss Federal Institute of Technology in Lausanne. One particularly polarizing change was the recent elimination of a subproject, called Cognitive Architectures, as the project made its bid for the next round of funding.

According to Markram, the fuss all comes down to differences in scientific culture. He has described the project, which aims to build six different computing platforms for use by researchers, as an attempt to build a kind of CERN for brain research, a means by which disparate disciplines and vast amounts of data can be brought together. This is a "methodological paradigm shift" for neuroscientists accustomed to individual research grants, Markram told Science, and that's what he says the letter signers are having trouble with.

But some question the main goals of the project, and whether we're actually capable of achieving them at this point. The program's Brain Simulation Platform aims to build the technology needed to reconstruct the mouse brain and eventually the human brain in a supercomputer. Part of the challenge there is technological. Markram has said that an exascale-level machine (one capable of executing 1000 or more petaflops) would be needed to "get a first draft of the human brain", and the energy requirements of such machines are daunting

Crucially, some experts say that even if we had the computational might to simulate the brain, we're not ready to. "The main apparent goal of building the capacity to construct a larger-scale simulation of the human brain is radically premature," signatory Peter Dayan, who directs a computational neuroscience department at University College London, told the Guardian. He called the project a "waste of money" that "can't but fail from a scientific perspective". To Science, he said "the notion that we know enough about the brain to know what we should simulate is crazy, quite frankly.”

This last comment resonated with me, as it reminded me of a feature that Steve Furber of the University of Manchester wrote for IEEE Spectrum a few years ago. Furber, one of the co-founders of the mobile chip design powerhouse ARM, is now in the process of stringing a million or so of the low-power processors together to build a massively parallel computer capable of simulating 1 billion neurons, about 1% as many as are contained in the human brain.

Furber and his collaborators designed their computing architecture quite carefully in order to take into account the fact that there's still a host of open questions when it comes to basic brain operation. General-purpose computers are power-hungry and slow when it comes to brain simulation. Analog circuitry, which is also on the Human Brain Project's list, might better mimic the way neurons actually operate, but, he wrote,

“as speedy and efficient as analog circuits are, they’re not very flexible; their basic behavior is pretty much baked right into them. And that’s unfortunate, because neuroscientists still don’t know for sure which biological details are crucial to the brain’s ability to process information and which can safely be abstracted away”

The Human Brain Project's website admits that exascale computing will be hard to reach: "even in 2020, we expect that supercomputers will have no more than 200 petabytes." To make up for the shortfall, it says, "what we plan to do is build fast storage random-access storage systems next to the supercomputer, store the complete detailed model there, and then allow our multi-scale simulation software to call in a mix of detailed or simplified models (models of neurons, synapses, circuits, and brain regions) that matches the needs of the research and the available computing power. This is a pragmatic strategy that allows us to keep build ever more detailed models, while keeping our simulations to the level of detail we can support with our current supercomputers."

This does sound like a flexible approach. But, as is par for the course with any ambitious research project, particularly one that involves a great amount of synthesis of disparate fields, it's not yet clear whether it will pay off. 

And any big changes in direction may take a while. Although the proposal for the second round of funding will be reviewed this year, according to Science, which reached out to the European Commission, the first review of the project itself won't begin until January 2015.

Rachel Courtland can be found on Twitter at @rcourt.

DARPA Wants a Memory Prosthetic for Injured Vets—and Wants It Now

No one will ever fault DARPA, the Defense Department's mad science wing, for not being ambitious enough. Over the next four years, the first grantees in its Restoring Active Memory (RAM) program are expected to develop and test prosthetic memory devices that can be implanted in the human brain. It's hoped that such synthetic devices can help veterans with traumatic brain injuries, and other people whose natural memory function is impaired. 

The two teams, led by researchers Itzhak Fried at UCLA and Mike Kahana at the University of Pennsylvania, will start with the fundamentals. They'll look for neural signals associated with the formation and recall of memories, and they'll work on computational models to describe how neurons carry out these processes, and to determine how an artificial device can replicate them. They'll also work with partners to develop real hardware suitable for the human brain. Such devices should ultimately be capable of recording the electrical activity of neurons, processing the information, and then stimulating other neurons as needed. 

Read More

Boeing Gets $2.8 Billion to Help Build World's Most Powerful Rocket

The most powerful rocket ever built will use four space shuttle engines and two solid rocket boosters to propel NASA astronauts to Earth orbit and beyond—and that translates into a lot of rocket fuel. The U.S. space agency recently finalized a $2.8 billion contract with Boeing to build the rocket's core stage, which will contain the hundreds of of metric tons of liquid hydrogen and oxygen needed to fuel the four main engines.

Read More

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More