Tech Talk iconTech Talk

Amazon's Jeff Bezos Debuts Spacecraft in First Flight Test

The commercial spaceflight race continues to heat up as Amazon founder Jeff Bezos announced his Blue Origin spaceflight company’s first flight test of a new space vehicle this week. The New Shepard space vehicle marks Blue Origin’s attempt to create a fully reusable rocket system capable of both vertical takeoff and landing.

Read More

Five Things You Might Not Know About Moore’s Law

graphic link for Moore's Law special report

In the 50 years since Gordon Moore published his prediction about the future of the integrated circuit, the term “Moore’s Law” has become a household name. It’s constantly paraphrased, not always correctly. Sometimes it’s used to describe modern technological progress as a whole.

As IEEE Spectrum put together its special report celebrating the semicentennial, I started a list of key facts that are often overlooked when Moore’s Law is discussed and covered. Here they are (sans animated gifs):

1. Moore’s Law changed over time. Gordon Moore originally predicted the complexity of integrated circuits—and so the number of components on them—would double every year. In 1975, he revised his prediction to a doubling every two years. 

2. It’s not just about smaller, faster transistors. At its core, Moore’s prediction was about the economics of chipmaking, building ever-more sophisticated chips while driving down the manufacturing cost per transistor. Miniaturization has played a big role in this, but smaller doesn’t necessarily mean less expensive—an issue we’re beginning to run into now. 

3. At first, it wasn’t just about transistors. Moore’s 1965 paper discussed components, a category that includes not just transistors, but other electronic components, such as resistors, capacitors, and diodes. As lithographer Chris Mack notes, some early circuits had more resistors than transistors.

4. The origin of the term “Moore’s Law” is a bit murky. Carver Mead is widely credited with coining the term “Moore’s Law”, but it’s unclear where it came from and when it was first used. 

5. Moore’s Law made Moore’s Law. Silicon is a pretty unique material, but maintaining Moore’s Law for decades was hard work and it’s getting harder. As historian Cyrus Mody argues, the idea of Moore’s Law kept Moore’s Law going: it has long been a coordinating concept and common goal for the widely-distributed efforts of the semiconductor industry.

Virtual Reality Pioneer Looks Beyond Entertainment

Anyone who wants to learn how to use virtual reality to hack the human brain usually ends up visiting Jeremy Bailenson, founding director of the Virtual Human Interaction Lab at Stanford University

Bailenson has received visits from heads of state, the U.S. military and NASA during a career that spans almost two decades. But Bailenson recently observed that his VR lab’s technological capabilities are rapidly becoming obsolete as leading technology companies such as Oculus VR, Samsung, Google, Valve, Sony and Microsoft compete to develop virtual reality headsets that can entertain the masses.

Read More

DARPA's Self-Steering EXACTO Bullets Home in on Moving Targets

Bullets are dumb. Really dumb. This is a problem, because bullets spend a significant portion of their actively useful life in very rapid transit from one place to another without paying the least amount of attention to what’s going on around them while they do so. If you or I behaved in such an ignorant manner while traveling from place to place, we’d almost certainly get run over by a bus.

In order to not get run over by a bus, those of us who are cleverer than bullets do our best to be aware of our surroundings while we travel, compensating for changes in both our environment and our destination. DARPA has managed to imbue bullets with a similar level of intelligence, allowing them to steer themselves to a moving target while dynamically adjusting for whatever sorts of things might send them off-target, such as crosswinds or the lousy aim of whoever pulled the trigger.

Read More

Now Any Team Can Buy the Performance Analysis Engine that Helped Germany Win World Cup

Performance analysis software that helped Germany win the 2014 soccer World Cup will soon be available to sports clubs all over the world.

On Monday, SAP unveiled its Sports One solution, at Bayern Munich’s Allianz Arena. Sports One is a sports specific, cloud-based unified platform for managing things like business operations and fan engagement, both already used by Bayern, who won their twenty fifth Bundesliga title, last weekend.

Read More

IBM Shows First Full Error Detection for Quantum Computers

Quantum computers must overcome the challenge of detecting and correcting quantum errors before they can fulfill their promise of sifting through millions of possible solutions much faster than classical computers. 

“With our recent four-qubit network, we built a system that allows us to detect both types of quantum errors,” says Jerry Chow, manager of experimental quantum computing at IBM’s Thomas J. Watson Research Center, in Yorktown Heights, N.Y. Chow, who, along with his IBM colleagues detailed their experiments in the 29 April issue of the journal Nature Communications, says, “This is the first demonstration of a system that has the ability to detect both bit-flip errors and phase errors” that exist in quantum computing systems. 

Read More

How Oculus Story Studio Learned Storytelling in Virtual Reality

Telling stories in virtual reality requires a new storytelling language. Veterans of legendary animation studio Pixar discovered that hard truth when they first founded what has become Oculus Story Studio. Their current recipe for virtual reality storytelling borrows liberally from Hollywood’s tried-and-true cinematic techniques, video game interactivity, and even live theater experiences.

Read More

The Murky Origins of “Moore's Law”

When I called up Carver Mead in preparation for the 50th anniversary of Moore’s Law, I was eager to track down a particularly elusive detail: the very first time the term “Moore’s Law” was used.

The Caltech professor, now retired, is often credited with coining the term. This happened around 1970, it’s often said, but I couldn’t find evidence to back up that claim. I’m hardly the only one. Ia chapter of an excellent book celebrating 40 years of Moore’s Law, historian David Brock has stated the origins of the phrase “remain murky”. 

I thought I might as well ask the man himself while I had him on the phone. And I got very excited when Mead seemed to remember it like it was yesterday.

He told me the term first popped up during an interview he did in the late 60’s or early 70’s with Electronics writer Larry Waller, and that it appeared in an article in print very shortly after. Mead wasn’t sure whether it was he or Waller who coined the term. “It sort of bubbled up in the discussion,” Mead told me, “and then it came out in the article and it stuck.”

I later found that Mead tells a similar story in Ashlee Vance’s friendly guide to Silicon Valley. I thought I must be on to something. Somewhere, buried in an old issue of Electronics magazine, was the first mention of the term “Moore’s Law.” And it seemed it might be years earlier than the earliest citation given in the Oxford English Dictionary, which is to a 1977 article in Science. 

I spent a long wintry Saturday afternoon at the New York Public Library zipping through back issues of Electronics on microfilm and came up completely dry. 

So I tracked down Larry Waller to see if Carver Mead’s recollection aligned with his own. Waller said he didn’t start at Electronics until 1975, and that he had conversations with Mead around 1979 and 1980. At that time, Waller said, he was talking to Mead about Mead’s work in circuit design automation, and the term Moore’s Law was already kicking around. “It was just so pervasive I never knew who was credited with saying it first,” he told me. “I know Carver and I talked about it, but I just don’t remember the context.”

I went back to Mead to tell him what I’d found, and he said he really thought the interview happened much earlier, in the early 1970’s. “It must not have been Larry, and maybe not Electronics,” he wrote. “I am really sorry I don't have a copy of that interview.” A helpful engineering librarian at Caltech hunted through academic journals and popular magazines of the period to see if something might turn up, but she couldn’t find anything that seemed to fit the bill. 

Somewhere floating out there, there could be such an article (and please tell me if you find it). But regardless of where the phrase “Moore’s Law” first turned up and who said it first, it seems fair to say that Mead was instrumental in putting the concept of Moore’s Law on the map, as he worked to convince people that gains in integrated circuit technology would continue for a long time to come. 

“As Mead traveled throughout the silicon community in the early 1970s, he succeeded in building a belief in a long future for the technology, using Moore’s plots as convincing evidence. In doing so, Mead also played a key role in fusing Moore’s law with this belief in the future of electronics and building an expanding awareness of both,” Brock has written. “While Mead may not have been the originator of the phrase Moore’s law (its precise origins remain murky), he undoubtedly acted as its charismatic Johnny Appleseed.​”

As contributions go, that’s nothing to sniff at. And it was far from Mead’s only contribution to modern electronics.

A New Bionic Eye: Infrared Light-Powered Retina Implant Coming

Writers are going to need a new metaphor. For centuries, “bringing eyesight to the blind” signaled something miraculous. But with the first visual prosthetic now on the market and a number of others close behind, curing a person of blindness may soon seem like less of a miracle and more of a routine medical correction.  

At the IEEE Neural Engineering meeting in Montpellier, France last week, researchers described their progress toward this goal. In one talk, a Stanford scientist described a clever visual prosthetic that’s photovoltaic, thus doing away with batteries or bulky recharging systems. The tech is being commercialized by the French company Pixium Vision, with clinical trials scheduled for 2016.

The 100-millimeter-square chip sits behind the retina, the part of the eye that contains the photoreceptor cells that respond to the light of the world by triggering electric pulses in other cells. Those pulses are part of a chain reaction that sends information up the optic nerve to the brain. In certain retinal diseases, the photoreceptor cells die off, but the remaining relay cells are undamaged. Different visual prostheses target different cells within this system for electrical stimulation.  

Henri Lorach (from Daniel Palanker’s lab at Stanford) says his team’s advance is in using the same light signal to both transmit the image of the outside world and to power the implanted chip. The most advanced version of the chip has 70-micron pixels, each of which includes photodiodes and a stimulating electrode. “We cannot use ambient light to power these devices, because it’s not strong enough,” Lorach said, “so we use high-powered infrared light.” 

When this system is tested by humans, the subjects will wear goggles containing a recording camera. A connected “pocket processor” will convert that recording into an infrared image, which the goggles will then beam into the eye. The chip receives the pattern and stimulates the underlying cells accordingly. In testing on rats, the researchers determined that nuerons in the brain respond to this stimulation in much the same way they respond to natural light, and that the power of the infrared light necessary to induce that reaction was well below the safety threshold.

Lorach’s team also got promising results when it came to visual acuity. Their rats achieved a vision level that translates to 20/250 in humans, which means the person would probably be able to read the top letter on an eye chart, but none of the letters below. With the next-generation device, Lorach said, “we’re working to get to 20/120, which would be below the limit of legal blindness” in the United States.

These results signal an impressive leap forward. Second Sight, the company that got FDA approval for the first visual prosthesis in 2013, currently offers patients about 20/1300 vision. The German company Retinal AGwhose system has been approved by European regulators, offers about 20/500. 

Australia’s Bionic Vision is planning a clinical trial of its technology in the next year, said researcher Nigel Lovell at the Neural Engineering meeting. Lovell and other speakers also emphasized the need to study the code of electric pulses by which the eye’s cells transmit information, in hopes of dramatically improving the crude vision produced by current prosthetic devices.

Computer Models Show Terror Birds Hunted by Sound

Terror birds, one of South America's most feared prehistoric predators, didn't use super sharp eyesight to catch prey, as do eagles, hawks, and other modern day raptors. Using sophisticated computed tomography X-ray scans and 3-D modelling software, researchers were able to show that these flightless giants actually hunted by listening to their quarry’s footsteps.

Terror birds, or phorusrhacids, to give them their scientific name, were top predators in South America for fifty million years after the dinosaurs died out, finally going extinct 1.8 million years ago. Terror bird fossils have also been discovered in North America, Africa, and Europe. These feathered killers stood from1 to 3 meters tall. They had huge hooked beaks, and the biggest ones could use them to kill with one gigantic stab.

In 2010, paleontologists discovered the almost complete skeleton of a new species of terror bird, near Mar del Plata in Argentina. They named it Llallawavis scagliai, after Galileo Juan Scaglia, one of Argentina's most celebrated naturalists. Scaglia’s grandson, Fernando, was part of the discovery team.

Subsequent analysis revealed that this bird would have stood 1.2 metres high and weighed around 18 kilograms. The remains were so well preserved, however, that a team led by Federico Degrange, assistant researcher of vertebrate paleontology at the Universidad Nacional de Córdoba, was also able to study the bird’s auditory system.

Degrange used a hi-speed medical CT scanner to produce detailed 2-D images of the bird’s inner ear. CT scans enable scientists to see minute detail inside fossils. These images were then transformed into 3-D segmented models using Materialise Mimics, 3-D imaging software. Mimics (Materialise Interactive Medical Image Control System) uses a special medical imaging aglorithm called “marching cube”  that takes partial volume into account to produce more accurate 3-D models.

The results were published in the Journal of Vertebrate Paleontology. Degrange explains that he based his deductions about Llallawavis’ hearing capabilities on the length of the terror bird’s cochlea.

“We calculated that it would have had a mean hearing range of approximately 3800 Hz and a mean hearing sensitivity of approximately 2300 Hz,” he says. Mean hearing range is the average difference between the highest and lowest frequencies that the bird could have heard. Hearing sensitivity, is the frequency at which the bird's hearing was the most acute.

Llallawavis would have been much better at hearing low frequency sounds than are humans. We hear best between 4,000 and 5,000 Hz. The ability to hear low frequencies would have also helped it locate and track the small mammals and birds it preyed upon. The bird would have been able to hear footsteps, even if the prey animal was hidden in the undergrowth. Crocodiles hear low frequency sounds. So too, did Tyrannosaurus Rex.

Degrange also thinks that Llallawavis would have had a deep booming voice, a bit like an ostrich. “This is plausible to hypothesize because the vocalization range of most birds falls within the lower half of their hearing sensitivity range,” he says.

Degrange admits that he doesn’t know about the bird’s voice for sure, though. The tracheobronchial syrinx, the structure that produces sound in birds, was missing from the remains. This structure is made mainly of cartilage and didn’t survive 2.5 million years in the ground.

Degrange is currently studying the terror bird’s eye bones, brain case, and skull. He intends to discover more about its vision and senses, and from this deduce whether Llallawavis was active during the day, at night or during twilight hours. This he hopes, could provide clues as to why terror birds died out.

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More