Tech Talk iconTech Talk

The Google Lunar XPrize Race Is Officially On

Today, the Google Lunar XPrize (GLXP) announced that Israeli team SpaceIL is the first to sign a verified launch contract that covers the first leg of its journey to the moon. The educational nonprofit’s spacecraft is slated to launch on a SpaceX Falcon 9 rocket in the second half of 2017.

This is huge news for the GLXP, which is offering at least US $20 million to the first private team to land on the moon and perform certain tasks. The lunar deadline is currently set at the end of 2017, but a more pressing deadline has been looming for quite some time.

Until now, we were just months away from what could have been the contest’s real expiration date. According to the present rules, if no team showed evidence of a launch agreement by the end of 2015, the contest would be over.

With SpaceIL’s contract, at least one team is now officially in the running for the prize. “It really is the new space race now,” GLXP’s senior director Chanda Gonzales says.

As I explained earlier this year, finding a way to get off the ground has been big challenge for GLXP competitors. Rocket berths are expensive, and while a team could potentially launch more cheaply as a secondary payload, it’s a tough pitch when you’re proposing adding a spacecraft loaded with potentially explosive rocket fuel.

“It’s more than tricky,” says SpaceIL CEO Eran Privman. “We found it almost impossible.” Privman explains that the team at one point had been working on securing a launch with Russia. “There we managed to get an initial agreement, but it failed due to geopolitical issues.” Attempts to secure a berth as a secondary also failed, he says, so “we decided to change the game.”

In the end, the team signed a contract with Seattle-based launch broker Spaceflight. SpaceIL’s craft will launch in a cluster containing more than 20 payloads, each contracting with Spaceflight. The launch will cost SpaceIL more than US $10 million, Privman says. (It’s not clear how much more than $10 million the actual fee is, but it’s likely a fraction of the oft-quoted “front-door price” for a whole Falcon 9, some $60 million.)

According to the revised GLXP rules laid out in May, any remaining teams who wish to compete must provide notification of a launch contract by the end of next year. But it may not take that long before we see other teams join the starting line.

Last week, aspiring lunar mining company Moon Express announced that it has signed a launch contract with Rocket Lab, a New Zealand-based start-up that’s hoping to drive down the cost of space access. 

 The contract is for five launches, totalling about US $30 million, Daven Maharaj, Moon Express vice president of operations told me on Friday.

Moon Express hopes to launch on Rocket Lab’s Electron rocket, which is expected to begin test launches shortly. Moon Express has reserved two Electron launches in 2017, which could be a good move if there are any issues with the new rocket. “There is definitely some risk there,” Maharaj says. “That’s one of the advantages—by picking two slots we’re basically guaranteeing ourselves a scheduled relaunch.”

The GLXP has yet to receive the launch contract documentation from Moon Express, Gonzales says. When it does, the agreement will be evaluated as the SpaceIL one was, she says, on both financial and technical terms. 

Of course, even with a more established company, no flight can be guaranteed, and no flight date is set in stone. SpaceX is still working on resuming flights following a launch failure in June. “I believe that [the delay] shouldn’t affect the manifest in the second half of 2017,” SpaceIL’s Privman says. “But you know that predicting is very unpredictable.”

Even with these uncertainties, it’s exciting to see the Google Lunar XPrize move into this new stage. The competition was first announced in 2007, and its original deadline for claiming the full top prize was 2012. It’s been a long road. Perhaps one day soon it will end with a few new landers dipping their feet in lunar regolith.

Follow Rachel Courtland on Twitter at @rcourt.

Tunnel Transistor May Meet Power Needs of Future Chips

A new kind of transistor consumes 90 percent less power than conventional transistors, dramatically exceeding a theoretical limit for electronics, researchers say. These findings could one day lead to super-dense low-power circuits as well as ultra-sensitive biosensors and gas sensors, the investigators added.

The relentless advance of computing power over the past half-century has relied on constant miniaturization of field-effect transistors (FETs), which serve as the building blocks of most microchips. Transistors act like switches that flick on and off to represent data as zeroes and ones.

A key challenge that FETs now face is reducing the power they consume. The switching properties of conventional FETs are currently restricted by a theoretical limit of 60 millivolts per decade of current at room temperature. This limit, known as the subthreshold swing, means that each 60-millivolt increase in voltage leads to a 10-fold increase in current. Lowering the swing would yield better channel control, so switching would require less energy.

Read More

Fujitsu Makes a Terahertz Receiver Small Enough for a Smartphone

It’s a good time to be alive for pixel peepers. TV makers are pushing 4K-resolution sets to replace our present 1080p screens; Apple’s iMacs sport a 5K resolution; and NHK, Japan’s national broadcaster, is testing 8K broadcasting equipment, targeting 2020 and the Tokyo Olympics for its introduction.

To help wireless devices cope with the higher speeds demanded by such applications, Fujitsu has developed a 300-GHz prototype receiver compact enough to fit into a cellphone. Though limited to about 1 meter in range, the company says the device can download 4K and 8K video almost instantly.

Read More

How Atoms Dance in Dielectrics

Atoms within the tiny crystals in many dielectric and ferroelectric ceramics twist and dance when an electric field is turned on. But until now, nobody could tell exactly how they twisted and danced, even though understanding those movements could be the key to developing more compact, higher performance capacitors and condensers.

In a new Scientific Reports paper, researchers at North Carolina State University—with colleagues from the U.S. National Institute of Standards and Technology and the University of New South Wales—show how new ways of analyzing X-ray diffraction data can reveal the details in a ceramic’s structural response to an electric field.

A ceramic can be a complex jumble of amorphous material that glues together crystalline particles of different sizes in different orientations. Traditionally, scientists probe crystal structures by reading the interference patterns produced by X-rays passing through them. They usually analyze only the primary signal, the strongest bands or brightest rings, which give information on the overall structure. This works well on homogeneous material in regular lattices. In a heterogeneous ceramic, though, with crystals pointing every which way, a lot of detail is buried in the faint secondary signals and diffusion patterns.

The NCSU-led team captured this detail by applying the pair distribution function (PDF). The method, first described in the 1960s but used increasingly over the past decade, lets researchers read all of the signals in a wedge of the diffraction pattern to calculate the length and orientation of the bonds between pairs of atoms. (See diagram.) Researchers take snapshots of the ceramic with and without an applied electric field, taking a census of how many microcrystals are in which orientations in each condition.

"A good analogy would be that analyzing the bright rings is like examining a skyscraper from far away and determining that each office is 500 square feet,” said NCSU’s Tedi-Marie Usher in a press release. “However, by also analyzing the weak X-rays scattered from the sample, we can determine that some offices are 400 square feet and others are 600 square feet, and some have the desk on the east side, and others have the desk on the north side."

The research team analyzed three perovskite materials— barium titanate (BaTiO3), sodium bismuth titanate (Na0.5Bi0.5TiO3), and strontium titanate (SrTiO3)—all members of a class of ceramics with useful dielectric, ferroelectric, and piezoelectric properties.

In an electric field, dielectrics become polarized and ferroelectrics reverse their polarity. This segregation—positive charge on one side, negative on the other, nobody in the middle—impedes current flow across the gap, letting energy build up. The material’s relative permittivity (or dielectric constant) is an index of how effectively it can store energy as an electric field. Air and a vacuum have permittivities of 1 (or very close to it). Silicon’s is 12. In barium titanate (which is also the first piezoelectric ceramic identified), the permittivity can range up from 1,200 to 10,000 or more.

As the material polarizes, its atoms reorient themselves in their crystal lattices. In sodium bismuth titanate, for example, bismuth atoms align with the electric field, changing their relationships with the surrounding titanium ions. (See animation. The bismuth atoms are shown in purple, the titanium in blue.)

“Dipolar effects are known to have large contributions to dielectric permittivity,” said NCSU’s Jacob Jones, the team leader, in a press release.  “The measurement tells us the population of dipoles that are reorienting. We could combine this information with the microscopic spontaneous dipole magnitude and calculate a net contribution to the macroscopic dielectric permittivity.”

Right now, there appears to be no one-size-fits-all mechanism for structural shifts in dielectric ceramics.

 “One of the interesting findings here is that each of the three dielectric materials we tested exhibited very different behaviors at the atomic level,” Jones said. “There was no single atomic behavior that accounted for dielectric properties across the materials."

So there is much more to be learned. Said Jones:

Of immediate interest are a broad range of dielectric materials that exhibit complicated (and elusive) structures. For example, perovskites with components Na0.5 Bi0.5 TiO3 [sodium bismuth titanate], K0.5Bi0.5TiO3 [potassium bismuth titanate], and Pb(Mg0.33Nb0.67)O3 [lead magnesium niobate] and their solid solutions. These compounds exhibit unique local structures that will respond to field differently. This is a useful technique to reveal interesting physical origins of unique dielectric, piezoelectric, and ferroelectric properties of these materials. We are also interested in extending this technique to an emerging class of high entropy alloys (HEAs) and entropy-stabilized oxides (ESOs), where unique elements may behave differently in response to mechanical stress or electric fields.  The unique local behavior of the elements in these multi-element materials is not accessible in traditional diffraction measurements.

How Much Power Will Quantum Computing Need?

Google’s Quantum AI Lab has installed the latest generation of what D-Wave Systems describes as the world’s first commercial quantum computers. Quantum computing could potentially solve certain problems much faster than today’s classical computers while using comparatively less power to perform the calculations. Yet the energy efficiency of quantum computing still remains a mystery.

For now, D-Wave’s machines can scale up the number of quantum bits (qubits) they use without significantly increasing their power requirements. That’s because D-Wave’s quantum computing hardware relies on a specialized design consisting of metal niobium loops that act as superconductors when chilled to a frigid 15 millikelvin (-273°  C). Much of the D-Wave hardware’s power consumption—slightly less than 25 kilowatts for the latest machine—goes toward running the refrigeration unit that keeps the quantum processor cool. The quantum processor itself requires a comparative pittance.

Read More

David DiVincenzo on his Tenure at IBM and the Future of Quantum Computing

Theoretical physicist David DiVincenzo is widely viewed as one of the pioneers of quantum computing. He authored a 1996 paper (PDF) outlining five criteria he predicted would make quantum computing a reality; it has become a de facto roadmap for most of the research in quantum computing since then. In 1998, with Daniel Loss, he proposed using electron spins for storing data as qubits in quantum dots, which might prove to be the best choice for creating a working quantum computer.

In 2010, DiVincenzo was invited by Germany's Alexander von Humboldt Foundation to become Director of the Institute of Theoretical Nanoelectronics at the Peter Grünberg Institute in Jülich, and a professor at the Institute for Quantum Information of RWTH Aachen University. Previously he was a research director at the IBM T.J. Watson Research Center in Yorktown Heights, N.Y.

We met DiVincenzo in his Spartan office at the Physikzentrum of RWTH Aachen University, which is located “ten minutes by bicycle” from The Netherlands, where DiVincenzo has made his home.

IEEE Spectrum: You turned to investigating quantum computing while working as a theoretical physicist at IBM. What caught your interest?

DiVincenzo: I became interested in around 1993. It was not very much of a field at that time, but it was a field. There were two very eminent IBM scientists who were already involved for much longer: Rolf Landauer and Charles Bennett. Landauer is remembered for his contributions to the fundamental understanding of computing. Questions like what is the minimum amount of energy required to perform computational processes.

Landauer was quite important in the original discussions of quantum computing because he provided a skeptical point of view.  He thought that the sensitivity to imperfections in matter would be devastating for quantum computing. But he was interested in the concept of error correction that arose at that time and that could be applied to quantum computing.  And this really turned the story around. Bennett was famous for introducing the ideas of quantum physics into information science and cryptography. In 1993, he worked on what is now known as quantum teleportation.

I was fascinated by these developments, and at that time, IBM was flexible enough so that I could just jump in. I started contributing various ideas, and the following year, the Shor Factoring Algorithm was discovered. This made it clear that quantum computing could be done.

Spectrum: So, the research culture at IBM was definitely an important factor in your research career.

DiVincenzo: I would say that the research culture at IBM was always distinct. There was a whole evolution over the decades. For years it was thinking of itself in relation to Bell Labs: Are we as famous as Bell Labs?  That’s the history of the 1970s. I joined the lab in the 1980s; I had many friends that were there from the beginning, and I think I had a feeling for what the culture was like. IBM tried to really build up its research in the 1960s. In that period, they were definitely looking at themselves hoping to be another Bell Labs, which was in its heyday. By the 1980s, I felt that at that point they really did not have to worry whether they [were turning out] science of a comparable quality as Bell Labs. The cultures were similar; they would take rather young scientists and immediately give them all the resources of an institute, basically, without any of the responsibilities. This is a fantastic model which has proven to be not so sustainable—at least not in the corporate world. [And beginning in the early 1990s, it wasn’t really sustainable within IBM.]

Spectrum: How did this change affect your work at IBM?

DiVinzenzo: IBM had a heavy financial crisis in 1993, its most severe one.  It had a moment when it was really questioning its whole business model and whether it should be broken up into smaller companies. IBM undertook a whole sequence of different steps, such as getting out of personal computers,  and each one made it appear that physics had become less relevant to IBM.  The physics department got much smaller that year, and I remained in that smaller department. But we had a fantastic time after that in quantum computing.

Spectrum: So at least the research culture at IBM survived.

DiVincenzo: Here I would say something about the culture of IBM versus Bell. Bell evolved into a very competitive internal culture. People were really knocking against each other. Internal seminars were quite an ordeal because you were subjected to really heavy scrutiny. Internal dealings among scientists at IBM were much more congenial. 

The rest of physics at IBM was suffering. IBM still has a physics department, but at this point almost every physicist is somehow linked to a product plan or customer plan. At IBM, right from the beginning, there was always hope that these physicists who were dreaming up interesting things could actually contribute. Research became more directed as time went on.

Spectrum: It is now five years since you joined two German universities. How would you compare the life of a researcher in Germany—a country traditionally known for its emphasis on academic freedom ever since the 19th century? Do researchers in Germany have the same freedom that researchers had at IBM during the 1960s?

DiVincenzoNo. But I think they are freer and have more flexibility than what you find in the U.S. academic culture. Of course, in the U.S., there is heavy attention given to third-party funding. In Germany, this is not completely true. If you have a chair, you actually do have fixed resources that go with the chair, which is not the case in the U.S. But it is typically not enough to do any major project, and it shrinks over time, so you should connect yourself with some third-party funding. However, there is here a pretty strong long-term consensus that we don't tinker with science funding too much.

Spectrum:  Basically, future quantum computers might be based on qubits of two types: atoms or ions suspended by laser beams, and ions or electrons trapped in matter, such as in defects or in quantum dots. Which will prevail?

DiVincenzoWe're close enough to the quantum computer that we kind of can foresee its complexity in a classical sense—that is, how much instrumentation is required for this to work as a quantum computer.  I think that systems that involve lasers add a really big  jump in complexity because they would require a laser system for every qubit. Say we need a million qubits; we will have a system with a complexity well beyond anything that has ever been done in optical science. 

Now, for example, with quantum dots there are no lasers in sight.  Everything is done with electronics and at gigahertz frequencies.  You will need controller devices containing transistors that work at 4 Kelvin. That is an interesting challenge. It turns out that some conventional transistors, right out of the foundry, do work at 4 Kelvin. This becomes the beginning of some real scientific research: How do you make a transistor that really functions in a good way at 4 Kelvin without dissipating too much energy? Energy dissipated in the instrumentation close to the quantum system at that temperature could be one of a whole set of challenges that will not be solved easily. My own personal view is that we're a decade or so away from some really new machines that at least will partially fulfill what we've been thinking about since 1993.  

Cybersecurity System IDs Malware Hidden in Short Twitter Links

Twitter and Facebook users can all too easily get a computer virus when they click on malware links shared by unsuspecting friends. To identify such malicious links on social media, UK researchers have developed a system that recognizes potential cyber attacks within seconds of clicking on a shortened Twitter link.

Read More

Goodbye MagStripe, Hello Chip Cards

Developed at IBM in the 1960s, rolled out in the 70s, caught on globally in the 80s, ubiquitous in the 90s, and now stepping aside—the magnetic stripe card has had a brilliant career.

But now the magnetic stripe card is going into forced retirement, replaced by the chip card after just too many security breaches that cost banks and retailers far too much money.

You probably already received chip card replacements for the mag stripe cards in your wallet; if not, you will soon. And retailers have been busily replacing their card readers—if they haven’t, as of today, 1 October, card-issuing banks will no longer eat the costs of fraud; it will be on the merchant.

Magnetic stripe cards are juicy targets for criminals. They contain information about the customer encoded on the stripe; if criminals get this information by skimming it using an altered card reader or hacking into a retailer’s network, they can use it to make counterfeit cards or sell it online; these fake cards will work until the breach is detected and the card numbers changed.

A safer alternative are chip cards, which communicate with the card readers to create unique data every time they are used. Even if a hacker managed to somehow grab that data and create a counterfeit card, it would be useless for future transactions. And they can be used in combination with a PIN, instead of just a signature, making it harder for even thieves who steal physical cards to use them.

Ironically, the biggest complaint about chip cards, which have to be inserted into a card reader and left there until the payment is complete, is that they are causing the same problem that mag-stripe technology was created to solve—lines that back up when a lot of transactions need to be handled in a short period of time. Back in 1967, airlines had ordered the first wide-body aircraft, and saw that they’d soon be dealing with far more customers arriving at check-in counters than they were used to. And retailers were finding that more and more customers were paying with credit cards, turning filling out charge slips by hand into more and more of a burden. They needed a technology to make identifying customers and making payments go faster.

You’ve likely already noticed that chip card transactions are taking longer than swipe cards—part of it is unfamiliarity, as you fumble to figure out where to put the card into the reader and perhaps have to restart a transaction you interrupted by removing the card too soon. But part of it is the technology—it takes a little longer for the system to verify the transaction with the issuing bank and to create the unique transaction code than it did to complete a magnetic stripe card transaction.

This is one reason  Jerome Svigals, who led the development of magnetic stripe technology at IBM, predicted that chip cards won’t have anything near the long career of their predecessor, and will soon be supplanted by mobile phone triggered payments. Indeed, many of these new readers installed by retailers to handle chip cards also include the capability to communicate with smart phones.

For more on the story behind the development of the magnetic stripe card and the reasons for its dominance for so many decades, see “The Long Life and Imminent Death of the Mag-Stripe Card” and “The Mag-Stripe Era Ends.”

Hajj Pilgrimage Safety Challenges Crowd Simulator Technology

A crowd stampede that caused at least 769 deaths at the annual Muslim pilgrimage to Mecca provided a stark reminder of the safety challenges at the most congested public space in the world. The millions of pilgrims who visit Mecca and other holy sites in Saudi Arabia each year often crowd into spaces as densely packed as six people per square meter. Such crowd densities present a huge challenge for both public safety authorities and computer simulation software designed to help model and predict crowd disasters.

Read More

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More