Tech Talk iconTech Talk

Tunnel Transistor May Meet Power Needs of Future Chips

A new kind of transistor consumes 90 percent less power than conventional transistors, dramatically exceeding a theoretical limit for electronics, researchers say. These findings could one day lead to super-dense low-power circuits as well as ultra-sensitive biosensors and gas sensors, the investigators added.

The relentless advance of computing power over the past half-century has relied on constant miniaturization of field-effect transistors (FETs), which serve as the building blocks of most microchips. Transistors act like switches that flick on and off to represent data as zeroes and ones.

A key challenge that FETs now face is reducing the power they consume. The switching properties of conventional FETs are currently restricted by a theoretical limit of 60 millivolts per decade of current at room temperature. This limit, known as the subthreshold swing, means that each 60-millivolt increase in voltage leads to a 10-fold increase in current. Lowering the swing would yield better channel control, so switching would require less energy.

Read More

Fujitsu Makes a Terahertz Receiver Small Enough for a Smartphone

It’s a good time to be alive for pixel peepers. TV makers are pushing 4K-resolution sets to replace our present 1080p screens; Apple’s iMacs sport a 5K resolution; and NHK, Japan’s national broadcaster, is testing 8K broadcasting equipment, targeting 2020 and the Tokyo Olympics for its introduction.

To help wireless devices cope with the higher speeds demanded by such applications, Fujitsu has developed a 300-GHz prototype receiver compact enough to fit into a cellphone. Though limited to about 1 meter in range, the company says the device can download 4K and 8K video almost instantly.

Read More

How Atoms Dance in Dielectrics

Atoms within the tiny crystals in many dielectric and ferroelectric ceramics twist and dance when an electric field is turned on. But until now, nobody could tell exactly how they twisted and danced, even though understanding those movements could be the key to developing more compact, higher performance capacitors and condensers.

In a new Scientific Reports paper, researchers at North Carolina State University—with colleagues from the U.S. National Institute of Standards and Technology and the University of New South Wales—show how new ways of analyzing X-ray diffraction data can reveal the details in a ceramic’s structural response to an electric field.

A ceramic can be a complex jumble of amorphous material that glues together crystalline particles of different sizes in different orientations. Traditionally, scientists probe crystal structures by reading the interference patterns produced by X-rays passing through them. They usually analyze only the primary signal, the strongest bands or brightest rings, which give information on the overall structure. This works well on homogeneous material in regular lattices. In a heterogeneous ceramic, though, with crystals pointing every which way, a lot of detail is buried in the faint secondary signals and diffusion patterns.

The NCSU-led team captured this detail by applying the pair distribution function (PDF). The method, first described in the 1960s but used increasingly over the past decade, lets researchers read all of the signals in a wedge of the diffraction pattern to calculate the length and orientation of the bonds between pairs of atoms. (See diagram.) Researchers take snapshots of the ceramic with and without an applied electric field, taking a census of how many microcrystals are in which orientations in each condition.

"A good analogy would be that analyzing the bright rings is like examining a skyscraper from far away and determining that each office is 500 square feet,” said NCSU’s Tedi-Marie Usher in a press release. “However, by also analyzing the weak X-rays scattered from the sample, we can determine that some offices are 400 square feet and others are 600 square feet, and some have the desk on the east side, and others have the desk on the north side."

The research team analyzed three perovskite materials— barium titanate (BaTiO3), sodium bismuth titanate (Na0.5Bi0.5TiO3), and strontium titanate (SrTiO3)—all members of a class of ceramics with useful dielectric, ferroelectric, and piezoelectric properties.

In an electric field, dielectrics become polarized and ferroelectrics reverse their polarity. This segregation—positive charge on one side, negative on the other, nobody in the middle—impedes current flow across the gap, letting energy build up. The material’s relative permittivity (or dielectric constant) is an index of how effectively it can store energy as an electric field. Air and a vacuum have permittivities of 1 (or very close to it). Silicon’s is 12. In barium titanate (which is also the first piezoelectric ceramic identified), the permittivity can range up from 1,200 to 10,000 or more.

As the material polarizes, its atoms reorient themselves in their crystal lattices. In sodium bismuth titanate, for example, bismuth atoms align with the electric field, changing their relationships with the surrounding titanium ions. (See animation. The bismuth atoms are shown in purple, the titanium in blue.)

“Dipolar effects are known to have large contributions to dielectric permittivity,” said NCSU’s Jacob Jones, the team leader, in a press release.  “The measurement tells us the population of dipoles that are reorienting. We could combine this information with the microscopic spontaneous dipole magnitude and calculate a net contribution to the macroscopic dielectric permittivity.”

Right now, there appears to be no one-size-fits-all mechanism for structural shifts in dielectric ceramics.

 “One of the interesting findings here is that each of the three dielectric materials we tested exhibited very different behaviors at the atomic level,” Jones said. “There was no single atomic behavior that accounted for dielectric properties across the materials."

So there is much more to be learned. Said Jones:

Of immediate interest are a broad range of dielectric materials that exhibit complicated (and elusive) structures. For example, perovskites with components Na0.5 Bi0.5 TiO3 [sodium bismuth titanate], K0.5Bi0.5TiO3 [potassium bismuth titanate], and Pb(Mg0.33Nb0.67)O3 [lead magnesium niobate] and their solid solutions. These compounds exhibit unique local structures that will respond to field differently. This is a useful technique to reveal interesting physical origins of unique dielectric, piezoelectric, and ferroelectric properties of these materials. We are also interested in extending this technique to an emerging class of high entropy alloys (HEAs) and entropy-stabilized oxides (ESOs), where unique elements may behave differently in response to mechanical stress or electric fields.  The unique local behavior of the elements in these multi-element materials is not accessible in traditional diffraction measurements.

How Much Power Will Quantum Computing Need?

Google’s Quantum AI Lab has installed the latest generation of what D-Wave Systems describes as the world’s first commercial quantum computers. Quantum computing could potentially solve certain problems much faster than today’s classical computers while using comparatively less power to perform the calculations. Yet the energy efficiency of quantum computing still remains a mystery.

For now, D-Wave’s machines can scale up the number of quantum bits (qubits) they use without significantly increasing their power requirements. That’s because D-Wave’s quantum computing hardware relies on a specialized design consisting of metal niobium loops that act as superconductors when chilled to a frigid 15 millikelvin (-273°  C). Much of the D-Wave hardware’s power consumption—slightly less than 25 kilowatts for the latest machine—goes toward running the refrigeration unit that keeps the quantum processor cool. The quantum processor itself requires a comparative pittance.

Read More

David DiVincenzo on his Tenure at IBM and the Future of Quantum Computing

Theoretical physicist David DiVincenzo is widely viewed as one of the pioneers of quantum computing. He authored a 1996 paper (PDF) outlining five criteria he predicted would make quantum computing a reality; it has become a de facto roadmap for most of the research in quantum computing since then. In 1998, with Daniel Loss, he proposed using electron spins for storing data as qubits in quantum dots, which might prove to be the best choice for creating a working quantum computer.

In 2010, DiVincenzo was invited by Germany's Alexander von Humboldt Foundation to become Director of the Institute of Theoretical Nanoelectronics at the Peter Grünberg Institute in Jülich, and a professor at the Institute for Quantum Information of RWTH Aachen University. Previously he was a research director at the IBM T.J. Watson Research Center in Yorktown Heights, N.Y.

We met DiVincenzo in his Spartan office at the Physikzentrum of RWTH Aachen University, which is located “ten minutes by bicycle” from The Netherlands, where DiVincenzo has made his home.

IEEE Spectrum: You turned to investigating quantum computing while working as a theoretical physicist at IBM. What caught your interest?

DiVincenzo: I became interested in around 1993. It was not very much of a field at that time, but it was a field. There were two very eminent IBM scientists who were already involved for much longer: Rolf Landauer and Charles Bennett. Landauer is remembered for his contributions to the fundamental understanding of computing. Questions like what is the minimum amount of energy required to perform computational processes.

Landauer was quite important in the original discussions of quantum computing because he provided a skeptical point of view.  He thought that the sensitivity to imperfections in matter would be devastating for quantum computing. But he was interested in the concept of error correction that arose at that time and that could be applied to quantum computing.  And this really turned the story around. Bennett was famous for introducing the ideas of quantum physics into information science and cryptography. In 1993, he worked on what is now known as quantum teleportation.

I was fascinated by these developments, and at that time, IBM was flexible enough so that I could just jump in. I started contributing various ideas, and the following year, the Shor Factoring Algorithm was discovered. This made it clear that quantum computing could be done.

Spectrum: So, the research culture at IBM was definitely an important factor in your research career.

DiVincenzo: I would say that the research culture at IBM was always distinct. There was a whole evolution over the decades. For years it was thinking of itself in relation to Bell Labs: Are we as famous as Bell Labs?  That’s the history of the 1970s. I joined the lab in the 1980s; I had many friends that were there from the beginning, and I think I had a feeling for what the culture was like. IBM tried to really build up its research in the 1960s. In that period, they were definitely looking at themselves hoping to be another Bell Labs, which was in its heyday. By the 1980s, I felt that at that point they really did not have to worry whether they [were turning out] science of a comparable quality as Bell Labs. The cultures were similar; they would take rather young scientists and immediately give them all the resources of an institute, basically, without any of the responsibilities. This is a fantastic model which has proven to be not so sustainable—at least not in the corporate world. [And beginning in the early 1990s, it wasn’t really sustainable within IBM.]

Spectrum: How did this change affect your work at IBM?

DiVinzenzo: IBM had a heavy financial crisis in 1993, its most severe one.  It had a moment when it was really questioning its whole business model and whether it should be broken up into smaller companies. IBM undertook a whole sequence of different steps, such as getting out of personal computers,  and each one made it appear that physics had become less relevant to IBM.  The physics department got much smaller that year, and I remained in that smaller department. But we had a fantastic time after that in quantum computing.

Spectrum: So at least the research culture at IBM survived.

DiVincenzo: Here I would say something about the culture of IBM versus Bell. Bell evolved into a very competitive internal culture. People were really knocking against each other. Internal seminars were quite an ordeal because you were subjected to really heavy scrutiny. Internal dealings among scientists at IBM were much more congenial. 

The rest of physics at IBM was suffering. IBM still has a physics department, but at this point almost every physicist is somehow linked to a product plan or customer plan. At IBM, right from the beginning, there was always hope that these physicists who were dreaming up interesting things could actually contribute. Research became more directed as time went on.

Spectrum: It is now five years since you joined two German universities. How would you compare the life of a researcher in Germany—a country traditionally known for its emphasis on academic freedom ever since the 19th century? Do researchers in Germany have the same freedom that researchers had at IBM during the 1960s?

DiVincenzoNo. But I think they are freer and have more flexibility than what you find in the U.S. academic culture. Of course, in the U.S., there is heavy attention given to third-party funding. In Germany, this is not completely true. If you have a chair, you actually do have fixed resources that go with the chair, which is not the case in the U.S. But it is typically not enough to do any major project, and it shrinks over time, so you should connect yourself with some third-party funding. However, there is here a pretty strong long-term consensus that we don't tinker with science funding too much.

Spectrum:  Basically, future quantum computers might be based on qubits of two types: atoms or ions suspended by laser beams, and ions or electrons trapped in matter, such as in defects or in quantum dots. Which will prevail?

DiVincenzoWe're close enough to the quantum computer that we kind of can foresee its complexity in a classical sense—that is, how much instrumentation is required for this to work as a quantum computer.  I think that systems that involve lasers add a really big  jump in complexity because they would require a laser system for every qubit. Say we need a million qubits; we will have a system with a complexity well beyond anything that has ever been done in optical science. 

Now, for example, with quantum dots there are no lasers in sight.  Everything is done with electronics and at gigahertz frequencies.  You will need controller devices containing transistors that work at 4 Kelvin. That is an interesting challenge. It turns out that some conventional transistors, right out of the foundry, do work at 4 Kelvin. This becomes the beginning of some real scientific research: How do you make a transistor that really functions in a good way at 4 Kelvin without dissipating too much energy? Energy dissipated in the instrumentation close to the quantum system at that temperature could be one of a whole set of challenges that will not be solved easily. My own personal view is that we're a decade or so away from some really new machines that at least will partially fulfill what we've been thinking about since 1993.  

Cybersecurity System IDs Malware Hidden in Short Twitter Links

Twitter and Facebook users can all too easily get a computer virus when they click on malware links shared by unsuspecting friends. To identify such malicious links on social media, UK researchers have developed a system that recognizes potential cyber attacks within seconds of clicking on a shortened Twitter link.

Read More

Goodbye MagStripe, Hello Chip Cards

Developed at IBM in the 1960s, rolled out in the 70s, caught on globally in the 80s, ubiquitous in the 90s, and now stepping aside—the magnetic stripe card has had a brilliant career.

But now the magnetic stripe card is going into forced retirement, replaced by the chip card after just too many security breaches that cost banks and retailers far too much money.

You probably already received chip card replacements for the mag stripe cards in your wallet; if not, you will soon. And retailers have been busily replacing their card readers—if they haven’t, as of today, 1 October, card-issuing banks will no longer eat the costs of fraud; it will be on the merchant.

Magnetic stripe cards are juicy targets for criminals. They contain information about the customer encoded on the stripe; if criminals get this information by skimming it using an altered card reader or hacking into a retailer’s network, they can use it to make counterfeit cards or sell it online; these fake cards will work until the breach is detected and the card numbers changed.

A safer alternative are chip cards, which communicate with the card readers to create unique data every time they are used. Even if a hacker managed to somehow grab that data and create a counterfeit card, it would be useless for future transactions. And they can be used in combination with a PIN, instead of just a signature, making it harder for even thieves who steal physical cards to use them.

Ironically, the biggest complaint about chip cards, which have to be inserted into a card reader and left there until the payment is complete, is that they are causing the same problem that mag-stripe technology was created to solve—lines that back up when a lot of transactions need to be handled in a short period of time. Back in 1967, airlines had ordered the first wide-body aircraft, and saw that they’d soon be dealing with far more customers arriving at check-in counters than they were used to. And retailers were finding that more and more customers were paying with credit cards, turning filling out charge slips by hand into more and more of a burden. They needed a technology to make identifying customers and making payments go faster.

You’ve likely already noticed that chip card transactions are taking longer than swipe cards—part of it is unfamiliarity, as you fumble to figure out where to put the card into the reader and perhaps have to restart a transaction you interrupted by removing the card too soon. But part of it is the technology—it takes a little longer for the system to verify the transaction with the issuing bank and to create the unique transaction code than it did to complete a magnetic stripe card transaction.

This is one reason  Jerome Svigals, who led the development of magnetic stripe technology at IBM, predicted that chip cards won’t have anything near the long career of their predecessor, and will soon be supplanted by mobile phone triggered payments. Indeed, many of these new readers installed by retailers to handle chip cards also include the capability to communicate with smart phones.

For more on the story behind the development of the magnetic stripe card and the reasons for its dominance for so many decades, see “The Long Life and Imminent Death of the Mag-Stripe Card” and “The Mag-Stripe Era Ends.”

Hajj Pilgrimage Safety Challenges Crowd Simulator Technology

A crowd stampede that caused at least 769 deaths at the annual Muslim pilgrimage to Mecca provided a stark reminder of the safety challenges at the most congested public space in the world. The millions of pilgrims who visit Mecca and other holy sites in Saudi Arabia each year often crowd into spaces as densely packed as six people per square meter. Such crowd densities present a huge challenge for both public safety authorities and computer simulation software designed to help model and predict crowd disasters.

Read More

Oxynitride Thin-film Transistors: Faster Screens with Faster Electrons

Currently, LCD HDTV screens operate according to ATC Standards, with 1080 gate lines and refresh rates up to 30 frames per second.  Going to higher definition, such as 4000 or 8000 lines and refresh rates up to 240 frames per second, is currently out of reach, both for OLED and LCD screens, because the pixels are driven by thin-film transistors that are simply too slow. Typically, their electron mobility is below 20 square centimeters per volt seconds (cm2/vs). “There is not much time to supply the voltage or the current through the transistor for a high frame rate and high definition display,” says Sanghun Jeon (PDF), a researcher at Korea University in Sejong. “We estimate that for future display technology, the mobility should be exceeding 100 cm2/vs.” Jeon and other researchers at Korea University and at the Samsung Advanced Institute of Technology in Gyeonggi-do, Korea, report this week in Applied Physics Letters the creation of just such a transistor. The one they developed has an electron mobility of 138 cm2/vs.

The transistor consists of a thin film of zinc oxynitride, ZnON, a glassy composite of ZnO, ZnOxNx  and Zn3N2. Using sputtering, they deposited a mixture of nitrogen, oxygen, and argon gases onto a zinc target, forming a film 50 nanometers thick. However, because nitrogen bonds weakly with zinc, as compared with oxygen, the layer was susceptible to oxidation when exposed to air. To prevent this, the researchers treated the layer with a high-energy plasma of argon ions. These ions, when they collided with the atoms in the ZnON, rearranged the bonds between atoms, just like shaking a box with colored marbles will result in a redistribution of colors. The process resulted in a chemically uniform layer, more resistant to chemical degradation. 

The researchers made scanning transmission electron microscope (STEM) images of an annealed thin film obtained by current production processes with their new experimental thin film. “The STEM image of argon-plasma-treated ZnON material shows uniform contrast throughout the entire film, indicating that the composition of argon-plasma-treated ZnON film is very uniform,” says Jeon. And it is this uniformity that is at the root of the high electron mobility in the ZnON film, explains Jeon, who notes that defects at the interfaces and in the bulk of such semiconductors slow electrons down a lot.

The argon plasma process augurs well for future large scale production, especially because ZnON is known to have poor reproducibility. “We were  surprised that by employing the argon process, we were able to reproduce the device well, together with a high mobility constantly exceeding 100 cm2/vs," says Jeon. However, more work is required before the thin-film transistors will control pixels on actual screens. Getting to that point will take two or three more years, says Jeon. 

Navy Diversifies Ships' Cyber Systems to Foil Hackers

Cyber attacks could prove just as deadly to technologically advanced warships as missiles and torpedoes in the future. That is why the U.S. Navy has been developing a defense system to protect its ships against hackers who threaten to disable or take control of critical shipboard systems.

The Resilient Hull, Mechanical, and Electrical Security (RHIMES) system aims to prevent cyber attackers from compromising the programmable logic controllers that connect a ship’s computers with onboard physical systems. RHIMES uses slightly different versions of core programming for each physical controller so that a cyber attack can’t disable or take over all shipboard systems in one fell swoop.

“In the event of a cyber attack, RHIMES makes it so that a different hack is required to exploit each controller,” said Ryan Craven, a program officer of the Cyber Security and Complex Software Systems Program in the Office of Naval Research, in a press release.“The same exact exploit can’t be used against more than one controller.”

That seemingly basic precaution could go a long way toward protecting crucial warship systems such as damage control and firefighting, electric power, steering and engine control. The loss of one or more such systems could prove especially devastating in the middle of a naval operation or battle; especially if hackers turn the ship’s systems against itself.

The threat of cyber attacks crippling or taking over large physical systems has already been proven in recent years. Stuxnet, the “computer worm” developed by the United States and Israel, attacked Iran’s nuclear program by compromising the physical controllers of Iranian centrifuges and running them at high speeds to damage the equipment. (A similar Stuxnet-style effort aimed at North Korea’s nuclear program failed because it couldn’t access the crucial computers.)

“Another powerful example is the hacking of a German steel mill in 2014,” Craven explained. “The hackers reportedly got in and overheated a blast furnace, and even made it so that the plant workers couldn’t properly shut down the furnace, causing massive damage to the system.”

The Navy’s RHIMES approach to cybersecurity could also pay off outside of warships. A similar strategy might help safeguard the physical controllers found in cars, aircraft, and factories. That could work in tandem with complementary defenses such as air-gapped systems isolated from networks or adding analog systems and humans into the loop as safeguards.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More