Tech Talk iconTech Talk

Scientists Control a Fly's Heart With a Laser

human os icon

The anesthetized fruit fly lay immobile on a microscope slide, its wings taped to the glass, while scientists directed a blue laser at its abdomen.

The laser beam didn’t harm the fly. Instead, the light pulsed 10 times per second in a precise and regular cadence—and the fly’s heart beat 10 times per second in exact synchronicity. The researchers had invented an optical pacemaker, which they describe today in the journal Science Advances

Typical pacemakers, which were first developed in the 1950s, rely on the fact that the heart is an electric organ. Each natural heartbeat is caused by an electrical impulse that ripples through the cardiac cells, making them contract and push blood through the heart and around the body. Implanted electric pacemakers use electrodes to deliver a steady series of impulses to the cardiac tissue, helping hearts that have trouble maintaining a regular rhythm.

The optical pacemaker instead triggers those cardiac cell contractions with pulses of light, using the hot new research technique of optogenetics. The researchers first bred a strain of genetically altered fruit flies whose cardiac cells contained a light-sensitive protein taken from algae. Then they directed the laser at a fly’s heart, shining it through the fly’s intact exoskeleton, activating those protein-containing cells.  

Read More

Immigrants Have a Growing Role in the U.S. Sci-Tech Workforce

Immigrants have a growing presence in the U.S. science and engineering workforce, according to a new report by the National Science Foundation. Between 2003 and 2013, the number of U.S. scientists and engineers increased from 21.6 million to 29 million. In that time, the number of immigrants in the field rose from 16 percent of the total tech workforce (3.4 million) to 18 percent (5.2 million).

The percentage of foreign-born scientists and engineers who were employed in 2013 was about 81 percent—the same as their U.S.-born counterparts.

According to the report, 57 percent of immigrant scientists and engineers hail from Asia; this group includes naturalized citizens, permanent residents, and temporary visa holders. Immigrants from the Americas and the Caribbean make up 20 percent, and Europeans make up 16 percent. India was the leading source nation for immigrant scientists and engineers. Between 2003 and 2013, the number of U.S. scientists and engineers who immigrated from India nearly doubled from 515,000 to 950,000.

The report shows that foreign-born scientists and engineers are more likely to earn higher degrees than their U.S.-born counterparts. In 2013, 9 percent of immigrants earned a doctorate compared to 3.8 percent of U.S.-born citizens.

Nearly 15 percent of immigrants earned their highest degree in computer and mathematical science, and over 20 percent earned engineering degrees. For U.S.-born citizens, those shares were about 8 percent and 10 percent respectively.

The percentage of immigrants in the U.S. tech workforce is likely to keep increasing. While talk of a STEM skills shortage in the U.S. could be overblown, tech companies have been pushing for more H-1B visas, and corporate recruiters often cite STEM worker shortages. Last year, President Obama announced immigration plans that make it easier for foreign-born students to stay and work in the U.S.

The challenge might be training and retaining skilled, engaged scientists and engineers. Some foreign policy experts believe that the United States should wage a war for global talent and nab the world’s best scientists and engineers while the U.S. economy is getting stronger.

What do you think? Does the United States need to offer even more H-1B visas? Is that what’s necessary to attract the world’s best talent?

The Google Lunar XPrize Race Is Officially On

Today, the Google Lunar XPrize (GLXP) announced that Israeli team SpaceIL is the first to sign a verified launch contract that covers the first leg of its journey to the moon. The educational nonprofit’s spacecraft is slated to launch on a SpaceX Falcon 9 rocket in the second half of 2017.

This is huge news for the GLXP, which is offering at least US $20 million to the first private team to land on the moon and perform certain tasks. The lunar deadline is currently set at the end of 2017, but a more pressing deadline has been looming for quite some time.

Until now, we were just months away from what could have been the contest’s real expiration date. According to the present rules, if no team showed evidence of a launch agreement by the end of 2015, the contest would be over.

With SpaceIL’s contract, at least one team is now officially in the running for the prize. “It really is the new space race now,” GLXP’s senior director Chanda Gonzales says.

As I explained earlier this year, finding a way to get off the ground has been big challenge for GLXP competitors. Rocket berths are expensive, and while a team could potentially launch more cheaply as a secondary payload, it’s a tough pitch when you’re proposing adding a spacecraft loaded with potentially explosive rocket fuel.

“It’s more than tricky,” says SpaceIL CEO Eran Privman. “We found it almost impossible.” Privman explains that the team at one point had been working on securing a launch with Russia. “There we managed to get an initial agreement, but it failed due to geopolitical issues.” Attempts to secure a berth as a secondary also failed, he says, so “we decided to change the game.”

In the end, the team signed a contract with Seattle-based launch broker Spaceflight. SpaceIL’s craft will launch in a cluster containing more than 20 payloads, each contracting with Spaceflight. The launch will cost SpaceIL more than US $10 million, Privman says. (It’s not clear how much more than $10 million the actual fee is, but it’s likely a fraction of the oft-quoted “front-door price” for a whole Falcon 9, some $60 million.)

According to the revised GLXP rules laid out in May, any remaining teams who wish to compete must provide notification of a launch contract by the end of next year. But it may not take that long before we see other teams join the starting line.

Last week, aspiring lunar mining company Moon Express announced that it has signed a launch contract with Rocket Lab, a New Zealand-based start-up that’s hoping to drive down the cost of space access. 

 The contract is for five launches, totalling about US $30 million, Daven Maharaj, Moon Express vice president of operations told me on Friday.

Moon Express hopes to launch on Rocket Lab’s Electron rocket, which is expected to begin test launches shortly. Moon Express has reserved two Electron launches in 2017, which could be a good move if there are any issues with the new rocket. “There is definitely some risk there,” Maharaj says. “That’s one of the advantages—by picking two slots we’re basically guaranteeing ourselves a scheduled relaunch.”

The GLXP has yet to receive the launch contract documentation from Moon Express, Gonzales says. When it does, the agreement will be evaluated as the SpaceIL one was, she says, on both financial and technical terms. 

Of course, even with a more established company, no flight can be guaranteed, and no flight date is set in stone. SpaceX is still working on resuming flights following a launch failure in June. “I believe that [the delay] shouldn’t affect the manifest in the second half of 2017,” SpaceIL’s Privman says. “But you know that predicting is very unpredictable.”

Even with these uncertainties, it’s exciting to see the Google Lunar XPrize move into this new stage. The competition was first announced in 2007, and its original deadline for claiming the full top prize was 2012. It’s been a long road. Perhaps one day soon it will end with a few new landers dipping their feet in lunar regolith.

Follow Rachel Courtland on Twitter at @rcourt.

DARPA Wants to Jolt the Nervous System with Electricity, Lasers, Sound Waves, and Magnets

Viewing the body as a chemical system and treating maladies with pharmaceuticals is so 20th century. In 21st century medicine, doctors may consider the body as an electrical system instead, and prescribe therapies that alter the electrical pulses that run through the nerves.  

That’s the premise of DARPA’s newest biomedical program, anyway. The ElectRx program aims to treat disease by modulating the activity of the peripheral nerves that carry commands to all the organs and muscles of the human body, and also convey sensory information back to the brain. 

Yesterday, DARPA announced the first seven grants under the ElectRx program. The scientists chosen are doing fairly fundamental research, because we’re still in the early days of electric medicine; they’ll investigate mechanisms by which to stimulate the nerves, and map nerve pathways that respond to that stimulation. They’re working on treatments for disorders such as chronic pain, post-traumatic stress, and inflammatory bowel disease. 

The proposed stimulation methods are fascinating in their diversity. Researchers will not only stimulate nerves with jolts of electricity, they’ll also use pulses of light, sound waves, and magnetic fields. 

Three research teams using electrical stimulation will target the vagus nerve, which affects many different parts of the body. IEEE Spectrum explored the medical potential of vagus nerve hacking in a recent feature article, writing:   

Look at an anatomy chart and the importance of the vagus nerve jumps out at you. Vagus means “wandering” in Latin, and true to its name, the nerve meanders around the chest and abdomen, connecting most of the key organs—heart and lungs included—to the brain stem. It’s like a back door built into the human physiology, allowing you to hack the body’s systems.

The light-based stimulation research comes from the startup Circuit Therapeutics. The company was cofounded by Stanford’s Karl Deisseroth, one of the inventors of optogenetics, the new technique that inserts light-sensitive proteins into neurons and then uses pulses of light to turn those neurons “on” and “off.” Under the DARPA grant, the researchers will try to use pulses of light to alter neural circuits involved in neuropathic pain. 

To tweak the nervous system with sound waves, Columbia University’s Elisa Konofagou will use a somewhat mysterious ultrasound technique. In an e-mail, Konofagou explains that it’s already known that ultrasound can be used to stimulate neurons, but with the DARPA grant, she hopes to figure out how it works. Her hypothesis: As ultrasound propogates through biological tissue, it exerts mechanical pressure on that tissue, which stimulates specific mechanosensitive channels in neurons and causes them to “turn on.” 

The final project will rely on magnetic fields to activate neurons, using a technique that could be called “magnetogenetics.” An MIT team led by Polina Anikeeva will insert heat-sensitive proteins into neurons, and will then deploy magnetic nanoparticles that bind to the surface of those neurons. When exposed to a magnetic field, these nanoparticles heat up and activate the neurons to which they’re attached. 

Figuring out how to alter the activity of the nervous systems with these various tricks will be a pretty impressive accomplishment. But in the DARPA world, achieving that understanding is just step one. Next, the agency wants its grantees to develop “closed-loop” systems capable of detecting biomarkers that signal the onset of disease, and then respond automatically with neural stimulation. Spectrum covered the first such closed-loop neural stimulators in a recent feature article, stating: 

The goal of all these closed-loop systems is to let doctors take their expert knowledge—their ability to evaluate a patient’s condition and adjust therapy accordingly—and embed it in an implanted device.

Tunnel Transistor May Meet Power Needs of Future Chips

A new kind of transistor consumes 90 percent less power than conventional transistors, dramatically exceeding a theoretical limit for electronics, researchers say. These findings could one day lead to super-dense low-power circuits as well as ultra-sensitive biosensors and gas sensors, the investigators added.

The relentless advance of computing power over the past half-century has relied on constant miniaturization of field-effect transistors (FETs), which serve as the building blocks of most microchips. Transistors act like switches that flick on and off to represent data as zeroes and ones.

A key challenge that FETs now face is reducing the power they consume. The switching properties of conventional FETs are currently restricted by a theoretical limit of 60 millivolts per decade of current at room temperature. This limit, known as the subthreshold swing, means that each 60-millivolt increase in voltage leads to a 10-fold increase in current. Lowering the swing would yield better channel control, so switching would require less energy.

Read More

Fujitsu Makes a Terahertz Receiver Small Enough for a Smartphone

It’s a good time to be alive for pixel peepers. TV makers are pushing 4K-resolution sets to replace our present 1080p screens; Apple’s iMacs sport a 5K resolution; and NHK, Japan’s national broadcaster, is testing 8K broadcasting equipment, targeting 2020 and the Tokyo Olympics for its introduction.

To help wireless devices cope with the higher speeds demanded by such applications, Fujitsu has developed a 300-GHz prototype receiver compact enough to fit into a cellphone. Though limited to about 1 meter in range, the company says the device can download 4K and 8K video almost instantly.

Read More

How Atoms Dance in Dielectrics

Atoms within the tiny crystals in many dielectric and ferroelectric ceramics twist and dance when an electric field is turned on. But until now, nobody could tell exactly how they twisted and danced, even though understanding those movements could be the key to developing more compact, higher performance capacitors and condensers.

In a new Scientific Reports paper, researchers at North Carolina State University—with colleagues from the U.S. National Institute of Standards and Technology and the University of New South Wales—show how new ways of analyzing X-ray diffraction data can reveal the details in a ceramic’s structural response to an electric field.

A ceramic can be a complex jumble of amorphous material that glues together crystalline particles of different sizes in different orientations. Traditionally, scientists probe crystal structures by reading the interference patterns produced by X-rays passing through them. They usually analyze only the primary signal, the strongest bands or brightest rings, which give information on the overall structure. This works well on homogeneous material in regular lattices. In a heterogeneous ceramic, though, with crystals pointing every which way, a lot of detail is buried in the faint secondary signals and diffusion patterns.

The NCSU-led team captured this detail by applying the pair distribution function (PDF). The method, first described in the 1960s but used increasingly over the past decade, lets researchers read all of the signals in a wedge of the diffraction pattern to calculate the length and orientation of the bonds between pairs of atoms. (See diagram.) Researchers take snapshots of the ceramic with and without an applied electric field, taking a census of how many microcrystals are in which orientations in each condition.

"A good analogy would be that analyzing the bright rings is like examining a skyscraper from far away and determining that each office is 500 square feet,” said NCSU’s Tedi-Marie Usher in a press release. “However, by also analyzing the weak X-rays scattered from the sample, we can determine that some offices are 400 square feet and others are 600 square feet, and some have the desk on the east side, and others have the desk on the north side."

The research team analyzed three perovskite materials— barium titanate (BaTiO3), sodium bismuth titanate (Na0.5Bi0.5TiO3), and strontium titanate (SrTiO3)—all members of a class of ceramics with useful dielectric, ferroelectric, and piezoelectric properties.

In an electric field, dielectrics become polarized and ferroelectrics reverse their polarity. This segregation—positive charge on one side, negative on the other, nobody in the middle—impedes current flow across the gap, letting energy build up. The material’s relative permittivity (or dielectric constant) is an index of how effectively it can store energy as an electric field. Air and a vacuum have permittivities of 1 (or very close to it). Silicon’s is 12. In barium titanate (which is also the first piezoelectric ceramic identified), the permittivity can range up from 1,200 to 10,000 or more.

As the material polarizes, its atoms reorient themselves in their crystal lattices. In sodium bismuth titanate, for example, bismuth atoms align with the electric field, changing their relationships with the surrounding titanium ions. (See animation. The bismuth atoms are shown in purple, the titanium in blue.)

“Dipolar effects are known to have large contributions to dielectric permittivity,” said NCSU’s Jacob Jones, the team leader, in a press release.  “The measurement tells us the population of dipoles that are reorienting. We could combine this information with the microscopic spontaneous dipole magnitude and calculate a net contribution to the macroscopic dielectric permittivity.”

Right now, there appears to be no one-size-fits-all mechanism for structural shifts in dielectric ceramics.

 “One of the interesting findings here is that each of the three dielectric materials we tested exhibited very different behaviors at the atomic level,” Jones said. “There was no single atomic behavior that accounted for dielectric properties across the materials."

So there is much more to be learned. Said Jones:

Of immediate interest are a broad range of dielectric materials that exhibit complicated (and elusive) structures. For example, perovskites with components Na0.5 Bi0.5 TiO3 [sodium bismuth titanate], K0.5Bi0.5TiO3 [potassium bismuth titanate], and Pb(Mg0.33Nb0.67)O3 [lead magnesium niobate] and their solid solutions. These compounds exhibit unique local structures that will respond to field differently. This is a useful technique to reveal interesting physical origins of unique dielectric, piezoelectric, and ferroelectric properties of these materials. We are also interested in extending this technique to an emerging class of high entropy alloys (HEAs) and entropy-stabilized oxides (ESOs), where unique elements may behave differently in response to mechanical stress or electric fields.  The unique local behavior of the elements in these multi-element materials is not accessible in traditional diffraction measurements.

How Much Power Will Quantum Computing Need?

Google’s Quantum AI Lab has installed the latest generation of what D-Wave Systems describes as the world’s first commercial quantum computers. Quantum computing could potentially solve certain problems much faster than today’s classical computers while using comparatively less power to perform the calculations. Yet the energy efficiency of quantum computing still remains a mystery.

For now, D-Wave’s machines can scale up the number of quantum bits (qubits) they use without significantly increasing their power requirements. That’s because D-Wave’s quantum computing hardware relies on a specialized design consisting of metal niobium loops that act as superconductors when chilled to a frigid 15 millikelvin (-273°  C). Much of the D-Wave hardware’s power consumption—slightly less than 25 kilowatts for the latest machine—goes toward running the refrigeration unit that keeps the quantum processor cool. The quantum processor itself requires a comparative pittance.

Read More

David DiVincenzo on his Tenure at IBM and the Future of Quantum Computing

Theoretical physicist David DiVincenzo is widely viewed as one of the pioneers of quantum computing. He authored a 1996 paper (PDF) outlining five criteria he predicted would make quantum computing a reality; it has become a de facto roadmap for most of the research in quantum computing since then. In 1998, with Daniel Loss, he proposed using electron spins for storing data as qubits in quantum dots, which might prove to be the best choice for creating a working quantum computer.

In 2010, DiVincenzo was invited by Germany's Alexander von Humboldt Foundation to become Director of the Institute of Theoretical Nanoelectronics at the Peter Grünberg Institute in Jülich, and a professor at the Institute for Quantum Information of RWTH Aachen University. Previously he was a research director at the IBM T.J. Watson Research Center in Yorktown Heights, N.Y.

We met DiVincenzo in his Spartan office at the Physikzentrum of RWTH Aachen University, which is located “ten minutes by bicycle” from The Netherlands, where DiVincenzo has made his home.

IEEE Spectrum: You turned to investigating quantum computing while working as a theoretical physicist at IBM. What caught your interest?

DiVincenzo: I became interested in around 1993. It was not very much of a field at that time, but it was a field. There were two very eminent IBM scientists who were already involved for much longer: Rolf Landauer and Charles Bennett. Landauer is remembered for his contributions to the fundamental understanding of computing. Questions like what is the minimum amount of energy required to perform computational processes.

Landauer was quite important in the original discussions of quantum computing because he provided a skeptical point of view.  He thought that the sensitivity to imperfections in matter would be devastating for quantum computing. But he was interested in the concept of error correction that arose at that time and that could be applied to quantum computing.  And this really turned the story around. Bennett was famous for introducing the ideas of quantum physics into information science and cryptography. In 1993, he worked on what is now known as quantum teleportation.

I was fascinated by these developments, and at that time, IBM was flexible enough so that I could just jump in. I started contributing various ideas, and the following year, the Shor Factoring Algorithm was discovered. This made it clear that quantum computing could be done.

Spectrum: So, the research culture at IBM was definitely an important factor in your research career.

DiVincenzo: I would say that the research culture at IBM was always distinct. There was a whole evolution over the decades. For years it was thinking of itself in relation to Bell Labs: Are we as famous as Bell Labs?  That’s the history of the 1970s. I joined the lab in the 1980s; I had many friends that were there from the beginning, and I think I had a feeling for what the culture was like. IBM tried to really build up its research in the 1960s. In that period, they were definitely looking at themselves hoping to be another Bell Labs, which was in its heyday. By the 1980s, I felt that at that point they really did not have to worry whether they [were turning out] science of a comparable quality as Bell Labs. The cultures were similar; they would take rather young scientists and immediately give them all the resources of an institute, basically, without any of the responsibilities. This is a fantastic model which has proven to be not so sustainable—at least not in the corporate world. [And beginning in the early 1990s, it wasn’t really sustainable within IBM.]

Spectrum: How did this change affect your work at IBM?

DiVinzenzo: IBM had a heavy financial crisis in 1993, its most severe one.  It had a moment when it was really questioning its whole business model and whether it should be broken up into smaller companies. IBM undertook a whole sequence of different steps, such as getting out of personal computers,  and each one made it appear that physics had become less relevant to IBM.  The physics department got much smaller that year, and I remained in that smaller department. But we had a fantastic time after that in quantum computing.

Spectrum: So at least the research culture at IBM survived.

DiVincenzo: Here I would say something about the culture of IBM versus Bell. Bell evolved into a very competitive internal culture. People were really knocking against each other. Internal seminars were quite an ordeal because you were subjected to really heavy scrutiny. Internal dealings among scientists at IBM were much more congenial. 

The rest of physics at IBM was suffering. IBM still has a physics department, but at this point almost every physicist is somehow linked to a product plan or customer plan. At IBM, right from the beginning, there was always hope that these physicists who were dreaming up interesting things could actually contribute. Research became more directed as time went on.

Spectrum: It is now five years since you joined two German universities. How would you compare the life of a researcher in Germany—a country traditionally known for its emphasis on academic freedom ever since the 19th century? Do researchers in Germany have the same freedom that researchers had at IBM during the 1960s?

DiVincenzoNo. But I think they are freer and have more flexibility than what you find in the U.S. academic culture. Of course, in the U.S., there is heavy attention given to third-party funding. In Germany, this is not completely true. If you have a chair, you actually do have fixed resources that go with the chair, which is not the case in the U.S. But it is typically not enough to do any major project, and it shrinks over time, so you should connect yourself with some third-party funding. However, there is here a pretty strong long-term consensus that we don't tinker with science funding too much.

Spectrum:  Basically, future quantum computers might be based on qubits of two types: atoms or ions suspended by laser beams, and ions or electrons trapped in matter, such as in defects or in quantum dots. Which will prevail?

DiVincenzoWe're close enough to the quantum computer that we kind of can foresee its complexity in a classical sense—that is, how much instrumentation is required for this to work as a quantum computer.  I think that systems that involve lasers add a really big  jump in complexity because they would require a laser system for every qubit. Say we need a million qubits; we will have a system with a complexity well beyond anything that has ever been done in optical science. 

Now, for example, with quantum dots there are no lasers in sight.  Everything is done with electronics and at gigahertz frequencies.  You will need controller devices containing transistors that work at 4 Kelvin. That is an interesting challenge. It turns out that some conventional transistors, right out of the foundry, do work at 4 Kelvin. This becomes the beginning of some real scientific research: How do you make a transistor that really functions in a good way at 4 Kelvin without dissipating too much energy? Energy dissipated in the instrumentation close to the quantum system at that temperature could be one of a whole set of challenges that will not be solved easily. My own personal view is that we're a decade or so away from some really new machines that at least will partially fulfill what we've been thinking about since 1993.  


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More