Tech Talk iconTech Talk

Magnetosphere Satellites Launch

Update, 13 March: NASA’s Magnetospheric Multiscale mission had a picture-perfect launch from Cape Canaveral Air Force Base in Florida at 10:44 p.m. ET following a smooth countdown. All four spacecraft that are part of the mission "appear healthy following separation," the agency says.

To solve a mystery concerning powerful geomagnetic storms that can threaten Earth's satellites and power grids, NASA is launching a quartet of spacecraft into orbit on 12 Marchfor a two-year mission to analyze magnetic fields around the Earth.

A geomagnetic storm in March 1989 blacked out the entire Canadian province of Quebec, leaving millions of customers in the dark and damaging transformers as far as New Jersey, and ones 10 times worse are possible, such as the 1859 solar superstorm.

Every step leading to such intense bursts of space weather are ultimately driven by a mysterious phenomenon known as magnetic reconnection, which occurs in clouds of electrically charged gas known as plasmas. Magnetic fields are entrapped inside plasmas, and magnetic field lines can break and reconnect with each other within these clouds, explosively converting magnetic energy to heat and kinetic energy.

Read More

Creating Lasers in the Sky

Collaborators working at labs in Russia, Austria, and the United States have succeeded in pumping more than 200 gigawatts of power into a 0.1-millimeter-wide filament formed in the ambient air by a laser. In a paper published in the 17 February issue of Scientific Reports, they describe how they created laser pulses in the mid-infrared part of the spectrum. By making them 100 femtoseconds (10-13 s) long, they could pack sufficient energy in these pulses to carve out a filament in the air several meters long. The researchers claim to have achieved a world record; but, at the same time, they say that they have hit up against a limit in the amount of power that can be transmitted via the filament because of the presence of CO2 in the atmosphere.

When lasers first appeared during the 1960s, expectations were high that laser guns would eventually replace rifles. However laser guns were soon relegated to the world of science fiction because even needle-thin rays produced by lasers quickly lose their punch because the photons spread out. “If you have a freely propagating laser beam, it will diffract,” says Alexey Zheltikov, a physicist who holds research positions at both Moscow State University and Texas A&M University. “This is a force of nature and there is nothing you can do as long as you are working in linear optics,” says Zheltikov, the scientist who led the research team.

By increasing the intensity of the laser pulses, nonlinear effects become dominant, Zheltikov explains. Because the beam, at high intensities, modifies the refractive index of air, and because the intensity of the beam is higher in the middle than at the edges, it creates a narrow tube like path with a higher refractive index in the middle, bending the light inwards.

But if it weren't for another nonlinear effect, there would be no filamentation.

This other effect occurs at even higher intensities, when the beam ionizes the air in its path. The electron density is the highest at the center of the beam. It is there that the beam lowers the refraction index and forms a negative lens, or plasma lens. The laser pulses focus and defocus simultaneously, maintaining the stability of the filament. “Filamentation is the balance between two nonlinear phenomena, self-focusing and self-defocusing due to the plasma lens,” says Zheltikov.

To achieve self-focusing, the pulses must carry a minimum amount of power; but too much power results in too much ionization, which disrupts the delicate balance struck by the plasma lens. This excess ionization is more likely to occur with pulses of shorter wavelengths, where photons have more energy and more ionizing power. The researchers found that the amount of power that can be transmitted through a filament created with 1-micrometer pulses is much lower than the power that can be transmitted with 4-µm (mid-infrared) pulses. In fact, the power that can be transmitted through a filament is proportional to the square of the wavelength, so 4-µm pulses can transmit 16 times as much power as 1-µm pulses.

Naturally, Zheltikov and his colleagues wanted the most powerful pulses possible. But pulses with wavelengths over 4 µm were quickly ruled out because CO2 in the atmosphere starts blocking the light. Still, at 4 µm, they could go for laser pulses packing more than 200 GW of power—16 times the power that achieved filamentation with a previous experiment by the group with a 1 µm laser.

However, suitable lasers that could produce 200 GW pulses were simply not available. The researchers solved this problem by starting out with a powerful 1-micrometer laser then downconverting the wavelength and amplifying the pulses via several stages of optical parametrical amplification. They ultimately obtained the required 100-femtosecond pulses of over 200 GW of power. “This is how we were able to achieve an unprecedented peak power in the mid infrared,” says Zheltikov. Unfortunately, because light absorption by CO2 rules out higher-power pulses, this the “ultimate experiment in the atmosphere,” Zheltikov adds.

Notwithstanding this limitation, there still is room for interesting applications such as remote sensing. It should be possible to focus the laser pulses in such a way that the filamentation occurs several hundreds of meters up in the atmosphere. There, analysis of the light emitted by the filament would allow specific molecules, such as air pollutants, to be identified.

And there is the "laser in the sky." In experiments with high-pressure gases, the researchers observed lasing of light within the filament, and the return of the amplified light to the laser. Created in the atmosphere, this backward signal would, for example, enhance remote sensing capabilities, or even the creation of artificial “guide stars” used for the adaptive optics of astronomical telescopes. “I have given a talk about this at a meeting with astronomers, and they are interested,” says Zheltikov.

2015 Hackaday Prize Competition Begins

The folks over at Hackaday—a popular blog about interesting new software and hardware hacks—have just announced the second annual Hackaday Prize competition. Like last year’s event, the grand prize on offering is a trip into space—when such a ticket becomes available to purchase—or US $196,883, a figure whose significance may elude you unless you’ve been studying your Monster-group numbers lately.

Read More

Physicists Find Hints of the Higgs in Superconductivity

When physicist Aviad Frydman and his team set out to perform experiments on ultra thin superconductors they had no idea that they would encounter what might be an analogue to the Higgs Boson—the particle whose discovery by a team at the Large Hadron Collider led the way for the 2013 Nobel Prize in Physics.  

A multinational research team recently reported the first-ever observations of what is known as the Higgs mode in superconducting materials. The discovery of the Higgs mode, reported in late January 2015 in Nature Physics, might eventually help physicists gain a different perspective on one of nature's most basic building blocks.

Read More

NASA's Dawn Makes History by Orbiting a Dwarf Planet

NASA’s Dawn spacecraft has achieved the distinction of becoming the first to orbit a dwarf planet. The spacecraft recently began circling Ceres, the largest known body in the main asteroid belt that sits between Mars and Jupiter.

Read More

Draw Biosensors on Your Skin

Someday soon, on-demand diagnostics could be as simple as doodling on your arm or leg. Special sensor-laden inks would help diabetics monitor their blood sugar and allow people to stay on top of other elements of their body chemistry. The write-once, read-several-times inks could also let homeowners test for toxic pollutants, and help soldiers detect explosives and nerve agents on the battlefield.

Researchers at the University of California, San Diego (UCSD) who developed the inks published their results in the 26 February issue of the journal Advanced Healthcare Materials. They revealed that the main ingredients of these inks are the enzymes glucose oxidase, which responds to blood glucose, and tyrosinase, which responds to common pollutants known as phenols. To make these bio-inks serve as electrodes, they added electrically conductive graphite powder. They also added: chitosan, a clotting agent used in bandages, to help the ink stick to surfaces; xylitol, a sugar substitute, to help stabilize the enzymes during chemical reactions; and biocompatible polyethylene glycol, which is used in several drug delivery applications, to help bind all these ingredients together.

The scientists filled off-the-shelf ballpoint pens with their enzymatic inks. The pens could create a blood glucose sensor simply by drawing glucose-sensitive ink on a transparent, flexible strip that included an electrode. When a blood drop was placed on the sensor, the ink reacted with glucose in the blood and the sensor measured the reaction.

The UCSD team demonstrated that these biosensors were reusable. They only had to wipe the strips clean and draw more ink on them.

But the eye-popping achievement they reported was that the inks could eliminate painful finger pricks to draw blood simply by scrawling sensors directly on the body. The diagnostic body art, they said in the paper detailing their findings, would take glucose readings through the skin and deliver the results via a Bluetooth device. The investigators estimated one pen held enough ink to write the equivalent of roughly 500 blood glucose sensor strips.

The pens could also draw sensors on leaves to detect pollution using phenol-sensitive ink. The scientists noted that the inks could be modified to react with many other pollutants, such as heavy metals or pesticides, and can be drawn on a variety of surfaces such as smartphone cases and building windows.

The UCSD researchers noted these pens could help people build useful sensors anywhere easily and cheaply anytime they need to, without needing to know ahead of time where and when sensors might be required. The group’s next steps include connecting the handwritten sensors wirelessly to monitoring devices and analyzing how they perform in difficult conditions such as extreme temperatures, varying humidity levels, and extended exposure to sunlight.

Q&A: Bringing Chappie to Life with VFX Supervisor Chris Harvey

What would existence be like for the first sentient robot? Pretty eventful, according to the action science-fiction movie Chappie, which officially opens today in the United States.

Directed by Neill Blomkamp, who previously helmed 2009’s District 9 and 2013’s Elysium, Chappie is the story of an eponymous upgraded police robot, programmed to learn and experience emotion. Post-upgrade, Chappie must then find his place in the world, while fending off forces determined to destroy him.

The key character of Chappie is computer-generated, based on a performance on-set by the actor Sharlto Copley. Creating a believable robot that could engage an audience’s sympathies, and a near-future world for it to inhabit, was largely the responsibility of Chris Harvey, Chappie’s visual effects (VFX) supervisor. Harvey talked with me about how the movie’s creators approached portraying futuristic but believable technology:

Stephen Cass: What is the role of a VFX supervisor?

Chris Harvey: I oversee all of the visual effects in the film. So that includes planning, in pre-production, how we are going to shoot things that are going to need effects integrated later. Then during the shoot we’re making  sure that we’re shooting it in the way that we need and acquiring the appropriate data. Depending on the director we can be very involved in what those video effects will be.… Once you get into post [production], you’re right next to the director: Everything from the effects creative staff is coming to you for approval and then presentation to the director.

SC: How did you work with Blomkamp—what was your brief?

CH: [I started working with Blomkamp] almost 8 months before shooting began… The brief was “we need to create this robot that everyone has to believe is real from a visual perspective [and make him so that people can] connect and relate and emote with him.” That was the big challenge of this movie—we had to take it past just believing he’s present.

SC: How do you go about building a world that’s realistic, but still visually engaging? How do you make extrapolations about how things might look in the future?

CH: Every film is different, because every film has a different visual style. … But the key is always looking to the real world and finding references.… Even if [the thing on screen] is something fantastical that no-one’s ever seen before, not having something to ground it in the familiar reality of our world will always make it much harder to accept as being real.

Now, with Chappie we took that right to the extreme. There’s literally no part of Chappie that is just completely invented. Everything about his design is purpose driven. There’s a function to every component.  There’s a ridiculous amount of real world material built into him. All of his joints and gears and all of that stuff comes from references from real robotics that exist today, so that he feel very, very tangible.

SC: What were the biggest technical challenges in making Chappie feel real and relatable, especially around actors on physical sets?

CH: All of the shooting happened in real locations, so we were on location the whole time. We shot Sharlto in all these scenes, but sometimes [he’s not exactly where he needs to be] so you have to erase him. No one likes to talk about that, but there was an army of people and all they did was just erase Sharlto out of the frame! That’s not new, but it’s certainly something that deserves acknowledgement. The animation was certainly a technical/creative achievement. We did not use motion capture like everyone thinks we did. We used Sharlto as reference, but the animators hand key-framed [the robot] on top of him.…

An interesting challenge was that [Chappie] is one character, but he evolves throughout the course of a scene. So even though you might think of him as one digital model or asset, there are 16 versions of him that we had to track from shot to shot throughout the whole film. [And, for example,] he has a battery light on his chest. And for that battery light we had to make sure it tracked the [correct] percentage from shot to shot.…

One thing we really tried to do a lot, in terms of making [Chappie] feel present in a scene, is physical interaction. We really didn’t shy away from that. If people wanted to hug him or touch him—we really encouraged the actors to interact with him as much as possible. We put these chains on him that could shake around and bump into things.... Those things are very, very hard to achieve, but the more of it that we could do, the more it sells that he’s there.

SC: Once, audience accepted robots like Robby. Now expectations are higher—are those expectations continuing to rise?

CH: Tremendously so. There’s just such an onslaught of big visual effects films. Huge amounts of eye candy. So people are becoming educated. It’s not just films—that technology is making its way into games and into everyday life. People are much more educated on it and much less accepting to the point where there’s a lot of people out there who literally go in looking for the mistakes. Which I think is a shame! 

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
 
Semiconductors

New York State Gets Behind Oxyfuel Carbon Capture

In a somewhat startling development, New Yorkâ''s governor David Paterson announced on June 10 that the state will support construction of an experimental â''oxyfiredâ'' electric generation plant, in which coal will be burned in an atmosphere of almost pure oxygen, so that nitrogen emissions are eliminated and carbon capture simplified. Swedenâ''s Vattenfall and Franceâ''s Alstom are completing a similar demonstration plant in eastern Germany, as described in the â''winners & losersâ'' January issue of Spectrum, and Babcock & Wilcox has had a serious oxyfuel R&D program in the United States. But oxyfuel has not been the mainstream approach …

Load More