Tech Talk iconTech Talk

FDA Approval For Mini NMR-Based Pathogen Detector

Startup T2 Biosystems in Lexington, Mass., got the U.S. Food and Drug Administration's (FDA’s) nod on Monday for a device that quickly and accurately detects dangerous pathogens. The instrument is based on miniaturized nuclear magnetic resonance (NMR) technology developed by MIT and Harvard Medical School researchers who founded the company eight years ago. The Harvard researchers are also developing the tool for cancer detection.

Today’s culture-based diagnostic tests for viral and bacterial infections are expensive, and require a few days wait, even with the equipment at full-scale laboratories. A speedy, portable, sensitive detector could save lives and money.

T2 Biosystem’s fully automated bench-top tool delivers results in three to five hours and is more sensitive than culture-based tests, according to the company. It works like this: A clinician loads a patient’s blood sample into a disposable test cartridge containing a few reagents, inserts the cartridge into the machine, and waits. The machine is capable of detecting a range of biological material including proteins, DNA, small molecules, viruses, and bacteria.

In conventional NMR machines, atoms aligned in a magnetic field are vibrated using a radio-frequency signal in order to measure their oscillation frequency. Those machines require large, powerful magnets.

In T2 Bio’s miniature NMR device, the magnet can be smaller because the sample volume is tiny and because the system measures how quickly the atoms’ vibrations decay instead of their frequency. Specifically, the instrument probes water molecules in a sample. Magnetic nanoparticles coated with antibodies that bind to the target molecule are added to the sample. If the target molecule is present in the sample, the nanoparticles cluster around the target, changing the signal decay rate.

The FDA approved T2 Bio's diagnostic instrument and a test for Candida yeast that runs on the machine. The test can detect five Candida species that cause potentially fatal bloodstream infections. Clinical trials in over 1,500 people showed that the T2 system could detect Candida yeast with 91.1-percent accuracy, a major improvement over blood culture-based tests, which are 60 to 70 percent accurate.

A typical Candida-infected patient stays in the hospital for 40 days at a cost of over $130,000, states the company's website. Doctors usually put patients on antifungal drugs while waiting for blood culture results. Getting a result in a few hours would let doctors quickly deliver the most effective course of treatment. The company mentions a study that shows that providing the right antifungal therapy within 24 hours of symptom onset decreases the length of hospital stay by approximately ten days and decreases the average cost of care by approximately $30,000 per patient.

But the FDA doesn’t recommend replacing blood culture tests yet. Per the agency: “because false positive results are possible with the T2Candida, physicians should perform blood cultures to confirm T2Candida results.”

T2 Bio plans to charge between $150 and $250 for the test, according to this Xconomy article. The challenge the company now faces is to get hospitals to buy their machines and adopt its tests as part of their standard routine.


Google X Balloons Will Circle the Earth to Deliver Internet

A ring of balloons circling the Earth could begin delivering Internet service to mobile phone users in the Southern Hemisphere sometime in 2015. That's the vision of Google's Project Loon, an effort to get billions of people living off the grid in remote regions to join the rest of the world online.

Read More

Can Ultrascale Computing Remain Sustainable?

Over the last three decades, we have witnessed an enormous increase in the processing power of supercomputers, with gains in speed of roughly three orders of magnitude every 10 years. From gigaflops in the mid-1980s, computers reached teraflop speeds in the mid-1990s, and petaflop speeds at the end of the first decade of this millennium. The next logical step, an exaflop computer, is still quite a distance away as evidenced by the fact that China’s NUDT Tianhe-2 supercomputer cranks out 33.86 petaflops.

The Tianhe-2 runs on 24 megawatts of electricity, and extrapolations show that the energy consumption of an exaflop computer using today’s technology would require the entire output of a power station—clearly an option that is not sustainable.

Read More

Long-Lived Blue OLED Could Lead to Better Displays

Many displays in smartphones and televisions generate red and green light with phosphorescent organic light-emitting diodes but use more energy-hungry fluorescent devices for blue. That's because blue PHOLEDs only last for a couple of days. Now researchers have found a way to extend the lifetime of blue PHOLEDs by a factor of 10, bringing them much closer to commercial use.

Read More

Hacking the Cancer Genome

human os iconThese days, cancer is as much a target for researchers with number-crunching skills and their data-mining tools as it is for scientists in biomedical research labs. And one potentially powerful big-data approach to conquering cancer—which involves discovering clever genetic tricks to make cancer cells kill themselves—has moved in a promising new direction this month.

Scottish, Israeli, and American researchers have reported a new discovery that analyzes existing cancer gene databases for clues to combinations of genes that together can kill tumor cells while leaving healthy cells untouched. The idea takes advantage of a phenomenon called synthetic lethality, which in oncology was first explored 17 years ago as a potentially fruitful new line of cancer treatment.

One of the new paper’s coauthors, Eytan Ruppin, director of the University of Maryland’s Center for Bioinformatics and Computational Biology, says cancer cells are like regular cells run amok. They, like regular cells, typically have some 10,000 genes. But in cancer cells, many more of these genes are inactive—meaning that for whatever reasons, those genes don’t produce the proteins that a healthy version of the cell would be producing.

Since the 1920s, it’s been observed that all cells in fact have networks of secret self-destruct switches: When both of a key pair of genes become inactive, the entire cell begins the process of shutdown and cell death.

So, Ruppin says, the hurdle that needs to be overcome in order to make this possible broad-spectrum genetic cancer treatment work is discovering and cataloging as many of these secret “synthetically lethal” (SL) gene pair combinations as nature has provided. Then, when a patient’s cancer is biopsied, and its genome is taken, an oncologist can look to see which genes in the patient’s cancer cells are inactive.

For example, say that the oncologist discovers in an SL database that an inactive gene in a patient’s tumor (call it Gene A) happens to have a corresponding synthetically lethal partner gene (call it Gene B).

In this case, then, a drug that inactivates Gene B will trigger the cell death process in the tumor but not in the person’s healthy cells. (Depending on what Gene B does, there might also be side effects from switching Gene B off. But so long as Gene A remains active throughout the rest of the person’s body, those side-effects should not include cell death.)

Ruppin and his collaborators used a clever data mining technique to discover more than a thousand candidate SL gene combinations. They plumbed the U.S. National Cancer Institute’s Cancer Genome Atlas, which itself contains thousands of genomes of different biopsied tumor samples.

They then ran searches for various inactive genes. So, for the sake of example, say they found some of the cancers in the database with Gene X inactivated. And some of the cancers in the database had Gene Y inactivated. If Genes X and Y don’t form an SL pair, there should then be plenty of examples in the database of cancers where both X and Y were inactive. However, if Genes X and Y do form an SL pair, then there should be almost no examples of tumors in the database where those two genes are both inactive.

“You would have expected them to be inactive together at a certain rate, given their individual inactive frequencies,” Ruppin says. “But when you look at the data, you find that they are never inactive together,” Ruppin adds that this, “is a very strong indication that they are synthetically lethal. Because whenever they were inactive together, they were actually eliminated from the population. Because these cells died.”

Your TV Will Know You Better Than You Do

With satellite, cable, and terrestrial TV stations broadcasting in the hundreds and Internet-based entertainment content companies also competing for viewers’ attention, finding something to watch is, strangely, a growing challenge. To help simplify the task, researchers at Japan’s public TV and radio broadcaster Nippon Hoso Kyokai, better known as NHK, plan to begin testing technology to automatically assess in real time a viewer’s interest in a TV program or video and then suggest other programs to watch based on the results.

Read More

“Twisted” Radio Beams Data at 32 Gigabits per Second

A team led by engineers at the University of Southern California has sent multiple channels of data over a single frequency by twisting them together into a beam resembling a piece of fusilli pasta. By combining several polarized beams carrying information into a single spiraled beam, the team was able to send up to 32 gigabits per second across 2.5 meters of open air, a rate around 30 times as fast as an LTE wireless connection.

Read More

Can Japan Act Like Silicon Valley?

Photo: John Boyd
Stanford professor Richard Dasher.

During the1970s and ‘80s, Japanese electronics manufacturers like Toshiba, NEC, Hitachi, Sony, and Panasonic reigned supreme in memory chips, displays, and consumer electronics. Then along came the Korean chaebols led by Samsung, followed by smaller, nimbler Taiwanese manufacturers, who together toppled the Japanese giants, just as the Japanese had previously dethroned U.S. rivals such as Fairchild, RCA, Zenith, and Motorola.

But whereas the United States’ strength in entrepreneurship has helped it recover and move on to create new game-changing technologies, Japan is still struggling to get out of its stagnant rut.

One way Japanese corporations can do this, suggests Richard Dasher director of Stanford University’s US-Asia Technology Management Center, is to take a leaf out of Silicon Valley’s playbook and adopt more open innovation, rather than continuing to rely on their primarily closed, internal R&D systems.

Dasher, speaking to the foreign press in Tokyo on 11 September, explained that Silicon Valley-style open innovation emerges in an advanced economy after major corporations have consolidated their position at home and overseas. At this stage of economic development, entrepreneurial types become dissatisfied with the opportunities available in a big company and look to work for themselves to bring new products and ideas to market—perhaps disrupting the status quo in the process.

At the same time large companies begin to fear stagnating and losing competitiveness and smart employees. But given that a typical company spends 90 percent of its R&D budget on development, “for the last 25 to 30 years in the U.S., companies have been looking outside to get new ideas that will integrate with what they have in the company in order to improve the pipeline for new ideas,” says Dasher. As a consequence, some 90 percent of successful U.S. start-ups enter the market not via an initial public offering, but rather through acquisition by large corporations.

By contrast big Japanese companies tend to look for external partners to fill a particular niche, which is more like outsourcing.

As for hallmark Japanese management characteristics such as the life-time employment system that served so well during the high growth and consolidation years, it now “looks like an inflexible labor market” when you have to move “even faster in an economy where’s there’s much more disruption,” notes Dasher.

There are other hindrances as well, such as the superior status employees of a major company may exhibit towards staff of a small acquisition. Even Japan’s famed customer service can become a problem, “for it can turn the company into something that’s reactive,” says Dasher. “If the customer doesn’t already want it, we won’t bother with it.”

On the other hand, he notes that the Japanese government has been laying the foundations for open innovation for the past two decades. He points to the government-university-industry consortiums, the pressure being put on universities to become more innovative, the changes being made in work regulations and labor policy, and the encouragement of small- and medium-sized business to compete for government contracts. Some Japanese companies have also begun backing venture captial-funded firms, one example being the $300 million investment in World Innovation Lab, which focuses on start-ups both in Japan and in Silicon Valley.

“If Japanese companies get really hungry again and regain their drive … they will be powerhouses,” says Dasher. “There is a lot of good technology. There are a lot of good people—you just have to incentivize them in the right way.”

Unemployment For Engineering PhDs Lower Than National Average

Doctoral degrees are an undertaking. A new NSF report indicates the payoff: PhDs in science and engineering make you much more employable than someone without.

The unemployment rate for those in the United States with engineering doctoral degrees was 1.9% in February 2013. That’s less than a third of the 6.3% unemployment rate of the general population 25 years of age and older.

Read More

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More