Tech Talk iconTech Talk

CESAsia: The Consumer Electronic Association’s Big Gamble

Once upon a time, in the 1980s and 1990s, there was a show called COMDEX. It was the show to attend if your product employed any kind of serious electronics. Then, the Consumer Electronics Show ate COMDEX’s lunch; the last COMDEX was held in 2003. Since then, CES has ruled supreme: 160,000 people from all over the world attended last January’s show, which had 52,000 exhibitors.

Now, perhaps as something of a pre-emptive strike against any would-be attempts to eat its lunch, the Consumer Electronics Association, which owns CES, has launched a spinoff show in Shanghai, CESAsia. The attraction of locating CESAsia in China is obvious: not only is China where most consumer electronics are manufactured, it’s also where these devices are finding a growing retail market.

Gary Shapiro, president and CEO of the Consumer Electronics Association, anticipates a 5 percent jump in the size of China’s domestic consumer technology goods market, from US $268 billion in 2014 to $281 billion in 2015.

Still, CESAsia is starting off small, with 15,000 attendees and around 250 exhibitors. In fact, there are fewer Chinese companies here in Shanghai than can be found at the Las Vegas show, where small outfits selling every part of the great consumer electronics supply chain crowd into giant halls. So, which companies are hawking their wares at CESAsia? Shapiro says that the exhibitors that are present are a “curated” group, consisting of only those that were “truly innovative or have a significant brand.”

Despite these relatively humble beginnings, Shapiro believes that great things are in store for CESAsia. He says that he expects the show to grow to rival the CES Las Vegas show. (Whether or not both shows could co-exist at similar scales was not addressed.) IEEE Spectrum will be here through out this inaugural CESAsia, bringing you the best from the show.

Organic Electrochemical Transistors for Reading Brain Waves

A new type of implantable electrode can get much more sensitive readings of brain waves and cost substantially less, according to a scientist studying the device. Organic electrochemical transistors (OECTs) consist of conductive polymers and liquid electrolytes, which could make for an easy interface between, say, the surface of the brain and conventional silicon electronics.

Read More

Machine Learning Could Predict Outbreaks by Identifying Dangerous Rodents

Outbreaks of zoonotic diseases follow a depressing pattern: Somewhere, people come in contact with an animal bearing a disease that’s compatible with human biology. These infected people return to their communities and spread the illness.  Once public health authorities get wind of the outbreak, an all-out scramble begins to determine where the disease came from and how it’s spreading.

A team of computer modelers hope their research will put an end to that reactive model. They want to predict outbreaks, and hope to help prevent crises like the Ebola epidemic that ravaged West Africa in the past year.

Disease ecologist Barbara Han and her colleagues have taken a big data approach to identify animals that serve as disease reservoirs—animals that harbor viruses or bacteria that can be transmitted to humans. In a study published last week, the scientists fed information about all 2277 known rodent species into their computer models, and used machine-learning algorithms to identify more than 50 species that may be disease reservoirs.

The scientists drew information from several sources, including the massive PanTHERIA database, which lists what’s known of mammalian species’ physiology, behavior, range, social structure, and so forth. PanTHERIA is a “painstaking collection” of data from thousands of individual field studies, Han tells IEEE Spectrum

Although the PanTHERIA compilation will probably never be comprehensive, the researchers’ computer models could work with the incomplete data set. “This was one strength of this approach,” says Han. “If you tried to wait until you know everything, you’d be paralyzed.” The researchers also used public health databases of species that harbor zoonotic diseases.

The researchers first used information about known rodent disease reservoirs to train their computer model. “What the algorithm is doing is picking out the key features that repeatedly show themselves to be predictors of a species being a disease reservoir,” Han explains.

Once the model had created that “profile” of a disease-bearing rodent, the researchers tested its ability to distinguish between reservoir and non-reservoir species. The model’s 90 percent accuracy rate gave them confidence to try it with rodent species whose reservoir status was unknown. By identifying more than 50 new species as highly likely to carry zoonotic diseases, the researchers have provided hypotheses that field researchers can test. “Our predictions are a jumping off point,” Han says. A couple of voles and a grasshopper mouse are at the top of the suspect list. 

Interestingly, the model didn’t pick out species that were closely related to each other. Instead, the most predictive feature was having what Han calls “a fast-paced strategy for life.” These are short-lived animals that reach sexual maturity quickly, and then reproduce prolifically.

This jibes with previous scientific findings about these fast-paced animal species and their immune systems. Such species may “allocate their resources to reproduction,” Han says, and have an immune system that isn’t very effective, allowing them to harbor diseases. Those diseases may not be of much concern to the species, as they’re “more concerned with knocking out as many babies as possible,” Han explains.  

It seems that Han’s computer models have given humanity a good tip: Beware of animals that live fast and die young. 

What Color Will Your Next Smartphone Be?

Whether smartphone owners prefer metallic tones or brighter, eye-catching colors, there’s a good chance that AkzoNobel has it. The Dutch multinational company has become a leading player in designing device coatings for huge consumer electronics brands such as Samsung. Recently, AkzoNobel unveiled the 10th edition of its Design Trend report that analyzed past color trends and showed off new designs reflecting the dominant color trends for the next generation of devices.

Read More

Why IoT Needs 5G

When 5G, the fifth generation of wireless communications technology, arrives in 2020, engineers expect that it will be able to handle about 1000 times more mobile data than today’s cellular systems. It will also become the backbone of the Internet of Things (IoT), linking up fixed and mobile devices—vending machines and cars alike—becoming part of a new industrial and economic revolution, some say.  A new architecture, new communication technologies, and new hardware will make this transformation possible.  A research team comprising Zhiguo Ding at Lancaster University and researchers at China's Southwest Jiaotong University have took stock of the recent research and future needs of 5G in a review last month in Science China [pdf].  In a Skype interview Ding spoke to Spectrum about his views on how 5G will take shape in the near future.

Spectrum: 4G is now being deployed in many countries, but is it already out of date?

Ding: Actually 4G is good for now, however if you would look at it in five or ten years, 4G will obviously not be able to meet requirements for new applications coming up in the next few years.  With 5G we will increase the data rate, reduce the end-to-end latency , and improve coverage.  These properties are particularly important for many applications related to IoT. One example is emerging autonomous cars and intelligent transportation, to which small latency is essential. Another example is that a lot is happening with interactive mobile games, which are really bandwidth hungry.  Unfortunately the current 4G cannot support them.

Spectrum:  So  5G will play a fundamental role in the Internet of Things?

Ding:  I think the Internet of Things will be the ideal application for 5G. What currently stands in the way of the IoT are disconnected systems.  For example, we have RIFD, we have short-range communications techniques, UWB,  Blue Tooth, etc., and this could be a problem in the future if we talk about a bigger picture like a smart city, where a unified framework for seamless connection is required.  5G is a good opportunity to provide this unified framework.

Spectrum: How will 5G deal with the huge number of devices connected to the IoT?  Will there be sufficient bandwidth?

Ding:  The previous 1G-4G systems rely on so-called orthogonal multiple access. Take time division multiple access used by 2G as an example: We slice one second into a lot of timeslots with short duration.  We then allocate one particular time slot to each user, and one user cannot access a channel allocated to others. Such orthogonal multiple access will be difficult to support for future IoT applications.  We will have a lot of devices, and we would have to allocate time slots dedicated to each of them.  But in the end this is a luxury we cannot afford, since the number of available time slots and bandwidth resources will be insufficient. This is why multiple orthogonal access won't work for the 5G.

Currently there is a lot of research exploring how we can develop non-orthogonal multiple access by putting a number of users into the limited bandwidth channels. Ideally non-orthogonal multiple access can strike a better trade-off between system throughput and user fairness. Of course there is interference between users, which means that some users may experience low data rates. But interestingly in IoT, there are many devices which should be served timely with low data rates. One example is wireless healthcare, where wearable devices (heart monitors, biosensors, etc) need to send patient data timely to hospital severs, but the data rates used by these devices are not likely high. By using non-orthogonal multiple access, we can squeeze in a lot of IoT users/devices with different quality of service requirements into the same time slot or frequency channels. In this sense, the concept of non-orthogonal approaches is very exciting and perfect for the Internet of Things.

Another way to illustrate the benefit to break the orthogonality of multiple access is to view non-orthogonal multiple access as the special case of cognitive radio technologies. Currently, we allocate a single bandwidth, or channel, to a user, and we are not able to reuse it again because this user is occupying this channel.  With cognitive radio communications we can admit new users into this channel. If these users have good connections to the base station, we can realize a large data rate.  Of course this will cause some performance degradation for the initial users, but such degradation can be insignificantly if these initial users have poor connections or a careful power control mechanism is carried out among the users.

Spectrum:  How is 5G going to deal with the "spectrum crunch"—the available RF bands are becoming full?

Ding: To solve the spectrum crunch we need a combination of a lot of technologies. One way is to improve the efficiency for using the existing available bandwidth. In this sense, we can apply non-orthogonal multiple access, massive MIMO, cloud radio access networks, full duplexing, etc.    

Another way is going to shorter, milimeter wavelengths, 60 or 90 GHz, where more spectrum bandwidth is available for telecommunications. There are some challenges here. For example, the higher the frequency, the more the attenuation by the atmosphere, and this rules out long-distance transmission. In addition, there is the shadowing problem—you need an intact line of sight between the transmitter and receiver.  This problem can be solved with multiple antennas; you have backup links between the transmitter and receiver, even if one link is blocked. It is worthy pointing out that the use of millimeter-wave communications is promising for many applications in IoT since sensors might have line-of-sight connections and distances among sensors might not be large.

Spectrum:  What are the next steps?

Ding: The timeline for the development of 5G has not been officially confirmed. It is widely expected that a formal discussion as well as standardization activities will start in the next year, and commercial deployment is expected to happen in 2020. Currently both industry and academia are working together to identify which standards and techniques should be used and which not.

This story was corrected to on 22 May 2015 to fix a photography error.

Monolithic 3-D IC Topped With Solar Cell for Internet of Things

A laboratory leading the charge toward monolithic 3-D ICs has now added ambient-light energy harvesters to its integration repertoire. Engineers at Taiwan's National Nano Device Laboratories (NDL) hope to use the integrated photovoltaic component to create a convenient, mobile, and durable Internet-of-Things chip.

The IoT chip is a 3-D stack that includes logic, a type of nonvolatile memory similar to flash, and SRAM all built atop each other. Usually, ambient light energy harvesters placed beside other chips on a circuit board. But building it right on top of a monolithic three-dimensional integrated circuit (3D IC) saved a lot of space, said Chang-Hong Shen, Division Director of the lab’s Emerging Device, said in a news conference in Taipei in March. “The integration reduced the size of the device by 60 percent,” he said.

It also saves energy, because the distance electricity must travel between the energy harvester and the transistors it must power was reduced 1,000-fold—from the order of millimeters to that of micrometers, Shen said. The light-energy harvester portion consists layers of silicon and germanium, that produce an output voltage greater than 1 volt and generates about 7 milliwats per square centimeter in outdoor light. (It generated about 20 microwatts per square centimeter indoors.)

“We just collect ambient optical energy, which could have been wasted, without the demand for more sources of energy in the environment,” Shen said. “The energy collection can still be conducted when the compact chip is in sleep mode.”

According to Jen-Inn Chyi, senior vice president of Taiwan’s National Applied Research Laboratories, the technology can be used in a wide range of IoT applications. “It particularly suits some chips with simple functions. It might not be able to replace main power supply of chips. But it can certainly help prolong a charge cycle, further extending the lifespan of chips,” Chyi said.

According to Shen, light-powered IoT chips either outdoors could detect a structure’s seismic response or sense nearby movement and smoke. It can be also ideally applied wearables, such as smart watches. But powering energy-intensive smartphones are beyond this technology’s reach, Shen said.

This post was corrected on 21 May 2015.

Tunable Liquid Metal Antennas for Tuning in to Anything

Tuning in is getting to be a complicated thing. The Internet of Things will need more microwave bands with shorter wavelengths. Cell phones are already need to link to GPS and Wi-Fi services on top of 4G and other cellular networks. And in the future they’ll likely also have to contend millimeter-wave bands for 5G services. All those need antennas of different lengths and shapes to accommodate the sometimes widely spread wavelength bands. 

Monopole antennas, consisting of a single conducting rod, transmit maximum power when their length corresponds to half the wavelength of the RF signal, but for devices operating at different wavelengths this becomes a problem.  "The present solution is to have a switchable filter bank along with switchable and/or multi-band antenna,” says Jacob Adams, an engineer at North Carolina State University.  “These solutions take up a lot of space and a single widely tunable element has the potential to replace several of these fixed components."  He and colleagues describe in today's issue of the Journal of Applied Physics just such an element: a liquid metal antenna that can continuously adapt to different wavelengths by changing its length inside a capillary.  

Read More

Google’s Patent Portal Is Closing Fast

If you want to sell Google your patent, you’ve got to move fast. The company’s Patent Purchase Promotion ends in little more than a week.

Announced in a Google blog post on 27 April, the promotion opened a window of time, starting 8 May, whereby U.S. patent holders can put a price on their property, fill out a form, submit it to Google and see what happens. The submission window closes 27 May.

Read More

Ion Electrospray Engines Could Take Cubesats to the Moon and Beyond

CubeSats are one of the cheapest, most efficient ways to get to space. Each CubeSat unit measures just 10 centimeters on a side, which is usually enough room for solar panels, communications equipment, and a small science payload. It isn’t enough room for an engine, and generally, most CubeSats are dumped into orbit and left to fend for themselves, tumbling aimlessly until drag pulls them to earth after a few months or so. This makes them cheap for a spacecraft (usually a little over $100,000 each including launch costs), but places rather severe limits on what they’re able to accomplish.

In 2013, NASA funded three different groups to develop small, highly efficient propulsion systems specifically designed to enable spacecraft like CubeSats to orient themselves, maneuver, and even change their own orbits. The propulsion technology that NASA is interested in is called ion electrospray, and MIT’s prototype is a modular, eight-thruster unit just 21 millimeters thick that can change the velocity of a CubeSat by a staggering 100 meters per second.

Read More

Can Hackers Commit the Perfect Murder By Sabotaging an Artificial Pancreas?

Robotic systems are, at last, beginning to take over some of the burden of managing the fluctuations in blood glucose in patients with Type 1 diabetes. But a new report warns that as the systems get adopted more widely, the risk of criminal eavesdropping and sabotage will also increase.

The report, by Yogish C. Kudva and colleagues at the Mayo Clinic in Rochester, Minn., and at the University of Virginia in Charlottesville, appears in Diabetes Technology & Therapeutics.

“Deliberately wrong (high) glucose data sent to an unprotected mobile computing platform may cause the algorithm to deliver excessive insulin, whereas incorrect low glucose values could cause it to deliver too little,” the researchers write.

Make the machine administer too little insulin, and the blood-glucose level may rise high enough to send the patient into a ketoacidosis coma. Make it administer too much, and the glucose falls until the brain fails causing the to patient faint, or even die. It might seem to bad guys like the way to commit the perfect murder.

Patients are particularly vulnerable to low blood glucose when sleeping. In fact, heading off such nightime episodes is a chief selling point for the most advanced commercial artificial pancreas, the MiniMed 640G, which was recently approved in Australia and Europe.  If its algorithm predicts that a sleeping patient’s blood glucose is about to fall too far, the machine will sound an alarm; if the patient still doesn’t respond, the machine will stop the flow of insulin. 

The researchers note that standards are already being developed to assure that all the parts of the artificial pancreas—the glucose sensors, the insulin pump, and the computer—be interoperable. They say these standards also ought to include provisions for encryption and other security measures. They also suggest that the system seek a second opinion by submitting its operations to the inspection of “intelligent safety algorithms, informed by additional data such as insulin delivery history.” 

Such a safety algorithm might suspect foul play if something extraordinary seemed to happen—for instance, if the sensors reported a sudden rise in blood sugar in the middle of the night, long after the patient’s final meal of the day. This could raise a red flag, inducing the artificial pancreas to wake the patient up to make an independent test of his blood-glucose level.

The artificial pancreas is the culmination of a 50-year slog in bioengineering—one that is finally paying off because of improvements in insulin, sensors, and algorithms. Read all about it in the upcoming June issue of IEEE Spectrum, which is devoted to a single topic: “Hacking the Human OS.”


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More