Tech Talk iconTech Talk

Finding the Perfect Blend of Full and Half Duplex for Future Cell Phone Networks

The sales pitch for full duplex is a powerful one: these new radios could instantly double the capacity of today’s wireless networks by transmitting and receiving signals on the same frequency, at the same time. That promise has made network engineers eager to deploy it in cellular base stations and mobile devices ever since the technology began to pick up steam around 2007.

But in reality, transmitting and receiving messages at the same time on the same frequency has an unfortunate side effect. It causes twice as much interference as performing each function in turn or on separate bands. So while full duplex radios can dramatically improve spectrum efficiency, the resulting interference means more connections would be lost if a network were constructed wholly of them.

Read More

Chilling Effects Watch: Oracle v. Google

A lawsuit between Oracle and Google that went to the jury this week has been called the “end of programming as we know it” and the case that “will decide the future of software.” The media is probably hyperventilating again, says one legal expert, but real chilling effects could still stem from this strange but important legal dispute.

On Monday, lawyers for Google and Oracle presented closing arguments to a jury in San Francisco in the latest installment of a lawsuit that’s been in and out of courtrooms since 2010. At issue is Oracle’s claim that Google’s Android mobile operating system, purchased by Google in 2005, infringed Oracle’s copyright on the Java programming language, 37 of whose application programming interfaces (APIs) Android undisputedly uses.

Google says Sun Microsystems (Java’s creator and the copyright’s owner when Android was being developed) touted Java’s open APIs at the time. But Oracle, who acquired Sun in 2010, clearly doesn’t see Java as open to outside developers as its creator did. It’s only, Google says, after Java’s owners failed to develop their own line of Java smartphones that Oracle is now trying to elbow in on Android’s success.

“Oracle took none of the risk but wants all the credit and a lot of the money,” Google attorney Robert Van Nest said in his closing argument.

On the other hand, Oracle argued that copyright protection of Java APIs is not in question. Instead, Oracle lawyer Peter Bicks said the key point is whether Google enjoyed “Fair Use” protection of Java’s API. And on that score, he said, there was a “mountain of evidence” that Fair Use simply did not apply to Android’s use of the Java API. So, he said, the jury must find that Google violated Oracle’s copyright. And if the jury does side with Bicks, Google could face fines of as much as US $9 billion.

For everyone in software development—not just Google and Oracle watchers—this case could be significant, says Pamela Samuelson, Richard M. Sherman Distinguished Professor of Law at the University of California, Berkeley. Depending on the outcome, she says, it could alter how tech companies write, develop, and market their code. 

Photo: UC Berkeley
Pamela Samuelson

APIs pervade software development today, she explains. From Amazon Web Services to Facebook to Apple to Google to countless interfaces between one software or hardware platform and another. Imagine a world in which big companies could descend on successful software products and reap rewards after the fact from a portfolio of claimed API copyrights. 

A lot, Samuelson says, hinges on Bicks’ assertion that Java’s APIs are protected by copyright law. That appears to be technically true in this lawsuit, she says. But it is arguably not true in other courtrooms around the country. And that crucial legal distinction could make a difference both in the outcome of this lawsuit and its applicability to other lawsuits down the line.

“The judge instructed the jury that it doesn’t make any difference if they think an API is copyrightable; the line in this courtroom is that they are copyrightable,” she says. Yet, she says, other courts such as the U.S. Ninth Circuit and U.S. Second Circuit have longstanding case law (including Computer Associates v. Altai and Sega v. Accolade, both from 1992 that Samuelson has elsewhere detailed) that knocks down copyright claims on APIs.

Of course if APIs can’t be copyrighted, then Oracle doesn’t have much on which to rest its claim of copyright infringement. Google’s lawyer Van Nest likened the situation to putting the word “hamburger” on a menu and then claiming copyright on the word. “The API is ‘hamburger’ there, it’s the menu,” he said in closing arguments on Monday. Whereas, he argued, the creative expression (and, in this analogy, the copyrightable expression) comes in how the hamburger is sourced, developed, made, cooked, and served.

“If you’re a small startup and you’re reimplementing somebody’s API, the half-million dollars that a litigation might cost is a chilling effect,” Samuelson says. “Big companies can fight like this, but smaller companies have a tougher time.”

There’s a wrinkle in the Oracle v. Google case that enabled the judge and Oracle’s lawyers to simply claim APIs are copyrightable. That is, Oracle’s original complaint against Google involved alleged patent infringement as well as alleged copyright infringement. The patent infringement claim has since been disqualified, but its legacy remains. 

Normally, Samuelson says, a copyright claim like Oracle’s would put it in line for a courtroom in the Second or Ninth District, which being the home districts of New York City, Los Angeles, and Silicon Valley, have the deepest copyright case law tradition to draw upon. These are the circuits in which, she says, any API copyright claim would face the hardest uphill climb.

However, because Oracle’s suit involved patents, she says, the case was instead routed to the court specializing in patent claims, the so-called Federal District Court. And so in a 2014 appeal, Oracle v. Google was argued in Federal Circuit court. This court, with its comparatively thin docket of legal precedents concerning software copyrights, ruled Java’s APIs were copyrightable.

So on one hand, it’s possible that even a strong finding for Oracle against Google could still have limited knock-on effects for other cases. Samuelson said the Second and Ninth Circuits’ caselaw disputing the copyrightability of APIs would remain in place regardless of the Oracle v. Google outcome. So even if Oracle prevails, a judge in the Second or Ninth Circuit might still be persuaded to treat the Oracle finding as an outlier.

On the other hand, savvy litigants might also add patent claims to any API copyright claim—which could then put the new claimant back in line for the same Federal District Court that ruled in favor of the copyrightability of Java’s API.

Thus even if the Second or Ninth Circuit would be friendly waters for a defendant in an API copyright suit, it also might not matter if it could be heard instead in the Federal District Court.

“This is the strangest Fair Use case I’ve ever seen,” Samuelson notes. Stay tuned, she adds, because whether Oracle or Google prevails, the decision will be worth hearing out.

5 Myths About 5G

Without a doubt, 5G is the hottest topic in wireless circles today. Many of the field’s most celebrated researchers and highest-paid executives are focused on forging this ultra-fast and high-bandwidth successor to 4G LTE. Among them, this opportunity to construct the next generation of wireless is often compared to Halley’s Comet: It comes around only once or twice in a person’s career.  

5G enthusiasts say the widely heralded future wireless network will deliver lightning-quick mobile data speeds with virtually unlimited capacity, blanket cities with high-quality Internet access, provide low bandwidth IoT connections to billions of devices, and even enable autonomous driving. But the industry has only just begun to set standards that will define 5G’s capabilities and launch very early trials that will establish its parameters.

But in many cases, the term “5G” is bandied about as a panacea that already exists. That’s why Seizo Onoe, CTO of NTT DOCOMO, Japan’s largest mobile carrier, is traveling around to conferences trying to keep everyone’s expectations in check. “In the early 2000s, there was a concrete 4G technology but no one called it 4G,” Onoe laments. “Today, there are no contents of 5G but everyone talks about 5G, 5G, 5G.”

At first glance, Onoe may seem like an unlikely messenger. If 5G lives up to the hype, the world’s mobile carriers stand to benefit most from the new demand and services it will create. On the other hand, Onoe’s industry ties also make it within his best interest to keep his collaborators grounded in reality so 5G can be deployed as quickly and successfully as possible. “I want to right the direction for where 5G is going,” he says.

On Wednesday, Onoe presented a keynote at the IEEE International Conference on Communications in Kuala Lumpur, Malaysia. He sought to dispel some of the most pervasive myths about 5G. It was the second time in two months that he attempted to spread this message. In April, he gave the same talk to a group of industry professionals at the Brooklyn 5G Summit in New York City.

Here are a few of the falsehoods about 5G that Onoe is eager to debunk:

1. 5G will be a “hot spot” system

Many experts believe telecom operators will deploy 5G over so-called small cell networks. Unlike cell towers of the past that broadcast signals indiscriminately over a wide area, they envision new base stations being affixed to rooftops and lampposts to serve hyper-local areas. In theory, this design should provide better and faster coverage to those fortunate enough to live in said areas (mainly, cities in wealthy countries).

Onoe says this belief is an unfortunate self-fulfilling prophecy. By labeling 5G as a small cell or “hot spot” system at this stage, the industry is closing itself to other innovations. That’s a problem, he says, because such a “hot spot” system may not be so convenient to build in rural areas. Without a commercially viable strategy, the small cell structure of 5G could end up widening the digital divide. 

Onoe says it would be better to keep an open mind to other technologies that could someday bring 5G to rural customers—or leave room for brilliant business models that could perhaps justify building far-reaching networks comprised of small cells. “At this point, I don't believe we can achieve that,” he says. “But in the past, [the industry ultimately] realized what I thought was impossible.” 

2. 5G will require substantial investment

One of the boldest statements in Onoe’s speech was that deploying 5G will not require a ton of investment. This is counterintuitive to anyone listening to predictions for widespread deployment of cutting-edge technologies from massive MIMO to millimeter wave, or projections for the number of base stations required to build out a small cell network.   

But rather than requiring a complete overhaul of existing networks as some imagine, Onoe believes 5G will be deployed largely on existing infrastructure. Better service, he insists, does not always correlate with greater capital expenditures. NTT DOCOMO’s 600 billion yen in capital expenditures last year marked a 15-year low, even as the data traffic across its networks grew 6300 percent since 2000. 

In fact, Onoe actually expects capital expenditures for NTT DOCOMO to drop throughout 5G deployment, which he says would keep with trends for earlier wireless generations. To illustrate his point, Onoe opened a chart of the company’s capital expenditures over the past 20 years and asked the audience to guess when the company rolled out 3G and 4G LTE service. It’s impossible to tell based on expenditures alone. “For LTE, there was no increase in CapEx before the LTE launch,” he says. “That's a fact”

3. 5G will replace 4G

Another assumption that Onoe loves to challenge is that 5G networks will quickly render 4G obsolete. Not so, he says. The dominance of a new wireless network is more of an evolution than a sudden debut. "Of course this happens eventually but not overnight,” he says.

In this case, too, history is on his side. No wireless network has ever wholly replaced its predecessor, if only because there are so many areas of the world such as India where 3G and even 2G service is still the norm.

And though 5G promises perks that ride in on the coattails of high speed and capacity, there are plenty of cases where 4G networks will still be more than sufficient. For example, many IoT devices such as sensors may only need to transmit small amounts of data once every hour or day. These can operate on low bandwidth and do not require ultra-fast connections.

4. 5G will require more spectrum

There’s an oft-repeated line in the wireless world: With more smartphone users consuming more bandwidth per user, the portion of spectrum dedicated to mobile data is getting crowded—and we need more of it! But Onoe maintains that carriers can find plenty of existing spectrum to support 5G and free up more through re-farming, or the recycling of that which is currently dedicated to other uses.

To support his point, he again points to his experiences over more than 30 years in the industry. For example, he says, most people assumed 4G LTE service would require new spectrum, but NTT DOCOMO launched it in 2010 using only existing spectrum.

5. For 5G, everything will need something new

Many researchers and industry professionals are eager to find as many future uses for 5G as possible, and to enhance or expand existing services on the new network. Onoe insists that just because a new generation of wireless is in the works, it does not mean that it can or should serve every possible need under the sun—whether it’s autonomous driving, IoT, or mobile broadband service. “This is the most frustrating to me,” he says.

He admits to feeling a bit of déjà vu, with today’s hype reminding him of conversations about how 4G would suddenly enable new technologies and services. At the end of the day, says Onoe, 5G will eventually deliver on many of the promises that the industry has dreamt up—and possibly even a few others it has yet to consider. But it’s just too early, he says, for the industry to tout it as the path to so many potential futures.

Editor’s note: This post was corrected on May 27 to reflect NTT DOCOMO’s capital expenditures in billions of yen instead of American dollars.

Officer at a traffic stop wearing camera attached to his glasses

Police Body Cameras Seemingly Cause More Assaults on Officers

Barak Ariel, a lecturer at the University of Cambridge’s Institute for Criminology, wrote last month for IEEE Spectrum about his studies of police body cameras. He described there the startlingly good outcomes in the first large-scale trial of police body cameras, which he and two colleagues carried out in Rialto, Calif., in 2012 and 2013. That study indicated that these cameras reduce both the frequency with which officers resort to using force and the frequency with which citizens register complaints against officers.

Ariel also shared in that article some newer results from a wide-ranging set of trials testing the effects of these cameras on the police use of force—results that would temper anyone’s enthusism for these cameras. You see, in a few of the trials, the use of force by police officers seemingly went up when they were wearing cameras. I say “seemingly” because it’s impossible to tell whether the use of force actually went up or if the cameras merely cause there to be more reports of force being used by officers. And this result wasn’t consistent: In some places police use of force went down when cameras were worn; in others it stayed about the same. In any case, it was a troubling finding.

Now Ariel and his colleagues have some even more disappointing news, which has just been published in the European Journal of Criminology. It seems that when officers wear body cameras, they are more likely to be assaulted. Now that’s strange. You’d imagine that wearing these cameras could only do the opposite. 

Read More

NSA Can Legally Access Metadata of 25,000 Callers Based on a Single Suspect’s Phone

Despite changes to the law, the U.S. National Security Agency can still request metadata from tens of thousands of private phones if they are indirectly connected to the phone number of a suspected terrorist, according to a new analysis. The study is one of the first to quantify the impact of policy changes intended to narrow the agency’s previously unfettered access to private phone records, which was first revealed by Edward Snowden in 2013. 

For years before Snowden went public, the U.S. National Security Agency legally obtained metadata not only from suspects’ phones but also from those of their contacts and their contacts’ contacts (and even their contacts’ contacts’ contacts) in order to trace terrorist networks. This metadata included information about whom a user has called, when the call was placed, and how long these calls lasted.

Today, federal rules permit the NSA to recover metadata from phones within "two hops" of a suspect, which means someone who called someone who called the suspect in the past 18 months. Previously, federal regulations were more generous, permitting recovery of metadata from "three hops" away dating back to five years.

A new analysis led by researchers at Stanford University’s Computer Security Laboratory quantifies just what this policy change has meant, discovering that, under the old five-year three-hop rules the NSA could legally recover metadata from about 20 million phones per suspect and “the majority of the entire U.S. population” if it analyzed all its suspects.  Now, the stricter 18-month "two hop" rule permits the agency to recover metadata from about 25,000 phones with a single request, according to the Stanford study.

Read More
Mark Suster defended one of his investments, uBeam.

Engineer and Investor in Spat About Wireless Charging Startup uBeam

The engineer whose critical blog post has roiled high-profile wireless power company uBeam is challenging statements made about him by Mark Suster, a prominent venture capitalist and uBeam investor.

Paul Reynolds, a former engineering vice president at uBeam, recently began publishing an anonymous blog that described uBeam as “the next Theranos” (a troubled biotech startup). Reynolds, who confirmed that he is the author of the post, raises substantial questions about whether uBeam can deliver the “world without wires” that it has been promising for many years.

Read More
An RFID chip using RF-only logic lacks a rectifier and its associated circuitry.

RF-Only Logic Makes RFID Tags Tinier

Engineers at North Carolina State University have applied a new technology called RF-only logic to create passive RFID tags that are 25 percent smaller than today’s. And a smaller tag means a cheaper tag.

The space savings comes from eliminating a circuit usually considered crucial to the chip’s operation—the rectifier. Passive RFID tags, the most common ones, are powered by an RF signal provided by the tag reader when it’s nearby. The RFID’s rectifier takes the AC radio signal and turns it into DC for use by the chip’s logic circuits. The innovation was to develop a set of circuit techniques that eliminate the need for the rectifier, allowing the logic to run directly from the oscillating radio signal.

Read More

Building Myanmar’s First Mobile Network From The Ground Up

In the past five years, no place in the world has witnessed a shift in mobile use so dramatic as the one taking place in Myanmar. The country, also known as Burma, was ruled by a military junta for nearly 50 years before a democratically-elected government was installed in 2011. At that time, only three percent of Myanmar’s 50 million residents had access to mobile phones.

Since then, the country’s newest carriers have rapidly built out infrastructure for connecting millions of new customers for the first time. These newcomers have generated enormous demand for service, which carriers and their contractors are scrambling to meet.

Read More
Researchers set up this 128-antenna array in March at the University of Bristol to carry out the first of several attempts to achieve greater spectrum efficiency.

5G Researchers Set New World Record For Spectrum Efficiency

A team of 5G researchers has set a new world record for spectrum efficiency. Their achievement with massive MIMO (multiple-input, multiple-output) arrays, which are cellular base stations comprised of dozens of antennas, is further evidence that this technology is a promising option for wireless engineers working to construct networks to deliver ultra-fast data speeds to more smartphones and tablets than ever before.

In an experiment on Wednesday, the group achieved a rate of 145.6 (bits/s)/Hz for 22 users, each modulated with 256-QAM, on a shared 20 MHz radio channel at 3.51 GHz with an 128-antenna massive MIMO array. That represents a 22-fold increase in spectrum efficiency over today’s existing 4G networks.

Read More

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More