Tech Talk iconTech Talk

Fingers type on a laptop keyboard.

You’re Being Tracked (and Tracked and Tracked) on the Web

The number of third parties sending information to and receiving data from popular websites each time you visit them has increased dramatically in the past 20 years, which means that visitors to those sites may be more closely watched by major corporations and advertisers than ever before, according to a new analysis of Web tracking.

A team from the University of Washington reviewed two decades of third-party requests by using Internet Archive’s Wayback Machine. They found a four-fold increase in the number of requests logged on the average website from 1996 to 2016, and say that companies may be using these requests to more frequently track the behavior of individual users. They presented their findings at the USENIX Security Conference in Austin, Texas, earlier this month.

The authors—Adam Lerner and Anna Kornfeld Simpson, who are both PhD candidates, along with collaborators Tadayoshi Kohno and Franziska Roesner—found that popular websites make an average of four third-party requests in 2016, up from less than one in 1996. However, those figures likely underestimate of the prevalence of such requests because of limitations of the data contained within the Wayback Machine. Roesner calls their findings “conservative.”  

For comparison, a study by Princeton computer science researcher Arvind Narayanan and colleagues that was released in January looked at one million websites and found that top websites host an average of 25 to 30 third parties. Chris Jay Hoofnagle, a privacy and law scholar at UC Berkeley, says his own research has found that 36 of the 100 most popular sites send more than 150 requests each, with one site logging more than 300. The definition of a tracker or a third-party request, and the methods used to identify them, may also vary between analyses.

“It’s not so much that I would invest a lot of confidence in the idea that there were X number of trackers on any given site,” Hoofnagle says of the University of Washington team’s results. “Rather, it’s the trend that’s important.”

Most third party requests are made through cookies, which are snippets of information that are stored in a user’s browser. Those snippets enable users to automatically log in or add items to a virtual shopping cart, but they can also be recognized by a third party as the user navigates to other sites.

For example, a national news site called might send a request to a local realtor to load an advertisement on its home page. Along with the ad, the realtor can send a cookie with a unique identifier for that user, and then read that cookie from the user’s browser when the user navigates to another site where the realtor also advertises.

In addition to following the evolution of third party requests, the team also revealed the dominance of players such as Google Analytics, which was present on nearly one-third of the sites analyzed in the University of Washington study. In the early 2000s, no third party appeared on more than 10 percent of sites. And back then, only about 5 percent of sites sent five or more third party requests. Today, nearly 40 percent do. But there’s good news, too: pop-up browser windows seem to have peaked in the mid-2000s.

Narayanan says he has noticed another trend in his own work: consolidation within the tracking industry, with only a few entities such as Facebook or Google’s DoubleClick advertising service appearing across a high percentage of sites. “Maybe the world we’re heading toward is that there’s a relatively small number of trackers that are present on a majority of sites, and then a long tail,” he says.

Many privacy experts consider Web tracking to be problematic, because trackers can monitor a user’s behavior as they move from site to site. Combined with publicly-available information from personal websites or social media profiles, this behavior can enable retailers or other entities create identity profiles without a user’s permission.

“Because we don’t know what companies are doing on the server side with that information, for any entity that your browser talks to that you didn’t specifically ask it to talk to, you should be asking, ‘What are they doing?’” Roesner says.

But while every Web tracker requires a third-party request, not every third-party request is a tracker. Sites that use Google Analytics (including IEEE Spectrum) make third-party requests to monitor how content is being used. Other news sites send requests to Facebook so the social media site can display its “Like” button next to articles and permit users to comment with their accounts. That means it’s hard to tell from this study whether tracking itself has increased, or if the number of third-party requests has simply gone up.

Modern ad blockers can prevent sites from installing cookies and have become popular with users in recent years. Perhaps due in part to this shift, the authors also found that the behaviors that third parties exhibit have become more sophisitcated and wider in scope. For example, a new tactic avoids the use of cookies by recording a users’ device fingerprints, or identifiable characteristics such as screen size of their smartphone, laptop, or tablet.

When they began their analysis, the University of Washington researchers were pleased to find that the Wayback Machine could be used to track cookies and device fingerprinting through its storage of the original JavaScript code, which allows them to determine which JavaScript APIs are called on each website. Therefore, a user who is perusing the archived version of a site in the Wayback Machine winds up making all the same requests that the site was programmed to make at the time.

The researchers embedded their tool, which they call TrackingExcavator, in a Chrome browser extension and configured it to allow pop-ups and cookies. They instructed the tool to inspect the 500 most popular sites, as ranked by Amazon’s Web analytics subsidiary Alexa, for each year of the analysis. As it browsed the sites, the system recorded third-party requests and cookies, and the use of particular JavaScript APIs known to assist with device fingerprinting. The tool visited each site twice, once to “prime” the site and again to analyze whether requests were sent.

Until now, the team says academic researchers hadn’t found a way to study Web tracking as it existed before 2005. Hoofnagle of UC Berkeley says that using the Wayback Machine was a clever approach and could inspire other scholars to mine archival sites for other reasons. “I wish I had thought of this,” he says. “I’m totally kicking myself.”

Still, there are plenty of holes in the archive that limit its usefulness. For example, some sites prohibit automated bots such as those used by the Wayback Machine from perusing them. 

Ixion docked with ISS

NASA Funds Plan to Turn Used Rocket Fuel Tanks Into Space Habitats

NASA is very good about being on the cutting edge of space exploration, but it's less good about making non-cutting edge space exploration efficient and cost effective. The agency is acutely aware of this, which is why it's been trying to get commercial carriers to handle deliveries of (now) supplies and (soon) astronauts to the ISS.

The next step is for private companies to take over space station construction for (soon) Earth orbit and (eventually) deep space. To that end, NASA has selected six partner companies to develop full-sized ground prototypes and concepts for deep space habitats, with the eventual goal of deploying habitats near the moon as a stepping stone to Mars.

Five of the partners, including Bigelow Aerospace, Boeing, Lockheed Martin, Orbital ATK, and Sierra Nevada will be designing habitats that are built on Earth and launched into space on rockets. It makes sense to do this, because it's how habitats have always been sent into space. The sixth partner, NanoRacks, is teaming up with Space Systems Loral and United Launch Alliance to try something completely different: taking empty fuel tanks from the upper stages of rockets and turning them into space habitats on-orbit.

Read More
A room-temperature fluorescent protein polariton laser in action

Glowing Protein Warms Up Low-Power Laser

Mother Nature probably wasn’t thinking about lasers when she invented the jellyfish, but it turns out the sea creature evolved a substance well suited to letting a new type of laser work at room temperature.

Read More
A woman holds a smartphone and checks apps on its homescreen.

Are Your Apps Sluggish? Blame Summer

During late August here in the United States, it can start to feel like everything is moving a little bit slower. In the case of your apps, that may actually be the case.

Earlier this year, a San Francisco company called Apteligent released a report based on internal data that suggests app performance slows by 15 percent in the summer. The report identifies humidity as the culprit, though the company can’t say for sure based on their data why the extra delay occurs.

However, it’s a reasonable guess that moisture in the air is the guilty party in light of research showing that radio signals attenuate in humidity as well as in rain, sleet, and snow. As radio waves travel through humid air, water molecules absorb part of their energy and scatter another portion, weakening the signal or causing data packets to be lost altogether, says Milda Tamošiūnaitė of the Center for Physical Sciences and Technology in Vilnius, Lithuania.

This effect is particularly bad at frequencies above 1 gigahertz, which are used for LTE service, because those shorter wavelengths repeat more frequently and so have more opportunities to encounter obstacles as they travel. Those obstacles will also be larger relative to the size of the wave than obstacles encountered by waves at lower frequencies.

Phone calls and data connections are both attenuated by rain, humidity, and other forms of airborne water—though phone calls are sometimes handled at lower frequencies, so they may be slightly less impacted than Web browsing. Previous research has often focused on the effect of rain on radio signals, but the specific role of humidity has been studied less.

The informal study by Apteligent hints at humidity’s potential impact on app performance, though can’t be considered definitive. The company monitors tens of thousands of apps for clients including Hilton, Groupon, Netflix, and PokemonGo. Its clients embed a bit of special code into their apps, and the code allows Apteligent to track what users are doing, how much data they are sending and receiving, whether they experience any delays, and what might be causing those delays.

“If it's a smartphone in the U.S. that has apps, the odds are very high that we're embedded in one of those,” says Andrew Levy, Apteligent’s co-founder and chief strategy officer.  

To examine the possible role of humidity in app performance, the company compared the average latency across its entire U.S. network of smartphone apps during the summer of 2015 with its performance the following winter. They found that service was about 15 percent slower in the summer than in the winter. Their theory is that humidity caused the bulk of this impact.

Ultimately, the average delay only worsened by about 60 milliseconds—a period of time that customers aren’t likely to notice. For comparison, Tamošiūnaitė says light rain could attenuate a 2 GHz signal by 15 percent at a distance of about 3000 kilometers, or at 128 kilometers during heavy rain (Note: this example assumes that rain is the only factor causing signal degradation, which is never the case in real life).  

So what does all of this mean for developers? Paulo Michels, VP of engineering for app development company ArcTouch, says it won’t change his approach very much. He and his team of 60 software engineers, who have developed roughly 300 apps, aren’t focused on weather more than any other factor when building a new app. They already use common strategies such as compressing JPEGs, pre-processing videos to allow them to stream at multiple potential qualities based on a user’s network, and caching content on phones in order to avoid delays.

“The network, of course, plays a big effect on overall app performance, but as mobile developers, we're used to considering the network as something unreliable and unpredictable,” he says.

Eric Richardson, senior software engineer for WillowTree, who has worked on more than 35 Android apps, says 60 milliseconds is no more than “the blink of an eye” and designing to account for peculiar weather conditions is not a major priority beyond the measures that developers already take for poor network connections.

But he also says the Apteligent report might mean that developers should start to make an effort to test their apps in humid conditions as well as on dry days. Right now, his company relies primarily on simulated 3G and 4G networks running on Wi-Fi to evaluate their apps, as well as some beta testing in the real world.

“Up until now, I don't think weather has ever been on our minds,” he says. “But now that it is, I guess it kind of brings in the perspective to do more realistic testing as opposed to just sitting in the office connected to Wi-Fi.”

Image: Craig Mayhew and Robert Simmon/GSFC/NASA

Fighting Poverty With Satellite Images and Machine-Learning Wizardry

Governments and NGOs need economic data to decide how best to aid the poor. But reliable, up-to-date data on poverty levels is hard to come by in the world’s poorest countries.

Scientists have now devised an inexpensive technique that combines satellite images and machine learning to accurately predict poverty levels at village level. Such a fine-grained gauge of poverty could help aid programs target those with the greatest needs. It could also be a valuable tool for researchers and policymakers to gather national statistics and set development goals.

Governments typically conduct surveys of income and consumption to measure poverty levels. These surveys cost hundreds of millions of dollars and are impossible to conduct in areas of conflict. World Bank data show that between 2000 and 2010, 39 out of 59 African countries conducted fewer than two surveys that were extensive enough to measure poverty.

Researchers have recently tried to estimate poverty levels by analyzing mobile phone usage data and satellite photos showing nighttime lighting. But mobile phone data are typically not publicly available. Nighttime lights, meanwhile, indicate wealthier regions, but they cannot differentiate among economic levels in the most impoverished regions. “In the poorest areas in Africa, the ones we care the most about, it’s almost uniformly dark at night,” says Neal Jean, an electrical engineering and computer science Ph.D. student at Stanford University.

Jean, earth system science professor Marshall Burke, and their colleagues came up with a clever machine-learning method that combines nighttime light intensity data with daytime satellite imagery. The technique, reported in the journal Science, is general and could be applied to any developing country, Jean says. 

In machine learning, a computer model is fed labeled data sets—say, thousands of images labeled “dog” or “cat.” Much like humans learn by inference after seeing enough examples, the model analyzes certain features in the images and figures out how to classify an animal in a picture as a dog or cat.

The researchers trained their machine-learning algorithm with millions of daytime satellite images, each labeled with a number that corresponded to how bright the area was at night. Daytime images, which contain features that indicate livelihoods, such as paved roads, metal roofs, and farmland, can help distinguish poor regions from ultrapoor ones. “The model looks for visual cues and automatically learns to find features in daytime imagery that correspond to nighttime light values,” Jean says.

Read More

Popular Internet of Things Forecast of 50 Billion Devices by 2020 Is Outdated

If you follow discussions about the Internet of Things, you’ve probably heard this stunning prediction at least once: The world will have 50 billion connected devices by 2020. Ericsson’s former CEO Hans Vestburg was among the first to state it in a 2010 presentation to shareholders. The following year, Dave Evans, who worked for Cisco at the time, published the same prediction in a white paper.

Today, that figure has arguably done more than any other statistic to set sky-high expectations for potential IoT growth and profits. Remarkably, those projections weren’t even close to the highest of the time—in 2012, IBM forecasted 1 trillion connected devices by 2015. “The numbers were getting kind of crazy,” recalls Bill Morelli, a market research director for IHS Markit.

Read More
A view of QUESS at the Shanghai Engineering Center for Microsatellites in May

China Launches World's First Quantum Communications Satellite

The first spacecraft designed to perform quantum communications was launched into space today, from the Jiuquan Satellite Launch Center at 1:40am local time.

The Chinese mission, dubbed Quantum Experiments at Space Scale (QUESS), is a milestone for researchers building the technology needed to create large-scale quantum communications networks. Thanks to the fundamental nature of quantum mechanics, which is sensitive to observation and prohibits the copying of unknown states, quantum links should in principle be unhackable. Gregoir Ribordy of the quantum cryptography firm ID Quantique told the Wall Street Journal that a quantum transmission is like a message scribbled on a soap bubble: “If someone tries to intercept it when it’s being transmitted, by touching it, they make it burst.”

Free of turbulent air (except for what you hit between Earth and orbit) and the distortions of fiber, space is an attractive place to pursue quantum communications. QUESS, which boasts the ability to generate pairs of entangled photons, will perform experiments in quantum entanglement and teleportation, Nature reports. But the first order of business will be quantum key distribution, “to establish a quantum key between Beijing and Vienna, using the satellite as a relay,” lead scientist Pan Jian-Wei told Nature in a Q&A published early this year.

Last year, Thomas Scheidl, a member of the Austrian Academy of Sciences team that is collaborating with Pan and his colleagues, explained to IEEE Spectrum how the process would work: 

 The satellite flies over a ground station in Europe and establishes a quantum link to the ground station, and you generate a key between the satellite and the ground station in Europe. Then, some hours later, the satellite will pass a ground station in China and establish a second quantum link and secure key with a ground station in China.

The satellite then has both keys available, and you can combine both keys into one key...Then you send, via a classical channel, the key combination to both of the ground stations. This you can do publicly because no one can learn anything from this combined key. Because one ground station has an individual key, it can undo this combined key and learn about the key of the other ground station.

With any luck, the two-year mission will be the first in a string of quantum communications spacecraft—and a progenitor of secure quantum communication for the masses. 

Intel’s Diane Bryant, executive vice president and general manager of the Data Center Group, with Nervana’s cofounder Naveen Rao

The Nervana Systems Chip That Will Let Intel Advance Its Deep Learning

Deep-learning artificial intelligence has mostly relied upon the general-purpose GPU hardware used in many other computing tasks. But Intel’s recent acquisition of the startup Nervana Systems will give the tech giant ownership of a specialized chip designed specifically for deep learning AI applications. That could give Intel a huge lead in the race to develop next-generation artificial intelligence capable of swiftly finding patterns in huge data sets and learning through imitation.

Read More

Stretchable Touch Pad Could Become Wearable Touch Screen

Video: Kim et al. Science (2106)

A new, stretchable transparent touch pad can be used to write words and play electronic games, and it may even one day be implanted inside the body, its inventors say.

Touch pads and touch screens are on nearly every smart device these days. But they can’t go on anything flexible, such as the human body. Scientists have explored stretchable touch panels based on carbon nanotubes, metal nanowires, and other advanced materials, but the performance of these stretchable touch panels fell off sharply when they were stretched. Just as bad, they also fell apart over time when repeatedly stretched.

To overcome these problems, scientists at Seoul National University created a touch pad made of the same kind of soft and very stretchable hydrogel used to make soft contact lenses. The hydrogel involved contains lithium chloride salts, which are electrically conductive and help the hydrogel hold onto the water it needs to stay soft.

Read More
A photo of Transhumanist Party presidential candidate Zoltan Istvan standing in front of an American flag.

AI for President

Zoltan Istvan, who represents the Transhumanist Party and bills himself as “the science candidate” in the 2016 U.S. presidential election, has garnered more media coverage than many third party candidates, with recent mentions in Vocativ, The Verge, USA Today, and Pacific Standard. He also writes regularly for Motherboard and The Huffington Post.

Istvan’s popularity is likely due to a combination of his quirky campaign style (he drives around in a bus painted to resemble a coffin with “Science vs. The Coffin” written above the bumper) and an unconventional platform that pushes for gene editing, human life extension, and morphological freedom (the right to do anything to your body so long as it doesn’t harm others). As a broader movement, transhumanism focuses on leveraging science and technology toward the ultimate goal of overcoming death, largely through as-yet-unproven methods such as mind uploading, in which a person’s entire consciousness would be transferred to a digital system or machine.

Read More

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More