Tech Talk iconTech Talk

Fukushima Workers Battle Radioactive Air and Water Inside Reactor Buildings

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: John Boyd is an IEEE Spectrum contributor reporting from Kawasaki, Japan. This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency. For more details on how Fukushima Dai-1's nuclear reactors work and what has gone wrong so far, see our explainer.

Over the past week, workers have made grudging progress towards stabilizing Japan's Fukushima Dai-1 nuclear power plant. Yet a dramatic announcement today, in which Japanese prime minister Naoto Kan called for a new energy policy, suggests that the country may turn away from nuclear energy.

First, the grudging progress: Just a few days after air-filtering equipment began cleaning the radioactive air inside the damaged No. 1 reactor building, workers and government safety officials were able to enter the building early Monday morning to carry out a survey of conditions inside.

"We are making great progress there, so that workers are now able to enter the building," said Goshi Hosono, a senior aid to prime minister Kan, at a briefing for the foreign press on the afternoon of Monday, May 10. "Our next goal is to confirm that the situation there is safe and workable, and then to restore the (reactor) cooling system as soon as possible."

Such confirmation will be no easy matter. Tokyo Electric Power Co. (TEPCO) reported the same day that radiation levels in the building ranged from 10 millisieverts per hour to as high as 700 millisieverts per hour on the second level of the building. The latter figure is almost three times the 250-millisieverts maximum level workers are allowed to be exposed to in one year.

A TEPCO official told the press that the company had "hoped the radiation level would go down to 1 millisievert per hour," but that was nowhere near the case. To allow workers to operate safely, the company may bring in lead shielding or might focus on removing highly radioactive debris, the result of the hydrogen explosion that damaged the building following the 11 March earthquake. The TEPCO official voiced concern that the continued high radiation levels could force a change in the company's plans. But despite the hazardous conditions, workers entered the building again to adjust the reactor’s measuring gauges in preparation for flooding the pressure vessel, which is done to cool the crippled reactor.

According to TEPCO's stabilization and clean-up plan, the company will aim to bring the plant to a cold shutdown in less than nine months. To achieve that goal, TEPCO first has to deal with the near 70 000 tons of radioactive water that has accumulated in the turbine basements and outside trenches associated with reactors 1, 2, and 3. The pooled water near reactor No. 2 has been designated highly radioactive.

“Until we are able to remove this highly radiated water (in the Unit 2 trench) we will not be able to proceed with any other work,” said Hosono.

Currently, the water is being transferred to a central waste-water treatment facility on the site to await decontamination. TEPCO is working towards decontaminating the water in cooperation with French nuclear supplier Areva and Kurion, a nuclear waste management company in the United States. Areva’s technology is based on a co-precipitation method that injects chemical reagents into the water to isolate elements such as radioactive iodine and cesium, which are then separated by precipitation for removal. Kurion then uses a vitrification process that turns the radioactive materials into a compacted glass to facilitate storage.

Hidehiko Nishiyama, the deputy director-general of the Nuclear and Industrial Safety Agency, participated in the press briefing with Hosono. He explained that the decontamination system will be used in conjunction with Japanese desalination equipment. "Once the contaminated water has gone through the systems and has been cleaned, it will be separated into two kinds of water, one of which will be stored in a tank, and one which will be recirculated (for cooling) the reactor." The water decontamination is expected to begin in June.

While the situation at Fukushima Dai-1 seems to be slowly improving, the general outlook for nuclear power in Japan got worse this week.

In a major development, Chubu Electric Power Co. agreed on Monday May 10 to suspend operation of its Hamaoka nuclear power plant in Shizuoka prefecture, about 200 kilometers southwest of Tokyo, "until further measures to prevent tsunami are completed." The decision came after Prime Minister Kan's May 6 request that the power company cease all operations at the plant, citing concerns over its safety. Kan said the government had predicted that there's an 87 percent chance that a magnitude 8 earthquake will strike the region within the next 30 years.

The coastal Hamaoka plant is located near a major fault line in central Honshu, Japan’s largest island. Reactors No. 4 and No. 5 are currently operating and the company had planned to restart the No. 3 reactor, currently offline for inspection, in July. Reactors No. 1 and No. 2 were shut down in January 2009 and are set to be decommissioned.

Although the plant is protected by sand dunes between 10 and 15 meters high, the company plans to build a concrete seawall to further protect against tsunami. Construction is expected to take two three years, according to press reports. The gross capacity of the three active reactors is about 3.5 gigawatts and accounts for roughly 10 percent of the region’s power supply. The company had been providing TEPCO with make-up power, but will now have to suspend that service, as it scrambles to meet its own region’s demands.

On Tuesday, May 11 Prime Minister Kan took another dramatic step, announcing that Japan would seek a new national energy policy that puts an increased emphasis on renewable energy and conservation. "We need to start from scratch," Kan said in a press conference. "We need to make nuclear energy safer and do more to promote renewable energy." 

Kan's announcement suggests that the government will drop an energy policy released last year, which called for the construction of 14 more nuclear reactors before 2030; that plan called for nuclear to supply 50 percent of the country's energy needs.
 

Researchers Create A Schizophrenic Computer

We know what schizophrenia looks like in humans. We think we know what schizophrenia looks like in mice. Now we may know what it looks like in a computer.

Researchers at the University of Texas have modeled the disease in a natural language parser, called DISCERN, as a way to test competing theories of the neural mechanisms that cause schizophrenia.

DISCERN was built to process and recall simple narratives. Through training, the system learns what words mean and how they work in sentences in a way that mimics activation patterns in the brain. With the original settings, DISCERN is able to digest a narrative, retain it, and reproduce the story in its "own" words. The system also identifies negative and positive aspects of the story to change the likelihood that it will remember a specific detail. In this way, DISCERN models the brain both as a semantic and emotional processor.

When the researchers at UT changed the rate at which DISCERN learned, they noticed drastically different results,  After learning a third-person narrative about a mobster, DISCERN retold fragments of the story in the first person, adopting a wild, criminal autobiography. Mistaking who did what to whom, an error called "agent-slotting," is common among people with schizophrenic symptoms of delusion, and among the human patients used in controls for the study.

When diagnosing and treating schizophrenics, doctors often look for disturbances in storytelling and semantics. Finding those same symptoms in DISCERN has strengthened a theory that hyper-learning causes schizophrenia.

But the real triumph of this work could be that it validates the computer as a neural model, proving some relevance to clinical research.

We know very well what schizophrenia looks like. But, so far, we don't know really know what causes it. This has been the big, unsolvable problem with neuro-psychiatric disorders, where clinical ethics stand in the way (and rightly so) of the scientific method. If people were not people, then neuroscientists could study the phenomenon with reverse engineering, gradually coming to understand it by learning how to cause it. Instead we try to create mice with schizophrenia.

A model that communicates with words but does not breathe or feel would be a breakthrough indeed.

You can find the study results here.

Broadcasters' Report Cries Foul on FCC Whitespace Auction Plans

If you search the U.S. Federal Communications Commission's National Broadband Plan for the word ‘femtocell’, you won’t get any results. According to the National Association of Broadcasters (NAB), that’s a worrisome sign. The FCC is convinced that there is a scarcity of spectrum for smartphones and other mobile devices, so they’re keen on pushing a bill, which would allow them to auction off some television broadcasters’ spectrum, as quickly as possible through the legislative process.

But in a study released last week, the NAB counters that the regulator should put the brakes on its auction plans, or at least round it out with some alternative approaches. The study argues that the FCC hasn’t taken into full account the capacity increasing powers of technologies like femtocells (which 70-80 percent of wireless carriers have said they plan to use), cognitive radio, better receiver standards, and spectrum sharing. And, says the NAB, the FCC hasn’t even done a proper accounting of how broadcast and wireless companies are using currently-licensed spectrum.

The NAB recently asked Uzoma Onyeije, who served as an FCC attorney in the wireless and wireline bureaus from 2001 to 2006, to conduct a study on how comprehensively the FCC has mapped the current state of licensed spectrum resources, and how significantly alternate ideas to increase spectrum capacity would affect spectrum availability. The Onyeije study evaluates the sources and assumptions that are behind the FCC’s declaration of a wireless spectrum crisis, and questions whether sufficient data gathering has been conducted on spectrum utilization. It’s reasonable to answer this, Onyeiji said, before legislation locks in on auctions as the number one way to meet wireless spectrum demand.

The lack of a sound spectrum inventory is turning out to be the crux of the NAB's concerns, potentially outweighing the absence of femtocells from the FCC playbook. Christopher Ornelas, Vice President and Chief Strategy Officer for the NAB, and Matthew Hussey, a former engineer and now an aide to Senator Olympia Snowe (Republican of Maine), argue that even a basic comprehensive inventory of spectrum availability and usage is nonexistent, at the FCC or elsewhere.

If no one is entirely sure how much broadcast spectrum is underutilized, they asked, then why is there a rush to hold auctions? Until spectrum utilization is quantified in a granular way, how can the best practice approach to future utilizations be decided?

The NAB insisted that such questions are not designed to simply stall incentive auctions, and, in fact, the trade group does not systemically oppose the auctions. The idea that wireless carriers have ever-growing spectrum requirements is not lost on NAB officials, and the association supports the FCC plan, provided that participation on the part of licensed television broadcasters is voluntary.

This caveat, however, may prove to be tricky. Television broadcaster CBS, for instance, has recently made it clear that it is uninterested in partaking in the auction, and will retain the full range of its currently-allotted spectrum. It will be interesting to see if other television networks follow suit. However, such a stand might not isolate them from the spectrum repackaging process altogether. Onyeije said that even if broadcasting companies choose not to partake in the auctions voluntarily, they still stand to have their spectrum shifted or replicated in a different space, which introduces the possibility of service disruptions.

But for now a voluntary auction may be the compromise – the companies pushing for immediate auctions are represented by over $1 trillion in revenue. And in March, 112 leading U.S. economists signed a letter to Congress endorsing voluntary incentive auctions for the spectrum. Holistic approaches to spectral efficiency will still happen, but in part include technologies that are still in pilot phases. Christopher Guttman-McCabe, Vice President of Regulatory Affairs for the CTIA (The Wireless Association), said that there’s a window for auctions now, and that requests to increase wireless spectrum should be addressed immediately if the exploding demand from the mobile marketplace is to be met.

Guttman-McCabe insisted that Onyeije’s critique is a delay tactic. Over two thousand companies -- even companies that are themselves developing alternate technologies to increase existing spectrum efficiency -- have requested auctions, he said.

For example, at the Mobile World Congress this year, Qualcomm demonstrated femtocell technology that led to substantial increases in network capacity, findings which the company also filed with the FCC last month. And, Guttman-McCabe said, Qualcomm is also an advocate for bringing more spectrum to wireless carriers.

Perhaps outside of Washington, diverse approaches to spectrum demand can exist harmoniously after all.

Intel Transistors Enter the Third Dimension

From what I remember of Honey, I Shrunk the Kids, getting zapped by Wayne Szalinski’s shrink ray was not a cause for celebration. But Intel senior fellow Mark Bohr, who submits to a virtual miniaturization process that leaves him 100nm tall, seems positively gleeful in this new video from the company.

Bohr is celebrating Intel’s new, smaller, and, for the first time, three-dimensional transistor. Dubbed Tri-Gate, the device employs a much-anticipated design innovation – multiple gates that wrap around a raised silicon channel.

The transistors will appear in Intel's new "Ivy Bridge" microprocessor, which will be made using a 22nm manufacturing process and begin shipping later this year. Intel's processor is the first make the jump from 32nm to 22nm, continuing the decades-long race to keep boosting the number of transistors that can be packed into the same amount of space.

The switch to 3D addresses some fundamental problems that have been plaguing transistors as their feature size has shrunk. Transistors have long been two-dimensional affairs, with a source and a drain separated by a channel along which electrons flow. The only components that sit above the plane are the gate, which turns on and off the flow of electrons, and a thin, insulating layer sandwiched between the gate and the channel.

But as engineers have made transistors smaller and smaller, the distance from the source to the drain has gotten so short that electrons can leak through the lower part of the channel, where the gate's influence is weakest, wasting power. To solve the leakage problem, transistor designers have been eyeing three-dimensional designs in which electrons have no place to go that isn’t controlled by the gate.

Intel’s solution is a FinFET design that employs a ridge-like silicon channel that juts out of the silicon substrate. The transistor’s gate runs perpendicular to the channel and drapes over it, creating three gates around the channel.

Intel reports its new 3D, 22nm transistors are 37 percent faster at low voltage compared to Intel's current 32nm planar transistors, making them better for smartphones and other handheld devices. They draw less than half the power as the firm’s 32nm chips.

Despite that, the New York Times says, some think Intel is making a gamble, because alternative approaches may offer better gains on the power front:

There has been industry speculation that FinFET technology will give Intel a clear speed advantage, but possibly less control over power consumption than alternative approaches...The scope of Intel’s gamble is underscored by the fact that while the company dominates in the markets for data center computers, desktops and laptops, it has largely been locked out of the tablet and smartphone markets which are growing far more quickly than the traditional PC industry.

Intel made a tentative step into the low power market last month, with the introduction of its first processor for tablet computers.

Nuclear Regulator: Stopping Cybersecurity Threat Needs IT Experts and Systems Engineers

James Wiggins, director of the Office of Nuclear Security and Incident Response at the NRC, spoke today on future nuclear power plant security, emphasizing the need for it to be based in both operational and information technology expertise. While the role for IT professionals in power plants is crucial, he said, it must work in tandem with systems engineering expertise. Understanding the physical consequences of cybersecurity threats is key to mitigating cyber risks, he said.

The talk is part of a conference on Commercial Nuclear Power Cybersecurity, which got underway today with talks by representatives from a range of government security organizations, who each reviewed cybersecurity risks to nuclear power and the regulatory policies in development to deal with them. The University of Maryland conference brings together regulators, industry engineers, and academics to discuss collaborative approaches to deal with the unprecedented nature of cyber threats to an industry that’s already undergoing scrutiny from multiple angles.

Wiggins' perspective is rooted in his knowledge of power plant systems – he began his career with the NRC as a reactor inspector in 1980. But a lot has changed in system control since then -- more and more plants are using SCADA (supervisory control and data acquisition) to manage their operations digitally. Wiggins admitted that it’s tricky to apply the methods and paradigms of physical security to the digital realm.

The industry now has examples like Stuxnet to consider, and has to imagine scenarios, Wiggins said, like what someone could do with a thumb drive, or what someone could do within a single software program. The idea of securing self-contained parts of a power facility becomes moot when it houses a network of technologies.

That leads to Wiggins’ foremost cybersecurity concern, which lies in the complexity of the supply chain for nuclear power plant technologies. A nightmare scenario he has imagined, he said, is based on the network of companies and individuals involved in bringing digital systems to plants. While it’s possible to keep a plant disconnected from the Internet as a cyber threat, or restrict physical access to a plant, SCADA system designs are harder to supervise, and encompass multiple entry points for threats, even at the operating system level.

Wiggins said that the regulations and new plans that the NRC is refining in order to meet new cybersecurity challenges are ahead of cybersecurity preparation in other industries, but that it's a stretch to say that they are ahead of the curve.

Modifying the licensing process for future reactors is one component of new preparedness measures. Wiggins said that the NRC traditionally holds vendors responsible for operational success and failure in individual plants once they are certified. But in the construction of older plants, regulators discovered a series of mechanical problems caused by falsely-certified materials. This led to a niche effort within in the agency to prevent counterfeit and fraudulent materials from entering into nuclear power plant construction.

The 21st century version of mediating these kinds of production involves the multifaceted SCADA systems that are used in modern plant system operation and control. The NRC receives proposals for system component designs from different vendors, and certifies the ones that meet its regulatory criteria. Then, when licensees of new reactors work with the NRC, they can choose from NRC-certified designs. While this process works from a legal standpoint, Wiggins said, its role in mitigating modern manufacturing vulnerabilities is still in development.

Another high priority challenge for the nuclear industry and its governance that Wiggins addressed lies in NRC communications. Wiggins explained that when threats are identified by the entities like the Department of Homeland Security, the NRC assumes the responsibility of alerting power plants. From there, the NRC presumes that the systems and personnel in the plants will react appropriately. However, he admitted that the communication system itself for disseminating such alerts may not function as well as he hopes it will in the future.

Wiggins said communicating threats should ideally involve the right combination of information security professionals and operations engineers, but it's not always easy to map out a perfect communications network.

Taiwan R&D Group Pushes Packaging to Speed Development

A Taiwanese R&D center says it has come up with a packaging technology that will speed development of multi-chip systems.

Taiwan's National Chip Implementation Center (CIC) calls the technology MorPACK, for morphing package. MorPACK is three-dimensional (3D) embedded system integration platform that can integrate several different kinds of chips. CIC says the technology will cut down development time and cost, because it allows for the easy integration of chips with different functions into one tightly-bound package. 

MorPack is a stack of sub-modules, which are each composed of bare chips, a fiberglass and copper substrate, connection bridges, and solder balls.
The lower part of the structure, called the common platform, consists of a processor, a north-bridge (a chip linking the processor to memory chips), synchronous DRAM, NOR flash memory, and a south-bridge (a chip linking the processor to peripheral devices). The upper part, called the custom substrate , is designated for a chip the user wishes to develop as the unique part of the system. This can be an IC designed to accelerate some specific function, such as decryption, or a programmable logic chip (FPGA).

"The MorPACK platform was invented to share part of electronics developers' workload,” says Chun-ming Huang, who leads the MorPACK development team at CIC. “They now just need to spend time and money tapping out accelerator [intellectual property] cores invented by themselves rather than on the whole module."

The CIC spent two years designing the MorPACK platform. The hope was to provide more integration to university designers than was allowed by university-serving prototyping systems, such as Multi-Project System-on-a-chip (MP-SoC). Although the MP-SoC concept has greatly reduced fabrication cost of SoC designs, you can’t integrate fundamentally different types of chips onto the same piece of silicon.

Huang stressed that the MorPACK technology allows electronics developers to stack chip modules with different functions on top of each other.  Therefore, it can help save space and resources on a carrier board.  "The dense three-dimensional structure guarantees better performance and less power demand than that of a huge two-dimensional complex IC board,” Huang says.

Right now, the MorPACK platform is 4 x 4 centimeters, but Huang says that his team has evaluated the possibility to shrink it. "We aim to downsize it to one fourth next year, by adopting the (more expensive) flip-chip packaging technology, without compromising existing advantages," he says.

In Taiwan, seven research teams at six universities have adopted the MorPACK platform to develop IP cores needed by intelligent systems for biomedical, 3D communication, image processing and other applications. According to Tzi-dar Chiueh, the director general of CIC, for each IP core-design project, the reusable MorPACK platform on average helps researchers cut down development time by three to six months and reduce cost by NT$ 1.5 million ($50,850).

Chiueh says the center plans to transfer the technology to the private sector next year.

Workers Prepare to Enter Fukushima Reactor No. 1

Fuel Pool: Fuel rods in the reactor No. 4 spent fuel pool are seen in a still taken from a video and released by TEPCO.

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency. For more details on how Fukushima Dai-1's nuclear reactors work and what has gone wrong so far, see our explainer.

Workers are planning to enter the damaged No. 1 reactor building of the Fukushima Dai-1 Nuclear Power Plant this week to install filtering equipment for removing radioactive contamination from the air. Tokyo Electric Power Co. (TEPCO) said on Monday that preparations were underway for workers to place four devices in the building with the goal of reducing the radiation to a level low enough to create a “working environment inside the reactor building.” This is a necessary preliminary step before the company can begin to put in place a stable circulating cooling system for the reactor, whose original cooling system (along with those of the No. 2 and 3 reactors) was damaged by the tsunami that followed the 11 March earthquake.

A first step in restoring a cooling system is finding a way to better calibrate the water levels in the reactor, so TEPCO intends to install a new water-level gauge. The company has been injecting water into the pressure vessel via the “Low Pressure Injection Line” on a trial basis. The water injection is part of urgent efforts to fill the vessel above the fuel rods and so cool the reactor. But pressure inside the reactor vessel began falling after the procedure began, creating the potential for outside air to enter the vessel and lead to an explosion. TEPCO had set a goal of injecting up to 14 tons of water per hour, but reduced this to 10 tons and now has cut back the figure to six tons, in order to maintain an adequate internal pressure. 

Figuring out the water level in the reactor will necessitate a work team entering the building, but first it must be cleared of most of the radioactive air. To achieve this, a team will install four ambient air filtration systems in a passageway near the double doors or airlock of the reactor building. Each device, supplied by The Japan Environment Research Company, will be powered by temporary diesel generators and clean the circulating air via at a rate of 1680 cubic meters per hour. The company said that the workers installing the equipment will be limited to being in the building for ten minutes at a time, because of the high radiation levels.

A TEPCO official explained during a press briefing that they planned “to reduce the density of air contamination in the building by about 95 percent.” The workers could start installing the devices this Thursday.

TEPCO is also continuing to pump nitrogen gas into the No. 1 reactor’s containment vessel to prevent the possibility of a hydrogen explosion, like the one that occurred at struck the reactor on 12 March.

Over at Reactor No. 2, workers continue to remove the highly radiated water that has pooled in the basement of the reactor’s turbine building and in trenches outside the building. “We have already transferred more than 2500 tons of water to the radiated waste disposal facility, said Hidehiko Nishiyama, the deputy director-general of the Nuclear and Industrial Safety Agency, in a press briefing on May 2.”

Nishiyama added that TEPCO will also have to inject nitrogen into the No. 2 reactor but faced the challenge of dealing with a suspected hole in the suppression pool or torus. “One idea is to fill the torus room where the suppression pool is located with sticky concrete,” said Nishiyama. “But there are some difficulties such as how to introduce the concrete from outside the building.”

A TEPCO official told IEEE Spectrum that the company also plans to construct a temporary wall or dyke to protect the No. 3 and 4 reactors, which are considered the most vulnerable to another tsunami, should one occur following one of the many major aftershocks that have been taking place frequently since the original quake. The official said the wall would stretch for about 500 meters and be between 1 and 2 meters high but built on ground some 10 meters above sea level. He added that the design was still under consideration.

Separately, Banri Kaieda, head of the Ministry of Economy, Trade and Industry, hinted at a plan to have all of Japan’s power companies with nuclear plants help pay compensation to those hurt by the Fukushima incident. Kaieda revealed this idea during an interview with the Yomiuri Shimbun, Japan’s largest newspaper. He said he didn’t want to see electricity rates hiked to do this, but rather urged the companies to sell assets and cut expenses.

“Domestic power companies have relied on their monopolization of regional energy supplies, profiting from their agreement not to interfere with each other,” Banri is quoted as saying. “Considering their mutual dependence on each other, I want them to share TEPCO’s burden of compensation.”

Forensics Experts Say iPhone Tracking Isn't All That Accurate

Last week the iPhone user community was abuzz after researchers presenting at the Where 2.0 conference announced their discovery that iPhones collect location data—and that this information can be used to create a track of the user’s roamings.

While this may have been shocking to iPhone users—it was old news to the law enforcement community, who has been using the iPhone’s tracking file as part of standard operating procedure when analyzing the iPhone for evidence for several years.

And, it turns out, the data being stored isn’t all that useful for tracking a user and his iPhone. The data is simply not reliable and there seems to be way too much of it.  Because a variety of apps use location data, it’s stored with no consistency. It shows routes that may be off by actual streets or avenues. It shows activity that may be off in time by days or weeks.

There is a reasonable explanation for all of this confusion. The data was not designed to track your location; it was designed to improve your location identification.

The AT&T iPhone does not contain what is called the enhanced 911 chip, designed to locate a mobile phone by the use of two GPS satellites (versus the minimum three for conventional GPS devices). Instead, it relies on multiple technologies to determine its location-- satellites, WiFi hotspots, and cell towers combined depending on their availability. This hybrid method is very inaccurate in rural areas, though is usually slightly better in urban areas.  And while someone analyzing the collected data can often identify a basic route, it’s usually only accurate to within five to ten miles.

Here’s the way the data collection typically works.

The iPhone combines the reception and transmission of all of its wireless technologies (3G cell phone communications, Wi-Fi receivers and transmitters, and GPS receivers) into just two antennas. These are integrated into the metal ring around the camera, the audio jack, the metal screen bezel, and the iPhone circuitry itself.

Using these antennas, the iPhone doesn’t always get a solid GPS signal, especially when indoors or in basements. So for speed and better accuracy, it relies on a process called Assisted GPS (A-GPS).

Before looking for a GPS signal, it gets a rough estimate of its location from the cell phone network, which makes this determination based on signal strength from the nearest towers. It then tries to get location data from the GPS satellites, but if this isn’t available, it checks to see if the phone is getting any WiFi signals; based on the known location of those WiFi nodes the iPhone can help pinpoint the phone’s whereabouts. (Apple is leveraging a public crowd-sourced database of known WiFi hotspots to aid in this geo-locating.) If there aren’t any WiFi nodes available, it turns again to the cell phone network, where Apple is leveraging another public database of known cell towers. Using these known tower locations to collect the latitude, longitude, and the compass heading from where the iPhone is in relation to a nearby tower, the iPhone stores that information with a date/time stamp in a file called Consolidated.db.

Inside of the Consolidated.db database are four tables: CellLocation, CellLocationLocal, CellLocationHarvest, WifiLocation, and WifiLocationHarvest. The non-Harvest tables are directory listings of known Cell Tower and WiFi node locations, each stored with a degree of confidence. The Harvest tables appear to be temporary tables used for updating the CellLocation or WiFiLocation tables. These lists only represent a small sample of the known cell towers and WiFi hotspots; the full database would be too large to store on the device.

Consolidated.db doesn’t store snapshots of location data constantly throughout the day, rather, it only grabs the coordinate data when an application requests it; it also sometimes grabs that data to make sure it hasn’t strayed from its current location.
With so many different developers creating applications that rely on the location features of the iPhone, the quantity and time increments of collection and storage of location information in the Consolidated.db file varies.  The time stamp information has occasionally been found to be incorrect as each application collects and updates this information by whatever time period it sees fit. This may be an update time stamp from Apple’s database or an update time stamp of the last time the cell tower or WiFi hotspot was located. Finally, the controlling application location request can be made for greater or lesser accuracy, some applications look for more WiFi nodes or cell towers for greater accuracy.

So it turns out that the iPhone Tracking database is truly all over the map. But there are few things that are for sure with the location data stored in Consolidated.db. The information stored won’t show that you spend more time in certain locations than others. In other words, someone won’t be able to look through your data and figure out from it where you live or work or where your child goes to school.  Also, the Consolidated.db will not define the paths you have travelled over specific periods of time, though may confirm the places you and your iPhone have visited in general, loosely based on the time updates for those data records.

Law enforcement agencies will continue to use the iPhone location data and other related intelligence for corroborating information about where phone users have been originally obtained from other sources, but relying on the iPhone data as evidence itself may be a crime.

Richard Mislan is a Professor of Cyber Forensics at Purdue University. He wrote Cellphone Crime Solvers published in the July 2010 issue of IEEE Spectrum. Mislan edits his iPhone Consolidated.db regularly to confuse anyone who might look at it.

Tech Firms Looking for Software Talent

Looks like this might be a good time to be a software engineer.

In January, the career guidance site CareerCast.com rated Software Engineer as the best job for 2011. The website evaluates 200 professions covering a variety of industries, skill levels and salary ranges, and bases its rankings on five criteria: work environment, physical demands, outlook, income, and stress. According to the survey, software engineers have one of the least stressful jobs.

Our thirst for social media, cell phone apps and iPads is leading to more jobs for computer specialists of all stripes. Programmers, app developers, website and product designers, and cloud specialists all seem to be in demand at tech companies. From CareerCast.com:

"…the strong performance of Software Engineer this year can be attributed to two emerging industries: web applications and cloud computing. A proliferation of companies making applications for smartphones and tablets, along with the push to develop "cloud" software hosted entirely online, has made the job market for Software Engineers broader and more diverse. And a diverse job market brings improvements in stress factors such as Growth Potential and Competitiveness, as workers become less beholden to employers or vulnerable to outsourcing."

Early this year, Software Engineer was the job title with the largest number of openings at NetApp, Cisco and Intel (350, 407 and 250 respectively), companies that are on Fortune Magazine’s 2011 list of best workplaces.

Internet and social media icons Google, Facebook, Twitter, and Zynga are leading a hiring frenzy in Silicon Valley. These and other technology companies are vying for talented software engineers and developers, luring potential employees with unusual perks.

Google has announced that it plans to hire more employees this year than it did in 2007, when it added over 6100 workers. Facebook, which has roughly 2000 employees (a tenth of whom are said to have been at Google previously), is moving this summer from Palo Alto, CA to a new site in Menlo Park with a capacity for 3600 workers. "As the company grows, we’re aggressively hiring the best talent we can in the quickest manner possible," says Facebook representative Slater Tow.

Social gaming company Zynga expects to double its 1500-strong workforce over the next year, while Twitter plans to grow from over 400 employees right now to 3000 by 2013.

Google's hiring spree for IT talent isn’t just limited to the US. Google and Microsoft are posting hundreds of jobs in Europe, Australia, India, and Japan.

These tech bigwigs aren't just competing for talent with each other. They also face competition from smaller web-based startups that are proliferating because of the recession.

Finally, a promising note from the Bureau of Labor Statistics: "Computer software engineers are among the occupations projected to grow the fastest and add the most new jobs over the 2008-18 decade, resulting in excellent job prospects."

PHOTO: Canadian Veggie, Flickr

TEPCO Gets Help in Cold Shutdown Plan at Fukushima

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency. For more details on how Fukushima Dai-1's nuclear reactors work and what has gone wrong so far, see our explainer.

Six weeks after the Great East Japan Earthquake struck the northeastern coast of the country and crippled the Fukushima Dai-1 Nuclear Power Plant, a stark reminder of what might still lie ahead came on April 26, the 25th anniversary of the Chernobyl nuclear accident in the former Soviet Union. A quarter of a century on and the International Atomic Energy Agency continues to “assist the Ukrainian government in evaluating the amount and types of radioactive waste that has accumulated at the nuclear power plant.” And the plant is still awaiting decommissioning, “a mammoth undertaking that requires extensive planning,” according to the agency.

Fukushima, too, faces a long road.  Tokyo Electric Power Co. (TEPCO) unveiled a roadmap last week aiming to produce a cold shutdown of the three damaged reactors within the next nine months, many experts believe that the timetable is based more on hope than realistic goals, given the many hurdles that still remain. Not least among these challenges is the high radiation levels preventing onsite workers, who are already fatigued, from carry out their tasks for any sustained length of time.

“I think (the roadmap) is a very optimistic, one full of expectations, and I have grave doubts that it is viable,” says Atsushi Kasai, former laboratory chief of Japan’s Atomic Energy Agency. “No one really knows what the state of the three reactors are like inside because of the very high radiation levels (in the reactor buildings). The workers will have to go in to investigate the situation. We simply don’t have enough data to know how long it will take to bring them to a cold shutdown.”

TEPCO is not battling alone in its fight to close the power station. Toshiba Corp., which supplied two of the reactors at the site, and Hitachi Ltd. which provided one, each now have over a thousand workers involved in helping to stabilize the plant, and Hitachi is working on its own shut down plan.

To date, a total of 600 Toshiba employees have worked at both the Fukushima Dai-1 and Dai-2 sites (Dai-2 is in a state of cold shutdown) “with about 140 or 150 employees working as needed on a rotational bases at the Dai-1 site,” a Toshiba official told Spectrum. He added that while Toshiba has provided TEPCO with dosimeters, electric motors for cooling water pumps, transformers, distribution switchboards, and power cables, Toshiba itself is receiving a variety of assistance from its collaborative partners in the power business in the United States including Westinghouse Electric Co., civil engineers The Shaw Group, power technologies provider Babcock & Wilcox, and utilities company Exelon Nuclear Partners.

“And with these companies we’ve also formed a task force and submitted a decommission draft plan to TEPCO,” the official said. “After the plant is stabilized, the plan outlines the opening up of the reactor, removal of the fuel rods and transporting them to a safe place.” The draft also includes steps to dismantle the plant and remove the radioactive debris. “The minimum time given is ten years,” the official added. “This could be extended because there might be complications.”

Hitachi, together with Hitachi-GE Nuclear Energy Ltd. and Hitachi-GE Nuclear Holdings LLC, also has over a thousand “staff members providing support and assistance where needed,” according to a Hitachi official. Earlier, the company submitted its own decommission plan to TEPCO, “but it was more a general outline,” said the official. “Now we are working with our [GE joint ventures], Exelon, and Bechtel Corp on a more detailed proposal. But until we can check the condition of the reactors, we cannot give a timeline on how long it will take to decommission.”

Meanwhile, the battle to keep the three reactors in a relatively stable state continues. Today, TEPCO sent a remote controlled robot into the No. 1 reactor building to check the reactor for water leaks. It found none and no significant rise in radiation levels. As a result the company will test out increasing the amount of water it is injecting into the reactor from a daily 6 metric tons to a maximum of 14 metric tons starting Wednesday. The aim is to steadily boost the amount of water that is collecting in the torus, or suppression pool, until it covers the pressure vessel and can cool it down from the outside.

The company is also reconnecting outside power cables feeding the site so that in the event of another earthquake and tsunami, the plant might not suffer another total power outage.

 

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More