Tech Talk iconTech Talk

Robots: The Expensive Way to Prepare Cheap Food

If you've ever watched the giant container loaders in Elizabeth N.J. or Yokohama Harbor, you've probably wondered if the same robotic technologies could be used to make ramen soup.

Okay, maybe you never have, but someone seems to—Kenji Nagoya, said to be an industrial robot manufacturer and owner of a new fast-food restaurant where bowls of ramen in pork broth are prepared almost entirely by a pair of robots that look, to me at least, a bit like the container loaders I see from the New Jersey Turnpike.

In a widely copied Reuters video report, Nagoya says, “The benefits of using robots as ramen chefs include the accuracy of timing in boiling noodles, precise movements in adding toppings and consistency in the taste."

The robots are reported to be able to make only 80 bowls a day (though the automated process, which includes heating but not making the broth, is said to take less than 2 minutes). They sell for $7 apiece. That gives the shop a total daily revenue of $560, which has to cover the cost of the ingredients, electricity, rent, and some humans make the broth, serve customers, take their money, and so on. And the robots themselves of course.

The shop therefore doesn't make a profit for Nagoya, but it's a great proof of concept and might someday lead to restaurant robots inexpensive enough to replace all those inprecise high school students currently preparing our fast food. (By the way, it's unclear to me whether Nagoya has anything to do with the soon-to-be-closing robot musuem in the town of Nagoya.)

There's an additional video of the Nagoya ramen robots here.

The Nagoya robot story has completely overshadowed a robot “somewhere in Yamanashi” Japan that also helps make ramen soup. Restauranteur Yoshihira Uchida, for whom the robot was created, had the exact opposite strategy of having the robot custom-prepare the broth with “40 million recipes”—combinations of broth ingredients—while a human chef makes the noodles.

This Article Has Been Revised to Reflect the Following Correction

Last week, a slew of news outlets (emphasis on “outlets,” as if it were a contraction of “outhouse” and “toilets”) published a story about a man allergic to a particular radio signal. Not just a particular radio frequency, which would be crazy enough, but a particular air interface, a particular protocol, if you will: Wi-Fi signals. No, really. Here's the Daily Mail's headline:

Allergic to wi-fi! How 'electrosmog' leaves Afterlife DJ in agony

I know what you're thinking. Wi-Fi, at least the most popular flavors of it, uses the same 2.4 GHz frequency as cordless phones, garage doors, and the microwave oven that Steve Miller, aka “Afterlife DJ,” probably pops his popcorn in. How could someone be allergic to Wi-Fi and not a phone or microwave using the same frequency?

You shouldn't have to know anything about the IEEE 802.11 standard to instantly see that the story is nonsense, but apparently you do if you if you work for any of the publications that took to the story like a lemming to the sea.

Fox News—you know, the fair and balanced people—took it and ran ( Man Allergic to Wi-Fi, Makes Him Sick, Dizzy, Confused), apparently getting it, like a virus, from The Sun.


This isn't an occasional phenomenon, it seems integral to the Web. But not just the Web, it's probably a story as old as history itself, or at least the 1970s. The national public radio watchdog show, “On The Media,” had a great piece (“Too Good to Check”) this weekend about crazy claims about Walter Cronkite that have been around for decades and that resurfaced on the occasion of his recent death.

Did you know, for instance, that Uncle Walter is so identified with the news business that in Sweden an anchorman is called a "Kronkiter?" And speaking of anchorman, did you know that the word was coined in the '50s to define Cronkite's role on broadcast TV? Turns out, despite what many media eulogies would have you believe, neither of those facts I just asserted are exactly true.

On the Media host Bob Garfield traced the virus's 30-year etiology with the help of Ben Zimmer, executive producer of the Visual Thesaurus.

BOB GARFIELD: Let's start with the Kronkiter bit. Read me, please, the excerpt from the AP obit.

BEN ZIMMER: Well, the obituary that ran in many newspapers came from The Associated Press. The version that ran in The Chicago Tribune, for instance, said, "Cronkite was the broadcaster to whom the title 'anchorman' was first applied. In Sweden, anchors were sometimes termed “Kronkiters'", that's with a k. "In Holland, they were "Cronkiters.'" That's with a C.

BOB GARFIELD: It scans. I mean, it sort of sounds possible. But what you did was go back to see if it was, you know, true. What did you discover?

BEN ZIMMER: Well, I was not able to discover any evidence in Swedish, Dutch or any other language that news anchors were ever called Kronkiters. So I tried to figure out, well, who started telling this anecdote? And when I first looked, the earliest example I could find was in a 1978 book called Air Time: The Inside Story of CBS News, written by Gary Paul Gates, who was at one time a news writer for Cronkite.

Then in 1979, David Halberstam wrote The Powers That Be, and similarly he had, in Sweden, anchormen were known as Kronkiters. It seemed that these were the earliest examples of this story being told. And what I did was I actually contacted Gary Paul Gates to find out where he got the story from, and it turns out he says he got the story from Halberstam.

At least the Web, when it taketh away the truth, can also giveth it. For example, when a responsible publication messes up, it can correct it with lightning speed. And not in some miniscule correction published in an obscure corner of the paper days later that leaves the original nonsense untouched. The corrections can be made to the original article (something as Foxnews.com, the Daily Mail, and The Sun have yet to do, by the way), with, hopefully, an editorial note describing the changes-all seven of them, in the case of—speaking of Walter Cronkite—an “appraisal” of the late great newscaster written by New York Times ace appraiser Alessandra Stanley.

Check out the mammoth 200-word correction (numbers added):

An appraisal on Saturday about Walter Cronkite's career included a number of errors. (1) In some copies, it misstated the date that the Rev. Dr. Martin Luther King Jr. was killed and (2)referred incorrectly to Mr. Cronkite's coverage of D-Day. Dr. King was killed on April 4, 1968, not April 30. Mr. Cronkite covered the D-Day landing from a warplane; he did not storm the beaches. In addition, (3) Neil Armstrong set foot on the moon on July 20, 1969, not July 26. (4) “The CBS Evening News” overtook “The Huntley-Brinkley Report” on NBC in the ratings during the 1967-68 television season, not after Chet Huntley retired in 1970. (5) A communications satellite used to relay correspondents' reports from around the world was Telstar, not Telestar. (6) Howard K. Smith was not one of the CBS correspondents Mr. Cronkite would turn to for reports from the field after he became anchor of “The CBS Evening News” in 1962; he left CBS before Mr. Cronkite was the anchor. Because of an editing error, (7) the appraisal also misstated the name of the news agency for which Mr. Cronkite was Moscow bureau chief after World War II. At that time it was United Press, not United Press International.

Correction: All eight corrections. A week later, the Times added yet one more:

This article has been revised to reflect the following correction: Correction: August 1, 2009 An appraisal on July 18 about Walter Cronkite's career misstated the name of the ABC evening news broadcast. While the program was called “World News Tonight” when Charles Gibson became anchor in May 2006, it is now “World News With Charles Gibson,” not “World News Tonight With Charles Gibson.”

Why stop there? We can get to an even 10 for for the Times on the subject of Cronkite if we count the two—yes, two—separate corrections to its obituary of (as opposed to appraisal for) Der Kronkiter.

This article has been revised to reflect the following correction:

Correction: July 21, 2009 Because of an editing error, an obituary Saturday about the CBS newsman Walter Cronkite misspelled the name of the church in Manhattan where his family plans to hold a private funeral service. It is St. Bartholomew's, not Bartholemew's.

This article has been revised to reflect the following correction:

Correction: July 23, 2009 An obituary on Saturday about Walter Cronkite misidentified the country in which he crash-landed a glider as a United Press correspondent in World War II. It was the Netherlands, not Belgium.

It's a wonderful thing, the Web, a gargantuan fact-checking machine it is. We're lucky to have one. It's just too bad we need one so often, in the era of the Web. Like the pharmacy owner in the Mad Magazine cartoon who sells both chocolate ice cream and acne medication, the Web fuels its own arms war of truth and falsity.

Devices for Diabetics Expand Inward and Outward

Funding and advising the development of an artificial pancreas is a major long-term initiative at the FDA. A couple interesting advances have recently been made, both commercially and in research, that seem to bring us closer to this goal.

According to the FDA, an artificial pancreas would consist of three components:

"(1) an infusion pump to deliver the required drug, many of which are already available; (2) a continuous glucose monitor, several of which have been approved by the FDA for tracking and trending glucose levels; and (3) an algorithm to communicate between the pump(s) and glucose monitor. An algorithm will receive information from the glucose monitor and convert it to instructions for the infusion pump."

Some continuous blood glucose monitoring devices are already available on the market (here's a good comparative chart).  All give periodic updates of blood glucose levels measured from a sensor inserted just beneath the skin. But all fall short, in some serious way or another, of what an artificial pancreas would require.A huge problem, it seems, is the lifespan of the device. The sensors for these models only last a few day and have to be reinserted regularly. Furthermore, the sensor is only partially implanted, and connects to a transmitter through the skin.

Last month, engineers at the University of Calgary published an alternative design that mounts a glucose sensor onto a transponder chip. An external reader inductively powers the chip while reading the glucose level, eliminating the need for it to hook up to a battery powered transmitter. This makes the  device very small, and thus more durable in the body. Removing the need for a battery also means that the entire chip and sensor can be fully implanted under the skin.

The design also uses an alternative chemical reaction to measure the glucose levels in the body, one that doesn't require oxygen. The oxygen-driven reaction used by other devices produces hydrogen peroxide that can corrode the sensor.

The device hasn't been tested in an organism yet, and once that happens it will be interesting to see how accurate it actually is, but these are definitely ideas that could improve available models.

Another thing that is changing is the extent to which these glucose monitors can communicate with computers and other devices. The MyGlucoHealth system uses a traditional pin prick glucose meter but has installed it with a USB cable and bluetooth capabilities that make it possible to synchronize data with a diabetes management system. It also keep doctors and patients up to date with individuals' glucose level fluctuations with text messages and email. This kind of network is likely to be vital in a system that closes the loop with an automatic insulin infusion pump.

Web 2.0 Meets Public Engagement in Nanotechnology

The UK government is taking this idea of public engagement for nanotechnology quite seriously. And it seems that the interactive capabilities of the Web 2.0 was just the tool they needed to put this seriousness to work.

First we had BIS (Department for Business Innovation & Skills) launch a website earlier this month that urged people to offer their opinions on the UK government’s nanotechnology strategy and even shape its final form.

The premise of the BIS site was characterized by at least one UK-based nanotech expert as a “crowd-sourced nanotechnology strategy”. With the BIS site you are provided a SWOT analysis for each chapter that are divided between cross-cutting themes and industry sectors and then each of these chapters has a handful of questions.

But for all the questions it remains a fairly static site. The questions are already posed for you rather than you posing your questions, for instance. And visually it gives off the aura that this material is not to be touched. One might say it’s the 1.0 of the Web 2.0 in design and feel.

On the other hand, a new UK public engagement website called Nano&me which was set up by an organization called the Responsible Nano Forum and funded by a grant from the Department for Innovation, Universities and Skills takes the visual and interactive capabilities of Web 2.0 and turns it up to 11.

I should say in the name of full disclosure that I helped in the editing of some of the site’s copy. But this material almost seems incidental in the context of the site, which provides every visitor an opportunity to produce their own copy, their own point of view and to set the ground rules for the debate. Quite different from the BIS site, which tells you what the questions are and asks you to just respond to those.

I think if one were to really press the owners of these two sites on what they expected these sites to do, you would probably finally get the answer that they are experiments and the truth is the outcomes are quite uncertain.

As a self-confessed cynic, I am not sure that these sites perform much of a civic duty other than to give politicos something to cover their pudendum and for the public to have the false sense that they are actually involved in shaping some policy even if it’s something as esoteric and ultimately meaningless to them as nanotechnology.

But even as a cynic, I have to admit that it is hard to know how these sites will turn out and what kind of impact they will ultimately have.

Measure for Measure

My daughter is moving to Colorado, and she and her older brother recently took a four-day road trip from his house in Pennsylvania to her new apartment. They both posted their photos on Shutterfly; many of the shots are essentially the same.

The big difference between their photo albums—and it’s makes a huge difference—is that he took a few extra minutes to add captions. So from his pictures I know that the bronze statue of Abe Lincoln they both stood next to is in Vandalia, which was once the capital of Illinois; that the baseball game they went to was in Kansas City; that the giant cross by the highway is in Effingham, Illinois—which is enough information for a Google search that says that the cross is 198 feet tall and was built by the Cross Foundation.

Of course, location information, such as Vandalia, Ill., and Kansas City, Mo., can already be included in a photograph’s metadata, if the camera has GPS. In my ideal universe, all cameras would, and they’d even have little keyboards so you could add a caption right when you take a picture. Photo metadata is phenomenally useful, and, in a world of photo clouds like Shutterfly and Flickr, it’s getting ever more so.

You know what else needs metadata? Engineering numbers. That’s the premise behind Allen Razdow’s new start-up, True Engineering Technology.

Razdow was a co-founder of MathSoft, the company behind Mathcad. In a way reminiscent of Stephen Wolfram’s idea that the Mathematica universe would benefit from databases that could be queried (thus, Wolfram Alpha), Razdow decided that the software on an engineer’s desktop, like Mathcad—but also Microsoft Office—needed metadata for the numbers that move from one program to another. If I had to choose one of these ideas as a winner, it would be Razdow’s.

The idea apparently came slowly to Razdow, who wrote MathCAD back in the 1980s (for MS-DOS!), and with good reason—it requires you to think of numbers in a paradoxical way. While the common conception of computers is that they turn everything into numbers, Razdow’s insight is that the reality is just the reverse.

We take an engineering number, maybe it’s the hydrogen permeability of palladium, or the specific gravity of the railroad ties you just shipped to a customer, and put it into a report that strips it of almost all of its meaning—what reference book the number came from, or when and where it was measured and by whom, the tolerances, and so forth. We take numbers that are ripe with engineering meaning and mathematical context and turn them into flat text. Often—and paradoxically this happens particularly with those bastions of number-crunching, spreadsheets—you don’t even directly know the unit of measurement, because that’s contained in a column heading or a footnote or some other surrounding text.

Consider all the numbers that get used and reused for years, within your company and outside of it. Imagine you’ve worked out a more precise measurement of the hydrogen permeability of the particular palladium alloy to be used in an upcoming product. Or auditors from the Nuclear Regulatory Commission or the Food and Drug Administration have arrived to examine a new report that contains 70 different key engineering numbers that need to be checked, 50 of which were taken from a report that was vetted last year.

Razdow has in mind a plug-in that would encourage you to create metadata for important numbers and would let a number retain that metadata when it’s cut and pasted from one application to another, whether it’s MathCAD to a Word document, or to a PDF, or vice versa. You can hover over someone’s number and see some of that metadata, or, if they make it public, you can click on it and get all of it from a Web page devoted to that number on a public site that Razdow’s company will maintain. True Engineering Technology will make its money by selling a server appliance that will host and manage engineering numbers within an enterprise.

It’s a clever and much-needed idea. Like a lot of other Web 2.0 notions these days—think RSS, for example—it will need widespread adoption by users—in this case engineers—and the software applications that they use. Here’s hoping that happens.

How Will Nano Change the World?

Thus is posed the question for the new video contest put on by the American Chemical Society. In the first contest, the question was simply “What is Nano?” and it turned out that question was best answered with puppets in full-throated song.

I enjoy watching videos as much as the next guy, but I have not quite figured out what purpose these videos are supposed to serve other than to compete in a contest. Are the editors of the ACS’ Chemical & Engineering News supposed to become informed of how nanotechnology is going to impact the world in a way that they hadn’t considered before? Are these videos supposed to become teaching tools for pre-schoolers as in the case with the puppet video?

I am entertained but I don’t get the purpose, or maybe there isn’t one.

 

The National Register of Historic Newspapers

When you hear analysts explain what's killing the newspaper business, the answer invariably boils down to two words: “The Internet” or even one: “Craigslist.” A recent CNET story, “Pew Center illustrates how Craigslist is killing newspapers” is typical.

The use of online classifieds sites, such as Craigslist, has more than doubled in the past four years, according to a study published Friday by the Pew Research Center. At the same time that Web classifies are on the rise, the classifieds business that newspapers once depended on has collapsed

So it was interesting to hear David Simon interviewed on Bill Moyers Journal. He was on the show back in April, thought I listened to it only this week, through the miracle of podcasting. Simon is famous for creating “The Wire,” the HBO series that many critics think is the greatest television show of all time (see, for example, here, here, and here). But before that, he was a reporter for the Baltimore Sun for more than decade, and he had some interesting things to say about newspapers and the Internet:

Yes, we were doing our job. Making the world safe for democracy. And all of a sudden, terra firma shifted, new technology. Who knew that the Internet was going to overwhelm us? I would buy that if I wasn't in journalism for the years that immediately preceded the Internet because I took the third buyout from the "Baltimore Sun." I was about reporter number 80 or 90 who left, in 1995. Long before the Internet had had its impact. I left at a time-- those buyouts happened when the "Baltimore Sun" was earning 37 percent profits.
You know, we now know this because it's in bankruptcy and the books are open. 37 percent profits. All that R&D money that was supposed to go in to make newspapers more essential, more viable, more able to explain the complexities of the world. It went to shareholders in the Tribune Company. Or the L.A. Times Mirror Company before that. And ultimately, when the Internet did hit, they had an inferior product-- that was not essential enough that they could charge online for it.
I mean, the guys who are running newspapers, over the last 20 or 30 years, have to be singular in the manner in which they destroyed their own industry. It-- it's even more profound than Detroit making Chevy Vegas and Pacers and Gremlins and believing that no self-respecting American would buy a Japanese car in 1973. That-- it's analogous up to a point, except it's not analogous in that a Nissan is a pretty good car, and a Toyota is a pretty good car. The Internet, while it's great for commentary and froth doesn't do very much first generation reporting at all. And it can't sustain that. The economic model can't sustain that kind of reporting. And to lose to that, because you didn't - they had contempt for their own product, these people. I mean, how do-
BILL MOYERS: The publishers. The owners.
DAVID SIMON: Yes, how do you give it away for free? You know, but for 20 years, they looked upon the copy as being the stuff that went around the ads. The ads were the God. And then all of a sudden the ads were not there, and the copy, they had had contempt for. And they had-- they had actually marginalized themselves.
By the time the Internet had its way, I mean, they're down to 180 now. You don't cover the City of Baltimore and a region like Central Maryland with 180 people. You don't cover it well.

The problem doesn't stop at newspapers. (The irony of viewing the entire Moyers show online for free - and reading a free full transcript of it -  almost goes without saying.)

Back in March, eMarketer had a rather terrifying article about the magazine business.

It reported that in the U.S. alone, “525 magazines were shut down in 2008” and that “consumer magazine print ad spending in 2008 was down 7.1%.”

And yet, eMarketer's formula is what David Simon might call the same trip down the rabbit hole:

The big move in publishing, however, is online. But the transition may be coming too late for many titles. “While there are pockets of innovation, many print brands have fallen far short of the mark when it comes to their online presence,” says Ms. Krol. “Given the state of the business, publishers need to act quickly to capitalize on brand assets and provide accessible, compelling content to readers-who already have access to a wealth of content online.”
“You have to get your brand online-95% of the magazines out there haven't really done that,” Jim Spanfeller, CEO of Forbes.com, tells eMarketer. “They're not putting out a product that is compelling for an online audience.”
Digital ad revenues for consumer magazines averaged 6.4% in 2008, according to an analysis of 11 major magazine group publishers by Advertising Age, suggesting the publishers have a long way to go in building their digital businesses.

I imagine David Simon summing it up this way: In other words, publishers should put even more of their eggs in the anemic on-line ad-revenue basket.

And it's not as if Forbes is immune. Far from it.

As it happens, also in March, 24/7Wall Street had a remarkable and depressing analysis of the business prospectes for the business weeklies, “The Sun Sets On BusinessWeek, Forbes, And Fortune.”

The May 11 issue of Fortune Magazine is a perfect demonstration of what the three largest business magazines have done for decades. Its cover story, “How Bernie Did It' is the culmination of a four-month investigation into the details of Bernie Madoff's life and business operations written and reported by three of Fortune's best editorial staff members, one of whom is a Pulitzer Prize winner. This issue of Fortune is also an example of why the magazine and its competitors Forbes and BusinessWeek, will soon no longer be able to publish these kinds of stories. The May 11 issue has 92 printed pages and covers. There are only 21 pages of paid advertising compared with more than a hundred pages in a spring issue 20 years ago.

According to the article, “All three of these magazines lost money in the first quarter” although actual numbers are unavailable because Fortune and BusinessWeek are part of larger media organizations and Forbes is privately owned. The article says Business Week “is in the worst shape of the three” and cites industry experts as saying it “has lost money for two years and will lose over $20 million this year if its advertising continues to move down at its current rate and the operation does not make large cost reductions.”

Its advertising pages fell 16% in 2008, according to data from industry research letter MIN. The magazine's ad pages are down 38% this year through the end of April, and in the most recent issue, the drop was an extraordinary 63%. The magazine has more than 220 editorial, support, and management personnel based on the BusinessWeek masthead. This does not include ad sales, production, or circulation staffs. That is a large number of people to put out a magazine that often has fewer than 60 editorial pages and a website with less traffic than TheStreet.com based on March figures from online audience research firm comScore. BusinessWeek online had 3.3 million unique visitors and 18 million pageviews. TheStreet.com, which has a larger monthly audience, had advertising sales of only $30 million in 2008, based on its 10-K, and that number certainly dropped in the first quarter of this year. That is probably a good benchmark for what BuinessWeek brings in for online advertising.

The article concludes “BusinessWeek will not be a weekly magazine with over 200 employees and a rate base of 900,000 at the end of the year. BusinessWeek will have to become a much, much smaller operation.”

As for Fortune, the article says, “management has said that the magazine still makes money, but based on most definitions of profit that is almost certainly not true.” After looking at its large staff and modest Web revenues, the article says “It will have to cut costs and its choices are similar to BusinessWeek's.”

Things are only a little better at Forbes: “Ad pages at Forbes were down 17% last year and are down 19% year-to-date. The most recent issue's ad pages were 33% lower than they were in the same issue last year.” The good and bad news is that staff has already been cut-and not just on the print side:

Forbes has a financial advantage over its two competitors. It has already gone through two large staff layoffs which totaled about 70 people, about 15% of the staff who worked at Forbes and Forbes.com.

24/7WallStreet notes, though, that Forbes has a strong Website, just as Forbes.com CEO Spanfeller bragged to eMarketer:

The print business at Forbes is doing as poorly as it is at BusinessWeek and Fortune. Forbes has the advantage of a much larger audience online. In the US, it has almost 5.6 million unique visitors and 66 million pageviews. Revenue from the Forbes online business is between $70 million and $80 million, but is not growing. Forbes management might say that its online operations are profitable and that its print business loses money. It is convenient to separate the two businesses, but they share so many resources, that this is not a realistic description of the Forbes overall business.

If we want investigative journalism-whether it's of the local politics in Baltimore, or business scandals like Madoff's twenty-year Ponzi scheme, we have to be willing to pay for it. So far, the reading and voting public has shown no appetite for doing so. We're right now dismantling journalistic enterprises that took decades to build.

One is reminded of New York's glorious Pennsylvania Station, part pink granite Doric columns, part iron art deco archways. When it was thoughtlessly torn down in 1963, the loss was so mourned that it led to the National Historic Preservation Act of 1966 and its National Register of Historic Places Program.

The equivalent losses have already been felt, and then some: The 146-year old Seattle Post-Intelligencer, the Rocky Mountain News (140), Tucson Citizen (138), the Cincinnati Post (128), but, as you can imagine, David Simon has a theory why there will be no national historic preservation act for investigative journalism. In his interview, Bill Moyers paraphrased something Simon had said in an earlier interview: "Oh, to be a state or local official in America... without newspapers. It's got to be one of the great dreams in the history of American corruption."

As the printing presses of these journalistic giants grind to a halt, they of course leave websites, ones about as nondescript as the current Penn Station, but woefully understaffed and far less useful. And like the old Penn Station, they won't be rebuilt. Nor will the great magazines whose demise will surely follow.

Responsible Recycling Video Contest Winners Announced

E-waste is a serious problem and you can help by taking your ewaste to a reponsible recycler. That's the message; the challenge to video producers amateur and professional--turn that message into an interesting and informative video of a minute or so.

The winner: Michael Herp from Aubrey Texas.

 

The second place video:

 

For more information about the contest, and to view the third place winners, visit the Silicon Valley Toxics Coalition.

Iberian Nanotech Center Opens with Promise of a New "Age of Discovery"

Assorted Spanish and Portuguese dignitaries assembled for the opening for the new International Iberian Nanotech Laboratory located in Braga, Portugal.

Given such an occasion and such an assembled cast of politicos and even royalty we were bound to here some reference to the golden age of Spain and Portugal’s age of discovery. And we were not disappointed.

 "In the age of discovery, we had a lot of success. With this project, Portugal and Spain will chart a new atlas of innovation and will make new discoveries," said Spain’s president, Jose Luis Rodriguez Zapatero.

I have commented somewhat skeptically on this research center earlier this year.

I am sure it will be a huge success, as far as we'll ever know. However it turns out, it should prove interesting to see how this center develops since it should serve as a benchmark for other countries who develop large and expensive nanotech research centers without having much of a foundation for it either in their scientific or industrial communities.

 

Digital wizards bring back NASA's lost moon walk video

Today marks the 40th anniversary of the Neil Armstrong’s first steps onto the moon. I remember staying up past my bedtime to watch the blurry images on the little black and white TV in my New Jersey living room.

NASA later recorded over the original videotape of those images in order to save the cost of buying new tapes. It didn’t mean that the recording of the event was lost, but that the only copies remaining were simply that, copies, mostly made by TV broadcasters, and had even less resolution that the original.

NASA turned to Hollywood, specifically, Lowry Digital Images, to fix the problem. Lowry’s an amazing company that works magic on old film and video—I first encountered the firm’s work when profiling then-Lowry employee Ian Caven for IEEE Spectrum’s Dream Jobs Special Report. Caven developed original solutions to difficult image processing problems for Lowry, developing software to remove interference, mold damage, and flicker from old movies.

To restore the Apollo 11 images, the digital imaging specialists at Lowry used a number of copies of the video, including an 8 mm film recorded by pointing a handheld camera at a video monitor at Mission Control. Some sections of the video existed only on this 8 mm film. Lowry Digital used its standard temporal image processing technique, in which it compares information from large numbers of consecutive frames to calculate the optimal contrast, resolution, and noise level, to process the images. It also developed new techniques to fix brightness, ghosting, and smearing. NASA instructed the restorers to leave in some of the original flaws, like dirt on the camera lens, for authenticity.

Some short highlights from the restored video are available now; the complete restoration project is slated to wrap up in September.

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More