Tech Talk iconTech Talk

Sun team wins $44-million optical interconnect grant from Defense Department

A team lead by Sun Microsystems has won a $44.29-million, five and a half-year research grant "focused on microchip interconnectivity via on-chip optical networks enabled by silicon photonics and proximity communication." According to the New York Times Sun's team includes silicon photonics startups Luxtera and Kotura. (Sun's own press release doesn't mention either of those firms. Nice.) According to the Times, the Sun team beat out a joint proposal from Intel and HP, as well as proposals from IBM and MIT.

The money comes from a program called Ultraperformance Nanophotonic Intrachip Communications (UNIC. Really. Didn't anyone at Darpa try to pronounce this before they came up with the program name?). Darapa's site has updated UNIC's status from solicitation to program so don't bother looking in the program section under UNIC (ugh).

In Sun's press release the Darpa program manager described the goals like so: "DARPA's UNIC (Ultraperformance Nanophotonic Intrachip Communications) program will demonstrate high performance photonic technology for high bandwidth, on-chip, photonic communications networks for advanced (â'¥ 10 trillion operations/second) microprocessors. By restoring the balance between computation and communications, the program will significantly enhance DoD's capabilities for applications such as Image Processing, Autonomous Operations, Synthetic Aperture Radar, as well as supercomputing." That was a mouthful.

It's interesting to note that nowhere in Darpa's solicitation (even in the long form--I checked) is the word "proximity" used. Sun has made much in the past of its (still not commercial) technology that allows two chips placed next to each other to communicate via inductive coupling. We covered it back in 2003, but I'm afraid we haven't gotten that far in our archival postings; so no link, sorry.

Anyhow, a casual glance at the teams makes me think that it was really Intel vs. Luxtera the whole time. I think it's been that way all along. Every time Intel had some great breaktrhough, such as back in 2005 when Intel managed to make a silicon laser (but one that needs input from another laser to work). Luxtera has had some "anything you can do I can to do better" come back.

Intel was kind enough to lay down its vision of silicon photonics a couple of years ago. Anyone wanting to understand all the components you'd need for a working silicon photonics system should check out that link. It'll make a lot more sense than Sun's press release (which wasn't bad, really).

Intel has long appeared to be in the camp of folks who think an electrically pumped silicon laser is pointless to pursue. As demonstration they later with a UC Santa Barbara team, pioneered a way of making said "other laser" by bonding an indium phosphide light source to a silicon structure. I think Luxtera is in that camp as well; so perhaps Darpa wasn't keen on throwing more money at that problem too.

By the way, there's gobs of research about bonding exotic semiconductors to silicon being reported this week at the Materials Research Society's spring meeting. We'd hoped to have a story about that for this, week, but in light of the DoD grant, I think we're going to refocus the story. Look for it later this week or early next week.

How MIT Woos African students

The competition for Africa's best technical students is strong, global -- and intensifying. In the past year, the Chinese government officials have outlined plans for bringing thousands of sub-Saharan Africa's best students in the People's Republic of China, whose top science and engineering schools are increasingly worthy of international acclaim. African students are responding.

When I visited Rwanda last year, I met a university graduate considered among the best young software programmers in the country. Clothilde Tingiri, the programmer, dreamed of gaining a master's degree, as I've written elsewhere, not at Massachusetts Institute of Technology in Cambridge, Mass., but in Tsing Hua University in Beijing. Tingiri's choice of China over America for further engineering studies was dictated chiefly by pragmatism: the Chinese offered her a generous scholarship.

Africa isn't usually viewed as a source of future scholars, but in a globalizing world, the best American universities are paying more attention than ever to foreign students. Which explains why on Tuesday, March 25, MIT is holding its first-ever "West Africa Networking Reception" in Accra, Ghana. The event is hosted by MIT's Sloan Africa Business Club, comprised of students in MIT'sSloan School of Management. Thirty Sloan students are in Ghana this week meeting with African alums and business students at university's in Ghana.

Accra is a good location for MIT's event, which aims both to raise awareness of the university in an Anglophone zone of Africa and connect with its small but vital alumni community in the region.

Connecting Africa's best students to American intellectual networks is an important counterweight against the significant "brain drain" that continues to hamper life for Africa's engineers and scientists. When I lived in Ghana in 2003, one of the most significant software shops in the country was run by an MIT graduate named Mauli Tse. Another MIT graduate, Victor Mallet, launched a successful business-plan competition in Ghana, funded by his alma mater. Then in 2005 MIT gave access, via the Internet, to students at three African universities to five of its labs. "If you can't come to the lab, the lab will come to you," Jesus del Alamo, co-principal investigator on the Africa project and a professor in MIT's Department of Electrical Engineering and Computer Science, said at the time.

MIT's efforts are part a long-term movement to better appeal to the best African students -- and better support those who return to their home countries to pursue their careers.

And earlier this month, MIT held an "Africa Week," organized in part by the university's dynamic African Student Association.

MIT deserves credit for wooing talented Africans -- and helping Africans who decide to stay at home. Other universities are taking note -- and not only in the U.S.

Google hires top talent in India to focus on Google News

Rahul Roy-Chowdhury spent nearly two decades in the U.S. before joining Google as its product manager for Google News. His particular mandate is to broaden Google Newsâ'' reach in India, and his companyâ''s sent him back to capitalize on Indiaâ''s strangely voracious appetite for news.

Itâ''s one of the few countries in which newspaper circulation is actually increasing. Thatâ''s partly because rising incomes and education levels have stoked literacy, Roy-Chowdhury says, but it also has to do with the slow spread of Internet access. Of course, Google is planning for the day when every Indian can surf the Web for news, and itâ''s clear that when they can, theyâ''ll still want what they like now: local news.

â''If you leaf through almost any newspaper here, the sequence of sections is telling,â'' he says. â''Generally, city and state news is given pride of place, followed by some national news and generally a small amount of international coverage.â''

Roy-Chowdhury has to ensure that Google News not only provides that local coverage, but ultimately does so in the countryâ''s different languages. India has 22 major languages. The countryâ''s middle class, now about as big as the entire U.S. population, can speak English, but many still prefer to read and shop in their native tongues. â''If weâ''re able to do it successfully, we can reach many orders of magnitude more people than we could otherwise,â'' he says. â''I love the idea of potentially being able to make a difference at that scale.â''

Who could have known, 17 years ago, that he would return to India as the representative of a mammoth high-tech company. Back then, India was still mired in sluggish growth, the product of decades of excessive government meddling in the economy. â''Right as I was getting ready to leave for the U.S., India's balance of payments weaknesses suddenly caused a crisis,â'' he remembers. â''India moved to a floating exchange rate, which immediately caused a severe devaluation of the rupee. I remember my father being quite upset, as my education in the US suddenly became 50 percent more expensive than it had been a month before!â'' Later on, India liberalized the economy, setting off the ongoing boom, which is particularly robust in high-tech industry.

He studied first mathematics, at Hamilton College, then computer science, at Columbia University. A stint at the New York offices of the investment bank Lazard Frÿres got him interested in business, which he pursued by getting an MBA at Stanford University, followed by a job at Solidcore, a Cupertino, California startup in that provides IT maintenance services. In 2007, Google gobbled him up.

Roy-Chowdhury now lives a stoneâ''s throw from the center of Bangalore, which is the beating heart of Indiaâ''s tech economy. There he sees the corporate offices of Intel, Microsoft, and a host of other tech multinationals, alongside malls packed with contemporary styles and name-brand fast food. It makes him feel right at home, because that is where he is.

Scifi writer and visionary Arthur C. Clarke laid to rest

Sir Arthur C. Clarke was buried in Colombo today. At the funeral, which was attended by his family and friends and fans, music played from 2001: A Space Odyssey. As most people know, Sir Arthur wrote the screenplay for the film.

Sir Arthur left written instructions that his funeral be strictly secular: "Absolutely no religious rites of any kind, relating to any religious faith, should be associated with my funeral."

He wanted to be remembered primarily as a writer, he said shortly before his death in a video that he made on his 90th birthday last December:

"Iâ''ve had a diverse career as a writer, underwater explorer, space promoter and science populariser. Of all these, I want to be remembered most as a writer â'' one who entertained readers, and, hopefully, stretched their imagination as well."

Sri Lankan president Mahinda Rajapaksa annouced a moment of silence in Sir Arthur's honor; it coincided with the ceremony.

Sir Arthur died in the early hours of March 19 (Sri

Lanka time) at Colombo's Apollo hospital from respiratory complications.

As per his wishes, Sir Arthur's gravestone will read:

Here lies Arthur C. Clarke

He never grew up and did not stop growing

RIP, Sir Arthur.

NASA Posts Remembrance of Arthur C. Clarke

The U.S. space agency has posted a memorial page on its Web site to honor the legacy of Sir Arthur C. Clarke, who passed away Wednesday in Colombo, Sri Lanka, at age 90. It notes on the page that "Clarke's work resonated deeply with NASA and its employees."

In a prepared statement issued Wednesday, NASA Administrator Michael Griffin said: "With the passing of Arthur C. Clarke we in the space community have lost yet another legendary pioneer of early spaceflight. In Sir Arthur's case, this loss uniquely spans two communities. He was among the earliest of those who developed and promoted serious space mission concepts, both for human exploration of the solar system and for utilization of near-Earth space for immediate human benefit."

NASA notes that last September Clarke sent a special video message to the Jet Propulsion Laboratory during the Cassini spacecraft's flyby of Saturn's moon Iapetus, which stated that the passage was of particular interest to fans of his 2001: A Space Odyssey, because that moon was his original setting for the famous monolith, which turns out to be a gateway to the stars.

"I want to thank everyone associated with this mission and the overall project," Clarke said in the video. "Science projects are tremendously important for our understanding of the solar system. And who knows, one day our survival on Earth might depend on what we discover out there."

NASA notes that Clarke was a visionary of remarkable powers. In 1945, he proposed the idea of using geostationary satellites as orbital telecommunications relays, which came to be reality with NASA's Echo satellite in 1960. In 1954, Clarke wrote of a design for a lunar base featuring igloo-shaped habitats, not unlike the potential habitats NASA is now testing for it's future lunar outpost, planned to be in place by 2020.

Alan Stern, NASA associate administrator for the Science Mission Directorate, points out that the space agency honored Clarke's most famous work in 2000, when it christened its newest Mars orbiter the 2001 Mars Odyssey.

"Arthur Clarke was a gifted writer of science and science fiction," said Stern, "and an unparalleled visionary of the future, inspiring countless young people throughout the middle and later 20th century with his hopeful vision of how spaceflight would transform societies, economies, and humankind itself."

NASA is also inviting users of its site to post their thoughts on the passing of the great author and futurist.

A user going by only the name Mark posted the following recollection: "As a teenager [in] 1968, after I was awed and motivated when seeing his visions in the film 2001, I saw Clarke speak at a local community college. He spoke about a future where we would be able to receive news and mail, and order groceries and clothing, from a home computer. He also said each home could be independently powered by a small nuclear reactor like those on space probes. Wow! Why not? Now I'm a NASA researcher...still dreaming of his visions for us."

It is a fitting memoriam from a public agency that owes so much to the thinking of a very private citizen of the world.

[Editor's Note: Please see our online feature "Final Thoughts from Sir Arthur C. Clarke (1917-2008)" for more on the passing of this remarkable human being.]

Nanobots to overrun the United Kingdom

UK scientists are warning that the number one technological threat to merry ole England is nanotechnologies, likely in the form of â''miniature robotsâ'' enabled by nanotechnology, according to a recent report of the online version of the UKâ''s Daily Telegraph.

The full list of 25 can be found here and it is a hoot. At number nine of â''technologicalâ'' threats is â''frequency of extreme weather eventsâ''. I must say, old chap, I have never seen weather described as technological before.

I am not familiar with the typical editorial policy of UK newspapers, but I understand that over 60% of Telegraph readers are supportive of the local conservative party. This might be why the list is devoid of any mention of CO2 emissions, but a great deal about the after effects of global warming without indicating the causes.

But I mean really, nanobots?! Thatâ''s what you come up with as the number one technological threat.

While the article tries to cover itself by stating â''although their impact is uncertain and some will turn out to be irrelevantâ'', it never probes any further to see how likely these outcomes are.

Sloppy journalism is expected, especially when it comes to the subject of nanotechnology, but these scientists should presumably know better.

While it is not explicit in the article (that would be too much to ask), one has to guess that the â''miniatureâ'' robots enabled by nanotechnology are produced via some sort of molecular assembler that produces these dastardly robots by the billions, leading to great swarms, gray gooâ'¿blah, blah.

Itâ''s one thing to have Prince Charles completely clueless about what nanotechnology is except for what he read in a Michael Crichton novel, but does it really have to extend to British scientists.

Tesla Roadster will be THE status symbol of 2008

carf1small.jpgTesla Motors this week began the much-delayed production of its all-electric Tesla Roadster. With only 300 due to roll off the assembly line this year, itâ''s sure to be THE status symbol of 2008. And Tesla-spotting soon to become Silicon Valley sport.

The best-kept secret in town? The Tesla list, that is, who is due to receive their pre-ordered Tesla when. Some 900 people are on that list so far; not all will get their cars this year. Tesla and Chairman Elon Musk will get vehicle number one; founder and CEO Martin Ebhard is in line for number two. The New York Times reported that George Clooney is number eight, California Governor Arnold Schwarzenegger, former Hummer driver, is number ten. Silicon Valley gossip blog Valleywag says San Francisco Mayor Gavin Newsom is also one of the first in line.

It wasnâ''t so long ago that the Prius was the defining purchase for someone looking for geek chic. But the Prius quickly became the car of the geek masses. The Tesla, with its $100,000 price roughly five times that of a Prius, is not likely to become commonplace so quickly.

Movie studios commit to digital cinema


Just over a year ago, in the December 2006 issue of IEEE Spectrum, author Russell Wintner predicted that computer servers and digital projectors were poised to blast film projectors out of commercial movie theaters. It was a brash prediction; at that point only 2000 of some 36,000 theaters in the U.S. and Canada were projecting with bits instead of film, and North America was ahead of the rest of the world.

Wintnerâ''s company, Access Integrated Technologies, one of several companies that sell digital cinema systems, along with industry veteran Technicolor, had an idea of how to spark the change; they proposed charging movie studios a virtual print fee. That is, the vendors of digital cinema equipment would install the equipment in theaters at no charge, and, since the cost of providing a movie in digital form is far lower to the studios than the roughly $1200 to $2000 cost of a movie print, the movie studios would pay a fee for every digital â''printâ'' loaded onto the theater servers until the vendors were reimbursed for the equipment cost and then some.

Well, it looks out like itâ''s going to all work out just as Wintner told us. Last week four motion picture studiosâ''Disney, Fox, Paramount, and Universalâ''agreed to pay virtual print fees; in return, Access committed to installing 10,000 digital cinema systems in U.S. and Canadian movie theaters, on top of the 3750 installed in the last year or so. That means, if you live in the U.S. or Canada, digital cinema will likely be coming to at least one of your local theaters soon.

If you want to read Wintnerâ''s article detailing digital cinema technology in general, Accessâ'' system in particular, and the companyâ''s virtual print fee plan, look here. If you want to find out how digital cinema will bring about a burst of 3-D offerings, look here.

Nanotechnology in Russia is Boomingâ¿¿or is it?

In a brief report from RBC (RosBusinessConsulting), which describes itself on its website as â''the first Russian information agency, specializing in business newsâ'', Russian sales of nanotechnology products are expected to amount to $700 billion in 2008, according to First Deputy Prime Minister Sergei Ivanov at a Federation Council session.

Iâ''m not sure if we can believe that number because in another blurb distributed at the same time, the publication reports that by 2015 sales of Russia's nanotechnology products would amount to some RUB 900bn (approx. USD 38.25bn). Quite a drop. Even if we give them the benefit of the doubt, and deduce that they meant RUB 700bn, rather than dollars, it's still a large number and shows little growth over the ensuing seven years.

But with either sum, letâ''s give you a little context so you can judge for yourself whether these numbers are to be believed. According to organizations whose job it is to come up with these kinds of market estimates, the entire global market for nanotechnology (including semis) in 2007 is somewhere between $130-$150 billion. And the projections of the market by 2015 estimates between $1 to $1.5 trillion.

So according to the first report, Russia alone will have nanotechnology-enabled products valued about 5 times the size of the entire global nanotechnology market in 2007. Thatâ''s impressive, albeit somewhat dubious.

What makes this particularly doubtful is how recently Russia even decided to have a nanotechnology program. In June 2006, just under 2 years ago, President Vladimir Putin announced a grand strategy to make Russia a leading player in the field of nanotechnology that came with a pledge of $5 billion in initial spending, with other reports claiming that $7 billion would be spent in the first 8 years of the program.

Thatâ''s certainly putting your money where your mouth isâ'¿but the Economic Development and Trade Ministry objected to the proposal to start the program in 2007 and proposed launching it in 2008 and completing it five years later. As a result, according to published reports, the government announced an allocation of about $150 million for nanotechnology in the federal budget for 2007. Not quite the billions of dollars everyone was all excited about.

But even if Russia had allocated billions of dollars into nanotech on January 1, 2007, there is little chance that the money would buy the necessary equipment, fund the research, etc., etc. in just one year. The problem is further exacerbated by reports that while Russia has some great scientists and has some history of doing ground breaking work in nanotechnology its infrastructure for performing cutting-edge nanotechnology research is sorely lacking.

In addition to doubts about Russia actually being able to do the R&D that would lead to new nanotechnology-enabled products is that they donâ''t have a lot of industrial sectors that could use nanotechnology other than the oil industry. According to the RBC blurb, Ivanov indicated that â''nanotechnology research had made it possible to manufacture ultra-strong cutting faces for various instruments and rocket fuel with a great burn rateâ''. Wowâ'¿just that accounts for $700 billion?

To provide you some more context, take Taiwan for example. Itâ''s a small country, highly industrialized with some key industries such as electronics and textiles that can start using nanotechnology today and making an impact. And they recently reported that after six years under a clear program to come up with nanotechnology-enabled products and $615 million in funding, they believe they can show an economic impact of $9.68 billion. Thatâ''s a far cry from Russiaâ''s claim of $700 billion.

No, this figure for 2008 is not to be taken seriously, and the far smaller number for 2015 seems a wee optimistic. Either itâ''s a mistake or someone is hyping the numbers. If it is the latter, it is somewhat troubling since it may be an indication that the whole Russian nanotech program, which has been burdened with doubters, who believe that corruption will rule the way things are done, may be getting off on the wrong foot.

Intel, Microsoft give $20 million for multicore programming

Intel and Microsoft are clearly concerned that programming is not keeping pace with the increasing number of processor cores on chips. They say they will spend $20 million to create two "Universal Parallel Computing Research Centers" (UPCRC), aimed at accelerating developments in mainstream parallel computing (read as multicore processors) , for consumers and businesses in desktop and mobile computing. The new research centers will be located at the University of California, Berkeley (UC Berkeley), and the University of Illinois at Urbana-Champaign (UIUC). An additional $8 million will come from UIUC, and UC Berkeley has applied for $7 million in funds from a state-supported program to match industry grants.

This is actually Microsoft's 2nd big move in this area. It decided that it needed a nice big supercomputer to simulate all the possibly hundreds of processor cores on future chips. So it struck a deal with the Barcelona Supercomputing Center essentially, to examine some of the same things that Cal and UIUC will be looking into.

In our article about that we called on Cal's David Paterson, who will be heading up one of the new Intel/Microsoft centers for comment. He called the move to multicore "a rare ­opportunity to reinvent computing entirely."

You can see how concern about this topic has been growing simply from our own coverage:

We first talked about it when we profiled Sun's Niagra chip.

Then we started to worry out loud about software in our story about IBM's Cell processor.

We saw that worry in action when we profiled Insomniac Games as they struggled to make use of the Cell in a PS3 game.

And then we looked at possible solution to the programming problem when we profiled RapidMind.

Cal's David Patterson been worrying out loud about the state of multicore programming for some time. He said it to us two years ago in an article about the loss of U.S. Defense Department research dollars. Then, he told us:

"We really don't know how to write software in this new model," Patterson notes. "It's absolutely critical for the future of IT in the United States and around the world that we figure it out."

Still true today.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More