Tech Talk iconTech Talk

A bad day on MARS


Not the red planet, but rather the Monterey Accelerated Research System, an undersea observatory taking formâ''if haltinglyâ''in the icy depths of the Pacific Ocean some 32 kilometers off California's coast. Last month an ROV from the Monterey Bay Aquarium Research Institute installed the observatory's main science node at a depth of 900 meters. But when engineers threw the switch on the node's 10,000 volt power supply, they discovered a ground fault in the main underwater electrical plug connecting the node to shore.

The fault necessitated surfacing of the 2-ton package of electronics as well as the observatory's trawl-resistant steel frame, requiring a large ship and complex logistics. Replacing the plug and reinstalling the node will set the project back at least several months.

Spectrum readers will recognize this setback as simply one more sign of the inherent challenge of connecting high power and broadband information to deep-sea instrumentsâ''the subject of the 2005 Spectrum feature, "Neptune Rising", which profiled a family of U.S. and Canadian projects sharing engineering and components to create the world's most advanced remotely-operated and internet-connected underwater research stations. One piece of the programâ''VENUSâ''is delivering real-time data from relatively shallow installations off Vancouver Island in British Columbia, while the deeper MARS and NEPTUNE projects remain works in progress.

Power is a key challenge. As Neptune Rising was going to press in the fall of 2005 engineers at NASA's Jet Propulsion Laboratory were troubleshooting bugs in the sophisticated power supply they designed for MARSâ''a problem that would ultimately take another 14 months and a new engineering team at Alcatel to solve. Imagine the disappointment of the MARS team to be upended by a faulty plug after all that high-tech sweat and blood!

The Monterey Bay Aquarium Research Institute, in a communiqué issued last month, put the problems down to life on the cutting edge, quoting the words of David Packard (of Hewlett Packard fame) when he founded the institute in 1987. Packard apparently admonished the new institute's researchers to take risks and ask big questions. "Don't be afraid to make mistakes," said Packard. "If you don't make mistakes, you're not reaching far enough." Let's hope the National Science Foundation officials supporting MARS agree.

Photo credit: David Fierstein, MBARI

MIT to be tuition-free for nearly a third of students


The Massachusetts Institute of Technology announced yesterday a far-reaching financial aid program that will make it possible for nearly a third of MIT undergraduate students to have their tuition charges completely covered.

According to the program, which will take effect in the 2008-2009 academic year, MIT will be tuition-free for families earning less than US $75,000 a year. MIT will also provide grants to those students to cover expenses beyond tuition, helping them graduate free from loan debts.

The MIT move follows a major financial aid plan announced by Harvard late last year. It follows also a request by the U.S. Senate Finance Committee for detailed tuition, financial aid, and endowment information from the nationâ''s 136 wealthiest universities. (Here is MIT's response to the Senate request.)

The MIT initiative will increase the institute's financial aid budget to $74 million. Other details of the program (from the press release):

  • Families earning less than $75,000 a year will have all tuition covered. For parents with total annual income below $75,000 and typical assets, MIT will ensure that all tuition charges are covered with an MIT scholarship, federal and state grants, and/or outside scholarship funds. Nearly 30 percent of MIT students fall into this tuition-free category.

  • For families earning less than $75,000 a year, MIT will eliminate the student loan expectation. MIT will no longer expect students from families with total annual income below $75,000 and typical assets to take out loans to cover expenses beyond tuition. Under this provision, for example, students in this income group who participate in MIT's paid Undergraduate Research Opportunities Program (UROP) each semester would be able to graduate debt-free.

  • For families earning less than $100,000, MIT will eliminate home equity in determining their need. In determining the ability to pay for college, MIT will no longer consider home equity for families with total annual income below $100,000 and typical assets. On average, this will reduce parental contributions by $1,600. For families who rent, rather than own a home, MIT will provide a comparable reduction in the expected parental contribution.

  • MIT will reduce student work-study requirements for all financial aid recipients. During the past decade, MIT has steadily lowered the amount it expects students to provide through term-time work. MIT will take a further step in this direction by reducing the work-study expectation for all financial aid recipients by an additional 10 percent.
  • PS: The good news was accompanied by this not-so-good news: "Tuition and fees for the upcoming academic year will increase 4 percent to $36,390." MIT claims that "this figure represents less than half of what it costs MIT to educate an undergraduate," and that the Institute is "increasing funds available for financial aid this year at a far greater rate than the rise in tuition." But then again, the question is: And how fast is MIT's $10 billion endowment growing?

    PPS: On a related note, speaking of tuition-free engineering colleges in the Boston area, let's not forget about the Franklin W. Olin College of Engineering, in Needham, Mass., about which I wrote a long feature titled "The Olin Experiment" and a follow-up blog post. I've been accused of having a bit of a crush on Olin, and I admit, I do admire this great little school. :)

    Gloomy genetics company casts cloud over personal genomics prospects

    chromosomes2.jpgThereâ''s some unexpected news coming from DeCode Genetics, the nifty Icelandic company that tests fragments of your inner cheek for genetic markers of diabetes, prostrate cancer, concerns of the heart, and more. The company is scaling back its workforce by 15 percent, ostensibly triggered by an unexpected market downturn. â''It would even be wise for other companies in our community to follow our example,â'' says Kari Stefansson, the companyâ''s CEOâ''and Iâ''m assuming that by community he doesnâ''t mean Rejkjavik.

    So is this merely a hiccup, or a sign of larger problems on the road to personalized medicine? DeCode, the Google-backed 23andMe, and a Massachussetts-based company called Knome all launched whole genome scans in November 2007, so the field is extremely young. The blog Eye On DNA thoroughly dissects Stefanssonâ''s comments and the outlook for DeCode and its â''communityâ''â''namely, 23andMe, Knome, and a couple others. In short, itâ''s not entirely clear what caused this unexpected market shift, but itâ''s worrying given how far the price of genome scans must fall before it becomes a viable option for people other than millionaires and celebrities of the science world.

    For those with cash to spare, the going rate is $350 000, which was what biotechnology entrepreneur Dan Stoicesku paid Knome when in January he became the second person to pay to have his entire genetic code sequenced. The first, apparently, was a customer picked up by Knomeâ''s Beijing-based partner company.

    A March 4 story in The New York Times quotes Stoicesku as saying heâ''d rather have a more clear picture of his health profile than buy a Bentley.

    Right. Soâ'¿. itâ''s not quite the $1000 genome. Or as the Times relates it,

    â''I was in someoneâ''s Bentley once â'' nice car,â'' said James D. Watson, the co-discoverer of the structure of DNA, whose genome was sequenced last year by a company that donated the $1.5 million in costs to demonstrate its technology. â''Would I rather have my genome sequenced or have a Bentley? Uh, toss up.â''

    But itâ''s not all doom and gloom in the genomics world. Earlier this week, Google announced that it had invested an unspecified amount in the Personal Genome Project, a plan by George Church of Harvard University to create a database of 100,000 subjectsâ'' sequences of protein-coding DNA segments correlated with health histories and body features. This investment marks the third large, public move by Google into the medical space. First came its $3.9 million investment in 23andMe, and in February the search giant and the Cleveland Clinic teamed up to improve the management of electronic health records.

    Nor is all genomics news sensible: a March 5 BBC story covers the debate over exhuming Galileoâ''s body to scan his DNA for the genetic roots of his blindness. The story doesn't put a price on revealing Galileo's ocular misfortunes, but in this case, Iâ''d rather have the Bentley.

    Daylight savings time and electronic clocks: a confusing combination

    j0232239.gifIt used to be simple. On the first day of daylight savings time I would walk around the house and reset all the clocks. Until I got around to doing that, I would simply add an hour in my mind.

    These days, when I wake up on that "spring forward" Sunday morning, Iâ''m not really sure what time it is until I turn on my cell phone. I take it on faith that the cell phone system updated on schedule. Then I have to go around the house and figure out which clocks have reset themselves and which still need to be reset. This is not always predictable.

    The clocks in the cars, the coffee maker, and on the walls will stay on standard time until I reset them; they make no attempt to reset themselves.

    My camera and video recorder will once again display the correct time after months of being an hour off. (I never set them back in the fall, my bad.)

    Last year my Apple computer, confused by the recent rescheduling of daylight savings time in the U.S. (Congress moved it three weeks earlier in 2007), did not reset automatically. This year, thanks to a software update, it should reset its clock automatically. Microsoft has also sent out daylight savings time updates for its products; as long as youâ''re updating your computer regularly, youâ''re covered.

    My watch will wait until I reset it; my sonsâ'' watches are set automatically via radio signal, provided by the National Institute of Standards and Technology, so they should reset themselves. However, they may not; I recall the manual advises to put them near a window on occasion to make sure they get a strong signal. Iâ''ll have to remind the boys to check their watches, or leave them on a windowsill the night before.

    A Tivo (if I had one) would get an automatic clock correction from the mother ship. The DVD/VCR units that I do have hypothetically get updates, setting their clocks automatically to a time signal sent on the local PBS channel. This doesn't always work; back in 2000, some Los Angeles clocks ended up on New York time, other clocks were off by minutes, some by hours; the problem cropped up all around the country, depending on system quirks at local PBS stations. (I reported on this problem in the October 2000 issue of IEEE Spectrum.) The DVD/VCR that I have hooked up to the rooftop antenna will likely get its update. The other unit, which we use to play only, not to record, wonâ''t. It knows about daylight savings time, only it thinks itâ''s coming three weeks from now. Since this VCR functions as my bedroom clock, I canâ''t ignore it, instead, Iâ''ll reset it, and then, three weeks from now, reset it again after it jumps forward an hour.

    After, of course, I go flying out of bed in a mad rush, convinced I'm desperately late, having forgotten about automatic clocks and daylight savings time by then.

    Major Tech Publishing House Declares Bankruptcy

    Ziff Davis Media, which rode the computer boom of the Eighties and Nineties to prominence in the world of technology publishing, has filed for bankruptcy protection. In a press release issued yesterday, the company said that it is taking the step voluntarily in order to restructure its debt load and reorganize its operations. It added that it expects to continue producing its line-up of print and online technology publications as usual during the legal proceedings.

    Something of a legend in the U.S. publishing industry, Ziff Davis was an example of a private firm that was successfully able to re-invent itself as times changed. The company started out in the 1930s as a publisher of hobbyist titles such as Popular Aviation and Popular Photography, magazines that attracted advertisers of parts and equipment for the avocations of its readers.

    In the Fourties and Fifties, Ziff Davis continued to add enthusiast magazines such as Popular Electronics and Car and Driver to its roster. When co-founder William Ziff, Sr., died in 1953 (and co-founder Bernard Davis left), his son William took over the reins of the company and steered it to prosperity for decades. In the Eighties, the younger Ziff caught lightning in a bottle by starting up a number of publications geared to the nascent personal computer phenomenon. Originally considered marginal properties, titles such as PC Magazine and Computer Shopper were dismissed by industry competitors.

    So, when Ziff was diagnosed with prostate cancer and he decided to sell his company's assets in 1984, buyers such as CBS Publishing passed on bidding for the new computer publications, settling on purchase of the premier enthusiast magazines for over US $700 million. This turned out to be an historic bit of irony, because Ziff's cancer treatment succeeded and he soon found himself back in the publishing business, running a much leaner company that focused solely on technology publications, exactly at a time when the computer industry was exploding in popularity. Adding new titles in the field as opportunities arose, the entrepreneur quickly had Ziff Davis valued at more than the sum he had been paid by others for its core assets a few years earlier.

    In the Nineties, with his publishing empire at its zenith William Ziff, Jr., decided with his family to sell the company for good. With deals in place with private investment firms, Ziff Davis raised nearly $2 billion for its resources in 1994. The publishing portion of the company was subsequently sold by the Forstman Little investment group to the Japanese publisher SoftBank in 1995, which in turn sold it to yet another private firm five years later, when it became Ziff Davis Media to reflect its strong online offerings.

    Now, the topsy-turvy tale of Ziff Davis has come full circle. If approved by a federal bankruptcy court, the assets of the famed company will become the property of its creditors and debt holders. They will continue to fund its publications going forward.

    In yesterday's announcement, the company's CEO, Jason Young, said: "Through this process, we will improve our capital structure and align it with the size of our current business operations. We have great strength in our industry leading brands and products and we believe that this restructuring will allow us to unlock the underlying value of our businesses and achieve our true growth potential."

    In recent years, the transition from a paper-based operation to an electronic one has caused many traditional publishers such as Ziff Davis to re-organize their business models or succumb to financial pressures.

    The ultimate irony for this publisher is that it helped create the very revolution that now threatens its existence.

    [Full disclosure: The author is a former employee of Ziff Davis Publishing.]

    Wireless Broadband: (Buy It and) Build It and They Will Come

    The 700 MHz auction is winding down to a close, with experts saying it may well end in the next few days. Already the activity seems over in the critical C block, where itâ''s expected that a major carrier, probably Verizon, has won.

    The bidding has been blind, and no one will know who bid what until itâ''s all over, but Google is believed to have bid to the point at which the FCC minimum of $4.6 billion was met. At that point, the open-access rules that the FCC imposed, at Googleâ''s behest, will govern the operation of the eventual network.

    That is, whoever runs the network will have to allow its customers to use their own devicesâ''customers wonâ''t be limited to cellphones and data cards sold by the carrier. Similarly, developers can create applications and services for the network without having to work with the carrier, whether itâ''s a game like Tetris or a collection of ringtones.

    The Register has an interesting article concluding that the C block winner just about has to be a major carrier. Itâ''s a long and complicated argument, but, briefly, it goes like this: The 700 MHz band being auctioned have been called â''the last beachfront spectrumâ'' because of its excellent propagation characteristics, but itâ''s not really ideal for all possible uses. In particular, its ability to transmit signals over long distances is great for rural areas, where carriers can cover more terrain with fewer base stations. But it can have problems in urban areas (even though does a better job of punching through apartment and office walls), because the very excellence of its propagation make it more likely that signals will trail into other nearby cells, causing interference there.

    The upshot is that, in the view of the Register, the 700 MHz spectrum is probably best used in a nationwide network that blends other 3G spectrum, such as the 1.9 MHz bands that AT&T and Verizon already control, for urban areas, with 700 MHz spectrum in urban areas.

    Increasingly in the world of cellular there are such combinations in existence already. A device may have a radio operating in 800 MHz, 900 MHz, 1.9 MHz and shortly 1.7 MHz, 2.1 MHz and 2.5 MHz, all potentially in the same device. [....]

    In the end it is more likely that a fat 2.5 GHz (90 MHz wide) will become the best way to transmit large chunks of two way mobile internet access in a town, and that networks will eventually emerge combining that with longer wavelengths (lower MHz) out of town. There are lots of combinations possible.

    Which leads us to one of the first points we made, that 700 MHz is fundamentally a rural get out clause for operators that â''already haveâ'' networks. It makes rural build out cheaper, more effective, and it can happen more rapidly. So that in turn makes if more valuable to companies that already have nationwide wireless networks. As a result AT&T and Verizon are likely to be the major bidders.

    The reference to 2.5 GHz is, of course, to Sprint and its under-construction 4G network, dubbed Xohm, which we picked as a winner in our January 2008 isssue (see â''Sprintâ''s Broadband Gambleâ'').

    Recently thereâ''s been a lot of speculation that Sprint canâ''t afford to build Xohm, or that its shareholders wonâ''t let it. But, as CNet reported last week, Dan Hesse, Sprintâ''s new president and CEO,

    indicated that Sprint is moving forward with its WiMax network, called Xohm. Previously, there had been some speculation that Sprint might abandon its WiMax efforts.

    â''Leave no doubt that the first priority is our core business," he said. "But Sprint has an enormous asset in the 100Mhz of spectrum (that is being used to build Xohm), and we have a three-year head start with Xohm.â''

    Getting back the Registerâ''s argument, Iâ''m not convincedâ''after all, the 700 MHz spectrum will have been bought and paid for in urban areas as well as rural ones, at premium prices, the spectrum will have to be used for there to be a return on that investment.

    The conclusion seems right, thoughâ''the likely winner is either Verizon or AT&T, and thereâ''s reason to think itâ''s Verizon. First, they made clear before the auction that they would bid aggressively on the C Block, and it was Verizon that protested, initially, against the Google open-access rules. Second, experts expected AT&T to be more attracted to the piecemeal A and B block bands, which involves lots of little geographical areas. AT&T already has a robust 3G network which only needs to be filled in here and there, the thinking has been.

    Letâ''s consider again the fate of Xohm. Despite Dan Hesseâ''s assurances, Sprint faces some enormous challenges, as the CNet article spells out. Sprint posted â''a $29.5 billion loss for the fourth quarter of 2007 and warned that the company would continue to lose customers at an alarming rate in the coming year.â'' It eliminated its dividend and â''had to borrow some $2.5 billion just to get access to cash.â'' Churn is an enormous problem, having lost 683,000 customers in Q4 and â''likely lose another 1.2 million post-paid customers in the first quarter of 2008.â'' The old CEO was fired last summer. A planned WiMax partnership with Seattle-based Clearwire has fallen apart.

    Not long ago, I went to an invitation-only press briefing by the high-tech analytical firm, ABI Research. We journalists, naturally, grilled the gurus there about Xohmâ''s prospects in the face of these difficulties.

    We were told that Xohm is such a good idea, that the uses for wireless broadband are so great, and the need for a third, competitive, nation-wide broadband offering into each home beyond cable and DSL, so compelling, that Xohm will succeed even if Sprint has to sell it to someone who will build it. Craig McCaw, the founder of Clearwire, could buy it and build it. Or a group of investors could.

    But it occurs to me that Google, having not bought the 700 MHz C Block despite seeming to have compelling reasons for doing so, could turn around and snap up Xohm. For a company thatâ''s worth more than twice the entire U.S. car manufacturing industry, stranger things have happened.

    Responsibility of the failed Nanodynamics IPO points to underwriter

    Despite a reasoned a conclusion that the failure of Nanodynamicsâ'' $100-million IPO on the Dubai exchange put forth on this blog may have been the result of poor underlying economics for the company, it appears that the fault may have instead been with the underwriter.

    In an investigative piece by Alan Shalleck of Nanoclarity for Nanotech-Now, it is deduced that the lead underwriter, Global Crown Capital Ltd. (GCC), failed to put the underwriting funds into the issuerâ''s account, namely Nanodynamics, and this breach of the rules forced Nanodynamics to voluntarily de-list the stock and return the money to those who had purchased the stock.

    It seems interest in the company was thereâ''all the shares were pre-sold, but according to the rules of IPOs the underwriter has to first buy all the shares and then sell them. According to the Shalleck piece, the problem was that GCC just never put the $50 million for purchasing the shares into Nanodynamicsâ'' accounts.

    If this is indeed the case, as Shalleck points out, caveat emptor to nanotech startups. Make sure you have a good underwriter.

    Taiwan claims almost $10 billion economic impact from nanotechnologies

    After pouring in a reported NT$18.9 billion ($615 million) to a six-year program developed by the Council for Economic Planning and Development (CEPD) to support the application of nanotechnology for commercial use, the CEPD is now prepared to say that this has helped create NT$300 billion ($9.68 billion) in nanotechnology production value.

    Weâ''ll certainly have to take the Taiwanese governmentâ''s word for this figure based on the published reports since there is little other data to support the claim.

    The only specifics provided pertain to a commercial project that has not yet launched in the field of MRAM (magnetic random access memory) that is reported to have a production value of over NT$10 billion ($320 million).

    The more general information claims that â''800 Taiwanese manufacturers mainly from electronic industries and traditional industries, which have upgraded these industries by making value-added products such as textiles with an anti-bacterial purifying finishâ'' make up this $9.68 billion impact.

    There is no question that Taiwan has a well-organized (an actual strategic plan), well-funded and highly centralized nanotechnology initiative that has a native industrial base, primarily electronics, which will benefit from innovations provided by nanotechnology. So, if any country is going to reap the benefits of its nanotechnology initiative, it would likely be a country like Taiwan.

    But it would be nice to support these numbers with some more specifics. It leaves me wondering where all the nanotechnology- economic-metric skeptics have gone.

    Sony Pulls Plug on Historic Trinitron TV

    As time marches on, even the best technologies fall by the wayside. So it is now with the storied Trinitron cathode-ray tube (CRT) monitor. On Monday, according to numerous sources, Sony Corp. said it will end production of all CRT television sets worldwide, closing the book on the Trinitron, which was considered the top brand name in TVs in its time.

    The secret of the Trinitron's rise to prominence was the revolutionary design of its CRT. Invented by E.O. Lawrence in the 1960s (who called it the Chromotron), it used a single emitter to fire the electrons that produce red, geen, and blue light and focused them on an aperture grille made of horizontal filaments, as opposed to the conventional shadow mask design then common. Sony patented Lawrence's design in 1967 and began a long marketing campaign that eventually resulted in the more-expensive Trinitron becoming a hit in the marketplace in North America and Europe in the prosperous 1980s. For years, owning a Trinitron TV was a status symbol in many countries.

    The quality of the Trinitron's picture also induced many computer manufacturers to license its CRT technology for use in some of the first commercially successful integrated color displays, as well as later standalone monitors. Firms such as Apple Computer, Dell, Gateway, IBM, and Sun Microsystems were all early adopters of the Trinitron.

    At the height of its popularity, Sony produced 20 million Trinitron units a year. All told, the company manufactured some 280 million Trinitrons over four decades.

    In 1996, however, Sony's patent rights expired, and competitors such as Mitsubishi began incorporating the technology into their monitors and TVs. More importantly, the CRT itself began to be challenged in the late Nineties by new flat-screen technologies such as plasma and liquid crystal displays. Where the Trinitron had been the "must have" color monitor of a previous generation, the flat displays became the status symbol of a new one. And the bulky CRTs were slowly consigned to the garbage pile of history.

    (For more on the transition from CRTs to flat screens, please see our feature article "Goodbye, CRT" in our November 2006 issue.)

    Monday's news from Sony really concerned only the last manufacturing facilities the company employs to make Tinitrons, for retail in some Latin American and Asian nations. Sony had previously shuttered other facilities that sold into the Japanese, European, and North American markets.

    "We are going to end production of cathode ray tubes at the end of March," a Sony spokesman said Monday.

    It will bring to a close a fascinating chapter in the history of electronics. Farewell, Trinitron, you were once the best.

    Climate Skeptics Show Force in New York

    The Heartland Instituteâ''s International Conference on Climate Change (motto: â''global warming is not a crisisâ''), which ended yesterday in New York City, might be quickly written off because of the â''dizzying rangeâ'' and inconsistency of ideas expressed, as Andrew Revkin put in The New York Times, or the presumed vested interests bankrolling the meeting. But that is too facile. If nothing else, Heartlandâ''which describes itself as dedicated to â''promoting free-market solutions to social and economic problemsâ''â''mustered an impressive number of cosponsors: more than 50 organizations with a similar social philosophy, many of them in Europe.

    Free marketeers have looked upon climate alarmism with a jaded eye, seeing it as a Trojan horse for statism, central planning, and internationalism. But can they actually muster credible scientists to support their suspicions? The Heartland meeting, which took place March 3-4, suggests that they still can.

    To sample the situation, I attended a set of sessions about paleoclimatology, the study of the earthâ''s climatic prehistory, gleaned from study of tree rings, and ice cores, among other indicators. I found a good deal to support my skepticism about the climate skeptics: highly technical talks with eccentric claims by scientists who are not actually climate professionals; selective use of limited time periods and data sets to support sweeping conclusions; scant mention of the pioneering geologists, ecologists, and glaciologists who laid the foundation for whatâ''s been, in the last 50 years, a revolution in paleoclimatology.

    In a whole morning of talks about paleoglaciology, I heard scant mention of the ice scientists Dansgaard and Oeschger, and no critical discussion of their work. Itâ''s as if one were to hear four hours of talks critiquing the revolution in modern physics without mention of Einstein and Bohr.

    But it was not all eccentric, empty, or ill-informed, either. Commenting on the 650,000-year ice-core record of greenhouse gases and global temperatures, astrophysicist Willie Soon poked fun with a cartoon comparing the relative roles of the Sun and carbon dioxide: the former Chicago Bears tackle Refrigerator Perry represented the Sun, while CO2 was personified by an average-sized Asian, namely Soon himself. Soon, who works for the Harvard-Smithsonian Center for Astrophysics, asked pointedly why the record shows changes in temperature leading rather than lagging behind changes in greenhouse gas levels, if itâ''s the carbon dioxide and methane changes that supposedly cause climate change. â''Itâ''s as if we said that cancer causes cigarette smoking,â'' he said, echoing other speakers.

    Thatâ''s not the only argument weâ''ll be hearing more and more, as the debate heats up in the next year over whether the United States should adopt a carbon trading system and commit to a Kyoto-like schedule of greenhouse gas cuts. To judge from my Heartland sample, weâ''ll also be hearing doubts cast on the generally accepted estimate of pre-industrial greenhouse gas levels, the magnitude and geographic extent of the Medieval Warm Period and the Little Ice Age, and the scope and significance of carbon uptake by the oceans.

    The climate skeptics are complaining that they have trouble being heard, and they may have a point. On the eve of the Bali conference, with Australia and perhaps even the United States heading toward belated membership in the Kyoto regime, 100 scientists signed a petition to the U.N. Secretary General rejecting climate alarmism. Sure, a lot of them were the usual suspects, people like Fred Seitz, the distinguished semiconductor physicist who first emerged in the public arena as an enthusiastic adherent of Ronald Reaganâ''s Star Wars program, and in recent years has just as enthusiastically denounced climate alarmism. But there also were some eye-catching new names, notably Freeman Dyson, the maverick math and physics theorist at Princetonâ''s Institute for Advanced Studies who is always interesting and often right.


    Tech Talk

    IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

    Newsletter Sign Up

    Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

    Load More