Tech Talk iconTech Talk

Re-focusing Environmental and Health Concerns of Nanotechnology

The latest scare screed from an NGO on the subject of nanotechnology comes from the Australian-based Friends of the Earth in their latest report â''Out of the laboratory and on to our plates: Nanotechnology in food and agricultureâ''. The report comes replete with images of faceless scientists injecting some unknown chemical into some fruit.

As propaganda goes this is top-notch stuff. As far as keeping to facts, and avoiding misleading hype, it falls short. TNTLog does a thorough job of putting the report in its appropriate place.

But the environmental and health concerns surrounding nanotechnology need to be addressed, and none more acutely than the occupational safety and health issues for those workers involved in manufacturing processes that employ nanoparticles.

Nanowerk has written a spotlight piece on this issue that introduces a recent report and survey conducted by Kaspar Schmid and Michael Riediker from the Institute of Health Economics and Management at the University of Lausanne in Switzerland entitled â''Use of Nanoparticles in Swiss Industry: A Targeted Surveyâ''.

While the Friends of the Earth select out portions of the 2004 Royal Society Report to arrive at a conclusion that a moratorium is needed on nanotechnology (something that the Royal Society Report never does itself), the Royal Society Report does express keen concern about â''freeâ'' nanoparticles and the risk that they may hold for workers.

The recent Swiss report is not trying to create headline-grabbing fear mongering, but is in the silent pursuit of facts. And one of the key findings is fairly disturbing: that there are few, if any, best-practice regulations from either industry or government on how to handle nanoparticles.

If concerns about the environmental and health impact of nanoparticles are to be fruitfully pursued, then addressing occupational health and safety of so-called nanoworkers is a good place to start and one where the risk is probably the highest.

By engaging in scare tactics that require the dubious linking of nanotechnology to genetic engineering and synthetic biology, important nanotoxicological research into nanoparticles and the best-practice regulations that would follow are prevented from getting their proper place in the list of priorities.

Out of Africa: Gutenberg, birth certificates and the elusive hegemony of information technology

In Africa, the Gutenberg revolution remains unfinished. For many people, printed documents are a novel technology, yet to fully penetrate all levels of society.

The news this week that the southern African country of Malawi will requires its citizens to have birth certificates for the first time got me thinking about a complex problem in African development: â''information povertyâ'' and the way old technologies retain the power to shock, to paraphrase the title of a recent book by David Edgerton, a British historian of technology.

Edgerton argues persuasively in his 2007 book, â''The Shock of the Old,â'' that well-established technologies retain the power to â''shockâ'' in a surprisingly large number of circumstances.

In Africa, many mature technologies have not yet been mastered. Electricity is probably the best understood. The recent power shortages around Africa are illustration of that. But a 500-year old printing technology, which began to transform Europe more than 500 years ago, is only now doing the same in sub-Saharan Africa. Printed documents â'' and the personal information that drives the creation of them â'' are only now becoming mainstream in many countries in the region.

In rural Africa, birth certificates remain atypical, though partly because government officials charge too much for them. The charges are a form of extortion but also a symptom a mentality that treats printing as an exotic technology, a scarce resource, an alien instrument.

In Africa, as I once explored in a paper for the Web journal, First Monday, the whole notion of information as an instrument of power â'' as a technology in the truest sense of the word â'' is poorly developed. In short, the motivation to master printing technology -- and to value printed documents -- is lacking because of an "information poor" environment.

In African cities, many -- dare I say most -- streets have no names. Home delivery of mail is virtually non-existent. Documentation of a person's identity is often non-existent.

The costs of "information poverty" are manifold. Governments in Africa can't deliver certain services because it is often impossible to prove who received them.

In the case, of children, the failure of most families to secure birth certificates creates the potential for mayhem. Who does a child belong to? The question can be impossible to answer without printed documents.

In Chad, another African country, a French non-profit recently caused an uproar by taking 103 local children -- presumed to have no parents -- and giving them to families in France. Even when the Chadian and French governments intervened to block the flawed adoptions -- because the children actually had parents -- figuring out who the kids actually belonged to wasn't easy -- because of a lack of documentation.

Printed documents are taken for granted industrialized societies. The people in these societies invest heavily in combating the problem of "information overload." Managin vast amounts of information is among the most lucrative pursuits by contemporary innovators: witness the great commercial success of Google.

Yet in Africa, "information poverty" remains a curious scourge.

Long live the printing press!

Two Spacecraft Prepare for Space Station Meetings

That's one up and one to go.

As NASA prepared for an early morning Tuesday flight of the Space Shuttle Endeavour, the European Space Agency (ESA) monitored the status of the latest cargo ship to fly into orbit on a mission to the International Space Station (or ISS).

On Sunday, the new Jules Verne Automated Transfer Vehicle (ATV) climbed into orbit atop a special Ariane 5 rocket from its launch pad at the ESA spaceport at Kourou, French Guiana. The first in its class, the unmanned Jules Verne is Europe's alternative to the U.S. space shuttle fleet. It weighs nearly 18 metric tons and is equipped with electronics that automate its flight once in space. Its maiden mission is to prove its abilities by carrying a 4.5-ton cargo of parts, propellant, water, and oxygen to the ISS. Initially, it will linger in low Earth orbit at an altitude of 260 kilometers.

Three weeks from now, mission controllers will instruct its computers to begin an ascent that will put it into a path that will rendezvous with the space station, orbiting 85 km higher. The delay in its journey has been planned to allow the Endeavour to safely come and go to the ISS in the meantime.

In that regard, NASA said today it is making final preparations for the launch of Endeavour in the wee hours of Tuesday morning.

This flight, known as STS-123, will carry the Japan Aerospace Exploration Agency's Kibo Laboratory and the Canadian Space Agency's two-armed robotic system, known as Dextre, to the ISS. The 1500 kilogram robot will be assembled in space by American astronauts.

The 16-day mission will be helmed by Dominic Gorie with Gregory H. Johnson serving as pilot. The crew will include Mission Specialists Rick Linnehan, Robert L. Behnken, Mike Foreman, Garrett Reisman, and Japanese astronaut Takao Doi.

The cause for the overlap in the two missions arose from problems NASA experienced last December with internal sensors in the external fuel tank of the previous shuttle mission, the STS-122 flight of Atlantis (please see our previous entry "NASA Sets New Dates for Next Shuttle Launches" for more detail).

NASA and ESA hope this odd combination of missions will catch up their joint timelines for bringing the space station back to its construction schedule.

A bad day on MARS

MARS%20graphic%20Credit%20David%20Fierstein%20MBARI.jpg

Not the red planet, but rather the Monterey Accelerated Research System, an undersea observatory taking formâ''if haltinglyâ''in the icy depths of the Pacific Ocean some 32 kilometers off California's coast. Last month an ROV from the Monterey Bay Aquarium Research Institute installed the observatory's main science node at a depth of 900 meters. But when engineers threw the switch on the node's 10,000 volt power supply, they discovered a ground fault in the main underwater electrical plug connecting the node to shore.

The fault necessitated surfacing of the 2-ton package of electronics as well as the observatory's trawl-resistant steel frame, requiring a large ship and complex logistics. Replacing the plug and reinstalling the node will set the project back at least several months.

Spectrum readers will recognize this setback as simply one more sign of the inherent challenge of connecting high power and broadband information to deep-sea instrumentsâ''the subject of the 2005 Spectrum feature, "Neptune Rising", which profiled a family of U.S. and Canadian projects sharing engineering and components to create the world's most advanced remotely-operated and internet-connected underwater research stations. One piece of the programâ''VENUSâ''is delivering real-time data from relatively shallow installations off Vancouver Island in British Columbia, while the deeper MARS and NEPTUNE projects remain works in progress.

Power is a key challenge. As Neptune Rising was going to press in the fall of 2005 engineers at NASA's Jet Propulsion Laboratory were troubleshooting bugs in the sophisticated power supply they designed for MARSâ''a problem that would ultimately take another 14 months and a new engineering team at Alcatel to solve. Imagine the disappointment of the MARS team to be upended by a faulty plug after all that high-tech sweat and blood!

The Monterey Bay Aquarium Research Institute, in a communiqué issued last month, put the problems down to life on the cutting edge, quoting the words of David Packard (of Hewlett Packard fame) when he founded the institute in 1987. Packard apparently admonished the new institute's researchers to take risks and ask big questions. "Don't be afraid to make mistakes," said Packard. "If you don't make mistakes, you're not reaching far enough." Let's hope the National Science Foundation officials supporting MARS agree.

Photo credit: David Fierstein, MBARI

MIT to be tuition-free for nearly a third of students

MIT_77_Mass_Ave.jpg

The Massachusetts Institute of Technology announced yesterday a far-reaching financial aid program that will make it possible for nearly a third of MIT undergraduate students to have their tuition charges completely covered.

According to the program, which will take effect in the 2008-2009 academic year, MIT will be tuition-free for families earning less than US $75,000 a year. MIT will also provide grants to those students to cover expenses beyond tuition, helping them graduate free from loan debts.

The MIT move follows a major financial aid plan announced by Harvard late last year. It follows also a request by the U.S. Senate Finance Committee for detailed tuition, financial aid, and endowment information from the nationâ''s 136 wealthiest universities. (Here is MIT's response to the Senate request.)

The MIT initiative will increase the institute's financial aid budget to $74 million. Other details of the program (from the press release):

  • Families earning less than $75,000 a year will have all tuition covered. For parents with total annual income below $75,000 and typical assets, MIT will ensure that all tuition charges are covered with an MIT scholarship, federal and state grants, and/or outside scholarship funds. Nearly 30 percent of MIT students fall into this tuition-free category.

  • For families earning less than $75,000 a year, MIT will eliminate the student loan expectation. MIT will no longer expect students from families with total annual income below $75,000 and typical assets to take out loans to cover expenses beyond tuition. Under this provision, for example, students in this income group who participate in MIT's paid Undergraduate Research Opportunities Program (UROP) each semester would be able to graduate debt-free.

  • For families earning less than $100,000, MIT will eliminate home equity in determining their need. In determining the ability to pay for college, MIT will no longer consider home equity for families with total annual income below $100,000 and typical assets. On average, this will reduce parental contributions by $1,600. For families who rent, rather than own a home, MIT will provide a comparable reduction in the expected parental contribution.

  • MIT will reduce student work-study requirements for all financial aid recipients. During the past decade, MIT has steadily lowered the amount it expects students to provide through term-time work. MIT will take a further step in this direction by reducing the work-study expectation for all financial aid recipients by an additional 10 percent.
  • PS: The good news was accompanied by this not-so-good news: "Tuition and fees for the upcoming academic year will increase 4 percent to $36,390." MIT claims that "this figure represents less than half of what it costs MIT to educate an undergraduate," and that the Institute is "increasing funds available for financial aid this year at a far greater rate than the rise in tuition." But then again, the question is: And how fast is MIT's $10 billion endowment growing?

    PPS: On a related note, speaking of tuition-free engineering colleges in the Boston area, let's not forget about the Franklin W. Olin College of Engineering, in Needham, Mass., about which I wrote a long feature titled "The Olin Experiment" and a follow-up blog post. I've been accused of having a bit of a crush on Olin, and I admit, I do admire this great little school. :)

    Gloomy genetics company casts cloud over personal genomics prospects

    chromosomes2.jpgThereâ''s some unexpected news coming from DeCode Genetics, the nifty Icelandic company that tests fragments of your inner cheek for genetic markers of diabetes, prostrate cancer, concerns of the heart, and more. The company is scaling back its workforce by 15 percent, ostensibly triggered by an unexpected market downturn. â''It would even be wise for other companies in our community to follow our example,â'' says Kari Stefansson, the companyâ''s CEOâ''and Iâ''m assuming that by community he doesnâ''t mean Rejkjavik.

    So is this merely a hiccup, or a sign of larger problems on the road to personalized medicine? DeCode, the Google-backed 23andMe, and a Massachussetts-based company called Knome all launched whole genome scans in November 2007, so the field is extremely young. The blog Eye On DNA thoroughly dissects Stefanssonâ''s comments and the outlook for DeCode and its â''communityâ''â''namely, 23andMe, Knome, and a couple others. In short, itâ''s not entirely clear what caused this unexpected market shift, but itâ''s worrying given how far the price of genome scans must fall before it becomes a viable option for people other than millionaires and celebrities of the science world.

    For those with cash to spare, the going rate is $350 000, which was what biotechnology entrepreneur Dan Stoicesku paid Knome when in January he became the second person to pay to have his entire genetic code sequenced. The first, apparently, was a customer picked up by Knomeâ''s Beijing-based partner company.

    A March 4 story in The New York Times quotes Stoicesku as saying heâ''d rather have a more clear picture of his health profile than buy a Bentley.

    Right. Soâ'¿. itâ''s not quite the $1000 genome. Or as the Times relates it,

    â''I was in someoneâ''s Bentley once â'' nice car,â'' said James D. Watson, the co-discoverer of the structure of DNA, whose genome was sequenced last year by a company that donated the $1.5 million in costs to demonstrate its technology. â''Would I rather have my genome sequenced or have a Bentley? Uh, toss up.â''

    But itâ''s not all doom and gloom in the genomics world. Earlier this week, Google announced that it had invested an unspecified amount in the Personal Genome Project, a plan by George Church of Harvard University to create a database of 100,000 subjectsâ'' sequences of protein-coding DNA segments correlated with health histories and body features. This investment marks the third large, public move by Google into the medical space. First came its $3.9 million investment in 23andMe, and in February the search giant and the Cleveland Clinic teamed up to improve the management of electronic health records.

    Nor is all genomics news sensible: a March 5 BBC story covers the debate over exhuming Galileoâ''s body to scan his DNA for the genetic roots of his blindness. The story doesn't put a price on revealing Galileo's ocular misfortunes, but in this case, Iâ''d rather have the Bentley.

    Daylight savings time and electronic clocks: a confusing combination

    j0232239.gifIt used to be simple. On the first day of daylight savings time I would walk around the house and reset all the clocks. Until I got around to doing that, I would simply add an hour in my mind.

    These days, when I wake up on that "spring forward" Sunday morning, Iâ''m not really sure what time it is until I turn on my cell phone. I take it on faith that the cell phone system updated on schedule. Then I have to go around the house and figure out which clocks have reset themselves and which still need to be reset. This is not always predictable.

    The clocks in the cars, the coffee maker, and on the walls will stay on standard time until I reset them; they make no attempt to reset themselves.

    My camera and video recorder will once again display the correct time after months of being an hour off. (I never set them back in the fall, my bad.)

    Last year my Apple computer, confused by the recent rescheduling of daylight savings time in the U.S. (Congress moved it three weeks earlier in 2007), did not reset automatically. This year, thanks to a software update, it should reset its clock automatically. Microsoft has also sent out daylight savings time updates for its products; as long as youâ''re updating your computer regularly, youâ''re covered.

    My watch will wait until I reset it; my sonsâ'' watches are set automatically via radio signal, provided by the National Institute of Standards and Technology, so they should reset themselves. However, they may not; I recall the manual advises to put them near a window on occasion to make sure they get a strong signal. Iâ''ll have to remind the boys to check their watches, or leave them on a windowsill the night before.

    A Tivo (if I had one) would get an automatic clock correction from the mother ship. The DVD/VCR units that I do have hypothetically get updates, setting their clocks automatically to a time signal sent on the local PBS channel. This doesn't always work; back in 2000, some Los Angeles clocks ended up on New York time, other clocks were off by minutes, some by hours; the problem cropped up all around the country, depending on system quirks at local PBS stations. (I reported on this problem in the October 2000 issue of IEEE Spectrum.) The DVD/VCR that I have hooked up to the rooftop antenna will likely get its update. The other unit, which we use to play only, not to record, wonâ''t. It knows about daylight savings time, only it thinks itâ''s coming three weeks from now. Since this VCR functions as my bedroom clock, I canâ''t ignore it, instead, Iâ''ll reset it, and then, three weeks from now, reset it again after it jumps forward an hour.

    After, of course, I go flying out of bed in a mad rush, convinced I'm desperately late, having forgotten about automatic clocks and daylight savings time by then.

    Major Tech Publishing House Declares Bankruptcy

    Ziff Davis Media, which rode the computer boom of the Eighties and Nineties to prominence in the world of technology publishing, has filed for bankruptcy protection. In a press release issued yesterday, the company said that it is taking the step voluntarily in order to restructure its debt load and reorganize its operations. It added that it expects to continue producing its line-up of print and online technology publications as usual during the legal proceedings.

    Something of a legend in the U.S. publishing industry, Ziff Davis was an example of a private firm that was successfully able to re-invent itself as times changed. The company started out in the 1930s as a publisher of hobbyist titles such as Popular Aviation and Popular Photography, magazines that attracted advertisers of parts and equipment for the avocations of its readers.

    In the Fourties and Fifties, Ziff Davis continued to add enthusiast magazines such as Popular Electronics and Car and Driver to its roster. When co-founder William Ziff, Sr., died in 1953 (and co-founder Bernard Davis left), his son William took over the reins of the company and steered it to prosperity for decades. In the Eighties, the younger Ziff caught lightning in a bottle by starting up a number of publications geared to the nascent personal computer phenomenon. Originally considered marginal properties, titles such as PC Magazine and Computer Shopper were dismissed by industry competitors.

    So, when Ziff was diagnosed with prostate cancer and he decided to sell his company's assets in 1984, buyers such as CBS Publishing passed on bidding for the new computer publications, settling on purchase of the premier enthusiast magazines for over US $700 million. This turned out to be an historic bit of irony, because Ziff's cancer treatment succeeded and he soon found himself back in the publishing business, running a much leaner company that focused solely on technology publications, exactly at a time when the computer industry was exploding in popularity. Adding new titles in the field as opportunities arose, the entrepreneur quickly had Ziff Davis valued at more than the sum he had been paid by others for its core assets a few years earlier.

    In the Nineties, with his publishing empire at its zenith William Ziff, Jr., decided with his family to sell the company for good. With deals in place with private investment firms, Ziff Davis raised nearly $2 billion for its resources in 1994. The publishing portion of the company was subsequently sold by the Forstman Little investment group to the Japanese publisher SoftBank in 1995, which in turn sold it to yet another private firm five years later, when it became Ziff Davis Media to reflect its strong online offerings.

    Now, the topsy-turvy tale of Ziff Davis has come full circle. If approved by a federal bankruptcy court, the assets of the famed company will become the property of its creditors and debt holders. They will continue to fund its publications going forward.

    In yesterday's announcement, the company's CEO, Jason Young, said: "Through this process, we will improve our capital structure and align it with the size of our current business operations. We have great strength in our industry leading brands and products and we believe that this restructuring will allow us to unlock the underlying value of our businesses and achieve our true growth potential."

    In recent years, the transition from a paper-based operation to an electronic one has caused many traditional publishers such as Ziff Davis to re-organize their business models or succumb to financial pressures.

    The ultimate irony for this publisher is that it helped create the very revolution that now threatens its existence.

    [Full disclosure: The author is a former employee of Ziff Davis Publishing.]

    Wireless Broadband: (Buy It and) Build It and They Will Come

    The 700 MHz auction is winding down to a close, with experts saying it may well end in the next few days. Already the activity seems over in the critical C block, where itâ''s expected that a major carrier, probably Verizon, has won.

    The bidding has been blind, and no one will know who bid what until itâ''s all over, but Google is believed to have bid to the point at which the FCC minimum of $4.6 billion was met. At that point, the open-access rules that the FCC imposed, at Googleâ''s behest, will govern the operation of the eventual network.

    That is, whoever runs the network will have to allow its customers to use their own devicesâ''customers wonâ''t be limited to cellphones and data cards sold by the carrier. Similarly, developers can create applications and services for the network without having to work with the carrier, whether itâ''s a game like Tetris or a collection of ringtones.

    The Register has an interesting article concluding that the C block winner just about has to be a major carrier. Itâ''s a long and complicated argument, but, briefly, it goes like this: The 700 MHz band being auctioned have been called â''the last beachfront spectrumâ'' because of its excellent propagation characteristics, but itâ''s not really ideal for all possible uses. In particular, its ability to transmit signals over long distances is great for rural areas, where carriers can cover more terrain with fewer base stations. But it can have problems in urban areas (even though does a better job of punching through apartment and office walls), because the very excellence of its propagation make it more likely that signals will trail into other nearby cells, causing interference there.

    The upshot is that, in the view of the Register, the 700 MHz spectrum is probably best used in a nationwide network that blends other 3G spectrum, such as the 1.9 MHz bands that AT&T and Verizon already control, for urban areas, with 700 MHz spectrum in urban areas.

    Increasingly in the world of cellular there are such combinations in existence already. A device may have a radio operating in 800 MHz, 900 MHz, 1.9 MHz and shortly 1.7 MHz, 2.1 MHz and 2.5 MHz, all potentially in the same device. [....]

    In the end it is more likely that a fat 2.5 GHz (90 MHz wide) will become the best way to transmit large chunks of two way mobile internet access in a town, and that networks will eventually emerge combining that with longer wavelengths (lower MHz) out of town. There are lots of combinations possible.

    Which leads us to one of the first points we made, that 700 MHz is fundamentally a rural get out clause for operators that â''already haveâ'' networks. It makes rural build out cheaper, more effective, and it can happen more rapidly. So that in turn makes if more valuable to companies that already have nationwide wireless networks. As a result AT&T and Verizon are likely to be the major bidders.

    The reference to 2.5 GHz is, of course, to Sprint and its under-construction 4G network, dubbed Xohm, which we picked as a winner in our January 2008 isssue (see â''Sprintâ''s Broadband Gambleâ'').

    Recently thereâ''s been a lot of speculation that Sprint canâ''t afford to build Xohm, or that its shareholders wonâ''t let it. But, as CNet reported last week, Dan Hesse, Sprintâ''s new president and CEO,

    indicated that Sprint is moving forward with its WiMax network, called Xohm. Previously, there had been some speculation that Sprint might abandon its WiMax efforts.

    â''Leave no doubt that the first priority is our core business," he said. "But Sprint has an enormous asset in the 100Mhz of spectrum (that is being used to build Xohm), and we have a three-year head start with Xohm.â''

    Getting back the Registerâ''s argument, Iâ''m not convincedâ''after all, the 700 MHz spectrum will have been bought and paid for in urban areas as well as rural ones, at premium prices, the spectrum will have to be used for there to be a return on that investment.

    The conclusion seems right, thoughâ''the likely winner is either Verizon or AT&T, and thereâ''s reason to think itâ''s Verizon. First, they made clear before the auction that they would bid aggressively on the C Block, and it was Verizon that protested, initially, against the Google open-access rules. Second, experts expected AT&T to be more attracted to the piecemeal A and B block bands, which involves lots of little geographical areas. AT&T already has a robust 3G network which only needs to be filled in here and there, the thinking has been.

    Letâ''s consider again the fate of Xohm. Despite Dan Hesseâ''s assurances, Sprint faces some enormous challenges, as the CNet article spells out. Sprint posted â''a $29.5 billion loss for the fourth quarter of 2007 and warned that the company would continue to lose customers at an alarming rate in the coming year.â'' It eliminated its dividend and â''had to borrow some $2.5 billion just to get access to cash.â'' Churn is an enormous problem, having lost 683,000 customers in Q4 and â''likely lose another 1.2 million post-paid customers in the first quarter of 2008.â'' The old CEO was fired last summer. A planned WiMax partnership with Seattle-based Clearwire has fallen apart.

    Not long ago, I went to an invitation-only press briefing by the high-tech analytical firm, ABI Research. We journalists, naturally, grilled the gurus there about Xohmâ''s prospects in the face of these difficulties.

    We were told that Xohm is such a good idea, that the uses for wireless broadband are so great, and the need for a third, competitive, nation-wide broadband offering into each home beyond cable and DSL, so compelling, that Xohm will succeed even if Sprint has to sell it to someone who will build it. Craig McCaw, the founder of Clearwire, could buy it and build it. Or a group of investors could.

    But it occurs to me that Google, having not bought the 700 MHz C Block despite seeming to have compelling reasons for doing so, could turn around and snap up Xohm. For a company thatâ''s worth more than twice the entire U.S. car manufacturing industry, stranger things have happened.

    Responsibility of the failed Nanodynamics IPO points to underwriter

    Despite a reasoned a conclusion that the failure of Nanodynamicsâ'' $100-million IPO on the Dubai exchange put forth on this blog may have been the result of poor underlying economics for the company, it appears that the fault may have instead been with the underwriter.

    In an investigative piece by Alan Shalleck of Nanoclarity for Nanotech-Now, it is deduced that the lead underwriter, Global Crown Capital Ltd. (GCC), failed to put the underwriting funds into the issuerâ''s account, namely Nanodynamics, and this breach of the rules forced Nanodynamics to voluntarily de-list the stock and return the money to those who had purchased the stock.

    It seems interest in the company was thereâ''all the shares were pre-sold, but according to the rules of IPOs the underwriter has to first buy all the shares and then sell them. According to the Shalleck piece, the problem was that GCC just never put the $50 million for purchasing the shares into Nanodynamicsâ'' accounts.

    If this is indeed the case, as Shalleck points out, caveat emptor to nanotech startups. Make sure you have a good underwriter.

    Advertisement

    Tech Talk

    IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

    Newsletter Sign Up

    Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

    Advertisement
     
    Semiconductors

    Ode to the Pulsar P2 LED Watch

    Watch%20front.jpg My refurbished Pulsar P2 "Astronaut" LED watch came in the mail today, an early Xmas gift to myself that I've been anticipating for more than ten years. That's about how long it's been since my dad gave me his old watch and I've been looking for someone to fix it ever since. A recent fascination with the new crop of LED watches coming out of Japan led me to pull the old P2 out of the bottom drawer of my dresser a couple of weeks ago and renew my search for a repair person …

    Load More