Tech Talk iconTech Talk

NASA turns 50 today

On July 29, 1958, President Dwight Eisenhower signed the National Aeronautics and Space Act. The agency started operations on Oct 1 of that year, almost a year after the Soviet satellite Sputnik 1 stunned the world.

In its 50 years of existence, NASA has accomplished many milestones, but probably the most important is that it has expanded our horizons. The many pictures from the lunar missions, the Voyager expeditions and the Hubble Space Telescope have forever changed our view of the cosmos. In particular, one picture stands out: the Earth as a frail blue dot photographed in the darkness over the lunar desert.

To celebrate its anniversary, NASA has just launched a historical image archive that will enthrall space buffs:

http://nasaimages.org/

The BBC has also created a page devoted to NASA's 50th anniversary that has footage of President John F. Kennedy's pledge to reach the moon and Neil Armstrong's historic walk on the moon:

http://news.bbc.co.uk/2/hi/science/nature/7523700.stm

Randy Pausch, Inspirational Computer Scientist (1960-2008)

The lecturer who urged his students to go out and achieve their childhood dreams has succumbed to a disease he fought against in the public spotlight. Randy Pausch, a professor of computer science at Carnegie Mellon University died on Friday, 25 July, of pancreatic cancer. He was 47.

Pausch was diagnosed with the disease in August 2006. A year later, he was told the cancer had spread. Coincidentally, he had already accepted an invitation to speak at Carnegie Mellon in a format called The Last Lecture, in which invitees are asked to ruminate about what they would tell others if they knew they had one last chance to impart some final wisdom. Pausch went ahead with his presentation, despite the fact that his doctors had estimated that he had only a few more months to live.

PHOTO: WIKIPEDIA

THE LAST LECTURER: IEEE Member Randy Pausch brought a love of life and learning to the students he mentored, helping them to achieve their dreams.

On 18 September 2007, Pausch delivered a speech entitled "Really Achieving Your Childhood Dreams" before a packed lecture hall on the Pittsburgh campus. As he approached the podium, he was given a standing ovation from the hundreds in attendance. The word was out regarding his health.

In his talk that day, Pausch urged his listeners to work vigorously to overcome the obstacles life presents, to help others achieve their goals, and to seize the moment, because "time is all you have...and you may find one day that you have less than you think."

As fate would have, the presentation was videotaped and, thanks to the global reach of the Internet, it went viral, reaching millions.

Pausch was approached by a publisher to expand his remarks into a book. The result was The Last Lecture, which became an overnight success (and is currently No. 1 on the New York Times Bestseller Advice List). That led to appearances on American TV talk shows, from "Good Morning America" to "The Oprah Winfrey Show."

With this whirlwind of attention from the media, Pausch found himself drafted into a position as unofficial spokesperson for persons with pancreatic cancer, appearing in public service announcements. In March of this year, he testified before the U.S. Senate Appropriations Subcommittee on Labor, Health and Human Services, advocating for increased government funding for cancer research.

A World of His Making

Pausch was born in Baltimore on 23 October 1960. His family moved to Columbia, Md., when he was a boy. It was an omen, of sorts, for the young man, as the town was the first fully pre-planned community in the United States, emphasizing educational resources upfront as a premium in its urban design.

Pausch received his bachelor's degree in computer science from Brown University, in Providence, R.I., in 1982. He earned his Ph.D. in the same field from Carnegie Mellon in 1988. While pursuing his doctorate, he worked briefly in Silicon Valley for Adobe Systems and the Xerox Palo Alto Research Center. Nonetheless, he decided that education was his true calling, so he took a teaching position at the University of Virginia's School of Engineering and Applied Science, where he worked from 1988 to 1997, specializing in virtual reality systems and human-computer interaction.

As he told the audience in his Last Lecture speech, Pausch pursued his personal dreams by working for a time for Walt Disney Imagineering and game maker Electronic Arts in California while on sabbaticals.

In 1997, Pausch accepted a position as an Associate Professor of Computer Science, Human-Computer Interaction, and Design at Carnegie Mellon, where he co-founded the university's Entertainment Technology Center. His course Building Virtual Worlds soon became a favorite among computer science students, as well as other undergraduates. To help novices understand the basics of using software to design virtual simulations, he invented the Alice programming environment, an intuitive Java-based 3D scripting tool, which Pausch got Electronic Arts to sponsor as an open-source project on behalf of Carnegie Mellon.

An IEEE member, Pausch received many honors for his work during his short life, including the National Science Foundation Presidential Young Investigator award, a Lilly Foundation Teaching Fellowship, and the Karl V. Karlstrom Outstanding Educator Award from the Association for Computing Machinery (ACM). He published extensively in technical journals, especially those of the ACM and IEEE. He also co-authored the textbook for the language he created, Learning to Program with Alice (Prentice Hall, New York, 2005), along with several other books on software.

Still, it will be his final book, The Last Lecture (Hyperion, New York, 2008), for which he will be most remembered. Its popularity (currently ranked by Amazon as its No. 2 bestseller) will ensure his place among the ranks of writers who have popularized science.

In May 2008, Pausch was named by Time magazine as one of the "World's Top-100 Most Influential People."

After his Last Lecture presentation last September, a spokesperson for Electronic Arts said the company will honor Pausch by creating a memorial scholarship for women, in recognition of Pausch's support of women in computer science and engineering. And Carnegie Mellon has set up an honorary fund in his memory.

Pausch passed away at his home in Chesapeake, Va., last Friday surrounded by his wife Jai and their three children: Dylan, 6, Logan, 4, and Chloe, 2.

Tonight at 10pm EST, the ABC network will present a special documentary on his life appropriately entitled The Last Lecture: A Celebration of Life.

It will undoubtedly include a reference to a line he delivered in his famous presentation: "We can't change the cards we're dealt, just how we play the hand."

Girls as Good as Boys in Math but Lack Interest in Engineering

Contrary to cultural mythology, there are no differences between young men and women when it comes to mastering mathematics. That's the conclusion of a major new study looking at the performance of grade-school students by gender on standardized math tests.

The results of the study, published this week in the journal Science, found that girls now score just as well as boys in the exams. The study (Gender Similarities Characterize Math Performance) reviewed the annual test results mandated by the No Child Left Behind Act of 2002. With the cooperation of ten states, the researchers were able to compare the performance of more than 7 million children.

Yet, societal influences still prevail to conspire against young women in pursuing careers in technology dominated fields further reporting shows. And that remains unsettling to educators. Despite the news that young women in high school show the same aptitude for the basics of science and technology, they are not following these career paths at the college level in great numbers. This fact has been known for decades, and has not changed as the test scores at lower levels have improved.

In a report last Friday from the Associated Press, we learn that women now earn 48 percent of undergraduate college degrees in math, but that they still lag far behind in physics and engineering. Education researchers the AP spoke with think this discrepancy may be due to faults still built in to the grade-school math teaching agenda.

In looking at their own data, the authors of the Science article, led by Janet S. Hyde, a professor of psychology at the University of Wisconsin, noticed that in most states they reviewed, and at most grade levels, there weren't any questions that involved complex problem solving, an ability needed to succeed in high levels of science and engineering.

The AP report notes that the U.S. Department of Education recently convened a panel that called for changes in state tests to emphasize the importance of critical thinking in problem solving.

At IEEE Spectrum, we've been following this issue for years (for example, please see Getting Women Into Engineering Still Frustrates from last year). In 2005, we covered an important initiative aimed squarely at getting young women and girls excited about pursuing careers in engineering. As one of our contributors wrote then: "[T]he Extraordinary Women Engineers Project ... is being driven by a nationwide coalition of professional engineering societies, including the American Society of Civil Engineers, the IEEE, and the National Academy of Engineering, as well as universities and technology companies." (For more, please see A League of Extraordinary Women by Prachi Patel-Predd.)

And as recently as couple of days ago, Editor-in-Chief Susan Hassler, blogged about a session she attended at the recent Brainstorm Tech meeting in Half Moon Bay, Calif., sponsored by Fortune magazine, in which a past-president of the IEEE called on technology leaders to rededicate their efforts to attracting more young people into engineering by making them aware of "how engineers can make the world a better place."

Hassler's blog entry, Leah Jamieson Talks About Reinventing the Engineer, describes Jamieson, the Dean of Engineering at Purdue University, as challenging educators to move "away from the discipline-by-discipline approach and toward integrated experiences that allow students to appreciate how they'll be able to apply what they're learning."

After all, it's fine to inculcate facts and formulas into the sensitive minds of young men and women; but it's even better to help them understand how things work and what can be done to make those things work better by presenting context in a stimulating manner and welcoming students to learn by doing. Encouraging young people to try and fail and then try again, without social bias, is an approach to education that must become fundamental at all levels. This lesson applies to both genders, but we must show our girls and young women that we really mean to follow it.

Should We Discount Discounting in Climate Policy?

Discounting of future costs and benefits is used ubiquitously in evaluation of both private investments and public projects. Though highly technical in practice, the technique would seem (at least a first glance) to be based in a simple, commonplace, and almost undeniable empirical observation. Given a choice between being handed a hundred dollars today and hundred dollars a year or two from now, wouldnâ''t you rather just take the hundred right now? After all, even if you donâ''t have any immediate use for the money, you can always invest it, so that after a year to two it will be worth at historic rates of return $110 or even $120.

But when applied to very long time frames and large complicated situations with many unknowns or hard-to-knows, the subject of discounting tends to trip up and befuddle the greatest minds. Take climate policy, where the proper approach to discounting has become immensely controversial. In a recent New York Review of Books article, Freeman Dyson approvingly discusses recent work by Yale economist William Nordhaus while criticizing Sir Nicholas Stern, lead author of the British governmentâ''s monumental 2007 review of climate policy.

Rather inexplicably, Dyson says incorrectly that Stern â''rejects the idea of discounting future costs and benefits when they are compared with present costs and benefits.â'' Actually Stern stands accused not of that but of employing an excessively low discount rate, so that future benefits accruing from costly efforts to prevent climate damage appear bigger in present-day terms than they really should. (The higher the discount rateâ''the closer it gets to the normal long-term rate of return expected on investmentsâ''the smaller future benefits will be relative to current expenditures.)

In another recent article on this thorny subject, Oxford University ethicist John Broome correctly juxtaposes Sternâ''s preferred 1.4 percent discount rate with Nordhausâ''s 6 percent, nicely graphing the implications. But having done that, Broome seems to provide a philosophically incomplete account of discounting, and takes the reader into a thicket of ethical complications and conundrums where weâ''d really prefer to have a path cleared.

According to an authoritative treatment, the idea of discounting goes back two hundred years, to economists writing soon after Adam Smith. As it was elaborated in the following century and a half, mainly by economists in a noted â''Austrian school,â'' it came to have two main components: â''time preferenceâ'' (that is, our predisposition to take our pleasure now and put off pain), and diminishing marginal utility (since weâ''ll be richer in the future, added goods will have lesser proportional value). In 1937, the young Paul Samuelsonâ''the economist who grounded the whole field in advanced mathematics, turning it into a quantitative scienceâ''published a paper in which he gathered the aspects of choosing between present and future values into one rather simple formula, which carried the day. Though Samuelson had reservations about his procedure, which our source says further research would validate, his modus operandi was so simple and elegant it was irresistible, so that it became the standard for almost all cost-benefit analysis.

Broome, writing in the June issue of Scientific American, for some reason leaves pure time preference out of his account of discountingâ''perhaps he reasons that personal feelings about present and future happiness have no philosophic standing? Instead he focuses strictly on issues of marginal utility, arguing in essence that the well-being of future generations should not be highly discounted relative to our well-being. That attitude is consistent with Sternâ''s but leads, Broome himself concedes in a sidebar, to some bizarre considerations:

â''If humanity become extinct or the human population collapses [as a result of climate change], vast number of people who would otherwise have existed will not in fact exist. The absence of so much potential humanity seems an overwhelmingly bad thing. But that is puzzling. If nonexistence is a harm, it is a harm suffered by nobody, since there is nobody who does not exist.â''

Do we really need to go there? Perhaps not. Last year, in a respectful but critical review of the Stern report, Harvard economist Martin L. Weitzman suggested that some of its key conclusionsâ''its calls for rather aggressive and expensive actions to constrain greenhouse gas emissions nowâ''could be better justified in terms of an insurance argument rather than the usual Samuelsonian analysis in which all projected costs and benefits are calculated to reveal an optimal consumption path.

Now, in an even more technical followup paper, Weitzman argues (if Iâ''m following him correctly) that in situations involving very improbable but distinctly possible catastrophes, standard cost-benefit analysis is crippled not only by disagreements about discounting but also deep uncertainties in the assessment of consequences as such. That is, if we canâ''t really know how much damage could result from atmospheric greenhouse concentrations being (say) more than twice what they were before the industrial revolution began, how can we even begin to assess the net present value of that damage?

Weitzmanâ''s work appears to be a mathematical elaboration of arguments put forth several years ago by the conservative Chicago jurist Richard Posner, who argued (1) that it would be worth taking expensive action to reduce the probability of even a very improbable catastrophe, if the catastrophe is big and bad enough; and (2) that uncertainties about the catastropheâ''s likelihood and effects, rather than undermining the case for action, support it. That is, the higher the probability that climate change might be less severe than generally expected, by symmetrical statistical reasoning, the higher the probability it might also be even worse.

Leah Jamieson Talks About Reinventing the Engneer

hd-brainstormTech-lg.gif

Leah Jamieson, past IEEE president and Dean of Engineering at Purdue University, addressed the plenary session at Fortune's Brainstorm Tech meeting in Half Moon Bay, California on Wednesday.

brainstorm.leah2.jpg

Her call to action was three-fold---attract, educate, manage---and included:

The challenge for all of us: Changing the perception of engineering and other technology professions. Making sure that young people understand how engineers can make the world a better place.

The challenge for educators: Changing the way engineering and other technology disciplines are taught. Moving away from the discipline by discipline approach and toward integrated experiences that allow students to appreciate how they'll be able to apply what they're learning.

The challenge for industry: Changing entry level positions. Making theses jobs stimulating and rewarding so newly minted engineers don't flee to other industries.

Professor Jamieson knows whereof she speaks. She is co-founder and past director of the Engineering Projects in Community Service---EPICS---program. Under that program, teams of undergraduates earn academic credit for multi-year, multidisciplinary projects that solve engineering- and technology-based problems for community service and education organizations. She and EPICS colleagues Edward J. Coyle and William C. Oakes were awarded the U.S. National Academy of Engineering's 2005 Bernard M. Gordon Prize for Innovation in Engineering and Technology Education. She's also received the NSF Director's Award for Distinguished Teaching Scholars, been inducted into Purdue's Book of Great Teachers, and been named Indiana Professor of the Year by the Carnegie Foundation and the Council for the Advancement and Support of Education.

The IEEE and IEEE Spectrum Attend Fortuneâ¿¿s Brainstorm Tech

hd-brainstormTech-lg.gif

Iâ''ve just returned from Fortune Magazineâ''s Brainstorm Tech Conference, an invitation-only meeting held this past week in Half Moon Bay, California. Chaired by Fortuneâ''s Tech Guru David Kirkpatrick, its audacious goal was â''to sharpen the thinking of attendees about the escalating impact of tech-driven change for all business and global society.â''

dkirk.jpg

The IEEE helped sponsor this high-profile event and was well represented among the 200 or so tech movers and shakers in attendance.

Members participating included Rodney Brooks, MIT professor and CTO of iRobot; Vint Cerf, Google VP and Chief Internet Evangelist: John Chen, CEO and President of Sybase; Martin Fowler, Chief Scientist of ThoughtWorks; Seth Goldstein, CEO of SocialMedia Networks; Leah Jamieson, Dean of Engineering at Purdue University and Past President of the IEEE; Vyomesh Joshi, Executive VP, Imaging and Printing Group, HP; Pradeep Khosla, Dean of Engineering at Carnegie Mellon University; Jeff Kowalski, CTO, Autodesk; Victor Lawrence, Professor, Stevens Institute of Technology; Dave Morgan, Former Executive VP of Global Advertising Strategy, AOL; Vineet Nayar, CEO, HCL Technologies; Frank X. Shaw, President, Microsoft Accounts Worldwide, Waggener, Edstrom Worldwide; David Sze, Partner, Greylock Partners; Ted Vucurevich, CTO, Cadence Design Systems; and Jody Westby, CEO, Global Cyber Risk and Carnegie Mellon University Distinguished Fellow.

Over the next few days Iâ''ll tell you about some of the ideas and discussions that emerged from the brainstorming event.

But first, a little segue over to the personal submarine brought along to the meeting by our fellow sponsor, U.S. Submarines .

The Triton Model 1000 is a two- or three-person acrylic bubble-topped submersible that can travel to depths of 1000 feet for up to 6 battery-powered hours. Retailing at US $1.8 million dollars, theyâ''ve sold one so far, and have a contract for a three-seater version in hand. Youâ''ll need a yacht to launch it from, or perhaps a serious piece of waterfront property with a good-sized dock. U.S. Submarines will train you how to handle the Triton 1000, and as long as youâ''re not bringing paying guests on board, you donâ''t need a formal license to run it.

sub3.jpg

U.S. Submarines, however, is building these, not only for wealthy individuals, but ultimately to transport guests to their underwater resort, called Poseidon , which is supposed to be built somewhere off an island near Fiji.

sub2.jpg

Poseidon is currently in the final design stages, they say; you can sign up on their website to be contacted when they start taking guest reservations.

Iâ''m not sure this is exactly what the organizers had in mind when they wanted us to think and talk about technologyâ''s impact on society, but this is a very cool transport device, and itâ''s sure to rock the world of a select group of people.

Out of Africa: light and dark visions of text-messaging

An article of mine in The New York Times last Sunday on the emergence of underground computer geeks in Nairobi, Kenya, flushed out -- for me at least -- a neglected movement of scholars studying the effect of the mobile on African societies in particular and developing countries in general.

Of the varied research that's been brought to my attention, probably the most striking are a series of studies by two researchers at the Georgia Institute of Technology's Human-Centered Computing Program in Atlanta. The study, comparing how folks in Nairobi and Atlanta use information technology, are funded by Intel Corp. and supported by one of the company's researchers in Berkeley, Calif. Rather surprisingly, the trio of researchers is examining how religious behavior is influenced by new information technologies -- and influences them.

The big idea in the paper, published this spring in an ACM publication, is that in Nairobi, where evalengical Christianity has made deep inroads with middle-class and educated people, the mobile phone and the Internet are viewed as tools that improve the quality of one's religious experience. As the authors write of their Nairobi subjects, "When asked if they used technology to stay focused on their faith, participants answered with stories about using computers, software and mobile phones to do so."

One especially notable finding: text messaging is being used "to send and receive prayer requests."

Who knew that the devout can text message for Jesus or praise the Almighty by tapping on a tiny Nokia keyboard.

The authors -- Susan Wyche and Rebecca Grinter of GIT and Paul Aoki of Intel -- stick too closely to the technological experience in my view. They don't examine the role of Safaricom, the biggest mobile supplier in Kenya, In explaining the attraction of prayer through text messaging, they don't examine the role of Safaricom, Kenya's leading cell provider, in promoting the mobile phone as the ultimate expression of urban sophistication.

And then there's the profit motive. Text messages, after all, aren't free, so that the explanation for their use by evangelicals might instead be found in the rampant commercialism in some tendencies of African Christianity.

The very same practice of text-message by Nairobi denizens -- four times more of which have a mobile phone than a bank account -- is the subject of another recent paper, this forwarded to me only this morning by Oxford University's David Anderson, a leading historian of Africa who also directs the British university's African studies center. Written by Oxford student Michelle Osborn, the paper -- published in June in the Journal of Eastern African Studies -- documents the role that text messaging played in the recent post-election protests in Kenya earlier this year.

Many scholars have found that the mobile phone has contributed to democratization in African countries (and elsewhere in the developing world), by helping protesters evade government repression, organize protests and broadcast their message internationally. In the Kenya case, however, Osborn found some negative fallout from rampant text messaging during the disturbances, some of which were violent and result in the deaths of hundreds of people.

Test messaging, Osborn writes, "brought a new, unpredicted dimension" to the conflict between supporters of rival political parties, both of whom wanted their man to be elected president. "Politicians utilized rumors and SMS texts to galvanize supporters into collective action," some of which was ugly.

The mobile phone and the Internet are undeniable positive forces for good. But less positive outcomes are co-evolving alongside the welcome ones. As scholars apply more effort to studying these new technologies, their social effects -- and perhaps even how to design devices more effectively -- should become better understood.

Industry Group Backs New Wireless HDTV Scheme

A consortium of high-definition TV makers has announced its support of a new technology for wireless connectivity of its products. The proposed system, called the Wireless Home Digital Interface (WHDI), will use a proprietary chip set and communications algorithms from Amimon Inc., a startup headquartered in Herzlia, Israel. The WHDI scheme is a variant on the proposed IEEE 802.11n standard.

Hitachi Ltd., Motorola Inc., Samsung Electronics Co., Sharp Corp., and Sony Corp. said yesterday that they will serve as promoter-level members of the special interest group. They intend to develop an upgraded set of specifications for the technology by the end of the year and then work for its acceptance as a new standard.

Amimon claims the WHDI spec will provide wireless connectivity of up to 100 meters for uncompressed high-def video within a home network. The company said in a press release yesterday that its system relies on a video-modem that operates in the unlicensed 5-gigahertz band to enable wireless video and audio delivery with less than 1 millisecond latency. It said WHDI signals will operate through walls and other obstructions into multiple rooms with a variety of future consumer electronics.

"The development of the new standard will ensure that when consumers purchase CE devices and take them home, they will enjoy a fast, easy, and hassle-free wireless connection that delivers the highest quality," said Yoav Nissan-Cohen, Amimon's chief executive officer. "The WHDI standard's objective is to enable an enriched customer experience with multi-vendor interoperability."

The company has created a site on the Web to promote the WHDI initiative.

According to an EETimes report, the WHDI group faces competition from a number of challengers. For example, another startup, called SiBeam, already offers wireless high-def home networking in the 60-GHz band but only at distances of 10 meters currently. SiBeam, based in Sunnyvale, Calif., has gathered an impressive roster of supporters, as well, such as Intel Corp., LG Electronics, Matsushita Electrical, NEC Corp., Samsung, Sony, and Toshiba Corp. (with some backing both proposals). However, this consortium's efforts, called WirelessHD, have yet to gain much traction in the market or in the standardization process.

It is still early in the race to eliminate cables from consumer electronics in the home, but the challenge just got another boost of interest with this news.

When is a terabyte not a terabyte?

Consider the following exchange from the generally excellent public radio show, On The Media.

BROOKE GLADSTONE: Mathematician Martin Wattenberg observed in Wired that the sum total of all the words you'll hear in your lifetime amount to less than a terabyte of text. So then how much is a petabyte?

CHRIS ANDERSON: A petabyte is, mathematically it is, you know, 1,000 terabytes, but we have a hard time understanding that scale. We usually use the sort of, you know, the Library of Congress, as an example. The Library of Congress is sort of, you know, on the - you know, on a couple of terabytes scale; a petabyteâ''s a thousand of those.

We've never seen petabyte scale data aggregations before. Thereâ''s never been anything like that because we're still relatively early in, you know, the digital age. But Google has just hit that state. Google processes about a petabyte of information every 72 minutes, and a year from now it'll process a petabyte every half an hour, and so on.

There's a category mistake here to the tune of at least two, maybe three or four, orders of magnitude; it's an awful lot like mixing little-c calories and big-c Calories in the same sentence.

Consider a simple document containing the words "hello world" - 11 bytes of information, in a fairly straightforward way of counting them. The same words took up 19 456 bytes when put into Microsoft Word document. Take a image-file snapshot of it, and it might blow up to a 76 468-byte file, as it did when I used the Grab utility a few minutes ago.

When we measure the Library of Congress, we tend to make a successive estimates of the number of books, pages, words, and, finally, bytes, taking as a rough measure one byte per character. The makers of these estimates generally look for the smallest possible number, and I suppose we should be grateful they don't subject the resulting number to some ZIP-like compression.

When we consider Google's petabytes, we're, presumably, looking at the number of bytes its spiders crawl through, or the bytes on its millions of hard disks, throwing together video, audio, jpegs, and PDFs in with the text. Indeed, a lot of simple text gets counted in terms of the HTML pages it resides on. For comparison purposes, "hello world" is a cool 2500 bytes when you let Microsoft Word turn it into a .htm file.

Then there's the question of information, and Information. The Library of Congress is filled mainly with books. That is, it contains words that have been carefully thought out, then written, then vetted by a publisher, then pushed out into the world at great expense, with the expectation that they will be useful and interesting to thousands, often millions, of people, across several decades.

Google's cache, on the other hand, is filled with MySpace diary entries, LOLcat images, videoclips of Jon Stewart, and millions of copies of Abba songs. Compress those terabytes down to their truly useful and interesting elements, and you have, well, not much more than the 11 bytes of "hello world."

More memory devices masquerade as jewelry

Snapshot%202008-07-23%2009-27-37.jpg

Like many women, I've looked at a piece of my jewelry and thought, "well, it's not that gorgeous but I hang on to it for its memories."

Next time I say that, I might not be talking figuratively. Because yet another wearable USB memory stick is about to hit the market.

I've made fun of USB jewelry in the past, like the Swarovski crystal pendants introduced by Philips at this year's Consumer Electronics Show. But it could be turning into a trend. This week Super Talent Technology introduced the Pico-C Gold, a 24-carat gold plated, 8 GB, $40 USB necklace. It's not bad looking (see photo above), kind of high tech with a touch of 1980s fern bar. At least it doesn't try to hide the fact that it is, indeed, a memory stick.

Maybe I'll get one. For the memories.

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More