Tech Talk iconTech Talk

Hot Town

Appropriately enough, the three weeks that the song Light My Fire spent as the #1 song on the Billboard 100 in 1967 were the dog days of that summer. That phrase, by the way, refers to

the ancient Romans, who noticed that Sirius rose with the sun from July 3 to Aug. 11. As the major star of the "Big Dog" constellation, Sirius is often called the "dog star." It's the brightest star in the nighttime sky. The Romans assumed that the two stars were acting in league to create the "days of great heat."

I was reminded of Light My Fire by an email from frequent Spectrum contributor Kieron Murphy. Another contributor, Brian Santo (author of our popular May 2009 feature, "25 Microchips That Shook the World" and its hilarious backstory sidebar, "Where in the World Wide Web Is Al Phillips?"), who also received the email, responded:

Even though I was still in elementary school in 1970, I had an intuitive grasp of what was going on with The Beatles, and The Stones, and Hendrix and Joplin and many of the other great artists I heard on the radio. But I never really got The Doors. More specifically, I never really got Morrison. I understand Krieger and Manzarek showed sparks of brilliance. I understand that Morrison's Lizard King schtick was dangerous/sexy. But IMHO, the man was a diffident poet/writer at best, and if he hadn't died (let's assume, for now, that he's really in the grave in Paris that I have actually visited) he would have been off the charts for years and doing "This is not your father's Oldsmobile" commercials with a couple of kids he'd been legally forced to adopt.

It's revealing that Brian would refer to the radio, and it's funny that he would picture the song in, of all things, a car commercial. According to Wikipedia,

when Buick wanted to buy the piece for use in a 1968 TV commercial ("Come on, Buick, light my fire") and Morrison, who had been out of town, learned that other group members agreed, Morrison called Buick and threatened to have a Buick smashed with a sledgehammer on a TV show should the (presumably ready) commercial be aired.

I'm a little older than Brian - not a lot, but perhaps just enough to feel very differently. I was sitting with a friend in a pizza place on 37th Road in Jackson Heights, Queens, my first week of 7th grade, when I first heard "Light My Fire" on the radio. The pizzeria was, literally, ovenlike, the pizza was thin, blistering, and delicious; the time was one of those proverbial fry-an-egg-on-a-New-York-sidewalk afternoons; the song was just as fiery hot and yet slow and lyrical; it was clearly about sex, something that, as a twelve-year-old, I was coming to understand the importance of, if I didn't quite understand it itself; the lyrics were kind of silly but the melody was big and ballad-like and beautiful and it went on forever — I had never heard a seven-minute-long song on the radio, and as the keyboard solo gave way to the guitar solo it seemed impossible to believe it was still the same song playing. It was, I now realize, opulent and yet not in the least self-indulgent. In the pizzeria, my friend and I both stopped talking somewhere during the guitar solo and just listened.

In a 13-minute radio story in 2000, NPR reporter Guy Raz said it "broke the mold of the conventional hit pop song when Light My Fire went to the top of the charts."

Light My Fire clocked in at just over seven minutes. No one in the music industry believed it could work at that length.

John Densmore, the Doors drummer, told Raz, "In those days, if you wanted to be on AM radio, you had to be at three minutes." Raz says the band cut out the solos and "whittled it down to three minutes. But fans who owned the album swamped radio stations with requests for the full seven-minute version."

I wasn't one of those album owners — not yet. Sixteen months later, I turned 13, and of the three birthday albums I got from my friends, "The Doors" was the only one I had requested. The other two were Cream's "Wheels of Fire" and the Beatles' White Album — by then the psychedelic movment was in full sway, led by The Doors' eponymous album and the Beatle's Sgt. Pepper's Lonely Hearts Club Band. Sure, Sgt Pepper was the first rock album to win Album of the Year at the Grammy Awards and Rolling Stone magazine has named it the greatest album of all time. But Light My Fire was the song that changed radio forever. Guy Raz again:

No one had ever heard a song like it - seven minutes, free-form, psychedelic, Light My Fire was dark and brooding, haunting and romantic, at the same time. The song is a demarcation point in rock 'n' roll history. It shattered the acceptable boundaries of popular music. Themes of love, mortality, intoxication, and recklessness. All offer a glimpse into the turbulent era that was to come soon after its release.

With satellite radio, digital radio, and podcasts, radio is metamorphosizing today, as it did in the late 1960s. The changes today are technological, though, while back then, as AM gave way to FM, radio  — and music itself — became both more personal and more political.

What's different between now and then is how important radio was - more important, to music at least, than television or any other medium. (When the Doors or the Beatles appeared on Ed Sullivan, it certified a popularity that had been created by radio.) In 10 or 15 years, surely all radio programming will be delivered by the Internet, which will be given the AM and FM frequencies. It seems odd to think that once, the term "wireless" was synonymous with AM radio, and that the two leading communications technologies at the time of the seminal U.S. Communications Act of 1934, radio and telephony, will be digital afterthoughts, little more than a small fraction of the packets riding the TCP/IP radiowaves.

For me and my friends of forty years ago, our favorite DJs defined more than our musical tastes, they helped us think about drugs and sex, philosophy and fashion, war and patriotism. They sometimes set the very calendar we lived by. I remember how, well into the 1970s, each year my friends and I would wait impatiently for the first hot late-spring day. The radio would be set to 102.7. WNEW-FM's afternoon DJ Dennis Elsas would come on the air and play The Lovin' Spoonful's Summer in the City, and so would begin the dog days of summer.

null

Hot Chips for Games

IMAGE CREDIT: Wikimedia Commons

I'll be covering the 21st annual Hot Chips conference for the next couple of days.

Hot Chips is an industry nerd-off brings together designers and architects of high-performance hardware and software among the Spanish colonial architecture and rarefied air of Stanford University every August. The logic-heavy Powerpoints are interspersed with a few keynotes to remind everyone what’s at stake in all these mind-numbing comparisons of SIMD vs MIMD.

One of the big ideas this year appears to be the future of gaming. On Tuesday, Intel’s Pradeep Dubey will chair a keynote presented by Electronic Arts chief creative officer Rich Hilleman.

When I first saw the title of the keynote, “Let's Get Small: How Computers are Making a Big Difference in the Games Business,” I pinched my arm because I thought for a second I was experiencing some horrible Life on Mars style delusion/time travel. Computers making a big difference in the games business? Oh, you think so, doctor!

But it turns out to be more complicated than the title indicates. As usual, it all goes back to Moore’s Law.

Moore’s law says that as transistor size keeps shrinking, more of them can be squeezed onto a given area of silicon, and as long as the price of silicon remains the same, those trasistors will just get cheaper as they get smaller. That means the chips will get cheaper too. That’s why you can get so much processing power for ever-decreasing amounts of outlay.

These days you can get the kind of processing power in your mother’s basement that  just 10 years ago, was reserved for the Crays and Blue Genes and other monstrosities available only to government research facilities.

However--the cost of developing a PC or console game increases exponentially alongside Moore's Law. David Kanter, my go-to guru at Real World Technologies explained it thus:

Moore's law says transistors double in density roughly every 18 months. Graphics performance is perfectly parallel with Moore’s law, which means graphics performance too, roughly doubles every 18 months.

And when graphics performance doubles, you need higher-resolution artwork to render in a game. At that point frame rate over 60 FPS aren’t helpful, what you really want is more details and new effects to wow your gamers.

And if you want that higher-resolution artwork, you need to hire more artists. That rule also tracks with Moore’s law—more and more artists are necessary for each generation of chip.

The upshot is that the cost of developing artwork scales with transistor counts for GPUs, which are themselves driven by Moore's Law. This means that the cost of big-name games—like Grand Theft Auto, Quake, Doom, and the like—increases exponentially. That's a big problem for developers whose pockets are shallower than EA’s.

And that is one reason the games market for phones is exploding. For these little rinky dink displays (iPhone = 480 x 320 pixels, my phone = 220 x176), development costs are so low compared to a PC or console, anyone can make one (maybe even in their spare time).

Back in March, at the 2009 Game Developer’s Conference, ex-EA developer Neil Young (founder and CEO, ngmoco) delivered a keynote called “Why the iPhone just changed everything.”

He said that the iPhone--and the class of devices it represents--is a game changer on the order of the Atari 2600, Gameboy, PlayStation One, or Wii. He predicted that the iPhone will “emerge a gaming device as compelling as existing dedicated handheld game devices.”

Kanter suspects that at Tuesday's keynote, EA and Intel may discuss how ray tracing could make developing artwork easier and less expensive. Thoughts? Comments? Predictions? Do you buy the idea that gaming is splitting the world into empire of AAA game development vs. rebellion of mobile phone developers?

 

Business Articles on Nanotechnology Take on a Familiar Formula

I am simultaneously amazed and concerned when I read mainstream publications tackle the issue of business and nanotechnology. Last month we had the NY Times informing us that things were looking up for the commercialization of nanotech because it seemed industry and academic research centers were beginning to team up. What a novel idea.

Of course, this penetrating analysis followed the Grey Lady’s previous prediction over 18 months earlier that nanotech was going to finally experience its long-awaited boom with a series of IPOs…that never came.

But the latest bit of business journalism I’ve read on nanotech comes from one of the NY Times’ subsidiary publications, the Boston Globe. What is fascinating about this one is how it reads like a “Mad Lib” for nanotech articles formulated in 2001.

We get insights like the nanotechnology market in 2015 will be worth (fill in blank with number) trillion. Then they even find ways to throw in all the favored terms used in 2001, like “nanobots” or size definitions such as a “nanometer, equal to one-billionth of a meter”.

So, why am I am amazed and concerned at the same time? I am amazed because we can read articles that manage to be repeating articles written from nearly a decade before, or have such a flimsy grasp of the mechanisms of commercializing emerging technologies that they believe industry/lab partnerships are actually an innovative idea. And I am also concerned because I read business articles from publications like this on topics that I know far less about than nanotech, should I be worried? I think maybe, yes.

DNA Scaffolding Technique Promises Sub-22 nm Lithography

In a paper to be published in next month’s Nature Nanotechnology, Researchers at IBM’s Almaden Research Center and the California Institute of Technology have developed a way to use DNA origami structures as a quasi circuit board or scaffold for precisely assembling components at resolutions as small as 6 nm.

The attractiveness of the process is that it utilizes currently used lithography techniques. Spike Narayan, manager, Science & Technology at the  IBM Almaden Research Center is quoted in the IBM press release:

“The cost involved in shrinking features to improve performance is a limiting factor in keeping pace with Moore’s Law and a concern across the semiconductor industry,” he says. “The combination of this directed self-assembly with today’s fabrication technology eventually could lead to substantial savings in the most expensive and challenging part of the chip-making process.”

The BBC’s coverage of the same story followed Narayan’s quote above with the rather sobering reality that it could take as long as 10 years to see this technology integrated into the semiconductor industry.

Whenever you see the figure “ten years’ used in future projections you could just as easily add another zero to that number. It’s sort of like saying, “Who knows?”

Give Social Networking the Finger

Give social networking the finger.Fingerprint authentication isn't just for security anymore. Authentec makes fingerprint sensors for enterprise computers, and their main clients have until recently been the military or any company that really needs to keep its laptops secure.

In my last post about Authentec, I swooned about how they go the extra mile to protect you from finger-truncating impersonators and eyeball-gouging identity thieves. (The company doesn’t simply use a picture of the top layer of the skin; it uses radio frequencies to measure the valleys and ridges of the fingerprint beneath the outer layer of skin, or within the live layer. Because they’re measuring these RF fields within that live layer, a finger that has been separated from its owner won’t work in setting up that first RF field when a user contacts the sensor. Without the attached owner, there's no pattern and the finger is no good.)

Today, Authentec announced that they’re putting those military-grade fingerprint sensors into netbooks. Nothing says top secret like a fluffy little netbook, right? It’s the king of consumer-only applications, a cross between a lightweight laptop and an big-screen iPhone.

Here’s where the fingerprint sensor goes to work for consumer netbooks. Instead of protecting your identity a la The Bourne Identity or Angels & Demons, in your netbook the sensors take on a completely different capacity. They're putting your fingerprints to work for more mundane tasks.

It’s not just the one fingerprint that distinuishes you. The sensor easily differentiates among your ten fingerprints. Their software (called TrueSuite) lets you assign different fingers to different functions, including accessing facebook or twitter accounts, or your email. The program is even able to condense processes that would normally take multiple steps into the swipe of a single finger.

For example, say you want to log into your facebook account. Normally, you wake up your sleeping, locked laptop, type in your OS's password, open your browser, navigate to facebook, and type in your username and password.

With the fingerprint sensor, you skip 4 of those 5 steps. Instead of doing any of the above, you swipe your designated finger. The software reads your finger, and takes care of the rest. You set how it reacts to the swipe of your ten fingers: open gmail, facebook, twitter, flickr, picasa--all you have to remember is what job you gave which finger.

The software

Authentec is also working on the LED lights that surround the sensor, which glow when giving you a notification. Normally these would have limited use; i.e., if you swipe the wrong finger or you’re the wrong person, you'd get a red blinking light, if you did it right you'd get a green light. But the Authentec people devised a few new uses for these LEDs. You can set your own colors the same way you set the actions for your fingers.

Say you’re taking some time out of your busy schedule for an important episode of Walker: Texas Ranger. Your laptop has long since gone to sleep and locked itself. To find out if you have mail, you’d normally have to stand up (all the way!), walk across the room (nooo...) and wake up and unlock your computer. That could take up to 10 seconds! But, with this app, you can glance across the room and see that you have a red flashing LED, which means there is a message waiting from your boss, or a blue flashing light indicating a note from your mother. Granted, you still have to move the muscles that control eyeball directionality, but there’s no such thing as a free lunch.

This is probably the best thing that could have happened to fingerprints. I think it’s not such a bad idea to take fingerprint authentication out of highly secure environments and repurpose it for more mundane applications. Fingerprints will become almost meaningless as a security measure within less than 20 years.

The two main things that will undermine security at every turn:
1) Poor adminstration. Read this slashdot post to understand-- biometrics are just databases, and databases need to be securely and competently administered.

It's too difficult to manage a 2000 or even 200 member authentication database. The simplest administration is just not done because it is tedious or takes too much time. ... You have the human being that lets everyone into the building, security guards that think you work there because they've seen you before, meeting rooms filled with all-open network connections and a bunch of people that write down their password on a sticky note, even if it's as simple as their husband's name, brand of monitor or keyboard or something else.

2) Time. The younger you start the less secure your fingerprints will inherently become: "Many people are trying to regard biometrics as secret but they aren't. Our faces and irises are visible and our voices are being recorded. Fingerprints and DNA are left everywhere we go and it's been proved that these are real threats." Slashdotter Kadin2048 commented that

The fact that you can't change your fingerprints is a real problem if they start to use biometric systems for authentication. Particularly since there are biometric-ID systems used by children: in my area, they're currently testing and preparing to roll out a school-lunch system that uses fingerprints (it's a debit system -- no more stolen lunch money, and no way to tell who's on the subsidized lunch program or not). When you start using biometrics that young, you have a long time for them to possibly get compromised and spoofed.

The fingerprints you have, you own for life: so any system has to be built on the assumption that they will be compromised. In particular, future systems should be built knowing that people are going to come in who've already had all 10 fingerprints compromised already. The solution isn't to just come up with more biometric identifiers to use as secrets, the solution is to not use them as secrets at all.

Biometric identification can be used for convenience or for security, but it's probably best not to try both.

Telcos Spend Much, But Are They Spending Smart?

For the 2nd quarter of 2009, Sprint lost almost a million regular (“postpaid”) subscribers. The loss was partially offset by 777,000 prepaid subscribers, but in addition to the net loss, prepaid subscribers generate an average revenue per user per quarter of $34, while “its ARPU for postpaid plans was $56,” according to an excellent summary by NetworkWorld. No wonder Sprint lost $384 million in the quarter.

And Sprint is hardly alone. The much smaller Vonage lost 89 000 customers (download its Q2 report here ), “Churn rose to 3.2 percent from 3.0 percent in the prior year's quarter.” U.S. Cellular 88 000 (Q2 report here) from a customer base of 6.2 million.

Sprint's losses come despite spending well over a billion dollars annually in marketing. Did you ever get the feeling that the huge sums that phone companies—especially cellphone carriers—spend on big advertising campaigns should instead be used to, oh, say, train their customer service employees or build more base stations?

A group called the CMO Council has the same feeling, and they've backed it up with an 80-page study. The report notes that the companies within the $4 trillion telecommunications industry are under great stress—companies that used to concentrate on doing one thing well—such as voice calls on a cellular network—now have to provide data services, broadband connectivity, multimedia messaging, FM radio reception, and so on. “In 2009, mobile phone users are expected to download over 10 billion applications to their mobile phones.”

Competition is fierce, and not just among cellular carriers. “Even as traditional phone companies controlled more than 83 percent of the North American market for voice services,” the study says, “competition with cable providers had saved consumers more than $23 billion and could save households and small businesses a total of $111 billion over the next five years.” Of course, competition saves customers money because it gives customers the ability to switch from one provider to another. Looked at from the carrier side, that's a deadly increase in the churn rate.

The study cites the analysts at McKinsey for two key facts:

  • Satisfying and retaining current customers is three to 10 times cheaper than acquiring new customers, and a typical company receives around 65 percent of its business from existing customers.
  • A five percent reduction in the customer defection rate can increase profits by 25 to 80 percent, and seven out of 10 customers who switch to a competitor do so because of poor service.

And yet, are companies truly fighting to retain customers? Two more stats:

  • A Gartner study found that 92 percent of all customer interactions happen via the phone, and 85 percent of consumers are dissatisfied with their phone experience.
  • A typical business only hears from four percent of its dissatisfied customers; the other 96 percent leave quietly. (University of Pennsylvania)

In a phone conversation last week, CMO Council executive director Donovan Neale-May told me “there is a disconnect between marketing, and the back-end IT service delivery groups.”

“So marketers are saying, hey, we're spending, in this case, when you look at how much money is being spent by these large telco wireless operators, I mean, the top advertisers like AT&T, they're spending over US $3 billion. Verizon, over $3 billion. Sprint $1.5 billion. Comcast, $670 million. DirectTV, $450 million. They're spending a lot of money on marketing.

“And they're saying it's costing them more and more to acquire and keep and they're also seeing greater churn rates. So you're spending large sums of marketing money on your brand, and on a promise or a claim, announcing and delivering new services, new plans, new pricing, new devices, new applications, yet there's a dissatisfaction—a high level of dissatisfaction—with unmet needs and expectations with products and services, usability and complexity, with billing problems...

“So on the one hand we see massive outlays of money, for demand generation, and for branding, and for making people feel good and nice about these brands, yet the marketing people aren't doing what they should be doing, which is interacting more with the different stakeholders within the operation, and engineering, and technical side of things to improve the billing and the financial side to impove the customer experience.”

Neale-May says it's not how much a company is spending, it's how. One last stat from the CEB report:

  • Companies that restructure call centers around a customer service strategy often cut their costs by up to 25 percent and boost the revenue they generate by as much as 35 percent. (McKinsey)

 

Highlights from National Instruments Week 2009

The annual NI Week was full of cool demos. There was the scarily-named robotic "flying blade of amputation," a robotic keyboard and glockenspiel duet, and the always-popular Guitar-Hero playing robot (with a new twist). This video provides a taste of the best demos and highlights from the keynote sessions.

As always, NI debuted some new products, and gave previews of what's still to come. The most impressive of the future-features was the thin-client web interface that will soon be part of their LabView software. NI already has hardware that runs web services accessible via URL. But the next generation of LabView will allow users to build a web interface right in their browser, without knowing flash or Java. The best way to understand the new capability is to watch the demo below. I've edited it down in length to give just the essentials of the technology without all the pageantry of the keynote.

Nanotech Agreement between IBM and Bulgaria Put on Hold

I suggested at the end of May that it appeared as though IBM was starting an entirely new line of business. Small countries were turning to IBM to jump-start their nanotechnology initiatives.

Well we have seen the first of these announced partnerships fall to the wayside due to the inability of Bulgaria to pay for the expense.

It seems that Bulgaria’s state spending has increased by 30% in the first half of 2009. My speculation is that this increase in spending is likely due to paying unemployment compensation for those laid off during the worldwide recession, and trying a bit too hard to spend their way into joining the Eurozone. In any case, the finance minister started looking around for ways to cut the budget and nanotechnology rose to the top of the list.

If there is a lesson to be learned here, IBM has discovered that one of the drawbacks of doing business with small economies is that sometimes they just can’t pay the bill.

null

Fantasy Sports Prep: Semiconductor Edition

IMAGE CREDIT: Wikimedia user Inductiveload

 

I'll be at Hot Chips in late August, along with the rest of the engineering press (eat your heart out, Perez!). In some ways you might say Hot Chips kicks off chip season, which continues to the International Electron Devices Meeting (IEDM) in December and culminates in the superbowl of chip-talk at the International Solid State Circuits Conference (ISSCC) in February.

In case you want to do some research for your fantasy semiconductor team, Real World Technologies' David Kanter has compiled an excellent post-game analysis of the 32nm process technologies rolled out at IEDM 2008 and ISSCC 2009. In particular, he discusses the semiconductor industry's move to 32nm manufacturing processes with high-k dielectrics and metal gates (HKMG). (I'd like to point out here that it's not all about process technology: the cyborg moth at ISSCC was so creepy I thought I had woken up in a dystopian Arnold Schwarzenegger movie. Is that reference dated now? Who's the new Arnold-- anyone care to update my pop culture database? But I digress.)

David was kind enough to summarize the article for me. However, you should still go read the whole thing yourself because this is the version for people who are easily distracted by robo-moths and Arnold Schwarzenegger's falling star.

New manufacturing technologies are essential to keeping Moore's Law on-track and driving continued advances in microprocessors, graphics processors, FPGAs, ASICs, networking and other industries that rely on semiconductor technologies. At IEDM 2008 and VLSI 2009, leading edge manufacturers announced their initial results for 32nm process technologies, discussing key techniques including transistor strain, immersion lithography, double patterning and for some, custom illumination.

The process technologies analyzed include:

1.  IBM and AMD's research work on a HKMG 45nm process using silicon-on-insulator (SOI), which is not expected to go into production.

2.  IBM and the Common Platform's HKMG 32nm bulk process

3.  Intel's high performance HKMG 32nm process, slated for production at the end of 2009

4.  TSMC's performance optimized HKMG 32nm and 28nm process expected in 2010

5.  Intel's low power 45nm process for SOCs, the first low power process to feature HKMG

6.  Toshiba and NEC's HKMG 32nm density optimized process, which currently uses custom illumination, rather than double patterning

7.  IBM and AMD's high performance HKMG 32nm SOI process, expected to debut in late 2010.

The results for each process include key density metrics such as contacted gate pitch and SRAM cell size and transistor performance metrics such as NMOS and PMOS transistor drive strength.  We include a historical comparison that puts these newer manufacturing technologies into a historical perspective, going back as far as 130nm.  New to this year's coverage of IEDM and VLSI is a graphical comparison of density and performance for various 45nm and 32nm process technologies.

Of particular interest are several facts: First, the rest of the industry, including IBM and AMD will finally catch up to Intel's manufacturing technology, using high-k dielectrics and metal gates, at 32nm. Second, approximate parity between Intel, AMD and IBM for manufacturing CPUs. And finally, approximate parity between TSMC and the Common Platform for bulk foundry processes.

I for one would like to read more about Globalfoundries, but I imagine we'll get an eyeful of that from everyone over the next six months.

Nanotechnology Provides the "McGuffin" for Summer Movie Blockbuster

To those of you not familiar with the term “McGuffin,” according to Alfred Hitchcock, the term comes from the story of two men traveling on a train.

One man asks the other what is that you’re carrying in your luggage. The other man responds by saying it’s a McGuffin. When the first man asks what a McGuffin is, the other says it’s a gun for hunting lions in the Scottish highlands. The first man, nonplussed, responds that there are no lions in the Scottish highlands to which the other man quickly replies than that is no McGuffin.

In other words, a McGuffin is an empty an almost entirely meaningless plot device.

It seems that nanotechnology is becoming the new McGuffin for silly Hollywood action movies with the release of this summer’s blockbuster “G.I. Joe: Rise of Cobra.”

In this case, the McGuffin are little nanobots that are put into a warhead and then launched at a target where they begin to devour the target until they are turned off by remote control. In the trailers you can see the green swarm of nanobugs devouring the Eiffel Tower.

I am afraid nanotechnology is not fairing too well in popular culture it always seems to be a threat whether it be Michael Crichton’s “Prey” or the television program “Eleventh Hour”.

I guess it’s hard to make cleaner drinking water, cheaper alternative energy or better anti-cancer drug treatments into an exciting and compelling plot element.

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More