Nanoclast iconNanoclast

Piezoelectrics and Thin Films Power Your Mobile With a Press of Your Finger

While the pedantic among us may quibble with phrases like “self-powering portable electronics” and start blathering about the second law of thermodynamics, new research from Australia is pushing the limits of piezoelectric materials for turning pressure into electrical energy for mobile devices.

The researchers have published their work in the journal Advanced Functional Materials after demonstrating a method for combining piezoelectric materials with thin-film technology to produce more easily integrated into mass-production techniques.

"The concept of energy harvesting using piezoelectric nanomaterials has been demonstrated, but the realization of these structures can be complex, and they are poorly suited to mass fabrication,” says Dr. Madhu Bhaskaran, lead coauthor of the research. "Our study focused on thin-film coatings because we believe they hold the only practical possibility of integrating piezoelectrics into existing electronic technology."

When more easily integrated piezoelectric materials are combined with groundbreaking work in reducing the amount of energy consumed by electronic devices like that done by Eric Pop and his team at the University of Illinois at Urbana-Champaign’s Beckman Institute for Advanced Science and Technology, it seems possible that we may be able to run our small electronic devices for longer than a few hours before we have to plug them into an outlet. 

Bad Videos Are Not the Way to Teach Nanotechnology

It’s that time of year again, when the daylight lasts longer, the temperatures begin to rise and the news cycle allows for whimsical pursuits.

This means that I have noticed a new round of videos being posted on the subject of nanotech.

The Nanoscale Informal Science Education (NISE), which is supposed to educate and engage the public on the subject of nanotechnology, has apparently circulated in its most recent newsletter a video that was a runner up in the American Chemical Society’s video contest back in 2009. 

I also saw this week a video produced in 2010 for another video contest this time for something called Time for Nano.

It all starts quite hopefully with the theme music from all the Warner Brothers cartoons, but it quickly becomes apparent that it is presenting a “camera pill” technology that is ingested to take images from inside your body and somehow it is being presented as nanotechnology.

Now I have taken some what I believe to be well-deserved swipes recently at EU attempts at public outreach, and this is a prime example of what I am concerned about.

So childish and poorly informed is the approach to educating the public about nanotechnology that no one seemed to care that one of the videos submitted for a nanotechnology video competition had nothing to do with nanotechnology.

I would like to suggest that if we want to circulate a video to introduce the public to nanotechnology, let’s just all agree to use the Stephen Fry video (despite his own personal misconceptions on the topic) and happily call it a day.

EU Embarks on Yet Another Public Outreach Project

Europe is once again trying to appease…umh…I mean inform the public about the subject of nanotechnology with another public outreach program entitled Nanochannels.

I say ‘again’ because there have been similar efforts in the past. There was Nanologue, which is a completed EU project that except for those directly engaged in it few have heard of.

Over at the blog Frogheart the reasonable question is asked why no one mentioned the work previously done by Nanologue in this field when announcing Nanochannels. Since I was in the meeting at EuroNanoForum in which the outreach program was discussed, I can confirm that a dreary recap was provided of the Nanologue project. The extent to which this data informed the new project was hard to discern from the near-comatose expression on the face of the audience.

But moving on, in addition to Nanologue there was ObservatoryNANO that was supposed to inform government, industry and finance decision makers about issues pertaining to nanotechnology and in doing so often provided unintended guffaws.

One of the groups behind ObservatoryNANO and now Nanochannels is the Institute of Nanotechnology, which certainly has background in this sort of thing as it also provided some of the background material for last year’s nearly universally lambasted UK nanotechnology strategy report.

That report too managed to forget all the other reports that had come before. While starting from scratch is sometimes necessary and even a good thing, managing to dispose of an entire body of knowledge in order to offer up odd suggestions seems to miss the spirit of starting fresh.

It would appear that Nanochannels (a somewhat unfortunate name in that nanochannels is a term that actually refers to a structure in nanotechnology) is somewhat similar to the UK project Nano&Me, except that Nano&Me was set up for £77k and Nanochannels has a budget of €894,940.

You would think with that kind of money they would have a website, but no, unless of course you count a Facebook page

You know the EU might take a page from the US government in its nanotechnology outreach strategy and just start getting serious about the subject. Just because some groups have decided that juvenile discourse is the way to address the topic of nanotechnology doesn’t mean that governments should embark on a children’s-book approach to the subject to get it all sorted.

IBM Takes Graphene One Step Further with the First IC Built from It

In poker parlance it would seem that IBM has gone “all in” with graphene replacing silicon in the chips of the future.

The latest news in the graphene story is that IBM has built an integrated circuit out of the wonder material. The research, which was published in the magazine Science last week, describes IBM’s success at building a “wafer-scale graphene circuit…in which all circuit components, including graphene field-effect transistor and inductors, were monolithically integrated on a single silicon carbide wafer.”

According to the Spectrum article cited above, it took researchers a year of engineering work to sort how to connect the graphene to the other metallic elements of the circuit and how to perform lithography on it without damaging it.

To overcome the latter challenge the Spectrum article intriguingly says, “One way the team addressed the damage problem was to grow the graphene on a silicon-carbide wafer, then coat it with a common polymer, PMMA, and a resist that was sensitive to jets of electrons used in electron beam lithography.”

I am assuming then that there were other ways tested but this turned out to be the best (I don’t have a Science subscription, so I don’t know if other methods were in fact tried.).

Like anyone who follows to any extent developments in material science around chips I have become somewhat mesmerized at the developments that have been coming fast and furious around graphene.

But meanwhile some interesting developments are occurring with other materials that come in two dimensions like graphene but have a natural band gap. While the molybdenite is not being positioned as a direct competitor with graphene in the post-silicon battlefield, one has to wonder whether there are other minerals out there in addition to molybdenite that could fit the bill and push graphene to the side.

Not that there is such a competitor out there mind you, but once upon a time not too long ago carbon nanotubes were the new wonder material that would someday replace silicon. If I were a betting man, I would be looking to hedge my wager somewhat.

Nanoscale Conductors Enable New Battery Architecture: The Semi-Solid Flow Cell

Researchers at MIT, led by W. Craig Carter and Yet-Ming Chiang, have developed a new architecture for batteries that combines the design of liquid-flow batteries with that of conventional lithium-ion batteries resulting in a 10-fold improvement in the energy density of liquid-flow batteries and reducing the size of a battery system such as found now in electric cars to about half their current size.

The research, which was originally published in the Wiley journal Advanced Energy Materials, was able to overcome the low energy density of liquid-flow batteries by creating a semi-solid material that “kind of oozes,” according to Chiang. The new material is able to store energy in “suspensions of solid storage compounds” and the “charge transfer is accomplished via dilute yet percolating networks of nanoscale conductors.”

The result is that the cathodes and anodes of the battery are particles that are suspended in the liquid electrolyte. And the two different suspensions are pumped through systems separated by a thin porous membrane.

The design also separates the storing and discharging of the battery into two different physical structures. According to Chiang, this separated architecture will enable batteries to be designed more efficiently.

Since the design is expected to reduce the size (and cost) of a battery system by as much as half, it is being touted as a way to make electric vehicles more competitive with internal combustion engines.

Chiang and his researchers have even gone so far as to dub the semi-solid liquid “Cambridge Crude”, no doubt a reference to their Cambridge, MA location. The researchers also posit the idea that the “Cambridge Crude” could be pumped like gas to recharge a car. However, I would make one small caveat on that notion: what are you supposed to do with the semi-solid liquid that you’re disposing of? I expect that one would run into all sorts of environmental concerns.

Nonetheless the researchers have developed what appears to be a new design architecture for batteries and have demonstrated that slurry-type active materials can be used for storing electrical energy.

If the research has the potential for commercial development, it seems the right research team developed it since Chiang’s earlier work on lithium-ion batteries led to the MIT spinoff A123 Systems and it has already been licensed by Chiang’s and Carter’s new company 24M Technologies (itself a spinoff from A123 systems).

Nanotech's Role in Clean Drinking Water Creeping Forward

While attending the EuroNanoForum 2011 conference this week in Budapest, Hungary, I was confronted with at least once considering how nanotechnology could be used in water purification and desalination. You really can’t get through one of these things without hearing how nanotechnology could save the world.

But the water issue is one that over the years I have taken some interest in, putting together a conference on the subject seven years ago, and even taking up the issue here on Spectrum’s pages here, here and here.

A couple of years back, the Meridian Institute published a good white paper entitled “Conventional and Nano-Based Water Technologies,” which did a nice job of cataloguing all the nanotech-based solutions for water desalination.

We have another one to add to that list possibly in a recent article published Physics World  written by Jason Reese, Weir Professor of Thermodynamics and Fluid Mechanics at the University of Strathclyde.

The article relates how CNTs are enabling a technique used by Reese that moves away from the high-energy-cost process of reverse osmosis. In this technique, Reese has shown that the CNTs can improve water permeability 20 times that of modern commercial reverse-osmosis membranes. A factor of 20 improvement in permeability should have a pretty significant impact on the energy requirements.

This is certainly a move in the right direction. However, I have to confess that when I put the NanoWater conference together seven years ago, I had somewhat greater expectations that we would be further along at this point. I am not entirely convinced it’s a lack of technological solutions, nanotech related or otherwise, that is the cause of the delay.

Examining the New Dawn of Dye-Sensitized Solar Cells with their Discoverer

Last month I had the rare opportunity on two separate occasions of sitting down with two world-renowned Swiss scientists.

First, I got to meet Nobel Prize winner Heinrich Rohrer and on Monday of this week I got to chat at EuroNanoForum 2011 with last year’s winner of the Millennium Prize, Michael Grätzel, who is currently a professor at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland and in 1991 discovered dye-sensitized solar cells (DSSC), sometimes called Grätzel Cells in his honor. 

While chatting over some lunch, I put three questions to him that were more or less the following: 

--Is the future of dye-sensitized solar cells primarily in the area of powering of electronic devices, i.e. laptops, or could it have a place in the power grid? 

--Was he aware of the work of Angela Belcher in using viruses to manipulate carbon nanotubes for use in dye-sensitized solar cells,and were there any other innovations that he saw as key to the further development of DSSCs?

 --Is improving the conversion efficiency of DSSC the most critical technological hurdle for the cells?

While I did record the responses from Dr. Grätzel, the audio quality was fairly poor due to it being in a busy lunch area for the conference. So, I will quote some of his responses here.

To the first question above, Dr. Grätzel started by saying, “It’s certainly a disruptive technology , which is presently being commercialized mainly through niche applications such as providing electric power for portable electronic devices.

“It is also a very strong contender for building integrated photovoltaics (BIPV). The DSSC is the only solar cell that can be used to realize transparent glass facades, skylights and windows that produce electric power from light," he added.

“Other potentially huge markets targeted presently by industry is to print the DSSC on coil coated steel for roofing and cladding. The commercial production of flexible and light weight devices has already started in 2009.The DSSC  is something that will add new markets to the present applications of silicon ells but it will not confront conventional PV cells at this stage .”

However, he was quick to point out that DSSC does have distinct advantages over silicon cells.

“We [DSSC] have no competition for example here in these light conditions [low interior lighting]. Here we are the best. From indoor applications and for outdoor applications in ambient light that is where money is being made now by the companies that have invested in DSSC,” he said. “But that’s not the only goal. The final target is to mass produce modules that have presently reached 10% conversion efficiency for large scale solar electricity production including applications for roof tops and solar farms.”

In 2010, Sony did in fact demonstrate a prototype module they based on DSSC technology with a10% conversion efficiency.

When it came to the question of conversion efficiency, Dr. Grätzel seemed resigned to the percentage game that seems to exist, but believed that kilowatt hour (kWh) to price was a more significant metric.

“We have to play the game. We have to go and have our efficiencies validated by an accredited PV calibration laboratory. We cannot create a different world where we just say we are the best,” he said. “We are living exactly with the standards that silicon has set in terms of efficiency and stability.

“But, on the other hand, it is true that when it comes to the advantages we should also play those up as well,” he said. He added that under certain outdoor exposures DSSC will already out perform silicon in the key metric of kWh price

“In the end, what we would really like to see is kWh price used as a metric in addition to peak watt price. The peak watt price is a good standard but when it comes to outdoor applications it often does not reflect reality such as the performance under cloudy conditions and the drop of conversion efficiency with temperature encountered by silicon solar cells,” he said.

When I asked about the work of Angela Belcher based on the DSSC, his response was clear “That’s a real breakthrough we can learn a lot from her fascinating experiment. ”

But perhaps more intriguing is that he and his team are submitting their latest research this week on new dyes that break some previous conversion efficiency records for DSSC.

It seems as though DSSC technology is really taking hold recently and developments both commercially and in the laboratory are accelerating.

Definitions for Nanotechnology Inform EU Citizens as much as Regulatory Framework

Yesterday I attended EuroNanoForum 2011 in Budapest, Hungary, which marks the fifth running of this biennial event dating back to 2003.

The Forum serves as a kind of platform from which the European Commission can assess and trumpet its nanotechnology capabilities.

As anyone who has read this blog and my contributions to the TechTalk blog over the years knows, this regional mentality to the development of nanotechnology strikes me as kind of missing the point of how nanoscience and later nanotechnology comes to be developed. But it seems that governments forking out funds for these kinds of shindigs is what really keeps them going, so I suppose they can do whatever they want. It’s their party after all.

The conference had organized a special day for journalists that included a press conference with the plenary speakers that included among others Michael Grätzel, the discoverer of dye-sensitized mesoscopic oxide particles for use in solar cells (I interviewed him and will blog on that tomorrow) and Rudolf Strohmeier, Deputy Director General, Directorate General for Research & Innovation for the European Commission.

Since Mr. Strohmeier is a self-described regulator, I thought it might be worth asking him about the wisdom of embarking on a quest to define what nanotechnology is before establishing a regulatory framework. I also thought I might check in to see where they’re at now in the rather lengthy process.

For those who might like a small primer on the topic, the EU believed it necessary to define what nanotechnology is before developing regulations for it and it all seemed to make sense until the process got stuck in the mud on the issue of “how much” or “how many” nanoparticles.

At the time I first came across the imbroglio, it just seemed all a bit silly. Even if they could determine whether the risk of nanoparticles came from either the number of nanoparticles or the weight of the nanoparticles in a material, it wouldn't really seem to sort out whether said material was of any risk.

But it took an article from Andrew Maynard over at the Risk Science Blog for me to see how wrong-headed the EU's approach really is. By shoehorning a definition that will work for regulators we may be squeezing out science from the process.

As Maynard concludes:

“Five years ago, I was a strong proponent of developing a regulatory definition of nanomaterials.  Today, with the knowledge we now have, I think we need to start thinking more innovatively about how we identify new materials that slip through the regulatory net – whatever we decide to call them.  Only then will we have a hope of developing science-grounded regulation that protects people while supporting sustainable development.”

Below is an audio recording I made of my exchange with Mr. Strohmeier. Interestingly, according to him, the definition was necessary for educating EU citizens as much as for developing regulations. Patrick Vittet-Philippe, the Press and Information Officer for DG Research and Innovation of the European Commission, makes an additional comment at the end of the recording.

Loading the video player...

In fairness, I didn't really get a chance to follow up with Mr. Strohmeier to see if he could see the problems that arise when you arbitrarily arrive at a definition that may not always reflect the latest science on the topic. Nonetheless, I can't help but think that a definition that is as much about mollifying the public as it is about good science has inherent risks itself.

Mapping of Memristor Could Speed Its Commercialization

It’s been some time since I last checked in on Hewlett Packard’s drive to develop the memristor. At that time, HP had joined forces with Korean-based memory chipmaker Hynix Semiconductor Inc. to make memristor chips.

So, while I was waiting to see what would come from the collaboration between HP and Hynix, it seemed that not only was the memristor being touted as changing memory but also replacing the transistor altogether as evidenced by the comments on this recent blog post.

Although the latest news is not an announcement of a commercially available product, which looks as though it will be called resistive random access memory (ReRAM), the research HP has conducted recently has been successful in mapping out what happens inside the 100nm channels of the memristor.

The research was conducted by researchers at HP Labs and the University of California Santa Barbara and initially published in the in the UK-based Institute of Physics journal Nanotechnology

Basically the researchers were able to use X-rays to target precisely the channels within memristors in which resistance switching occurs and then they were able to sort out the chemistry and structure of the channel. If nanotechnology is anything, it is certainly having the tools necessary to see how things operate on the nanoscale and then exploit that knowledge to get things to do what you want.

And what HP no doubt wants is to get the memristor to market and for the first time I am seeing a timeline offered up in which 2014 is an expected to date to see it incorporated into electronic devices like mobile phones and tablets with 10 times greater embedded memory than currently available.

IBM's Millipede Project, Social Networking and How Semiconductor Technology Can Save the World

Last year, thanks to Twitter, I came upon a blog penned by Ira Feldman who was providing coverage of the IEEE San Francisco Bay Area Nanotechnology Council Sixth Annual Symposium. 

If there are positives to social networking this is certainly one of them where knowledge that would otherwise be in a silo for just those who attended the conference can actually be shared with a larger community. I hope more conference attendees start to make this a practice.

Mr. Feldman has provided coverage once again of this year’s IEEE San Francisco Bay Area Nanotechnology Council annual symposium. 

In particular Feldman has given us an analysis of the keynote speaker’s, Dr. Spike Narayan, Functional Manager at IBM, address: “Nanotechnology: Leveraging Semiconductor Technologies to Address Global Challenges.”

According to Feldman, the presentation asked the question “can we leverage semiconductor technology to address global challenges of environment, energy, healthcare, and water?”

If the recent collaborative work between IBM and the Institute of Bioengineering and Nanotechnology in Singapore in using the body of knowledge that had been accumulated in polymer building blocks for creating nanoparticles and then applying it to creating a drug the fights drug-resistant bacteria, then the answer is a resounding ‘yes’.

One of the examples Narayan apparently provided for “where semiconductor knowledge is indeed transferable to these other domains” is in the area of disk drives, with Feldman offering the IBM Millipede project as the most advanced example.

It is a curious story that of the IBM Millipede project. The IBM Millipede essentially used an array of thousands of miniaturized Atomic Force Microscopes (AFMs) as a memory device. Since it was based on the AFM that Gird Binnig had invented, he was sometimes made a spokesman for its capabilities and did so in his 2004 interview with Spectrum

Although touted as the next step in mobile memory devices, it soon became rarely mentioned and most everyone suspected that it fell victim to the cheap and increasingly capable qualities of flash memory.

So, last week during the press conference with Gerd Binnig and Heinrich Rohrer an intrepid journalist (not me) dared ask about the fate of the IBM Millipede project. I took a gulp and waited for the reply.

Binnig, who had been a champion of the technology, remained unapologetically supportive of the technology but did hand off the particulars of Millipede’s fate to Dr. Paul Seidler, Coordinator the new Nanotechnology Center at IBM Research in Zurich to explain more thoroughly.

And just as many suspected, the IBM Millipede project in its original form of creating a mobile storage device is no longer, but instead lives on various other research projects within IBM. At least in the nanotechnology side of things, IBM Millipede has found its niche in probes for lithography.

Below is Dr. Seidler’s full response.

Loading the video player...


Getting back to Ira Feldman, he has wonderfully led us to an archive for all the presentations from the IEEE San Francisco seminar and they can be found here

Most Commented Posts

Nanoclast

IEEE Spectrum’s nanotechnology blog, featuring news and analysis about the development, applications, and future of science and technology at the nanoscale.

 
Editor
Dexter Johnson
Madrid, Spain
 
Contributor
Rachel Courtland
Associate Editor, IEEE Spectrum
New York, NY
Advertisement
Advertisement
Load More