There was an interesting little story in the New York Times on Saturday where Professor of Business Randall Stross at San Jose State University made the case that while the availability of Internet services like Skype and Facebook don't (yet) match that of the old AT&T Bell Telephone system of 99.999 percent (or it being down about 5.26 minutes a year), they aren't too bad considering.

After all, Professor Stross argues, that even in the light of recent problems:

"Internet computing, however, isn't as unreliable as it may seem. After all, when was the last time you got to Goggle’s home page but couldn't complete your search?"

His article (which somewhat co-mingled the concepts of availability and reliability) led me to think about just how available should our digital systems and devices be?

I bring the issue up also because of Verizon Wirelessannouncement today that it will now be selling Apple's iPhone, and the company is promising that its network can handle the expected increase in network traffic. This Wall Street Journalarticle yesterday says that

"Verizon Wireless ... is confident enough in its network that it will offer unlimited data-use plans when it starts selling the iPhone around the end of this month."

Verizon is doing so as a way to distinguish itself from AT&T which had to move to restrict data usage after complaints from customers (like me) over dropped calls and poor service due in part to heavy users of smart phones like the iPhone. The AT&T started selling iPhones in 2007

AT&T is already taking swipes at Verizon over the iPhone, according to this story today in the New York Times. Mark Siegel, an AT&T spokesman for corporate issues, said in a company statement quoted by the Times:

"I'm not sure iPhone users are ready for life in the slow lane."

Which also kind of leads us back to my question about availability. Would you rather have an iPhone (or other smart device) that is fast, but the service is spotty or one that is slightly slower but is always available?

Given that as an introduction, and for those of you so inclined, what are your expectations for digital system and device availability? Do you expect them to be available "all the time" in the Old Ma Bell sense, or do you accept outages to be a normal fact of the digital life?

Also, if you accept outages are "normal" now, do you expect your opinion to change in the future? In other words, should digital systems/devices become more available over time? And are you willing to pay more for greater availability?

Feel free to discuss the same issues in regard to the reliability of digital devices as well.

One final question: anyone planning on switching from using an iPhone on AT&T to Verizon? I - and I think a lot of other people - would like to know why.

The Conversation (0)

Metamaterials Could Solve One of 6G’s Big Problems

There’s plenty of bandwidth available if we use reconfigurable intelligent surfaces

12 min read
An illustration depicting cellphone users at street level in a city, with wireless signals reaching them via reflecting surfaces.

Ground level in a typical urban canyon, shielded by tall buildings, will be inaccessible to some 6G frequencies. Deft placement of reconfigurable intelligent surfaces [yellow] will enable the signals to pervade these areas.

Chris Philpot

For all the tumultuous revolution in wireless technology over the past several decades, there have been a couple of constants. One is the overcrowding of radio bands, and the other is the move to escape that congestion by exploiting higher and higher frequencies. And today, as engineers roll out 5G and plan for 6G wireless, they find themselves at a crossroads: After years of designing superefficient transmitters and receivers, and of compensating for the signal losses at the end points of a radio channel, they’re beginning to realize that they are approaching the practical limits of transmitter and receiver efficiency. From now on, to get high performance as we go to higher frequencies, we will need to engineer the wireless channel itself. But how can we possibly engineer and control a wireless environment, which is determined by a host of factors, many of them random and therefore unpredictable?

Perhaps the most promising solution, right now, is to use reconfigurable intelligent surfaces. These are planar structures typically ranging in size from about 100 square centimeters to about 5 square meters or more, depending on the frequency and other factors. These surfaces use advanced substances called metamaterials to reflect and refract electromagnetic waves. Thin two-dimensional metamaterials, known as metasurfaces, can be designed to sense the local electromagnetic environment and tune the wave’s key properties, such as its amplitude, phase, and polarization, as the wave is reflected or refracted by the surface. So as the waves fall on such a surface, it can alter the incident waves’ direction so as to strengthen the channel. In fact, these metasurfaces can be programmed to make these changes dynamically, reconfiguring the signal in real time in response to changes in the wireless channel. Think of reconfigurable intelligent surfaces as the next evolution of the repeater concept.

Keep Reading ↓Show less