The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it…Today’s multimedia machine makes the computer screen into a demanding focus of attention rather than allowing it to fade into the background.
—Mark Weiser, Scientific American, September 1991
In 1988, xerox parc computer scientist (and later CTO) Mark Weiser put forward the idea of—and coined the term—ubiquitous computing. Sometimes shortened to ubicomp, it refers to the seamless integration of computing resources into most of the objects that people use to perform the activities of daily life. Today we’re more likely to call it pervasive computing or everyware. We’re not quite there, despite newfangled appliances such as smart TVs and smart refrigerators, but modern computing does have a pervasive feel to it. That feel comes mostly from the gadgets like smartphones and tablets (and soon, wearables like Google Glass) that we routinely carry around with us. Thanks to cellular connections and Wi-Fi networks, we have near-constant access to computing power and online data, giving us what might be called near-ubiquitous computing. It’s not quite the ambient intelligence envisioned by ubicomp fans, but it’s a step or three in that direction.
There’s a problem, however. One of the chief characteristics of true ubicomp is that it is a calm technology, meaning that it remains in the background until needed and thus enables us to interact with it in a calm, engaged manner. Today’s mobile computing platforms are more like jittery technology, constantly beeping at us and alerting us to new messages, posts, updates, and news. (Hence, perhaps, the curious prevalence of phantom vibration, the perception of a cellphone’s vibration in the absence of an incoming call or notification.) Even watching TV is no longer simple as more and more people use their mobile tech for second screening (monitoring social media commentary about the show they’re watching) and chatterboxing (chatting online with people watching the same show).
If it’s by now axiomatic that even as we change technology, it changes us, then we have to wonder how we’re being altered by this constant connectivity. On the positive side, having so much information fingertip ready is a boon for productivity and the quick settling of bar bets. On the downside, all this digital hectivity leads to FOMO, the fear of missing out on something interesting or fun, which can lead to obsessive checking of social networks. We like to think we’re capable of being polyattentive (watching or listening to more than one thing at a time), but it’s more like what Microsoft researcher Linda Stone calls continuous partial attention, where we’re ostensibly focused on some task but a chunk of our attention is waiting for something more important to pop up. It’s no wonder many people suffer from nomophobia, the fear of being without a mobile phone or without a cellular signal. Our phones and tablets have become weapons of mass distraction.
The result is the shortened attention span that writer Nicholas Carr identified in his famous 2008 essay, “Is Google Making Us Stupid?” We’ve become self-interrupters who now routinely suspend our own work to check social media or watch the latest viral video. Conveniently, it looks like we’re being productive members of society, but in reality our focus on the trivial and the fleeting means we’re just being fauxductive. We’re social notworking. True, our brains are engaged, but not always in a good way. We suffer from busy brain, a mental state that includes racing thoughts, anxiety, lack of focus, and sleeplessness. We indulge in binge thinking, where we overthink problems or think obsessively but fruitlessly over a short period.
Ubiquitous computing remains a technophile’s dream—and a utopian one at that, thanks to its vision of technology waiting in the background, not speaking unless spoken to. In its stead we have ubiquitous connectivity—always on, always interrupting, always in your face. And there’s nothing calm about that.
This article originally appeared in print as “Clamorous Computing.”