What if you could share a computer-aided design (CAD) model and even allow a colleague to manipulate it from afar? “Click on this link, check out my design, and feel free to add more holes or fill some in,” you might say. You wouldn’t have to instruct your distant coworker to install special software or worry about whether her operating system could run it. Imagine that all your programs and data were stored in the cloud and that even computationally intensive applications like multimedia editing ran just as well in your browser as they would if they had been installed locally.
Since the early days of the World Wide Web, a lot of smart, passionate people have wanted to make it into a vehicle for running almost any kind of program. What makes that dream so tantalizing is that the Web is different from other software platforms. It’s defined by open standards, so anyone can build on it. It’s not owned by any company, so developers are beholden only to their users. And it’s constructed largely around open-source technologies, so it has the potential to be very democratic.
Over the past three decades, a whole generation of developers has worked to make this vision a reality. We’ve added new capabilities to the Web, like audio and video streaming, 2D and 3D graphics, typography, peer-to-peer communication, data storage, offline browsing, as well as multitouch, location, and camera inputs. But we continue to struggle with performance, specifically the ability to run Web applications as quickly as non-Web applications.
We overcame this hurdle in two ways. First, my colleagues and I were able to make giant, complex programs run almost as fast inside a browser as they do when installed directly on computers. Then, we worked with other browser companies to create a new Web standard called WebAssembly, making safe and efficient-by-design code now officially part of the Web.
The first option was convenient, simple, and safe for users. But it delivered a janky experience for compute-heavy applications: Visuals would freeze, audio would drop, and apps would take seconds or longer to respond to user input. Plug-ins let browsers run programs—even demanding video games—smoothly and quickly, but they require users to download and install software. And such downloads can inject malicious code into the user’s system. Plus, browser plug-ins run only on the specific browsers they are developed for. So if you are using another type of browser, you’re out of luck.
This approach made it easier to gain support from other browser companies because it meant less work and less risk for them. Today, Mozilla Firefox, Google Chrome, Apple Safari, and Microsoft Edge all provide support for WebAssembly in their most recent versions. So you can expect the Web to be immensely more powerful from here on out. Here’s how that change came about.
Like a lot of stories about tech innovation, this one started with video games. The thinking from team leader Martin Best of Mozilla was this: If we could make games run well on the Web, other computationally intensive applications would soon follow.
So Zakai kept going with Emscripten until it worked. The first time he demonstrated his C++ game running in the browser, it shocked everyone in the room. When it popped up on the screen, all the components were there: graphics, audio, a changing 3D perspective, multiplayer interactions, and a choice of weapons. If there had been any doubts before, his demo put them to rest. In seconds, the other team members went online, clicked into the game, and started shooting rockets at one another in happy amazement.
With Emscripten basically working, Best wanted to see what it could do for commercial software. Could it translate a large, industrial-strength game to run on the Web? Developers of a popular game engine called Unity invited us to work alongside them in an experimental sprint to see how far we could get.
After a few months of programming, I had the optimizations for asm.js working in Firefox, and we were ready to try it out, this time on another equally beefy game platform, Epic’s Unreal Engine. It took just a few days to get the entire multimillion-line code base running inside the Firefox browser—the required work involving things like audio and graphics, which are supported in a browser differently than they are in a computer’s native operating system.
Our results were better than expected. The animation was buttery smooth right from the start. When we saw that, there were cheers and high fives all around. It was as though the final puzzle piece to years of work had fallen into place.
Epic’s CEO, Tim Sweeney, commented that he’d known it’d be possible to run such games in a browser, but he didn’t think it would come so soon. A few weeks later, we demonstrated some Unreal Engine games at the 2013 Game Developer Conference, showing for the first time a major game engine running well in a browser without the aid of plug-ins.
But we didn’t want asm.js or the optimizations used to run it to be exclusively for Firefox users. So we published the full specification of asm.js and actively encouraged other browser makers to add the same optimizations we were using. From 2013 to 2015, all four of the main browser engines improved performance dramatically in the way they ran asm.js, with the Microsoft Edge browser importing parts of Firefox’s open-source asm.js-optimization code.
Soon games translated by Emscripten into asm.js started appearing online. Popular Facebook games like Papa Pear Saga, Top Eleven, Cloud Raiders, and Candy Crush Saga use asm.js under the hood. And as predicted, once games demonstrated what was possible, many other uses followed.
Facebook, for example, now uses asm.js to compress users’ images in the browser before upload, saving bandwidth. Adobe compiled a core image-editing library written in C++ to asm.js for the Web version of Lightroom. And Wikipedia uses asm.js to play video formats when the browser doesn’t provide built-in support. Other Web uses now include computer vision, 3D mapping, user-interface design, language detection, audio mixing, digital signal processing, medical imaging, physical simulation, encryption, compression, and computer algebra.
These developments were wonderful, but it eventually became clear that to move forward, we had to create a new Web standard that would be more efficient to load than asm.js code. In particular, the new standard would replace spelled-out names for variables and instructions with a much more compact representation: binary numbers. The new standard could also allow us to address some smaller issues that users had found with asm.js—they wanted features like 64-bit integer arithmetic and the ability to break an application into smaller chunks that could be downloaded and optimized separately. Thus, the idea for WebAssembly was born.
At the outset of Mozilla’s games program, there were competing proposals for standards of this sort, but none were quite right. You might think that one more proposal added to the fray would also go nowhere. But with asm.js already being widely used, we had the unique opportunity to capture that momentum and channel it into WebAssembly. If some browsers took a while to support WebAssembly (“wasm”) code, it would be easy for developers to produce both asm.js and wasm, just by flipping a switch in Emscripten. So they could always use asm.js, which ran everywhere, and send wasm only to browsers that could run it.
The next step was to convince the other browser makers that WebAssembly was a good idea. In some cases, this was surprisingly easy, because engineers at these companies had already been contemplating similar ideas themselves. Other cases involved long discussions over beers, flights to meet teams and convince managers, and a very kumbaya sit-down at San Francisco’s Yerba Buena Gardens during a Game Developer Conference. By 2015, everyone was finally on board and ready to officially embark on creating a new Web standard.
That process began with the creation of a World Wide Web Consortium (W3C) Community Group made up of engineers from the four major browser companies and other interested parties. We agreed that it would be unwise to try to solve every problem at once, because it would then take five years or more just to produce the specification. So from the outset our group adopted the goal of specifying and shipping what startup mavens call a “minimum viable product,” or MVP. We could then iteratively improve it based on feedback.
This brings us to the present. Browser companies have agreed on an initial MVP version of WebAssembly and have released compatible implementations. Emscripten can take code written in C++ and convert it directly into WebAssembly. And there will be ways in time to run other languages as well, including Rust, Lua, Python, Java, and C#. With WebAssembly, multimillion-line code bases can now load in a few seconds and then run at 80 percent of the speed of native programs. And both load time and execution speed are expected to improve as the browser engines that run the code are made better.
The WebAssembly Community Group has big plans. We are currently working to add features that can exploit the parallel processing that is possible with multicore CPUs. And we want to provide first-class tooling and performance for many languages, thereby giving developers the same freedom they have when they are writing code for native platforms.
Looking back to the original dream of allowing the Web to run all manner of programs just as well as if they had been installed locally, my colleagues and I can see there is still a lot of work left to do. But with WebAssembly, we’re happy to be one giant step closer to that goal.
This article appears in the December 2017 print issue as “Turbocharging the Web.”
About the Author
Luke Wagner is a research engineer at Mozilla.