Top Programming Languages 2017: Focus on Jobs

We analyze the languages that are in demand by employers

3 min read
Opening illustration
Illustration: Alamy

While the default IEEE Spectrum ranking in the Top Programming Languages interactive gives a good aggregate signal of language popularity, here we are taking a deep dive into the metrics related to job demand. Two of our data sources, Dice and CareerBuilder, measure job openings for the languages included in the interactive, and consequently we have a preset for “Jobs” that weighs the rankings heavily toward those metrics. So, if you want to build up your tech chops before looking for a programming job, what languages should you focus on?

Although Python has moved to the top of the default Spectrum ranking, if we instead go purely by the volume of openings that mention a language, we find that C beats Python by a ratio of 3.5 to 1, or about 19,300 job openings versus 5,400 across Dice and CareerBuilder combined. The Swiss army knife of database languages, SQL, is also highly in demand (15,400 openings), as are Web technologies like JavaScript (9,300 openings), HTML (7,000 openings), ASP.NET (2,700 openings), and PHP (2,400 openings). R, the increasingly indispensable language for data scientists, saw its zenith in 2016 but had some light contraction in 2017, shedding about 12 percent, to 353 openings. (As a caveat, we should note these numbers reflect job openings as of mid-June 2017, indexed for a 30-day window.)

In the mobile coding space, one of the clearest trends in the default ranking is the rise of Swift at the cost of Objective-C. We can see this in the jobs numbers too: openings for Objective-C shrank by slightly more than 19 percent, whereas those for Swift rose by almost 19 percent. And for the first time, there were more openings for Swift (439 openings) than for Objective-C (394 openings).

There are some significant changes further down the job rankings that are worth keeping an eye on too, even though the absolute number of job advertisements that cite these language is still relatively small. Both CUDA, a general-purpose language for coding GPUs, and Rust, which would feel familiar to most C / C++ programmers, fall into this category. Since 2016, CUDA has moved up four spots to 23 in the Jobs ranking, while Rust moved up a solid 10 positions to 25. Still, the absolute number of job openings was just 40 for CUDA and 23 for Rust. These are niche languages still, but Rust is growing quickly—going from being used in 10,900 new GitHub repositories in 2016 to almost 17,100 in 2017.

This article wouldn’t be complete without mentioning some of the losers in the programming language jobs calculus. Once a dominant Web programming language, Ruby is still used widely but it is slipping; the number of job openings for Ruby shrank by a full third since 2016, to about 1,600. We aren’t the first to report the slump in popularity of Ruby, and it’s far from dead, but this will be one to keep an eye on in the future as coders may already be shifting to alternatives like Python and Go. Demand for other languages like Clojure, Haskell, and Visual Basic is also on the wane. When we started the rankings in 2014, ActionScript still clocked 87 job openings, but in 2017 it continues the downward spiral, with only 20 openings, and it’s unlikely to make the Top Programming Languages at all next year. RIP, ActionScript.

About the Author

Nick Diakopoulos is a pioneering data journalist and is currently an an assistant professor at the University of Maryland, College Park College of Journalism. His research interest focus on algorithmic accountability and how newsrooms can use social technologies. Previously he cofounded a program in computational journalism at the School of Interactive computing at Georgia Tech.

The Conversation (0)

Biggest Tech Companies Now Building the Biggest Data Pipes

Facebook will lay a record-capacity submarine cable across the Atlantic

4 min read

Google's Grace Hopper subsea cable landing in the seaside town of Bude in England

Google

Old-fashioned telecommunication carriers are falling behind in the global bandwidth race as global giants of content and cloud computing are building their own global networks. Facebook has commissioned electronics and IT giant NEC Corporation to build the world's highest capacity submarine cable. When finished it will carry a staggering 500 terabits—some 4000 Blu-Ray discs of data—per second between North America and Europe on the world's busiest data highway.

For decades, transoceanic cables were laid by consortia of telecommunication carriers like AT&T and British Telecom. As cloud computing and data centers spread around the world, Google, Amazon, Facebook and Microsoft start joining cable consortia, and in the past few years Google began building its own cables. The new cable will give Facebook sole ownership of the world's biggest data pipeline.

Transoceanic fiber-optic cables are the backbones of the global telecommunications network, and their change in ownership reflects the rapid growth of data centers for cloud computing and content distribution. Google has 23 giant data centers around the globe, each one constantly updated to mirror the Google cloud for users in their region. Three years ago, flows between data centers accounted for 77 percent of transatlantic traffic and 60 percent of transpacific traffic, Alan Mauldin, research director at TeleGeography, a market-research unit of California-based PriMetrica, said at the time. Traffic between data centers is thought to be growing faster than the per-person data consumption, which Facebook says increases 20 to 30 percent a year.

Vying for maximum bandwidth at the intersection of Moore's Law and Shannon's limit

Fiber-optic developers have worked relentlessly to keep up with the demand for bandwidth. For decades, data capacity of a single fiber increased at a faster rate than the number of transistors squeezed onto a chip, the definition of Moore's Law. But in recent years that growth has slowed as data rates approached Shannon's limit, a point at which noise in the transmission system overwhelms the signal. In 2016 the maximum data rate per fiber pair (each fiber carrying a signal in one direction) was around 10 terabits per second, achieved by sending signals at 100 gigabits per second on 100 separate wavelengths through the same fiber.

Developing more sophisticated signal formats offered some improvement, but not enough to keep pace with the demand for bandwidth. The only way around Shannon's limit has been to open new paths for data delivery.

In 2018, Facebook and Google placed bets on broadening the transmission band of optical fibers by adding signals at a hundred new wavelengths to squeeze 24 terabits through a single fiber. Each bought one pair of fibers on the Pacific Light Cable stretching from Hong Kong to Los Angeles. The leader of the consortium, Pacific Light Data Communications, of Hong Kong, retained four other pairs in the six-pair cable. Although the cable was soon laid, the U.S. Federal Communications Commission has refused to license its connection to the U.S. network because of security concerns arising from its Chinese connections.

Keep Reading ↓ Show less

Study: Recycled Lithium Batteries as Good as Newly Mined

Cathodes made with novel direct-recycling beat commercial materials

3 min read
iStockphoto

Lithium-ion batteries, with their use of riskily mined metals, tarnish the green image of EVs. Recycling to recover those valuable metals would minimize the social and environmental impact of mining, keep millions of tons of batteries from landfills, and cut the energy use and emissions created from making batteries.

But while the EV battery recycling industry is starting to take off, getting carmakers to use recycled materials remains a hard sell. "In general, people's impression is that recycled material is not as good as virgin material," says Yan Wang, a professor of mechanical engineering at Worcester Polytechnic Institute. "Battery companies still hesitate to use recycled material in their batteries."

Keep Reading ↓ Show less

How to Write Exceptionally Clear Requirements: 21 Tips

Avoid bad requirements with these 21 tips

1 min read

Systems Engineers face a major dilemma: More than 50% of project defects are caused by poorly written requirements. It's important to identify problematic language early on, before it develops into late-stage rework, cost-overruns, and recalls. Learn how to identify risks, errors and ambiguities in requirements before they cripple your project.

Trending Stories

The most-read stories on IEEE Spectrum right now