The 2015 Top Ten Programming Languages

New languages enter the scene, and big data makes its mark

2 min read
The 2015 Top Ten Programming Languages
Illustration: iStockphoto

What are the most popular programming languages? The only honest answer: It depends. Are you trying to land a job at a hot mobile app startup, model electricity flows across a continent, or create an electronic art project? Languages are tools, and what’s a “must have” in one domain can be a “whatever” in another. So for the second year in a row, IEEE Spectrum has teamed up with computational journalist Nick Diakopoulos to give you a popularity ranking that you can adjust to meet your own needs.

Our ranking system is driven by weighting and combining 12 metrics from 10 data sources. We believe these sources—such as the IEEE Xplore digital library, GitHub, and CareerBuilder—are good proxies for the popularity of 48 languages along a number of different dimensions. The weighting of these sources can be adjusted in our interactive Web app to give, say, more importance to languages that have turned up in job ads. Filters can be applied so that you can see only languages relevant to mobile or embedded development, for example. (Access to the Web app is US $0.99.)

We put a number of preset weightings into the app for convenience; the default is the IEEE Spectrum ranking, with weights chosen to broadly represent the interests of IEEE members, and here are this year’s top 10 languages from that weighting. (The column on the left is the 2015 ranking; the column on the right is the 2014 ranking for comparison.)

The big five—Java, C, C++, Python, and C#—remain on top, with their ranking undisturbed, but C has edged to within a whisper of knocking Java off the top spot. The big mover is R, a statistical computing language that’s handy for analyzing and visualizing big data, which comes in at sixth place. Last year it was in ninth place, and its move reflects the growing importance of big data to a number of fields. A significant amount of movement has occurred further down in the rankings, as languages like Go, Perl, and even Assembly jockey for position.

A few languages have dropped off the rankings compared with last year’s. Mostly this is due to an insufficient presence in this year’s data to justify keeping them in. But in one case, an entry was dropped because we agreed with comments on last year’s ranking that said we had made a mistake in categorizing it as a language rather than just a framework. This was ASP.NET, and we had originally included it because of our pragmatic approach to the definition of programming language—a lack of Turing completeness is not an absolute bar, and we make no apologies for including things like HTML—but we were too broad on that one.

A number of languages have entered the rankings for the first time. Swift, Apple’s new language, has already gained enough traction to make a strong appearance despite being released only 13 months ago. Cuda is another interesting entry—it’s a language created by graphics chip company Nvidia that’s designed for general-purpose computing using the company’s powerful but specialized graphics processors, which can be found in many desktop and mobile devices. Seven languages in all are appearing for the first time.

The Conversation (0)

How Can We Get​ Blockchains to Talk to Each Other?

The field is fragmented, but common protocols are on the way

4 min read
An artists impression of two blockchain blocks trying to communicate with each other.
iStockPhoto

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

Keep Reading ↓ Show less

Clever Compression of Some Neural Nets Improves Performance

MIT researchers find an efficient way to prune speech-recognition AIs while still boosting accuracy

3 min read
Icon based illustration showing an AI language processing cycle.
iStockPhoto

As neural networks grow larger, they become more powerful, but also more power-hungry, gobbling electricity, time, and computer memory. Researchers have explored ways to lighten the load, especially for deployment on mobile devices. One compression method is called pruning—deleting the weakest links. New research proposes a novel way to prune speech-recognition models, making the pruning process more efficient while also rendering the compressed model more accurate.

The researchers addressed speech recognition for relatively uncommon languages. To learn speech recognition using only supervised learning, software requires a lot of existing audio-text pairings, which are in short supply for some languages. A popular method called self-supervised learning gets around the problem. In self-supervised learning, a model finds patterns in data without any labels—such as “dog” on a dog image. Artificial intelligence can then build on these patterns and learn more focused tasks using supervised learning on minimal data, a process called fine-tuning.

Keep Reading ↓ Show less

Learn How to Use a High-Performance Digitizer

Join Teledyne for a three-part webinar series on high-performance data acquisition basics

1 min read

Webinar: High-Performance Digitizer Basics

Part 3: How to Use a High-Performance Digitizer

Date: Tuesday, December 7, 2021

Time: 10 AM PST | 1 PM EST

Keep Reading ↓ Show less