I've spent the last five years studying a wide array of different programming languages and paradigms and here are the big trends that I think will matter most in the near future.
Trend #1: Processors Are Getting More Numerous But Not Faster
Anybody who witnessed the quantum leap in software since the Apple II knows how much things have improved over the past 30 years since personal computers were first released. Just take a look at these Wikipedia screenshots of
flight simulator's evolution:
Notice anything? Yeah. Me too! And Microsoft is planning a new version to be released in 2012 that looks even better.
It goes without saying that these improvements required years of programming expertise, but they also relied on explosions in processor speeds as well. Every 18-24 months, processor speeds would double and open up whole new possibilities that developers could--and did--take advantage of for their next release. This much is obvious.
What isn't so obvious is that this doubling in speed always happened on a single processor. This is incredibly important. It meant that programmers could just steadily refine their already existing programs without having to rewrite it from scratch and without having to learn a new style of programming or new programming language. If it ran now, programmers could just assume that in 18-24 months, it would run twice as fast without requiring any changes. This is the free lunch.
As Herb Sutter writes in his must-read article,
The Free Lunch is Over. Individual processors have pretty much reached their top speeds. And how has Intel responded? Well, if you can't make one chip faster, just give them more than one processor instead. This is called distributed computing. Enter the dual, quad, and multicore processor computers. But there's a catch. You can no longer run your old program on a multicore and have it transparently run faster. That old program was written for a single processor in a language and programming paradigm geared for a single processor.
Distributed computing is a game changer. It means that for the first time in the history of personal computing, programmers are going to be required to write programs that spread the problem across multiple processors. That is going to require new programming paradigms and programming languages (or at least updates to current languages). To see why, go check out this video:
Software for a Concurrent World.
Whatever language you choose, it seems to me that having a language designed to take advantage of multiple cores is going to be a crucial requirement for the future; otherwise, you will be stuck with a program that cannot use the full capacity of your machine. Who wants that?
At the same time, languages that can deliver code that runs quickly on individual processors will continue to be important. Some problems are inherently linear: they cannot be effectively distributed. The only way to make these applications fast is to solve them in languages that produce fast code.
Not many languages meet these dual requirements.
Trend #2: Mobile Computing Rewards Energy Efficiency
It seems like you hear a lot of advocates for various scripting languages like python, perl, and ruby say that efficient execution is no longer that big a deal. So what, they argue, if you use more memory and require more execution than languages like C++ that compile down to machine code. Computers these days are so fast and have so much memory that worrying about efficiency is a waste of time.
However, they are forgetting one important fact: computers have gone mobile. Forget laptops. I'm talking about tablet pc's and smartphones: devices that run almost exclusively off batteries.
Every computation eats up battery life. This means that efficiency equals longer periods between charging your phone. That matters. Even if the execution speed feels the same to the user between a program written in Ruby and the same program written in Objective-C, the ruby program is probably going to use more juice, and that will get noticed.
Differences in battery life already drives sales. As more people go mobile, battery life is increasingly going to matter, which means we are back to the good old days where languages that delivery stand-alone, executable, highly efficient programs are going to deliver higher sales. And since broadcast power remains science fiction, it looks like this bottleneck is never going away.
Trend #3: Mobile Bandwidth Is Never Enough
There was this small innovation a few years back called the Internet. You may have heard of it. It has made many forms of communication possible; however, it has also imposed a powerful bottleneck in the form of how much you can download at a time. This isn't that a big a deal when you're at home, but I'm not worried about that. Once again, I'm thinking about mobile computing across 3G networks.
I own a smartphone. I subscribe to Verizon's network. Most days, it is only a little painful for surfing the web, which means that it is almost sufficient for most mobile devices now. But I'm looking forward here to a point where people use improved tablet pc's for more than surfing the web. My Ipad isn't quite good enough yet for managing spreadsheets or writing reports, much less a novel, but that day is coming.
I remember a time when desktop computers were standard and laptops were weak replacements that couldn't really do much. That wasn't too long ago. Now, laptops are the standard work machine for most people I know and tablet pc's are the weak platform. Pretty soon, people will want to download more than just websites to them. Actually, they already do. And what are those programs written in: Objective-C. There will be more tablets in the future, a lot of them, and people won't have the bandwidth or patience for huge programs that take forever to download, or pain-in-the-ass programs that require a ton of dependencies to be downloaded just to make them run. Few things are more frustrating than downloading a program only to find that you also have to to download some new version of Java or the correct version of Python or some other run-time environment just to get the damn thing to run. A stand-alone binary (.exe program) is faster and simpler any day of the week.
I should mention that tons of programmers are trying to side-step this problem by developing in HTML, CSS, and Javascript on the front end while pushing the record keeping and deep functionality to the server on the back end using PHP attached to a database programmed with SQL. Take a look at what you have to know to make that work. How many programming languages is that you have know? Four? Five? In theory, that's possible. In practice, managing all this crap quickly becomes very difficult.
Consider the level of talent it apparently takes just to keep Gmail or Facebook online. Then take an honest look at how static these websites are compared to most of the programs sitting on your computer. The word processor in Google Docs still feels like a toy, and that's basically just a box that accepts text! We won't even discuss what most web-based spreadsheets are like. If this is the best that Google can do, what are we likely to see from everyone else.
Trend #4: We Are Drowning In Platforms
Not too long ago, you could buy a computer program secure in the knowledge that it would run on your computer, provided your machine was new enough. The world was Windows. Sure, there were other operating systems, but only fanatics owned them. Apple! Give me a break. Only artists bought Apple machines in the 90's. Linux. Don't make me laugh. Nobody except comp-sci majors ran Linux on their machines. For that matter, nobody except computer scientists had even heard of Linux back then. I'll say it again, the world was Windows.
In many ways, it's still a Microsoft universe, but you can't bet on that anymore. Developers must cater to Apple products, to Linux on android smartphones, and Windows. In that way, it feels much like the 80's again where you have to choose which platform to join.
This leaves developers of desktop applications with two possible choices: (1) put all their eggs in one basket and hope that platform will endure or (2) try to develop across all platforms and hope the added complexity doesn't kill you.
Option 1 is scary anymore. Windows still looks strong but more and more, I see people who ten years ago would have scoffed at the idea of buying anything but Windows proudly switching to Apple. Let me say that again: proudly switching to Apple. At the same time, droids seem to be falling out of the sky. I see more droids than Iphones. The tablet PC is coming in a big way. When it gets here, people who never intended to run Linux are going to run it by default since that's what will be on their machine. Of course, Linux is already on lots of the servers that business are running. It also provides a nice way to breathe new life into old machines for people who don't feel like buying a new computer just to get the newest eye candy. Remember, we are talking about tablet PC's that are actually productive, as in, businesses will begin issuing them to employees the way they issue PC's now.
For the first time in a long time, Option 2 is starting to make some sense. As such, programming languages that ease cross-platform development are going to become increasingly valuable. So, what languages are used for serious development now.
- Windows: the main language seems to be C#. The runner-up is C++.
- Apple: the main language is Objective-C. The runner-up is C++.
- Linux: the main language is C. The runner-up is C++.
- Android: the main language is Java. The runner-up is C or C++.
(Note: I define "serious development" as rock solid, enterprise quality software)
There are of course other languages that get used for lesser projects. Python, perl, and ruby come up a lot for scripting purposes. I suppose Visual Basic is still used too, though that seems to have fallen off the radar compared to the 90's. Lua is hot for game scripting, PHP for server programming, Javascript for browser development, and Java for business applications--the kind of stuff that used to be done in COBOL, but I'm interested in person computer productivity.
None of this is without exception; however, for getting real work done--for projects where performance is an issue--the bulleted list above is pretty definitive. Does it suggest anything to you? It does to me. No language is number 1 across the board, but there is a language that is arguably number 2 platform to platform.
CONCLUSION
The trends of this essay suggest that the ideal programming language of the near future is going to have the following characteristics:
- Good for distributed computing
- Good at producing fast code for when distributed computing isn't possible
- Good at producing efficient code for saving on battery life
- Good at producing small, stand-alone code for saving on bandwidth
- Good at producing cross-platform code
So who are the contenders?
Haskell--a functional programming language--comes to mind, but programming in Haskell is just too different. I don't know of a single business program or game in mainstream usage designed in Haskell. While we're at it, let's just go ahead and lump Scheme, Forth, and Common Lisp into this category as well.
Google has released Go--a procedural programming language which received some buzz when it was first released. Its developers designed it for the multicore world so that's a big plus. On the other hand, it's brand new and lacks objects. In an object oriented world, that seems like a strange and significant omission for a non-functional programming language.
Java has a good chance too, but Java seems to be settling into the role of 21st century COBOL. It is the main language for the android operating system. That will really matter--especially at first--as non-Apple tablets come online, but that "non-Apple" qualifier is the 800 pound gorilla.
Ada presents an intriguing possibility. Ada is a really powerful language that would meet all these requirements brilliantly. Designed by committee for government usage and descended from Pascal--an educational language--Ada's problem is that it was never cool. But it deserves to be considered in here.
Looking over the list, I can't honestly see a language more competitive than C++. It isn't famous for being easy or small, but it possesses many of the qualities that I think programmers are going to need in the future. It's battle tested, proven, and everywhere.
Strange. When I began this essay, I never expected to wind up here. I just wanted to work out what I thought the big trends of the future would be and which language(s) I thought would meet them. It's weird how things work out.
I guess it's time to give this post a title and publish it.