Trend #1: Processors Are Getting More Numerous But Not Faster
Anybody who witnessed the quantum leap in software since the Apple II knows how much things have improved over the past 30 years since personal computers were first released. Just take a look at these Wikipedia screenshots of flight simulator's evolution:
Flight Simulator 1 for Apple II (1980) |
Flight Simulator X (2006) |
Notice anything? Yeah. Me too! And Microsoft is planning a new version to be released in 2012 that looks even better.
It goes without saying that these improvements required years of programming expertise, but they also relied on explosions in processor speeds as well. Every 18-24 months, processor speeds would double and open up whole new possibilities that developers could--and did--take advantage of for their next release. This much is obvious.
What isn't so obvious is that this doubling in speed always happened on a single processor. This is incredibly important. It meant that programmers could just steadily refine their already existing programs without having to rewrite it from scratch and without having to learn a new style of programming or new programming language. If it ran now, programmers could just assume that in 18-24 months, it would run twice as fast without requiring any changes. This is the free lunch.
As Herb Sutter writes in his must-read article, The Free Lunch is Over. Individual processors have pretty much reached their top speeds. And how has Intel responded? Well, if you can't make one chip faster, just give them more than one processor instead. This is called distributed computing. Enter the dual, quad, and multicore processor computers. But there's a catch. You can no longer run your old program on a multicore and have it transparently run faster. That old program was written for a single processor in a language and programming paradigm geared for a single processor.
Distributed computing is a game changer. It means that for the first time in the history of personal computing, programmers are going to be required to write programs that spread the problem across multiple processors. That is going to require new programming paradigms and programming languages (or at least updates to current languages). To see why, go check out this video: Software for a Concurrent World.
Whatever language you choose, it seems to me that having a language designed to take advantage of multiple cores is going to be a crucial requirement for the future; otherwise, you will be stuck with a program that cannot use the full capacity of your machine. Who wants that?
At the same time, languages that can deliver code that runs quickly on individual processors will continue to be important. Some problems are inherently linear: they cannot be effectively distributed. The only way to make these applications fast is to solve them in languages that produce fast code.
Not many languages meet these dual requirements.
Trend #2: Mobile Computing Rewards Energy Efficiency
It seems like you hear a lot of advocates for various scripting languages like python, perl, and ruby say that efficient execution is no longer that big a deal. So what, they argue, if you use more memory and require more execution than languages like C++ that compile down to machine code. Computers these days are so fast and have so much memory that worrying about efficiency is a waste of time.
However, they are forgetting one important fact: computers have gone mobile. Forget laptops. I'm talking about tablet pc's and smartphones: devices that run almost exclusively off batteries.
Every computation eats up battery life. This means that efficiency equals longer periods between charging your phone. That matters. Even if the execution speed feels the same to the user between a program written in Ruby and the same program written in Objective-C, the ruby program is probably going to use more juice, and that will get noticed.
Differences in battery life already drives sales. As more people go mobile, battery life is increasingly going to matter, which means we are back to the good old days where languages that delivery stand-alone, executable, highly efficient programs are going to deliver higher sales. And since broadcast power remains science fiction, it looks like this bottleneck is never going away.
Trend #3: Mobile Bandwidth Is Never Enough
There was this small innovation a few years back called the Internet. You may have heard of it. It has made many forms of communication possible; however, it has also imposed a powerful bottleneck in the form of how much you can download at a time. This isn't that a big a deal when you're at home, but I'm not worried about that. Once again, I'm thinking about mobile computing across 3G networks.
I own a smartphone. I subscribe to Verizon's network. Most days, it is only a little painful for surfing the web, which means that it is almost sufficient for most mobile devices now. But I'm looking forward here to a point where people use improved tablet pc's for more than surfing the web. My Ipad isn't quite good enough yet for managing spreadsheets or writing reports, much less a novel, but that day is coming.
I remember a time when desktop computers were standard and laptops were weak replacements that couldn't really do much. That wasn't too long ago. Now, laptops are the standard work machine for most people I know and tablet pc's are the weak platform. Pretty soon, people will want to download more than just websites to them. Actually, they already do. And what are those programs written in: Objective-C. There will be more tablets in the future, a lot of them, and people won't have the bandwidth or patience for huge programs that take forever to download, or pain-in-the-ass programs that require a ton of dependencies to be downloaded just to make them run. Few things are more frustrating than downloading a program only to find that you also have to to download some new version of Java or the correct version of Python or some other run-time environment just to get the damn thing to run. A stand-alone binary (.exe program) is faster and simpler any day of the week.
I should mention that tons of programmers are trying to side-step this problem by developing in HTML, CSS, and Javascript on the front end while pushing the record keeping and deep functionality to the server on the back end using PHP attached to a database programmed with SQL. Take a look at what you have to know to make that work. How many programming languages is that you have know? Four? Five? In theory, that's possible. In practice, managing all this crap quickly becomes very difficult.
Consider the level of talent it apparently takes just to keep Gmail or Facebook online. Then take an honest look at how static these websites are compared to most of the programs sitting on your computer. The word processor in Google Docs still feels like a toy, and that's basically just a box that accepts text! We won't even discuss what most web-based spreadsheets are like. If this is the best that Google can do, what are we likely to see from everyone else.
Trend #4: We Are Drowning In Platforms
Not too long ago, you could buy a computer program secure in the knowledge that it would run on your computer, provided your machine was new enough. The world was Windows. Sure, there were other operating systems, but only fanatics owned them. Apple! Give me a break. Only artists bought Apple machines in the 90's. Linux. Don't make me laugh. Nobody except comp-sci majors ran Linux on their machines. For that matter, nobody except computer scientists had even heard of Linux back then. I'll say it again, the world was Windows.
In many ways, it's still a Microsoft universe, but you can't bet on that anymore. Developers must cater to Apple products, to Linux on android smartphones, and Windows. In that way, it feels much like the 80's again where you have to choose which platform to join.
This leaves developers of desktop applications with two possible choices: (1) put all their eggs in one basket and hope that platform will endure or (2) try to develop across all platforms and hope the added complexity doesn't kill you.
Option 1 is scary anymore. Windows still looks strong but more and more, I see people who ten years ago would have scoffed at the idea of buying anything but Windows proudly switching to Apple. Let me say that again: proudly switching to Apple. At the same time, droids seem to be falling out of the sky. I see more droids than Iphones. The tablet PC is coming in a big way. When it gets here, people who never intended to run Linux are going to run it by default since that's what will be on their machine. Of course, Linux is already on lots of the servers that business are running. It also provides a nice way to breathe new life into old machines for people who don't feel like buying a new computer just to get the newest eye candy. Remember, we are talking about tablet PC's that are actually productive, as in, businesses will begin issuing them to employees the way they issue PC's now.
For the first time in a long time, Option 2 is starting to make some sense. As such, programming languages that ease cross-platform development are going to become increasingly valuable. So, what languages are used for serious development now.
- Windows: the main language seems to be C#. The runner-up is C++.
- Apple: the main language is Objective-C. The runner-up is C++.
- Linux: the main language is C. The runner-up is C++.
- Android: the main language is Java. The runner-up is C or C++.
(Note: I define "serious development" as rock solid, enterprise quality software)
There are of course other languages that get used for lesser projects. Python, perl, and ruby come up a lot for scripting purposes. I suppose Visual Basic is still used too, though that seems to have fallen off the radar compared to the 90's. Lua is hot for game scripting, PHP for server programming, Javascript for browser development, and Java for business applications--the kind of stuff that used to be done in COBOL, but I'm interested in person computer productivity.
There are of course other languages that get used for lesser projects. Python, perl, and ruby come up a lot for scripting purposes. I suppose Visual Basic is still used too, though that seems to have fallen off the radar compared to the 90's. Lua is hot for game scripting, PHP for server programming, Javascript for browser development, and Java for business applications--the kind of stuff that used to be done in COBOL, but I'm interested in person computer productivity.
None of this is without exception; however, for getting real work done--for projects where performance is an issue--the bulleted list above is pretty definitive. Does it suggest anything to you? It does to me. No language is number 1 across the board, but there is a language that is arguably number 2 platform to platform.
CONCLUSION
The trends of this essay suggest that the ideal programming language of the near future is going to have the following characteristics:
- Good for distributed computing
- Good at producing fast code for when distributed computing isn't possible
- Good at producing efficient code for saving on battery life
- Good at producing small, stand-alone code for saving on bandwidth
- Good at producing cross-platform code
So who are the contenders?
Haskell--a functional programming language--comes to mind, but programming in Haskell is just too different. I don't know of a single business program or game in mainstream usage designed in Haskell. While we're at it, let's just go ahead and lump Scheme, Forth, and Common Lisp into this category as well.
Haskell--a functional programming language--comes to mind, but programming in Haskell is just too different. I don't know of a single business program or game in mainstream usage designed in Haskell. While we're at it, let's just go ahead and lump Scheme, Forth, and Common Lisp into this category as well.
Google has released Go--a procedural programming language which received some buzz when it was first released. Its developers designed it for the multicore world so that's a big plus. On the other hand, it's brand new and lacks objects. In an object oriented world, that seems like a strange and significant omission for a non-functional programming language.
Java has a good chance too, but Java seems to be settling into the role of 21st century COBOL. It is the main language for the android operating system. That will really matter--especially at first--as non-Apple tablets come online, but that "non-Apple" qualifier is the 800 pound gorilla.
Ada presents an intriguing possibility. Ada is a really powerful language that would meet all these requirements brilliantly. Designed by committee for government usage and descended from Pascal--an educational language--Ada's problem is that it was never cool. But it deserves to be considered in here.
Looking over the list, I can't honestly see a language more competitive than C++. It isn't famous for being easy or small, but it possesses many of the qualities that I think programmers are going to need in the future. It's battle tested, proven, and everywhere.
Ada presents an intriguing possibility. Ada is a really powerful language that would meet all these requirements brilliantly. Designed by committee for government usage and descended from Pascal--an educational language--Ada's problem is that it was never cool. But it deserves to be considered in here.
Looking over the list, I can't honestly see a language more competitive than C++. It isn't famous for being easy or small, but it possesses many of the qualities that I think programmers are going to need in the future. It's battle tested, proven, and everywhere.
Strange. When I began this essay, I never expected to wind up here. I just wanted to work out what I thought the big trends of the future would be and which language(s) I thought would meet them. It's weird how things work out.
I guess it's time to give this post a title and publish it.
Say "sea change" not "quantum leap". While a quantum leap is a sudden jump from one state to another, it is also the smallest change possible. When you want to say "massive change" do it another way.
ReplyDeleteNah, Wikipedia quite clearly states that "Quantum Leap" is a sub-par sci-fi TV series.
DeleteGo does not lack objects it has types + methods + interfaces rather than classes.
ReplyDeletehttp://golang.org/doc/go_faq.html#Is_Go_an_object-oriented_language
@Anon
ReplyDeletelol
http://wordnetweb.princeton.edu/perl/webwn?s=quantum%20leap
Mostly agreed, but I'd add the minor nit that Go has a really nice object system -- it's a little unusual in that it's based on interfaces and structural types, but it's really powerful and easy to use. It's also great for taking advantage of multiple cores.
ReplyDeleteThat said, it's still nearly impossible to ship Go apps on iOS because of the way Apple's tool chain works. Hopefully this will change at some point, but it may be a while.
Interesting article, though the title only has to do with basically the last paragraph :-) C++11 might be a language ready for the future, though with the increasing sophistication of JITs I wouldn't be surprised if javascript was actually the language of the future -- it's only 2 or 3 times slower than C now.
ReplyDeleteI've got a couple of minor issues with this article:
1. processors are getting more single threaded performance, just not as fast as they used to. Especially in the mobile world, single threaded performance is increasing at an incredible pace.
2. Go is object oriented, it just has a different system than the one you'd see in C++ or Java.
The most highly optimized JS thing is probably 2 or 3 times slower than an average C++ program.
DeleteThe most highly optimized C++ program is probably an order of magnitude faster than that...
I strongly agree with your main point: C++ is going to be around for a long time and has an important place when building high performance professional applications, especially on mobile platforms.
ReplyDeleteYou neglect to mention two other important cross-platform languages: C and JavaScript. C is still an important language for writing high performance, cross-platform code and is still more widely used across all types of development and size of systems (kernel, services, applications, embedded systems, mobile devices, etc). Some cross-platform development, typically games, targets OpenGL, which is a C API.
JavaScript, as part of an embedded web app, is another important tool for many cross-platform apps these days. Frameworks like PhoneGap allow JavaScript developers to call underlying native platform functions or their own low level components (possibly written in C++). Mobile and desktop systems all support embedding a browser component in your app, which is often the fastest way to develop a cross-platform UI.
Of course, if your app's UI needs exceed what the browser can do, JS isn't a good option, though many apps have areas like settings pages that generally fit the browser paradigm. Unfortunately, the reality is that to build a responsive custom UI for a cross-platform app, you will need to write a large amount of UI code using the platform's native language (Java for Android, Objective-C for iOS, etc). Cross-platform development always ends up being a giant mix of various languages and technologies. While C++ is often an important tool, it can't be the only tool in your cross-platform toolbox.
You're forgetting D (www.dlang.org)! Its performance is similar to C/C++ yet its developer productivity is 1-2 orders of magnitude higher IMO. It has OO, functional, and meta features (templates, mixins, CTFE), robust standard libs and can easily interface with C. Its multithreading support (thread-local storage + message passing by default) is vastly superior to bug-prone C/C++ mutex approach. It can be picked up by old Java/C/C++ hands in very little time. All it needs is a little more momentum and toolchain (esp. ARM compiler support and better GUI libs).
ReplyDeleteScala?
ReplyDeleteC++ just got some language support for multithreading. Also it's a real pain to decide who should relese the object which is used by more threads at the same time...that was one of the main reason google decided to use GC for Go.
Not mentioning that the final outcome product of a c++ project is the quality of the least talented coder, while in java/c# world that's not the case, you cannot mess really up the whole application like you can with c++ with some nicely (mis-)implemented constructors.
Glad to see D mentioned -- I'd love to see that language gain greater traction. A major point is that D *compiles* faster than C++ (and the difference is MORE pronounced the BIGGER the project), which really helps developer productivity. C++ has lots of legacy issues (.h/.cpp file separation & include order issues, clunky template syntax, too much preprocessor magic, compiler can't optimize string operations), and D is basically there to ask the question "what would happen if we rebuilt C++ today with those issues in mind?"
ReplyDeleteI don't get it. You start by saying
ReplyDelete"Distributed computing is a game changer. It means that for the first time in the history of personal computing, programmers are going to be required to write programs that spread the problem across multiple processors. That is going to require new programming paradigms and programming languages (or at least updates to current languages)"
and then you dismiss functional languages like Haskell, Scheme, Lisp, etc because they are "too different"?
Why don't you ask the question the other way around? Name a single business application or consumer software application written in C++ that can flexibly and seamlessly scale to ANY number of processor cores now and in the future?
What makes you think parallel programming wouldn't require adoption of a new language (and a new mind set) which is different from the ones we've used to date?
What business application written in ANY LANGUAGE can seamlessly scale to ANY number of cores?
ReplyDeleteFair enough, what application written in any language can scale effectively to somewhere in the range of 16-32 cores?
DeletePer Amdahl's law, that application would need to be somewhere in the range 95-97% parallel and the rest permitted to be serial. Consumer applications simply don't have that kind of profile. Scientific applications and simulations, perhaps, but the HPC community is specialized for this, and not the same sort of developer that writes consumer or business applications.
It's strange that you cite an Erlang video, then don't look at how scaling to 32 cores works really well. 97% parallel is easy in Erlang.
DeleteAfter I finished reading the article the language left in my head was Erlang.
DeleteErlang ships rock solid business apps. Learning functional programming was one of the best things I've done for myself, along with using functional practices in imperative code.
It's refreshing to see Ada considered. I learned Ada in 1997 when I did a co-op at an avionics company and it quickly became one of my favorite languages to develop in. It's a shame people dismiss it so easily.
ReplyDeleteClay -
ReplyDeleteI don't have the depth of expertise in any of this to make an intelligent comment.
I will say, however, that I still own my Atari 7800, which I wouldn't trade for any smartphone!
Nathan Z.
C++ could be the language of the future if it were used in the spirit of http://mind.sourceforge.net/cpp.html to create artificial intelligence like the http://www.scn.org/~mentifex/AiMind.html in English or the http://www.scn.org/~mentifex/Dushka.html AI in Russian.
ReplyDeleteWe see a trend from statically compiled languages to dynamic scripting languages for a reason.
ReplyDeleteJust to make the long story short, the upcoming years are still going to be C-style imperative programming with some kind of support for namespaces (that's what people usually call object oriented). The exact flavor depends on libraries and industry best practice hype. Was it any different 10 years ago? :-)
ReplyDeleteI dont think you really make the case here. You say that programmers are going to have to program for distributed processors to get the most out of of their system, but that isn't true. There problems that are better solved by chuncking up and distributing the problem and others that are not. Do you really think that a Clock or Calculator application needs to be coded for multi core systems? If so I dont think you get it. The majority applications do not need this. Man I really can't wait for the next version of X-eyes to come out so it can really take advantage of the dual quad core system I have here. Man this Power Point deck would be so much cooler if it was using all of my cpus.
ReplyDeletePrograms that do not need multicore functionality will run faster when there are more cores simply because the underlying platform will manage what core a system is attached to. This distributes the load across the cores , regardless of whether all applications are written to take advantage of all of them.
I'm not sure what will be "language of the future", but I think D should *really* be one of the (serious!) contenders.
ReplyDeleteIn my (simplistic & mathematical sign liking) view:
D = (pragmatism+real world experience) ^ ( ([C++] - [annoying stuff] + [features from other languages-incl.C#,Java,Python]) * [optimisation down to ASM inclusion] * [language designed with compilation speed in mind] )
Don't trust my word - tust these guys (which have lot more knowledge and experience, than I ever will have, and also names):
Walter Bright - develops C++ compilers for quite some time
(original designer of D)
Andrei Alexandrescu - knows C++, writes books, written (at least 1) book about c++
(supporter and contributor to D)
(both of them have wikipedia pages:))
Only minus of this language (and actually very serious one) is that people are not using it, its not that popular (yet!?)
(this means fewer tools,fewer ports,fewer testers.. you get the idea)
I will leave this link here, in case someone would want to read : dlang.org/
> (Note: I define "serious development" as rock solid, enterprise quality software)
ReplyDeleteI always laugh at such assumptions, that development with statically typed languages is "serious" and that higher-level ones are adequate only for "lesser" projects.
I've seen lots of bugs being introduced because some developer decided to use the "macho" language instead of one that adequately represented and expressed what was actually being done. I can only imagine the horrors this mindset brings.
I agree with Ricardo. I'm always curious as to how people define "rock solid, enterprise quality software".
DeleteLower level, performance oriented and systems languages solve different types of problems than higher level languages, but that has nothing to do with them being of quality or being "rock solid".
You forgot D, you insensitive clod!
ReplyDeleteAlong with Erlang
DeleteGiven that neither language runs well on mobile platforms, I don't see how they can be contenders here.
DeleteGo does not 'lack objects', it's perfectly good at OOP!
ReplyDeleteClojure and Erlang are way more likely contenders.
ReplyDeleteRunning code in the browser is going to be key in the future, especially since there is nothing that can beat the web as deployment infrastructure. This means Javascript (with html and css) will be one of the big players. The argument about having to know many languages was reduced to nothing once node.js entered the scene and enabled Javascript to be run on the server.
ReplyDeleteDon't forget that there is always a trade off. Nothing is all good, and nothing is all bad.
I was going to say, before Rob Levy beat me to it, that Clojure, or something similar, seems more likely to be the language of the future, at least as far as multi-core programming goes.
ReplyDeleteWith default immutability and persistent data structures, Clojure just sidesteps many of the cache/memory coherence problems endemic in multi-core programming. C++ just makes it possible, but not easy.
"Every 18-24 months, processor speeds would double "
ReplyDeleteNo, no and no. Every 24 months the number of transistors that can be put in a single integrated circuit double.
The number of transistor != the speed of the cpu.
Good at producing good, robust, error-free, predictable, self evident code?
ReplyDeleteI don't know. And in embedded software you need that.
hey check this new website www.countcode.com. It's a social network made for programmers, where you can download,share or upload source codes, where you can count your own code lines for free. You have access to the web forum and the web chatroom. we are happy to have you joined to our community!
ReplyDeleteMostly agreed. Not going to add much more to this, except that my bet is also on C++. Thanks by the way, its always a pleasure to read a well argumented essay.
ReplyDeleteI'm with c++ as well...but the languages itself shoud start to deprecate the most dangerous, yet always possible, code an code style to drive programmer in a more polished rational and safe c++11 style.
ReplyDeleteIf experience show us that wrote somethink is an error...and in C++ there are various examples, we shoud be brave and impose some progress.
C++ is becaming better (potentially) but it is still enlarging and boosting complexity!
agree. and most of time complex for nothing.
ReplyDeleteHi!
ReplyDeleteAwesome post, really useful!
Have you heard about Publish Green? You’ve probably seen our Ebooks all over the place. We’re the leading free Ebooks for the world. We’ve just launched our website Thinking In C++Pdf Pdf where we give away the best free Ebook resources out there. We’d be stoked if you could add us to this list.
Keep sharing With us