Saturday, September 29, 2012

How To Use DOSBox

Have you ever needed to run an old Dos program--like the brilliant Master of Orion --only to find that it won't work anymore on Windows. (And definitely won't work on Linux or Apple).

The Best Game Ever...
Playing old games used to be a problem. Now though, thanks to DOSBox, all those old programs are totally available to you once again.

DOSBox is a program that pretends to be MS-DOS. It's basically like a running a little slice of old school DOS in your brand new computer.

It's free. It's fast. I love it.

The only problem is that its a little tricky to get running. (And the website could use a serious facelift)  So, I figured I'd post my three (3) step cheat sheet online for anyone who wants to get their old programs up and running again like it was still 1993.

Step 1: Download DOSBox. You can get DOSBox from its download site. There are a number of choices, depending on your operating system, but you should know pretty quickly which one is right for you. (Windows is the very first choice)

Step 2: Install DOSBox. This is Easy! On Windows, the download is just a single .exe file which automatically installs DOSBox on your computer. That's it!

Linux users have it even easier. Most major distributions include DOSBox in their package systems. Just go to your package manager and search for dosbox. Linux will do the rest.

If you've done it right, you should see an icon like this one somewhere:
The Dosbox Icon

Step 3: Run DOSBox. This is the part that seems to confuse people. Just double-click on the DOSBox icon to get DOSBox running. You should see an old-school command line from the good old days of Dos. It will look like this:


NOW THIS IS THE IMPORTANT PART!!!!
You need to tell DOSBox where your programs are. This is called mounting a drive. The command to mount a drive is as follows:
mount c (and then wherever in your file system you want DOSBox to look)

Here, I'll mount my home drive (I'm in Linux -- it's like C:\ in Windows)


See what I did. I just mounted my drive /home/ to c and DOSBox informed me that I was successful.

The only thing you have to do now is switch from your Z: drive to your C: drive and start playing .exe games.

This command is easy. Just type C:



Finally, just type dir to list your directory. To start a game, just find the file with a .exe ending and you're in business. (The command to play Master of Orion is ORION.EXE)


For more help, post comments here, and I'll fill in details. I'll write an advanced DOSBox blog next with more details on how to get the most out of it (like taking these cinematic screenshots I'm showing you)

For now, I'm back to conquering the universe...

Snobol4 Major Mode for Emacs

I've been reading up on this older text processing language called Snobol--and specifically Snobol4--tonight and decided to take it for a spin.

(I'll have more to say on programming in Snobol later. For now, let me just say that I think its refreshingly original.)

However, there is a total lack of support for programming in this language. So, I decided to whip up a quick editing mode for my favorite text editor: emacs.

As usual, emacs-fu came through with flying colors...

I took that script, edited it somewhat and plugged in my own--extremely limited--knowledge of Snobol4 to produce a mode that appears to properly highlight comments, labels, gotos, and keywords. It's still pretty rough, but I figured I'd post it anyway.

Without further ado...

Thursday, September 20, 2012

Lifetime Resolutions

I'm not a big fan of New Year's Resolutions.

They rarely seem to work, though I'm not sure why. Maybe it's because the promises are often made under the influence of alcohol or the stress of the holidays. It could also have something to do with the relatively short time span that a year offers to really do something meaningful in the middle of all the other commitments you've already racked up. Plus, you can always just dodge--resolving to do it next year if you drop the ball this time around.

Maybe people just stink at change.

In any case, I'm not a big fan.

I am, however, a huge believer in something that I'm terming Lifetime Resolutions. Here is my definition:

Lifetime Resolutions are the handful of long-term goals you must accomplish during your lifetime for your life to have been successful in your own eyes.

Now that's realistic. It gives you a long enough time to really do something, and establishes the only standard that you really can appreciate. Plus, it puts into perspective that if you let this deadline slip, there won't be another one. (At least, not without some form of reincarnation)

Recent events have gotten me thinking about this more seriously than before.

So I'm building a formal list of Lifetime Resolutions. This is a big deal. Bigger than I expected when I began toying with the idea. After all, if I'm going to dedicate myself to a lifetime of something, I want to say that I thought it through first.

In fact, it turns out that some of my "goals" weren't really even that important to me, while others, which always seemed to be put on the back burner, actually matter a lot when I force myself to take an honest look at them. So, I'm rearranging some of my priorities. I'm making my list, and in the process, I'm learning a few things about what really matters to me.

Try it. You might surprise yourself with what you discover about yourself. I did.

Wednesday, May 30, 2012

Facebook Doesn't Have Shoppers

Facebook claims that it's worth billions because of all the money it might make selling advertising, but that plan is going to fail for one simple reason: Facebook doesn't know that much about its users' shopping habits because Facebook Doesn't Have Shoppers.

What does Facebook know? It knows my birthday. It knows my friends? It knows my interests if I took the time to tell it my favorite movies and whatnot--which probably means I already own that stuff if I want it.

What it doesn't know is my shopping habits or future shopping needs. For that matter, I barely know this stuff.

Compare Facebook to Google--the big name in Internet Advertising. When I need to buy something, I google for it. At that moment, I am telling Google what I want. That's why Google is so effective at advertising. It doesn't have to guess what I need. It doesn't have to play fortuneteller; Google only has to respond to my explicit pleas for shopping help. Facebook doesn't have this luxury; it doesn't know me as a shopper.

Compare Facebook to Amazon--the big name in Internet Retail. Amazon has a record of my shopping history. It knows what I've bought. It knows what I've looked at while shopping. It knows what I've stored in my wishlist for later shopping. It knows what I'm interested in because I've helped improve its shopping recommendations for me. Facebook doesn't have this luxury; it doesn't know me as a shopper.

What Facebook has is a bunch of data that is mostly personal and mostly crap. That's nice for putting me in contact with people from fourth grade I've totally forgotten about, but it doesn't count for much in helping me find my next purchase. It doesn't help me shop! To help me shop, Facebook must predict what I need. But how can it? It doesn't have the data you need to make those kinds of predictions because Facebook has never watched me shop!

This is why Facebook is going to fail. I buy electronics online, but Facebook doesn't know that, and even if it did, it doesn't actually know what I bought, which is crucial. I buy books and music online, but Facebook doesn't know that, and again, even if it did, it still wouldn't know what I bought. I buy toys and games online, but Facebook doesn't know that with all the problems this entails. I buy junk off of Ebay, but Facebook doesn't know that either. Google does because it helps me find all this stuff in the first place. Amazon does because it's usually where Google lands me. But Facebook doesn't have a clue.

Facebook's advertising isn't worth much. True, the website has a billion users, but it doesn't know those users' shopping habits or needs, and without that data, it cannot--in my opinion--compete with Amazon or Google. In other words, Facebook is just the latest incarnation of Geocities and Myspace: a website but not a business.

Finally, take a moment to reflect on who uses Facebook now. Who are those 1 billion strong? Why, it's none other than today's parents. Think about that... When our kids get old enough to become full citizens of the Internet--and that won't be long, just a few more years--those newly minted teens won't want to join mom and dad's website. That's uncool. They won't want their actions popping up for their parents to review. They will want their own place to hang out: someplace safe from prying parental eyes. That's not wrong; it's normal; it's unavoidable; and it's the measure of Facebook's life expectancy.

Thursday, March 22, 2012

The Failure of Programmable Programming Languages

There is this myth out there that the wave of the future is a programming language that can modify itself. This sounds odd at first, until you realize how cool that would be.  It means you could add any feature to your programming language that you need.  Right there.  Without waiting for the language designer to get around to it.

In theory...

In practice, the main languages people hold up as being capable of doing this--Lisp, Smalltalk, and Forth--seem to have totally failed.  I run both Windows and Linux and I can't think of a single application for either operating system that uses anyone of those three languages.

Let me say that again, so far as I can tell, neither Lisp (in any of its flavors: Common Lisp, Scheme, or Logo), nor Smalltalk, nor Forth are in production in any meaningful way at any level on any personal computer.

I kind of wish it weren't so because I love these languages, in theory, but in practice they always fall down.

That is about as damning as it gets.

Go ahead. Prove me wrong.

Friday, January 20, 2012

Is C++ the Language of the Future?

I've spent the last five years studying a wide array of different programming languages and paradigms and here are the big trends that I think will matter most in the near future.

Trend #1: Processors Are Getting More Numerous But Not Faster
Anybody who witnessed the quantum leap in software since the Apple II knows how much things have improved over the past 30 years since personal computers were first released. Just take a look at these Wikipedia screenshots of flight simulator's evolution:
Flight Simulator 1 for Apple II (1980)
Flight Simulator X (2006)

Notice anything? Yeah. Me too! And Microsoft is planning a new version to be released in 2012 that looks even better.

It goes without saying that these improvements required years of programming expertise, but they also relied on explosions in processor speeds as well. Every 18-24 months, processor speeds would double and open up whole new possibilities that developers could--and did--take advantage of for their next release. This much is obvious.

What isn't so obvious is that this doubling in speed always happened on a single processor. This is incredibly important. It meant that programmers could just steadily refine their already existing programs without having to rewrite it from scratch and without having to learn a new style of programming or new programming language. If it ran now, programmers could just assume that in 18-24 months, it would run twice as fast without requiring any changes. This is the free lunch.

As Herb Sutter writes in his must-read article, The Free Lunch is Over. Individual processors have pretty much reached their top speeds. And how has Intel responded? Well, if you can't make one chip faster, just give them more than one processor instead. This is called distributed computing. Enter the dual, quad, and multicore processor computers. But there's a catch. You can no longer run your old program on a multicore and have it transparently run faster. That old program was written for a single processor in a language and programming paradigm geared for a single processor.

Distributed computing is a game changer. It means that for the first time in the history of personal computing, programmers are going to be required to write programs that spread the problem across multiple processors. That is going to require new programming paradigms and programming languages (or at least updates to current languages). To see why, go check out this video: Software for a Concurrent World.

Whatever language you choose, it seems to me that having a language designed to take advantage of multiple cores is going to be a crucial requirement for the future; otherwise, you will be stuck with a program that cannot use the full capacity of your machine. Who wants that?

At the same time, languages that can deliver code that runs quickly on individual processors will continue to be important. Some problems are inherently linear: they cannot be effectively distributed. The only way to make these applications fast is to solve them in languages that produce fast code.

Not many languages meet these dual requirements.

Trend #2: Mobile Computing Rewards Energy Efficiency
It seems like you hear a lot of advocates for various scripting languages like python, perl, and ruby say that efficient execution is no longer that big a deal. So what, they argue, if you use more memory and require more execution than languages like C++ that compile down to machine code. Computers these days are so fast and have so much memory that worrying about efficiency is a waste of time.

However, they are forgetting one important fact: computers have gone mobile. Forget laptops. I'm talking about tablet pc's and smartphones: devices that run almost exclusively off batteries. 

Every computation eats up battery life. This means that efficiency equals longer periods between charging your phone. That matters. Even if the execution speed feels the same to the user between a program written in Ruby and the same program written in Objective-C, the ruby program is probably going to use more juice, and that will get noticed. 

Differences in battery life already drives sales. As more people go mobile, battery life is increasingly going to matter, which means we are back to the good old days where languages that delivery stand-alone, executable, highly efficient programs are going to deliver higher sales. And since broadcast power remains science fiction, it looks like this bottleneck is never going away.

Trend #3: Mobile Bandwidth Is Never Enough 
There was this small innovation a few years back called the Internet. You may have heard of it. It has made many forms of communication possible; however, it has also imposed a powerful bottleneck in the form of how much you can download at a time. This isn't that a big a deal when you're at home, but I'm not worried about that. Once again, I'm thinking about mobile computing across 3G networks.

I own a smartphone. I subscribe to Verizon's network. Most days, it is only a little painful for surfing the web, which means that it is almost sufficient for most mobile devices now. But I'm looking forward here to a point where people use improved tablet pc's for more than surfing the web. My Ipad isn't quite good enough yet for managing spreadsheets or writing reports, much less a novel, but that day is coming. 

I remember a time when desktop computers were standard and laptops were weak replacements that couldn't really do much. That wasn't too long ago. Now, laptops are the standard work machine for most people I know and tablet pc's are the weak platform. Pretty soon, people will want to download more than just websites to them. Actually, they already do. And what are those programs written in: Objective-C. There will be more tablets in the future, a lot of them, and people won't have the bandwidth or patience for huge programs that take forever to download, or pain-in-the-ass programs that require a ton of dependencies to be downloaded just to make them run. Few things are more frustrating than downloading a program only to find that you also have to to download some new version of Java or the correct version of Python or some other run-time environment just to get the damn thing to run. A stand-alone binary  (.exe program) is faster and simpler any day of the week.

I should mention that tons of programmers are trying to side-step this problem by developing in HTML, CSS, and Javascript on the front end while pushing the record keeping and deep functionality to the server on the back end using PHP attached to a database programmed with SQL. Take a look at what you have to know to make that work. How many programming languages is that you have know? Four? Five? In theory, that's possible. In practice, managing all this crap quickly becomes very difficult. 

Consider the level of talent it apparently takes just to keep Gmail or Facebook online. Then take an honest look at how static these websites are compared to most of the programs sitting on your computer. The word processor in Google Docs still feels like a toy, and that's basically just a box that accepts text! We won't even discuss what most web-based spreadsheets are like. If this is the best that Google can do, what are we likely to see from everyone else.

Trend #4: We Are Drowning In Platforms
Not too long ago, you could buy a computer program secure in the knowledge that it would run on your computer, provided your machine was new enough. The world was Windows. Sure, there were other operating systems, but only fanatics owned them. Apple! Give me a break. Only artists bought Apple machines in the 90's. Linux. Don't make me laugh. Nobody except comp-sci majors ran Linux on their machines. For that matter, nobody except computer scientists had even heard of Linux back then. I'll say it again, the world was Windows.

In many ways, it's still a Microsoft universe, but you can't bet on that anymore. Developers must cater to Apple products, to Linux on android smartphones, and Windows. In that way, it feels much like the 80's again where you have to choose which platform to join.

This leaves developers of desktop applications with two possible choices: (1) put all their eggs in one basket and hope that platform will endure or (2) try to develop across all platforms and hope the added complexity doesn't kill you. 

Option 1 is scary anymore. Windows still looks strong but more and more, I see people who ten years ago would have scoffed at the idea of buying anything but Windows proudly switching to Apple. Let me say that again: proudly switching to Apple. At the same time, droids seem to be falling out of the sky. I see more droids than Iphones. The tablet PC is coming in a big way. When it gets here, people who never intended to run Linux are going to run it by default since that's what will be on their machine. Of course, Linux is already on lots of the servers that business are running. It also provides a nice way to breathe new life into old machines for people who don't feel like buying a new computer just to get the newest eye candy. Remember, we are talking about tablet PC's that are actually productive, as in, businesses will begin issuing them to employees the way they issue PC's now.

For the first time in a long time, Option 2 is starting to make some sense. As such, programming languages that ease cross-platform development are going to become increasingly valuable. So, what languages are used for serious development now. 
  • Windows: the main language seems to be C#. The runner-up is C++.
  • Apple: the main language is Objective-C. The runner-up is C++.
  • Linux: the main language is C. The runner-up is C++.
  • Android: the main language is Java. The runner-up is C or C++.
(Note: I define "serious development" as rock solid, enterprise quality software)

There are of course other languages that get used for lesser projects. Python, perl, and ruby come up a lot for scripting purposes. I suppose Visual Basic is still used too, though that seems to have fallen off the radar compared to the 90's. Lua is hot for game scripting, PHP for server programming, Javascript for browser development, and Java for business applications--the kind of stuff that used to be done in COBOL, but I'm interested in person computer productivity.

None of this is without exception; however, for getting real work done--for projects where performance is an issue--the bulleted list above is pretty definitive. Does it suggest anything to you? It does to me. No language is number 1 across the board, but there is a language that is arguably number 2 platform to platform.

CONCLUSION
The trends of this essay suggest that the ideal programming language of the near future is going to have the following characteristics:
  1. Good for distributed computing
  2. Good at producing fast code for when distributed computing isn't  possible
  3. Good at producing efficient code for saving on battery life
  4. Good at producing small, stand-alone code for saving on bandwidth
  5. Good at producing cross-platform code
So who are the contenders?

Haskell--a functional programming language--comes to mind, but programming in Haskell is just too different. I don't know of a single business program or game in mainstream usage designed in Haskell. While we're at it, let's just go ahead and lump Scheme, Forth, and Common Lisp into this category as well.

Google has released Go--a procedural programming language which received some buzz when it was first released. Its developers designed it for the multicore world so that's a big plus. On the other hand, it's brand new and lacks objects. In an object oriented world, that seems like a strange and significant omission for a non-functional programming language.

Java has a good chance too, but Java seems to be settling into the role of 21st century COBOL. It is the main language for the android operating system. That will really matter--especially at first--as non-Apple tablets come online, but that "non-Apple" qualifier is the 800 pound gorilla.

Ada presents an intriguing possibility. Ada is a really powerful language that would meet all these requirements brilliantly. Designed by committee for government usage and descended from Pascal--an educational language--Ada's problem is that it was never cool. But it deserves to be considered in here.

Looking over the list, I can't honestly see a language more competitive than C++. It isn't famous for being easy or small, but it possesses many of the qualities that I think programmers are going to need in the future. It's battle tested, proven, and everywhere.

Strange. When I began this essay, I never expected to wind up here. I just wanted to work out what I thought the big trends of the future would be and which language(s) I thought would meet them. It's weird how things work out.

I guess it's time to give this post a title and publish it.