Backwards Compability

The Java User Group Cologne event with Neil Gafter, which I mentioned in the post on closures, featured discussion about backwards compability. You can’t avoid that when talking about language evolution. During that discussion, Neil presented an interesting argument, which I remembered while reading a post from Bruce Eckel and which I’d like to present here.

Lets start with a quote from Bruce:

At the Java Posse Roundup ’08, in the last technical session, we once again discussed the future of Java. We might have come to the conclusion that backward compatibility is being maintained primarily to serve companies that have no intention of upgrading to newer versions of Java anyway.

For quite some time I found this argument to be quite convincing. But its quite misleading, as the point of backwards compability is not to enable or prevent upgrading of the JDK/JVM. Its about software depending on it.

Lets look at an example: I’m using Eclipse daily for development, and Pain.NET when I have to hack together a few images. Eclipse requires a Java Runtime, Paint.NET the .NET runtime. Now, I need to install two new applications, again both based on Java and .NET. Both require the next major version, eg. Java 7 or .NET 3.0. I upgrade them, install the new applications, maybe restart. Both Eclipse and Paint.NET still work.

What would happen if Java 7 didn’t maintain backwards compability? Eclipse wouldn’t work anymore, I’d have to get a new version compiled for Java 7 first. Same could happen with Paint.NET. And three billion other applications that stop running.

An even more obvious example: I haven’t upgraded to Windows Vista yet, primarily because I don’t want to mess with software that doesn’t run anymore.

In other words: Whenever someone tells you that backwards compability doesn’t matter, don’t think it terms of upgrading the server, think it terms of upgrading the client.

In the above examples, the server is the Operating System or JRE or .NET Runtime, and clients are applications written on top of that. When creating software APIs, eg. the YouTube API, those are the server who have to maintain backwards compability. Not because its a problem for a single client to migrate to an incompatible version, but because its a problem for clients of those clients that can’t afford to upgrade.

Consider the jQuery JavaScript library. The core library is quite small and forms the basis for a ton of plugins. So we have the core as the server, plugins as the clients. When developing a web application, I’d use jQuery and a handful of plugins. My application depends on both the jQuery core itself and the plugins, so I’m a client of both of them.

To those existing plugins I want to add a new one, which relies on a new jQuery core release. As long as jQuery maintains backwards compability, or provides a painless migration path, as jQuery did via compability plugins, I can upgrade jQuery core, use the new plugin, and keep all others working. Hell breaks loose if it didn’t.

-Jörn

No more comments.
  1. Casper

    Allow me to approach the topic from another perspective.

    The software we write everyday is not perfect, there are flaws in it and we often resort to refactoring or even rewriting from scratch due to the lessons learned. Version n+1 of an application is usually/hopefully superior to version n.

    A language, it’s compiler and support library, is ultimately just another piece of software – not unlike the programs we write every day and thus also subject to software entropy and rot.

    Why are we more dormant at this level when it can have a profound implication on our future programs? Couldn’t we remain backwards compatible within limits? It’s been 15 years since the inception of Java, perhaps a reboot is in order.

    There must be ways to keep both the past and the future happy, either by adopting a rich enough versionability/dependency model and/or provide code transition mechanisms which should be entirely possible with a static language like Java.