At first sight, that appears ridiculous. By their nature, monopolies—and I certainly think of Microsoft as a monopoly—stifle creativity. By contrast, the open source world is coming up with all sorts of new products all the time. You just need to look at places like SourceForge or Freshmeat to see the incredible amount of new software that is being developed.
Before we continue, though, we should define what “innovation” means. The Shorter Oxford English Dictionary states: (1) To change into something new; to alter; to renew. ... (3) To bring in or introduce novelties; to make changes in something established; ... . That doesn't bring us much further. It's fair to assume that Microsoft is implying changes for the better, which make the software easier to use and possibly make things possible that weren't possible before. In any case, that's the way I'm going to interpret it in this article.
So is open source software innovative? Taking off my daemon's advocate hat and putting a devil's one on instead, I wonder. The BSDs are proud of over a third of a century of high-quality software development, but tradition and innovation aren't the same thing. On the other hand, by my definition new development and innovation aren't the same thing either. There needs to be some added value
What's wrong with Microsoft? Well, lots of things. They're not exactly the most popular company round here. Some talk of monopolistic behaviour, including high prices and restrictive licensing conditions, others of really terrible problems of security, performance or reliability. But that's not the issue here.
It's unlikely that you'll immediately associate innovation with Microsoft. To be fair to them, they do change things like user interface with every release. Some of this might have to do with their desire to ensure that as many people as possible migrate to the new version, but one way or another, it's new.
Is it innovative? I suppose that depends on your perspective. For the sake of argument, let's assume that it is innovative. and move on to look at what the open source world is doing. It's doing lots of things, of course. But a surprising amount of activity is related to “desktop” software. This is, of course, an area in which Microsoft is active, and a lot of open source desktop software copies Microsoft. Is that innovation?
The proponents of systems such as KDE or Gnome will—correctly— point out that they do lots of things differently from Microsoft. They'll also claim that they're innovating. Are they? To a certain extent, yes, of course they are. But a lot of the work that goes into such systems is of cosmetic nature, such as so-called “eye candy”, which doesn't fit my definition of innovation.
The proponents of systems such as OpenOffice (or OpenOffice.org, as they prefer to be called) will point out that they are introducing open document formats, but as far as I know they don't claim any basic innovations. They're doing a pretty good clone of the Microsoft environment. Good or bad, you can't call it innovation: at the core, it's using concepts which were developed in the Microsoft space.
Some years ago, during a training course, we did an experiment in which everything was continually changing. At the end of the experiment, the instructor asked how we liked the fact that nothing stayed the same. He must have had the wrong participants: we were used to it, and it didn't bother us. It turned out that the experiment had been designed to show how much people hate change. In our case it failed.
This experiment highlights two issues, though:
So why innovate? In the training course, the experiment involved making party hats. That's hardly a high-tech application, and there's not much that you can improve by innovation. By comparison, computers, especially on the desktop, are very new: indeed, at the time I did the training course, they were still considered quite a novelty. They give us untold opportunities for doing things differently, and, more importantly, better. 50 years ago, computers were large, expensive machines used only for applications which were very difficult or impossible otherwise. Nowadays hardly any modern house or office is without a computer, and many have more than one computer.
But what do people do with computers? Excluding present company, it appears that most people use computers for the following activities:
There are other things you can do with computers. As computers get faster, multimedia is becoming more practicable. Even without faster computers, there are plenty of things you can do with computers. What, you may ask? Well, for one you could make a general purpose computer out of a device which was intended for specialized uses.
How many computers do you have in your household? You almost certainly have at least one, probably more. I count about 30, including old, mouldy collector's pieces. But those are just the “general purpose” computers. What about embedded computers? One of the problems is knowing whether a specific device has one or not. I have two microwave ovens: one about 5 years old (no question that it has a microprocessor in it), one twenty years old and pretty certainly run by discrete logic. Even in this extreme case, though, I can't be sure: what about TVs, video recorders, CD players, DVD players, air conditioners, mobile telephones, normal telephones, clocks, and just about anything that consumes electricity and does more than heat or light? How many microprocessors control the function of your car? Even the keyboard of your PC has its own dedicated microprocessor. I can't begin to guess the number of embedded processors in my house, but it certainly dwarfs the number of “general purpose“ computers.
The biggest difference between embedded and general purpose computers is the way the computer is used. The general purpose machine is definitely more flexible: you can run programs at will and install new ones when you feel like it. By contrast, just about every embedded computer is designed to do one job (and, hopefully, do it well). But do we use this flexibility? How often do you control your fridge temperature with your computer, or run your Hi Fi system from it?
There's probably not much point in running your fridge with a computer: as far as I know, most fridges don't even use an embedded processor. A perfectly normal analogue thermostat is completely sufficient. The Hi Fi system is a different matter: you could use a computer to control things, to record audio and video, to add functionality beyond that which a conventional system offers. Systems are available: the TiVo digital video recorder has been available for years. It runs a modified version of Linux.
In many ways, the TiVo is yet another embedded application. It's designed to do many of the same things that earlier VCRs did. The biggest difference is that it records onto hard disk instead of tape. But the hardware configuration offers opportunities which the TiVo doesn't use. The system is accessible: it bridges the gap between rigid embedded systems and conventional operating systems. Early on, Andrew Tridgell and friends worked on a machine and managed to connect it to a network. The result is a machine which transcends the barrier between conventional computers and embedded systems.
I think so. We have ways to do things with computers now, mainly based on concepts which existed before computers. Useful innovation should make computers easier to use, and I think the TiVo hack passes the test. Adding eye candy or more ways to do one specific task don't. I find it easier to set the time on the TiVo from the command line than from the remote control (which, sadly, is not a particularly good advertisement for the device). This is the problem I have with the current desktop approaches: they make it easier to do one thing, but in the process they make it more difficult to do things that the designer didn't think of. UNIX was successful because it did allow people to do things that hadn't been envisaged. This was no accident: it was done by concentrating on concepts rather than individual problems. Plan 9 built on experience with UNIX. Microsoft and other vendors did the opposite: they ignored the lessons of UNIX and went their own way, building a system which is initially easy to use but which requires more initial learning to be used properly.