I participated on a panel on the relationships between standards and open source software at the Open Source Business Conference in San Francisco yesterday, along with Bob Sutor (IBM), Tim Bray (Sun), Jason Matusow (Microsoft), and moderated by Andy Updegrove (Gesmer-Updegrove). (There are a few photos in a collection here, courtesy of Sun's OpenSolaris Community Manager, Jim Grisanzio.) I want to capture my notes on the primary economic relationships here.
When a market begins to mature, there comes a point when the incumbent vendor begins to over deliver functionality to their customers, faster than the customer can absorb it (and therefore the customer doesn't want to pay for it). The technology space is mature enough at that point that the best integrated solution is no longer seen as the most valuable solution by the customer, and the technology space can componentize around a set of relevant standards.
Customers want such standards, because they feel it will give them better choice. The incumbent's competitors are only to happy to define standards to crack open the market away from the incumbent. This is all pretty much straight out of Christensen's Innovator's Solution, with some personal observations based on past experience.
This is where open source becomes relevant to the standards process. Around what do the competitors choose to build a standard? I would argue they choose a body of technology around which they have already collaborated for a period of time in the industry. Because the collaboration has been happening, the vendors have existing practise and experience to determine what works or not. The vendor marketing organizations even likely have certain "customers" they can build case studies around, because of various non-vendor participants in the collaborative space.
- So when Digital Equipment Corp began over delivering on VAX/VMS faster than customers could absorb the new innovations, their competitors chose a large collaborative research project called UNIX that had existed for years to begin the standards base.
- TCP/IP won over ISO OSI networking initiatives because the code base existed out of the DARPA and IETF experience. ("Rough consensus and running code.")
- In the content management world we're seeing this play out again as different vendors over deliver in the different content management spaces, and standards are appearing, rapidly adopted from the open source projects around which they have formed.
And we're seeing this play out around Microsoft Office and the rise of the Open Document Format (ODF) standard. Microsoft has over delivered to the bulk of their Office customers. The customers are frustrated. And Microsoft's competitors rallied around the OpenOffice project to act as a base from which to define ODF. The next release of OpenOffice supported ODF.
The interesting thing is the dilemma the incumbent faces. They have to continue to deliver new product releases with new features and functions. That's what Wall Street rewards them for doing. And the incumbent is even brilliantly creating new innovations in the space and could be working feverishly to deliver them. But the bulk of their customers are over served. They can't eat the last round of innovations. Why would they want to pay for the next round? Remember, economically, innovation is a supply side activity — customers just want to buy solutions and they don't want to buy things they don't use regardless of how innovative they might indeed be.
So vendors use collaboratively developed technology bases (today often defined as open source projects) to act as a base for new standards work.
Another part of the discussion that came out was around the purpose of standards. Unfortunately this part of the discussion happened towards the end of the panel and I think parts of it were lost.
Standards exist to enable and encourage multiple implementations, regardless of the forum in which they are created. It benefits the consumer. Likewise, a vendor specification regardless of whether it has gone through a formal standards or consortia forum exists to encourage multiple add-ons to a single implementation, i.e. it benefits the supplier. I have never liked the term de facto standard — it is really a de facto technology. So in our case above, it is the incumbent's technology that is dominant in a market space.
So a successful standard (measured economically) has many implementations. A failed standard is one for which there is only one implementation. In the ODF case, there are already a number of product implementations available, derived from the OpenOffice code base. In the Microsoft Office XML standards work underway at ECMA International, there is only the yet to be shipped Microsoft Office 12 product.
Jason (rightly) called me out on this rather presumptuous observation, but in his explanation I thought I heard him say that Microsoft is defining the new Microsoft Office standard indeed to encourage multiple implementations, AND to encourage developers to innovate around the Microsoft Office document formats. In the first case, it would be bad business to encourage multiple implementations of a product that is responsible for 50% of the revenue stream. In the latter case, it's just another vendor specification to benefit the vendor regardless of the standards imprimatur.
All in all a good panel discussion (despite Tim's comments that it could have been more exciting). I look forward to continuing the debate, and would love people to comment with other examples of where a collaborative project has acted as the base of a new standards effort.
I can’t help but think Jason was speaking with a mix of Redmond ideology and defensiveness having been caught out on the subject of Office XML File Formats, when he said Office XML File Formats could lead to multiple implementations of Office.
Yes, Microsoft is pushing Office XML File Formats through ECMA for standards ratification. Thing is, the last time Microsoft “used” ECMA to develop a standard just one independent implementation emerged.
The standard was for C# and the Common Language Infrastructure (CLI). The only independent implementation was the late and incomplete Project Mono.
Don’t expect multiple implementations of Office using the Office XML File Formats. In fact, don’t expect much to change at all in the Office franchise. The sad fact is, Microsoft has the Office market sewn up in so many markets and it’s going to get a whole lot tighter, as Microsoft deepens integration between the Office productivity and server products, like SQL Server.
What you can expect, though, is an ecosystem of ISVs and service providers to emerge using Office XML File Formats to deliver new applications capable of tapping data that is held in Office. This ecosystem will help to further enrich the Office feature set and consolidate the suite’s hold at home and in businesses.
That Microsoft faces a mountain of problems persuading users to upgrade from ancient copies of Office, esp Office 97, to the latest edition, Office 2007, is not in question. But independent implementations? That takes more than open standards. That requires open code. And that ain’t happening. Not yet, at least.
Posted by: Gavin Clarke | 18 February 2006 at 20:07
Nice article, and your articulation of the incumbent's problem and his over-delivery and under-absorption by users (demand side) is a concept definitely worth pondering over...as you have mentioned, this is just exactly what Microsoft is trying to do as an incumbent, as well as what we poor sods are doing as the end-users
Rgds
Ec @ http://www.eit.in
Posted by: Ecacofonix | 10 June 2006 at 05:00