[Updated (18-May-2007, 11:13): Added a new paragraph (in italics) in the business model section towards the end.]
There are two ratios we've been unable to escape since we began programming [1]:
- The number of lines of code written by an "average" developer per day (~20).
- The average number of defects per thousand lines of code in a good development process (~1/1000 LoC).
All the advances in programming whether languages (assembler to "high level" languages to script-based languages and managed code), reuse strategies (modules, libraries, objects, components, distributed objects, etc.), or engineering practice and discipline (requirements, inspection, version/configuration management, build practices, test practices, extreme, agile) have been directed at beating those two ratios by writing more and better software with fewer lines of code. Writing good software is hard.
Indeed, it is difficult enough that we have shared software for as long as we have written software. This certainly predates Richard's desire for software freedom and the GNU Manifesto. By the time I was attending DECUS conferences in the late 1980s, the tape library was a well oiled machine. You laid down your "cost of media" and received a magtape full of software. Some was binary only. Some in source code form. All covered by copyright, but some placed in the public domain. Some developers accepted changes. Some allowed derivatives. Bandwidth and latency in the "network" were certainly different, but share we did. (As I understand it, the developers in the IBM world did similar things at a conference called "SHARE" oddly enough.)
This is also why the economics of "packaged" software were so good, and customers laid down a lot of money through the 1980s and 1990s. Customers were used to writing all their own software. Each customer ate the entire cost of development, maintenance, and support. Domain expert software vendors came along, aggregated requirements, and amortized those costs across the customer base. Customers got more and better software faster and cheaper than they ever could have developed for themselves. The economics worked in the customer's favour. Computing hardware costs fell to the point of the creation of the PC, and packaged software exploded to address the needs of users that didn't even know what programming was. And a lot of software companies made a lot of money.
Then the web happened. Bandwidth exploded. Latency vanished. We needn't trickle software across 300 baud acoustic couplers anymore. Conferences (events artificially bound to a calendar, clock, and geographic location) were no longer the necessary sharing point. We could form online communities at will, freed from those constraints of linear time and space, and follow our interests and passions.
Our understanding of sharing under copyright also matured.
Now good software has always been written by good software developers without regard for how they share it. There is a discipline to developing good software involving inspection, source code management, build automation, test automation, and facilities to encourage discussion and decision making. We all know what we should do to develop good software, but good software developers know of no other way.
Good software developers also like working with other good software developers. Web-enabled communities complemented by copyright licenses that ensured everyone understood the "contract" of the community allowed well run projects to thrive. Good project managers know how to get the best out of a team of developers.
So we now have the ability to amortize the costs of developing, maintaining, and supporting the development of good software in web-based communities. The engineering efficiency is compelling. Well run open source software projects represent dynamic stable building blocks for solving a growing and complex set of problems. They preserve (and indeed expand) the value in the asset created. (Intellectual property is a set of legal tools used to protect an asset. Let's not confuse the value of the asset's protection to a business with the value of the asset itself to its users.)
This does NOT mean that these development communities can't be closed. Or that they can't reside inside of a company. But there are two dynamics at work then. First, by closing a community you may be closing out the very developers that can best help grow the asset's value over time. Second, understand that a company with a customer must serve certain other economic dynamics, beyond a project with participants and users.
A company builds solutions for customers. Customers pay money for the solution and have certain expectations around value-for-money based on the promises made by the company. A solution or product is much more than "just the software." (See the Core, Complement, and Context post.) Businesses selling solutions have an operational side around marketing to get the word out, and sales and delivery channels, and these have costs associated with them. Yes -- the web is certainly changing the way that happens in everything from eBay and Amazon, to Fedex and UPS, to Microsoft, Adobe, Red Hat, JBoss, and all kinds of other vendors. But customers pay for value around a solution for which they perceive a need. That doesn't change whether they're buying Microsoft Windows Server 2003 or Red Hat Advanced Server.
And here we come up against the other economic dynamic that happens to "software" companies over time. Companies need to keep selling. So they need to keep finding new customers, and to keep adding features and providing sustained innovation on the base they're already selling to current customers. When they begin to over deliver on the customer needs, they need to shift their value proposition to something for which the customer is willing to pay -- enterprise agreements as a repackaging of license and support for example. And the customer's needs aren't static either, so they're investigating shifting perceived needs at the same time. The money-for-value equation keeps changing for customers.
The new breed of companies using open source software as a feature are innovating here as well. Historically, if you were a packaged software vendor, you could only sell a customer again by adding features. You kept adding features (innovative or not) until you over-delivered. (Anyone remember when each new release of Hummingbird's PC X11 server had all manner of IETF clients jammed into it -- including email?) The "subscription" model later invented by such companies looked a lot like a feature update service with support and maintenance. The new web-enabled subscription services developed by the likes of Red Hat, JBoss, and MySQL provide regular value beyond new features.
Good engineers work to the constraints of the scarcest resource. In many cases this is time, and in some cases money. With open source software, engineers on both the customer AND vendor side have broader freedom in some vertical problems spaces (e.g. content management, office suites, system admin and operations) to make choices based on their own unique needs, which they typically understand better than their suppliers/vendors. So for a "UNIX" OS replacement they might (depending upon their unique needs) choose:
- Red Hat Advanced Server
MicrosoftNovell SuSE- Open source Solaris (with or without support)
- Microsoft Windows Server 2003
- roll their own Linux distro (similar to Bank of America or Google)
Open source software is all about solution choice and value preservation in the software asset through shared development and community. We can collectively amortize the development, support, and maintenance of software. The rest is just software business.
[1] I don't need education on the inaccuracies of counting lines of code, nor on how a programmer writing less, or indeed removing code may be the better and more productive programmer -- honest. I've never measured anyone's productivity based on the number of lines of code they wrote. The best simple description of all the debate I've read may be this bit from the introduction of this paper from Bell Labs in 2001:
No single system of metrics for measuring software quality is
universally accepted [4,5]. Intuitively, software quality is related
to the ratio of the perceived usefulness of a product and its
perceived buggyness. The usefulness of a product is related to its
functionality, which is in turn related to code size. More
functionality often implies more code. As a metric for buggyness
one often uses the elusive standard of ‘residual defect density.’
The residual defect density is meant to measure the number of
defects that remain in a software artifact after delivery to the end-
user (the customer), weighted by code size. A typical target in
software development is to achieve a residual defect density of
less than one defect per one thousand lines of non-comment
source code [4,10].
You say: " The "subscription" model later invented by such companies looked a lot like a feature update service with support and maintenance. The new web-enabled subscription services developed by the likes of Red Hat, JBoss, and MySQL provide regular value beyond new features. "
Really, Stephe? Like... what?
Posted by: Jason | 20 May 2007 at 23:00
You say: "Microsoft########Novell SuSE"
When did you start taking cheap shots to drive readership and "street cred", Stephe? Not that the giant of the northwest has what anyone would claim to be "clean hands", but I don't recall any aspect of that deal ceding any form of control or ownership of the contents of SuSE by Novell to Microsoft. Do you know differently?
Posted by: Jason | 20 May 2007 at 23:03
Jason! Welcome.
The Novell cheap shot was a wee bit cheap -- yes. No control was ceded. No street cred needs on my part. Readership will always be niched around people that care about open source, business, standards, and intellectual property. It was more a reference to Novell thinking it needs Microsoft's help. If you haven't read Brent Williams commentary on the numbers, it's here:
http://stephesblog.blogs.com/presentations/BrentWilliamsEclipseConV02.pdf
On your subscription question: the "network" products tend to provide realtime data and tuning and simple monitoring sorts of information. So instead of providing feature updates, you're getting "data" about your unique configuration (e.g. security patch requirements).
Posted by: stephe | 23 May 2007 at 21:10