Friday, August 13, 2010

On Quality Software

Sean sent us this link today to an interview with Dave Thomas and Joe Armstrong on the subject of software quality and craftsmanship.  (I highly recommend downloading the MP3 available on the link and keeping it in one's archives in case the link ever goes stale.  The interview is full of good points and insights that should be retained for the ages.)  It really got me thinking about software quality in the enterprise environment.

It's a conversation a lot of us have had with our managers, our fellow developers, etc.  And it occurs to me that the response is often the same.  It comes across differently, but the message is clear.  Writing quality software in "the real world" is "too hard."  Sure, it's nice for open source projects or academia or anywhere else that money doesn't matter, but in a real business where costs are important it's just not worth all the extra effort vs. just doing some good old rapid application development and whipping up some application to get the job done.

I find that this point of view is based on two core assumptions:

  1. Bad software costs less than good software.
  2. Good software is more difficult to write than bad software.
The former of the two assumptions confuses, downright baffles me.  We've been in this business for a long time now.  Some of us are a little newer to the game than others, but the industry as a whole has been around for some time.  And as such, there are certain lessons I would think we'd have learned by now.  After all, we're talking about businesses here.  Groups of professionals who's sole purpose in the office day in and day out is to concern themselves with "the bottom line."  Money.  Costs.

Can anybody in any position of management over a software development team truly claim with a straight face that supporting bad software is in any way cheap?  That it can just be written off as a justified cost and pushed aside and forgotten?  Teams of highly-paid developers spend a significant portion of their time troubleshooting legacy software that was rapidly whipped up to just get the job done.  We as an industry have enough of this old software laying around that we know full well at this point how difficult and, at the bottom line, expensive it is to support.

Now, what's done is done.  I understand that.  This software has already been written and invested in and it's "business critical" so we need to support it.  But it's not like we're done.  We're still writing software.  We're still creating new features and new applications and new projects.  So why is "the way we've always done things" still the status quo?  You know how much it's going to cost in the long run, that math has already been done.  Hasn't it?

You're either going to pay for it now or pay for it later.  Either way, it's going to cost you.  But there's a critical difference between "now" and "later" in this case.  "Now" is brief.  It's measurable.  You know when it's over.  "Later," on the other hand, is open-ended.  You don't know how long "later" is going to last.  You don't know how much it's going to cost, other than potentially a lot.  Let's express this with a little math...

  • 100 * 3 = 300
  • 50 * n = 50n
So, assuming for a moment (and we'll get to this in a moment) that creating the "quick and easy" software costs half as much as creating the "good" software (seriously, remember that this is hypothetical just for the sake of simple math here... we'll really get to this in a minute), can you tell me which final amount is higher or lower?  Quick, make a business decision.  300 or 50n?

Now the latter assumption is something towards which I can be more sympathetic, mostly because the people who make that assumption don't know any better.  If you haven't written both bad software and good software over the years and really grasped an understanding of the two, then I can't really expect you to truly understand the difference.  Whether you're a manager who just isn't really into the technical details or an inexperienced developer who hasn't really learned yet, the assumption is the same.

But the thing you need to understand about this assumption is that it's just patently untrue.  Writing good software doesn't cost a lot more than writing bad software.  It doesn't even cost a little more.  Maybe this assumption is based on the generality that in most industries good costs more then bad.  Yes, this is true.  But it's true for a different reason than you think.

Manufacturing quality goods costs more than manufacturing cheap goods because the manufacturing is mechanical and commoditized.  It's an industry of tools, not of people, and better tools cost more.  But software isn't manufacturing.  It isn't plugging together a series of pre-defined widgets on an assembly line.  (Microsoft would have you drinking their Kool-Aid and believing that it's all about the tools, but that's because they happen to sell the tools.  But we'll cover that another time.)  It's a craft, and it's performed by craftsmen, not assembly line workers.

Does good art cost more than bad art?  Sure, the classic good art costs a fortune.  But that's because it's rare and can't be truly reproduced.  Software doesn't really have that problem.  But art being created right now has nothing to do with the financing of the work.  Good or bad is a measure of the craftsman, not of the financial backing.

And a good software craftsman can often be found at roughly the same cost as a bad one.  This is because that cost is usually based on how long the craftsman has been doing the craft, not how good he or she happens to be at it.  We've all met "senior developers" who are only in that position because they happen to have been writing mediocre software for a long time, or because they're over a certain age, or because they've been with that company for a long time in various capacities such that when they got into the software gig they were considered "senior level."

But ask any craftsman in any industry.  Ask a seasoned carpenter or electrician or anybody.  Not taking into account the tools, is the actual act of doing something well inherently more difficult or, at the bottom line, more expensive than doing it poorly?  No, absolutely not.  It seems that way to laymen because the professionals "make it look easy."  But the fact is that, for those who know what they're doing, it is easy.  It actually takes a professional more effort to do something in an amateur way than to just do it the way they know how to do it.  It's not right, it's not natural, it forces them to stop and un-think every once in a while.

Do yourself a favor, do your company a favor, do your bottom line a favor and actually care about the software you manage.  Your developers care about it, because craftsman like to produce good things.  (If they don't care about it then perhaps you should re-think your team.)  Produce better software so that, in the long run, your support efforts can be lower and you can have the time and resources to produce even more good software.  Your business end users may not know the difference, they may not know you've done anything at all.  They shouldn't have to, their software should just work.


  1. Excellent post man. I'm glad you enjoyed that interview and I highly recommend it to anyone in the industry. I just don't see this changing anytime soon though since highly skilled and professional developers seem to be of the rarer lot. You have to find a company that cares and they have to find developers who are craftsmen.

    I have to say I like this craftsman term for field. Any developer can make something work, but a great developer would take less time and provide a design that is easiest to use, maintain, and extend.

  2. I think I'm going to have a short series of this "On ____" posts stemming from this one. One of the points I want to elaborate on is the idea of valuing the tools (the Microsoft way) vs. valuing the craftsmen (the developer way).