Why can't the IT industry deliver large, faultless projects quickly as in other industries?

Ed Yourdon's Death March touches upon a number of these meta type questions.

In general, the software industry lacks a lot of the following, which gets in the way of large projects.

Standardization and work item breakdown.

  • This has certainly gotten better, but the design constructs still aren't there to break out a big system. In some ways, software can't even agree on what's needed for a given project, much less being able to break things down into components.
  • Aerospace, building construction, auto, etc.. all have very component-driven architectures with reasonably tight interfaces to allow fully parallel development. Software still allows too much bleed through in the corresponding areas.
  • A large body of successful, similar projects. The A380 wasn't the first big airplane that Airbus built. There are a lot of large software applications out there, but many of them have suffered dramatically in some aspect or the other and wouldn't come close to being called "successful."
  • A large body of designers and builders who have worked on a number of similar and successful projects. Related to the successful project issue, not having the human talent who has been there, done that makes things very difficult from a repeatability point of view.
  • "Never" building the same thing twice. In many ways, an airplane is like any other airplane. It's got wings, engines, seats, etc.. Large software projects rarely repeat themselves. Each OS kernel is significantly different. Look at the disparity in file systems. And for that matter, how many truly unique OSs are there? The big ones become clones of a base item at some point. AIX, Solaris, HP-UX, ... herald back to AT&T System V. Windows has had an incredible amount of drag forward through each iteration. Linux variants generally all go back to the same core that Linus started. I bring it up, because the variants tend to propagate faster than the truly unique, proprietary OSs.
  • Really bad project estimation. Since the repeatability factor is so low, it's difficult to project how large it will end up being and how long something will take to build. Given that project managers and Management can't put their hands on the code and actually see what is being done, unrealistic expectations regarding timelines get generated.
  • QA / QC is not emphasized as heavily as it could or should be for larger projects. This goes back to having looser interfaces between components, and not having rigid specifications for how components should work. That looseness allows for unintended consequences and for bugs to creep in.
  • Consistently measurable qualifications. Generally, people speak of the number of years they've worked in X language or in programming. Time in is being used as a substitute for caliber or quality of skill. As has been mentioned many times before, interviewing and finding good programming talent is hard. Part of the problem is that the definition of "good" remains very subjective.

I don't mean to be all negative, and I think the software industry has made significant strides from where we've been. Forums like this and others have really helped promote conversation and discussion of design principles. There are organizations working to standardize on "baseline" knowledge for software engineers. There is certainly room for improvement, but I think the industry has come a long way in a reasonably short period of time.

Subscribe to Farath Shba

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe