Saturday, 18 March 2006

Teaching Software Engineering

Heh. Having spent a not insignificant proportion of the last 1.5 years doing HR work, I feel a great deal of sympathy when reading of the plight of others when doing HR. Some amusement too, as I recognize the problems and issues faced.

Today's fun article comes courtesy of, regarding teaching the Waterfall method in schools. I wince in sympathy because almost all of the people I interviewed, if they knew anything about development, knew only this method. Yet it is a method we (as in Exoweb) know doesn't work very well for us.

It's a nice, sunny Saturday afternoon so I'm too lazy to ruminate on why schools put too much emphasis on the Waterfall method and SEI methodologies, but I have been recently rambling to colleagues about a few complaints I had with my own college experience in software engineering:

  • Overly simplified
  • Short term projects
  • No Challenge
Overly Simplified

This is related to the Waterfall issue. I realize that colleges first try to teach us the basics, then try to teach us the more complex stuff. But sometimes, the basics are so overly simplified that we learn the wrong things. e.g. the Waterfall method. To me, the failure of the Waterfall method is the assumption that it is possible to get perfect requirements and that they will never change. Working life has taught me that no plan survives first contact with reality. That lesson was most painfully learned.

What is sad is that too many people I meet still stubbornly stick to what they were taught in college. I still see too many people/organizations spending months trying to gather all the requirements while competitors gain a head start by producing an imperfect but workable product. I see man-years of developer time spent haggling over little requirement details, only to find the client or market has changed requirements in the time it took for them to sort out the exact details.

Yes, requirements are important and it is the cheapest stage in the software development process to make changes. Cowboy hacking just as frequently leads to disasters. However, there is a point of diminishing returns and most people following the Waterfall process go way past this point. Agile Development offers the best middle ground that I have found to date.

So, to wrap up this section of the rant, if schools would quit simplifying stuff too much, the tragedy of the 1 year requirements gathering phase would not occur.

Short Term Projects

Almost all college projects are for the duration of a single class - a single semester of a few months in length. This means that a student typically spends an entire semester building a system that works, then forgets about it afterwards.

The problem with this approach is that, like construction, it is much easier to build a small shack than it is to build a skyscraper. If you are just slapping a few pieces of wood together to cover some random stuff in your backyard, you really aren't concerned about how good the foundation is or if the darned thing collapses a few months later. It's not that hard to rebuild it. On the other hand, screw up the foundation of a skyscraper and very horrible things happen. Like software, those screw ups become apparently very late, when the cost of changing things (or failure) is very high. Yet the one semester projects mostly teach us the habits required to build small shacks.


There is a quote from Peopleware that I enjoy about good builders:

"The minimum that will satisfy them is more or less the best quality they have achieved in the past."

This seems to be true for myself (not that I consider myself a great builder) and for many great developers that I respect. I cannot be sure that this applies to everyone, but it seems true enough for most.

The problem is that most schools don't really hold their students up to high standards or even show them that it exists. If the "best" that they've done is code that doesn't even compile (I know quite a few professors don't even bother to check this), then they will always be satisfied producing crap because they don't know any better.

I see this in some fresh grads that I interview - they are, in theory, some of the smartest kids graduating that year from their college. They have the highest grades, they've achieved more than their peers ... they think they are the king of the world. The only problem is that compared to the truly best in the world, they are crap. They don't automatically strive to improve their code, they use suboptimal algorithms, miss various corner cases, etc.

I have had classmates that have graduated after 2 years of courses taught in C++, yet still not know what a pointer is. I have interviewed candidates who graduated with a bachelor's degree in computer science, but have never written a line of code in their life. These schools do a great disservice to our profession and society in general (i.e. think of the cost of all that crappy code out there).

I know this has been suggested before by others, but perhaps one thing that would make things better would be a minimum competency exam, administered by a certification board. Professions such as law, medicine and accounting all have professional organizations that set minimum standards and administer an exam that all practicing members of that profession must pass in order to practice being a lawyer, doctor or certified public accountant. Perhaps we are approaching a time when software developers too must meet a minimum competency before being allowed to work on things like nuclear power plant controls or medical equipment. I know I would sleep better at night knowing my pointer-incompetent classmate was not writing the code for medical equipment that would one day be used on me.

No comments:

Post a Comment