Studies show that over 70% of all Information Technology projects fail. Who is the perpetrator of these crimes? A little detective work identifies three possible suspects: technology, people, and processes. Which one is the culprit?
It’s extremely rare for a computer to make a mistake. Given the same inputs and state, a computer always generates the same outputs. So the fault doesn’t seem to lie in the hardware. Software is not as reliable as hardware. The definition of a trivial software program is one that has no bugs. This said it is usually not the software that causes projects to fail. Software is a flexible medium and easy to change. If a bug is severe enough it can usually be trapped and killed, or a workaround discovered.
People make as many or more mistakes as software. But people have been around long before computers. Before computers most projects seem to complete successfully. It turns out people are pretty flexible too; they are good at adapting to any situation. Given a problem, a group of people can almost always come up with a creative way to solve it.
That leaves process as our last remaining suspect. Process is pretty hard to put the finger on. Usually process is invisible, inflexible, unstructured, and undocumented. Process is such a shifty character it must be the villain. As it turns out process usually is the cause of most project failures because: 1) complex processes interact with each other in complex ways, 2) when performing analysis, design, development, and implementation, rarely is the impact of process considered, 3) an error in process is nearly impossible to detect, and 4) processes are difficult to change.
To reduce the odds of failure, IT projects should do two things. 1) Accurately document both the current-state and proposed future-state business processes impacted by the project, and 2) Implement change management with all of the rigor of project managment, enabling process transformation to keep up with the technical transformation.
Recent Comments