It has been said of IT organizations that, in the past decade, they have suffered from process myopia.
Partly this is due to the transition of IT provisioning from a fundamentally “build” model to a “package implementation” model: package vendors work to add features to their products, which typically appear as ever more comprehensive process models embedded in the code.
But IT has also fallen into this myopia because the simple automation of the 1950s-1980s has been completed. These areas really could be almost completely encoded in software: most organizations want a highly structured and deeply repetitive process for, say, paying staff, paying bills, or handling payments.
The areas we started to automate after the reengineering craze was initiated by the work of Michael Hammer and James Champy, however, was to reinvent processes that depended on the human element of judgement, decision-making and alternative pathing, and to make them regular enough to allow for an encoded process path in software.
Reengineering might be both passé and somewhat discredited, but the impulse lives on in process myopia.
Yet the areas we are working in now as IT professionals are far more often the realm of what are called “barely repeatable” processes.
These are composed of process elements that admit to rigour and repetitive structure, but which may be assembled and reassembled, with pieces added and subtracted, to make each transaction unique.
The judgement of the person executing the barely repeatable process as to what is needed and what can be skipped, and as to the order of events, is situational. (Indeed, here is where much of the differentiation in the realm of customer service or services embedded into a product as a differentiator is expressed.)
Encoding these in a single definitive solution is a Sisyphean task, often leads to out-of-control projects with deep cost overruns, a severe loss of credibility and, in many cases, an incomplete and unsatisfactory “solution” forced to a premature conclusion.
It is in the attempt to jam the entire set of events into a single process model that such complexity and scope creep arises, no matter whether a single large package, or a suite of them connected by suitable bridges and tunnels, is the chosen implementation model.
Rather, what IT needs to do is to deliver the subcomponents as elements that can be assembled on an as-needed basis.
This may depend on packages — in this case, single-purpose ones — and some custom code to handle gaps, built against a robust and complete single data model (which may have its origins in the firm’s ERP solution).
But the elements can be manipulated by the user at “run time” to meet their needs of the moment rather than prescribing how to work to the n-th degree.