I’ve been approached many times for my thoughts on the subject of security.
In each case, the IT group asking was looking for advice on how better to secure systems, stop end users from modifying their computing environments, how to protect data against unauthorized usage, and so on. In almost every case, there is also a new effort in the area of physical security going on — which has led to a corresponding network security upgrade.
When asked, “Why are you so concerned? What do you expect to happen that would require this level of security?” not one of them could give a good answer. Each was responding to a more general condition, which they couldn’t identify (in other words, they were not carrying out instructions from above).
What is going on here? I believe that, for many of these IT people, security has become a line in the sand, a last bastion of the old days of IT “being in control”.
Between outsourcing, consultants installing packages, business areas buying process support, an ever-increasing proliferation of device types and the like, security has become the untrumpable card to play.
I do not, of course, believe that systems should be unsecured. I do think that many already over-secure their IT environments, and thus lose opportunity. Some examples:
One client is almost in despair, as their attempts to move off the old client-server version of their ERP package have been turned down repeatedly year after year.
Yet this same organization fights tooth and nail to control the desktop.
Letting a little diversity enter into the desktop realm — giving, for instance, the creative types in marketing the Mac OS machines they want (Macs, iPads, iPhones) and letting things like that spread — would quickly turn into a need to have a web-based interface, a standard part of the upgraded package, and unsupported in the old client-server version.
Another is adamant that access to data must be strictly limited.
In this organization, although they have spent tens of millions on data warehousing and business intelligence solutions, only select “analysts” are allowed to actually have access to the data.
Everyone else must send in an information request, and wait.
The analysts have learned over the years to give people exactly what is specified — making corrections caused more trouble (“how dare you presume I don’t know what I want!”). Alas, the average request requires three attempts to get at the “right” information. Most no longer bother.
I become convinced, through these interactions, that people like Microsoft had it right, back in the 1990s, when their answer to Regulation FD and its disclosure rules was simply to tell everyone “don’t be stupid” and let them have access to all the company’s information anyway, or that universities — where every bit of technology known to mankind gets hooked up at one time or another — are on to something when they start from the premise that only a few common bits of technology (browsers, for instance) need to be defined.
Much good, in fact, seems to come out of these looser worlds, and their support costs are not much higher — and sometimes, a little less.
Do you suppose IT would gain more support by being more flexible and open instead of making decisions that limit things?
I do. In fact, I think every move to secure things ought to be balanced by analysing what opportunities will be lost — and when in doubt, err on the side of opportunity.
The resulting serendipitous agility will more than pay for the odd incident.