Robert Oppenheimer, working at Los Alamos during the development of the atomic bomb, was asked once why he’d done it.
“If an experiment is sweet, you go ahead and do it”, he said.
A lot of chatter in business today — and by business I include government and the third sector — turns on the magical properties of what’s known as “big data”.
There’s a presumption that spending millions massaging, sifting, analysing and reporting on masses of data, both internal and external, will somehow reveal the secrets needed to take the organization forward.
Been in your data centre lately? You’ve had reams of data for years. Why is it that suddenly you’re going to make use of it?
Oppenheimer, of course, didn’t have to bear the consequences of his experiment. He could play the disinterested scientist, only going after the facts, and leaving the moralizing and judgement to others.
Later, when he saw that he did, in fact, like the rest of us, have to bear the consequences, he changed his mind.
For an enterprise, it’s not necessarily so easy. The complete world system was sluggish enough and diverse enough that Oppenheimer could change his mind and survive doing so. Enterprises — even governments — don’t have that kind of staying power.
Let’s be clear: we’ve systematically used information technology to limit decision making over the past thirty plus years.
People still have responsibilities on paper, but the reality of delegation of financial authority, sign-offs required, and the like mean that few actually can exercise much of it. Moreover, by removing all the slack from budgets, and requiring that absolutely every initiative “work out”, we’ve put a high premium on not making mistakes, and not backing away from approved courses of action.
In other words, we’ve created an environment (and systematically culled each and every iconoclast who’d press back against it) where almost no one will willingly stick their neck out and make a decision on their own.
Pray tell, therefore, why suddenly making information available will somehow unleash waves of innovation, responses to markets, and seizing of opportunities?
How many middle and senior managers are willing to stand up and say “I don’t know what’s going on” when asked? For that is the price tag of unleashing big data on the organization and demanding it be used for generating results: you won’t know everything that’s going on.
Safe-to-fail experimentation and control from above are mutually exclusive. Yet it’s only in safe-to-fail experimentation that the information exposed through analysing all that data gets tested, and innovation to put it to work occurs.
In other words, to really get value from masses of information, you have to unlock your organization. Seriously change it — and its management expectations and methods.
Here’s a paradox to reflect upon: in many ways the ancestor of our organizations, the military, is more capable of handling these changes than the average private sector enterprise is.
That role-based structure and fixed reporting lines inherent in armies, navies and air forces turns out to be surprisingly flexible, precisely because more senior officers know they can’t direct the troops in real time. (Nor should they.) Colonels do not sign off on every little issue from the stores for corporals. Compare that to Senior Vice-Presidents signing off on whether or not a clerk gets a headset for their telephone (and yes, that’s the norm these days).
You can’t use information if people aren’t free to put it to work. Thumbscrew tight budgets, top-level control, and “no surprises, no failures” all work against that.
If you want value from big masses of data, you’d better start by rethinking how your organization is managed.