Archive for October, 2013


I’ve found myself having some conversations about Cloud Computing recently. One with some work colleagues, and one with a student friend who is studying the subject.

I will be clear right from the start and say that I am not sold on the cloud. The idea of more closely aligned, connected and accessible technology services undoubtedly makes sense. The idea of organisations containing other organisations and having suppliers, partners and other dependencies closely connected through technology, also makes sense.

But Cloud Computing always seems based on the premise of organisations adopting generic solutions that already exist somewhere. It seems much less about building something new, innovative and unique and taking control of it. That might be fine for certain applications that genuinely are generic – spreadsheets, document editing, email and the like. But what about our key business systems?

This, I would argue exposes the two fundamental and competing forces which underpin IT, namely:

1. The desire to do new, novel and unique things – which means innovation and difference. Obviously we all hope that these can deliver ‘competitive advantage’ and distinctiveness.

Versus

2. The desire to adopt ‘best practices’ or ‘industry standard’ ways of doing things.

So, if we do new things and have innovation, we might end up with a labyrinthine, complex, or even somewhat ramshackle environment but perhaps it works. On the other hand, if we adopt the standard, ‘tried and tested’, ‘best of breed’ ways of doing things, we run the risk of simply becoming a clone of our competitors. And worst still, dull into the bargain. By adopting the ‘best’ (a subjective idea anyway), people in the organisation might actually find it less and less easy to innovate and less and less easy to change as we are constrained by the generic Cloud services we have bought into. Ironic considering we are led to believe we are living in an era of constant change.

Like the Project Triangle of Time-Quality-Cost, the above conflict of ‘different’ and ‘unique’ versus ‘standard’ and ‘best’ isn’t a ‘problem’ to be solved. It is an inevitability to accepted, and rightly so. We should be wary of anyone who thinks ‘they’ve cracked it’. Technology provides almost endless possibilities but it also runs the risk of creating unintentional barriers even prisons, if we are led by the technology rather than influencing and controlling it.

This is a video you may have seen:

It shows a toddler sitting up holding a magazine. She tries to swipe it – she tries to expand it – she bangs it to try to make it play. Nothing happens. And in frustration she throws it away. To a toddler a magazine is a tablet that’s broken.

This is – it is said – how this generation is growing up. It will have a totally different set of norms and behaviours.

I have just stepped away from a somewhat heated online debate about this. Below is verbatim what I said:

I have to say I don’t agree. How do we know that the toddler thinks the magazine is a broken tablet? They might just toss it aside they way they would with anything they don’t (yet) understand. Lego, say. Humans assess such things the way they assess anything else: through their individual world view, which changes over time. The same toddler probably won’t see the risk associated with grabbing a knife from the kitchen table or crawling over the top of a flight of stairs. Those ‘norms and behaviours’ aren’t fixed.

By extension, that doesn’t mean they won’t discover magazines later in life and really like and value them. By extension further (which is what the underlying message behind this clip is) it doesn’t mean that the same toddler/pre-school/teenager won’t grow up and value traditional media – radio, television, books and so on. People’s attitudes change. Television audiences are going up not declining. Radio has been written of for 40 years but is still as strong as ever.

The trap many fall into (and Marketing people especially) is that they think they can examine the habits and behaviours of a specific range of people and extrapolate the results to predict the future. This is nonsense. The world doesn’t work like that.

It’s possibly also worth pointing out that Marie Claire isn’t aimed at toddlers.

I was talking to some IT-friends the other day about requirements gathering. This subject has long been a source of problems (sorry – challenges) for our industry, for a whole range of reasons we all know well – vagueness, change, estimation, conflict etc. But the discussion moved onto high-level ‘strategic’ or ‘transformational’ projects where you might not have any requirements at all. You might get some high level goal like “we want to build an online shop” or “we want a replacement system to support more suppliers” but these aren’t really ‘requirements’ as we have known them. And the more i talk to people the more there seems to be an increasing sense that the concept of traditional ‘Requirements’ is diminishing or often doesn’t exist at all: If there is a requirements list then it pretty much consists of:-

#1: “Make it work”
#2:
#3:

So are we witnessing the death of the ‘requirement’?

You can contrast this with earlier years where the end users would often approach technology proactively with a rough idea. In my experience this would need shaping, but generally be pretty well formed to begin with. Technology then went ahead and built or implemented it. I daresay at his point there will be some people that will throw their hands up with horror at such a simplistic view of the world, but trust me it can work. The notion of end users suggesting a project and technology responding with “Yes. I see what you mean. Lets do it” can actually result in success. It really can be that simple.

This I suppose, suggests a ‘bottom-up’ culture where people lower down the food chain feel confident enough and empowered enough to make such a suggestion. Furthermore management ensure they are trusted and supported to see it through. I’m not disputing it can have a downside of course – individual parts of the organisation doing their own thing and uncoordinated chaos as a result. “Local Optimisations” can spring up. But not always. If people talk and share, it can work perfectly well.

The alternative is where people stop making suggestions, stop attempting to change or innovate and just accept what they have. What sort of a bizarre world is that?

So what of the requirement, then?

I don’t know the answer, frankly. What I do know is that some will say it is a non-issue because the same people “further down the food chain” will know the intimate details of their jobs and processes so well they can easily bring about the necessary change. This is dangerous nonsense. Technology people need to be in there to assist the users. It’s not unreasonable for them to expect technology people to help educate them and to explain the better or ‘right’ ways of doing their jobs. Personally, this is a major driver behind why I went into this industry in the first place.

The death of requirements – if that is what it is – is not a good thing. Perhaps It is indicative of technology’s commoditisation and marginalization. If that is the case it is wrong and ultimately catastrophically bad for everyone inside and outside of our industry.