Archive for September, 2012


My organisation is going through radical change. The change is partly structural, partly a move to different ways of doing what we’ve always done. Both will, it is hoped, open up new opportunities for the future. The change is impacting pretty much all of our IT systems and services in one way or another. Either by wholesale replacement of systems (long overdue in many areas) or by altering interfaces, data structures and so on.

The interesting thing about all of this is what it does to requirements gathering. Not too long ago, the organisation would come to IT with reasonably well thought out idea, and it was for IT to scrutinise this and turn it into something that worked. Once the business analysis and systems analysis had been performed you might end up with something rather different, but not radically different, to what was proposed: People don’t generally come to IT unless they need to. Face facts everyone.

Today, the new applications and services being implemented tend to operate at a very high level and the “business” (or end users) merely contribute to requirements elaboration. They don’t submit formed “requirements” in the way they used to. Their expectation is that “it works”. That is their business requirement. That is not to say they don’t get involved – They often have detailed and passionate input into requirements elaboration and definition, but they don’t drive it. They expect IT to do that.

The conclusion, I suppose, is that we’re moving into an era where IT drive the “requirements” and don’t necessarily wait for the organisation or “the business” (a term I personally hate) to provide them.

I personally have no problem with this at all. It puts IT and Technology firmly in the driving seat of an organisation, where it belongs.

Jason Gorman writes an uncharacteristically strange article on his blog about SQL and relational databases. I take issue with this article on many many fronts and i think anyone that spends their time involved with business systems probably would. It’s just odd. There isn’t enough time to fully go into everything I have to say. Maybe a series of posts are needed.

At its heart is an issue that has been around for as long as I’ve been in the industry. The issue is that “software development” and “information systems” are in no way the same thing. Both require different methods, techniques and tools. Both often require different underlying technology. I would argue that both require a very different mindset and attitude on the part of the people involved. And, if I may say so, even a different outlook on life.

“Software development” is Photoshop. It’s iTunes. It’s Microsoft Office. It’s the latest game. It’s an app you downloaded onto your iPad. It’s also about scary things you rely on written by a section of the computer industry you never get to meet: Printer drivers. DLLs. Firmware in your Television.

That’s all “software”.

“Information Systems” on the other hand are quite different. They are about Business Systems. They are about managing business processes, running your organisation, and giving us competitive advantage and benefits of various kinds.

Thats not to say everything is different. But I don’t and never have regarded Information Systems development in the same light as Software development. They are not the same.

Having established that, It is certainly true that relational databases generally form a significant part of the underlying technology of business systems. Recording data but at the same time separating it from the applications that use and operate on it is a fundamental principle, and one that frankly, has served us well over the years and continues to do so. I don’t agree with Jason’s comments that this only leads somewhere bad, or is a practice to be avoided. For business systems, this is a demonstrably good thing.

But separating your data from your application has another big advantage that often gets overlooked. That is that the characteristics and behaviour of the data itself can help influence your design. In Jason’s “outside in” approach (to be fair, supplemented by the much of current thinking based around “user centred” design), people forget that data can take on a sort of life of is own. By doing proper data analysis, we can actually discover scenarios and behaviours we or the users would never have thought of. All of this comes BEFORE we design the external interface – or at least before we finalise it. We actually end up with a better, more rigorous, design as a result of using a database.

Back in the day, data-centred techniques were often criticised for being “slow” and “complicated” – people seemed to forget that the reason for the “slowness” was because the technique was helping you tease out some of the unglamorous obscurities of the system that you or you users might not have considered.

Therefore it’s time well spent.

Of course, you wouldn’t build a game, iTunes, or a printer driver that way, but that’s not the industry I or my fellow business systems people are in. It’s not “software”…..

I recently saw an interview with the geneticist Paul Nurse. He said to the effect that the danger with research is the temptation to preempt the outcomes. To think you know more than you do. That causes all sorts of problems. On the other hand, if you are neutral and learn to look, nature will provide the answers.

You could say the same about analysis. There is the temptation to make up your mind before the work has even started. Sometimes people justify this under the guise of “best practice” and the like. True, we have requirements to adhere to as best we can, and a business case but they shouldn’t be legally binding – things change and that’s good. Not bad.

Not too long ago we had a contractor BA in who was re-analysing a system I had worked with for many years and had a good knowledge of. In fact I was the company expert. Whilst I was explaining it she would interrupt with comments like “well, what you would normally do there is…” and then go on to describe something she had obviously observed somewhere else.

This approach is dangerously close to just re-creating what you’ve seen or done before. And re-creation isn’t, to my mind, analysis. Who decides “what you would normally do”? Surely organisations are striving for difference and innovation? Settling for the “normal” isn’t necessarily an achievement. Apart from that, the system in question had served us well for best part of 10 years and had survived with relatively minimal investment. That’s a success in my book. Yet she seemed to have negative presumptions and ideas about it before even speaking to people or getting into any detail.

Re-creationism sounds like some obscure religious sect, and in the IT industry, perhaps it is. But I’d say we should guard against it.