Category: Analysis


I recently saw an interview with the geneticist Paul Nurse. He said to the effect that the danger with research is the temptation to preempt the outcomes. To think you know more than you do. That causes all sorts of problems. On the other hand, if you are neutral and learn to look, nature will provide the answers.

You could say the same about analysis. There is the temptation to make up your mind before the work has even started. Sometimes people justify this under the guise of “best practice” and the like. True, we have requirements to adhere to as best we can, and a business case but they shouldn’t be legally binding – things change and that’s good. Not bad.

Not too long ago we had a contractor BA in who was re-analysing a system I had worked with for many years and had a good knowledge of. In fact I was the company expert. Whilst I was explaining it she would interrupt with comments like “well, what you would normally do there is…” and then go on to describe something she had obviously observed somewhere else.

This approach is dangerously close to just re-creating what you’ve seen or done before. And re-creation isn’t, to my mind, analysis. Who decides “what you would normally do”? Surely organisations are striving for difference and innovation? Settling for the “normal” isn’t necessarily an achievement. Apart from that, the system in question had served us well for best part of 10 years and had survived with relatively minimal investment. That’s a success in my book. Yet she seemed to have negative presumptions and ideas about it before even speaking to people or getting into any detail.

Re-creationism sounds like some obscure religious sect, and in the IT industry, perhaps it is. But I’d say we should guard against it.

I had a manager once who said to the effect that IT was like walking a tightrope. If the developers get too much power and influence they will start building space stations when renovating a Volkswagen Golf will do. But if the project managers get too much power and influence then innovation stops and there is an obsession with the ‘on time’ and ‘on budget’ points of the triangle at the expense of the third one, ‘quality’ (or what I prefer to interpret as ‘doing the right thing’).

I draw no conclusions from this, but it gives food for thought.

I have recently found myself having to restate a couple of realities of the software development process. Not a problem, except that I seem to have to do it a bit more often lately. I’m not sure why this is particularly; maybe it is just in response to the usual day-to-day events.

If you’re interested, these are they:

  • #1. “‘Analysis in progress’ doesn’t just mean ‘Business Analysis’. It can equally mean analysis in all of its many forms – data analysis, systems analysis, or simply a developer looking at code”.
  • #2. “The developer’s role isn’t just about building new features. Often it is about helping the team understand what apps we already have by looking at the code, discovering how it works and what it is capable of. Very often this needs to come BEFORE requirements and features are defined”.

Both of these realities are linked. Defining requirements and features often needs some form of technical investigation and analysis to understand what we already have. There maybe portions of our applications that nobody understands – even the stakeholders and product owner. They may well look to IT to help them shape what is to be built. Some people throw their hands up in horror at this idea as if it represents some terrible failing or a ‘problem’ to be ‘solved’. This is rubbish. Far from it. It contributes to the problem solving and helps shape the features. This is a good thing.

I can illustrate this with a recent example. One of the applications I support – a contract management system – developed a weird fault whereby certain data on the contracts seemed to ‘expire’ and throw strange error messages after about 5 days. To get round this problem the users found a work around – the data in question was rekeyed with a slightly different names. This worked, but resulted in duplicated data quite apart from unease about why it was happening at all.

Much time was expended re-understanding the application and considering what might be causing it but everyone had drawn a blank.The users had no clue and many were new.

My theory was that because it was a contract system and the contracts had to go through various workflow statuses maybe the error was because certain activities hadn’t been done in time (i.e. within the 5 days) and the contacts were failing validation as a result. The message was somewhat vague.

At the stand-up I asked for a developer to be allocated. But I then got into a weird loop whereby both the developer and development manager wanted user stories and scenarios. I explained that we don’t have that yet because we don’t understand the problem. “But we need a story before we can start” came the cry. “No we don’t”, I said. “We need to understand the problem first. I can demonstrate it, but someone will need to look at the code”. I restated points #1 and #2 above and after some further battling I eventually got a developer allocated.

I could have written a story, of course. I could have said

“as a user I want the system to allow me to create contracts”

But what’s the point? It doesn’t get us anywhere.

***

You might be interested in the outcome of this?

Well, it turns out that the application was built in such a way that data was held in memory at the application (not session) level and not everything as being written to the database correctly. Because data was held in memory, everything appeared to work OK. Nothing untoward. The ’5 Days’ phenomenon was because the server recycled its memory every 5 days. This meant that when a user accessed a contract again after 5 days or more, the app would no longer have it in memory. It would then go back to the database, find information missing and throw an error. Nothing to do with the status of contracts. Nothing to do with anything around the business process or how it ought to behave.

Critically, there is no way on earth that either a Business Analyst or an end-user could possibly have discovered this outcome. It was totally different from any of our theories. Therefore the idea of writing a feature for it in advance, before the developer starts work, is totally banal. And as it turned out, no feature was even needed or written. All that was needed was some bug fixes supported by a couple of extra scenarios added to existing features.

A successful outcome but frustrating that it took so long to get there.

An initiative has started at work aimed at making the organisation “simpler”. One is invited to submit ideas as to how, and I suppose it is an initiative to be encouraged.

Having said that, I’ve come across these sorts of initiatives before and they always make me smile somewhat. The implication tends to be that what we have “now” (assumed to be the nightmarish non-simple “complex” organisation) somehow came about on its own. It just “grew” like weeds and brambles taking over your otherwise perfect and well designed garden. No-one was responsible for the complexity building up. It just happened somehow, by itself, right?

This is nonsense of the highest order. Complex organisations don’t just happen: they are designed that way. We made them. So it’s a bit rich that we now complain about what we’ve done and want to undo it, but without acknowledging how we got into the complexity in the first place. It wasn’t some random serendipitous event that brought it about.

I recently saw a television interview where a politician was complaining that police officers do too much desk-based office work and they need to reduce it.

I don’t think it occurred to him that the desk-based office work was there because politicians like him had told the police to do it. People don’t create administration and beaurocracy themselves. It has to be as a response to some stimulus or instruction. You might also want to ask him why the admin tasks were brought in at all if lots of time and effort will be spent removing it all again later…

Then there was the UK government department that for years was called the Department of Trade and Industry. Then for a while it was bizzarly renamed Department of Business, Enterprise and Regulatory Reform. The ‘Regulatory Reform’ bit seemingly also coming from the ‘simplicity’ agenda. Again, it seemed to have escaped politicians notice that the very ‘regulations’ in need of ‘reform’ (i.e. were supposedly ‘bad’) were the very ones created and passed by the politicians. Yet they were again behaving as if it was a case of taking a flame thrower to the stinging nettles.

Life is not a series of chance events: It is the product of decisions that people
consciously make. These decisions might not always be good ones, but they nonetheless exist. We can’t deny them and pretend everything around us is random. It is important to learn WHY our environment is the way it is and what led us there. Only then can we make meaningful decisions about change – simplicity or no simplicity.

I find it quite ironic that we are continually told how we live in a world of constant and unpredictable change, and yet IT people often seem to get agitated by requirements changing. Or even by anything unexpected or out of the blue happening.

This agitation seems be increasing. Why?

I could say that what I’ve just said may well come back to haunt me. Yet I first wrote about it 12 or more years ago and I’ve certainly worked on projects that have fallen victim to requirements changing, sometimes in the most dramatic way. So you could say it already has. I’ve also lived through occasions where there weren’t even any proper ‘requirements’ at all. Just a vague set of statements from senior levels about what broadly (sometimes VERY broadly indeed) needed to happen. It was for IT to work out what they need to do.

So sod it. Let’s reiterate it again.

In my view, there is often not enough change. The problem doesn’t lie with ‘change’ in itself. It lies with the fact that people are often uneasy or nervous instigating initiatives because they don’t understand the consequences of the change. They don’t know where it will end up. It could actually be fairly straightforward – but we don’t know. This can lead to a quest for the ‘safe’ route or perhaps abandoning initiatives entirely. It can also lead to people not being open and honest with their intentions for fear of opening a ‘can of worms’. Yet ‘cans of worms’ (or sacks of snakes sometimes) is what IT is all about, surely?

What concerns me is the mindset that seems to regard the unexpected or unforeseen as some sort of failure. It is increasingly seen as a ‘target’ or ‘commitment’ that hasn’t been met. It is indicative of personal deficiency.

This is all folly: It implies that change is ‘good’ on the one hand yet is a ‘problem’ to be ‘solved’ on the other when it, err, changes. We seem to be moving into into an era where people are terrified of the unexpected or of anything that can’t be completely planned for up front. The myth is perpetuated that the unexpected, unpredictable and unplanned can in some way be designed out. That if only we do things “right” or learn some secret rules or knowledge we will end up with perfection.

This is also folly: To me, IT isn’t about trying to prevent, control or design out the unexpected, but to do what we can to prepare ourselves for the inevitability of things not going according to plan.

I hate this phrase with a passion. All it means is that you’ve sketched out some ideas as a diagram using something like Mind-Mapping, Rich Picture or a Context Diagram and that’s it. Nothing more, nothing less. Maybe it doesn’t even follow any convention at all. So why not simply say “I’ve produced a basic diagram to illustrate”?

Agggh!

Last year I wrote an article triggered by some wall charts i had seen describing the steps in “agile projects” and “waterfall projects” – both of which looked pretty much the same to me.

http://www.mikedorey.f2s.com/blog/index.php/2011/06/19/the-same-but-different/

Of course “Agile projects”, don’t really exist. The Agile philosophy is based around continually building on and improving a “product”, a piece at a time. Furthermore, this building never stops.

This is, of course, at odds with the traditional “Project” notion.

You can go further and get deep and meaningful about it and say that the whole “project” mindset negates the agile ideals, and in fact, it is the project concept itself that is the reason for delivery failures. The notion that we do something that conveniently ‘starts’ and ‘finishes’ (the basis of a waterfall project) might be administratively convenient, but is a bit of a nonsense if we are dealing with something that is going to be with us for a long time. As most IT systems are.

I have been attending some training courses recently. One of the most enjoyable aspects of a course is always meeting the other delegates and swapping ideas. Traditionally, you would chat about the subject matter and generally put the world to rights over a lunch consisting of plates of those small triangular sandwiches that have done so much make the world a happier place. Even if you ate in surroundings reminiscent of a hospital accident and emergency waiting area it was always fun.

In these cash-strapped times, the sandwiches have long since gone. You’re lucky to even get those plain cookie type biscuits that nobody likes or eats – unless they are on a training course. Socialising has given way to grabbing something from paid for vending machines even less likely to dispense your chosen item than those mechanical grab machines in fairgrounds.

But last week, we still built up quite a rapport and had some interesting conversations.

Two of the delegates were in the process of introducing agile into their companies and asked for advice from the rest of us. There were some very VERY interesting insights about people’s experiences. This sort of thing is almost worth the cost of the course in itself and you certainly won’t get it from any book.

I was the fourth person in to give an opinion, and explained my view:

Agile is a better way of running your projects. It generally reflects what people know deep down to be the right things, but that have somehow been suppressed or forgotten. But it comes with a proviso. A big proviso.

That proviso is that the people on an agile project need to know what they are doing.

Knowing what they are doing doesn’t mean technically competent or even being able to apply technical skills. We can assume the team have all of that which s why they are there. It goes deeper. It means understanding the applications and technology from a business perspective and understanding the business context of the project and where it s going so we can shape the requirements. (Apologies if that sounds a bit high and mighty…).

If we have that, then we have what the agile books refer to as “a good team”.

We can then apply agile to it and hopefully make “a good team great”.

The “applications and technology from a business perspective” bit is key. Most projects nowadays involve existing, legacy, systems. This isn’t the 1970s.These applications will have business rules, logic and behaviour embedded in them. Often this is not properly understood, and furthermore it is not discoverable to the users. You need developers to help understand it by looking at the code. But if you have developers that don’t want to work on anything pre-existing, and feel aggrieved if they can’t work just on the glamorous “new” stuff (and I’ve encountered plenty of those, I can tell you) you will have a problem. This mindset certainly isn’t that of a good team.

To be fair, I’ve also encountered business analysts that are reluctant to look into pre-existing IT systems too closely. They seem to think this will in some way contaminate the finished solution. For heavens sake. They exist. Therefore you have to look at them. If people were using ringbinders and rolodex to manage their process you would look at these, so why get aggitated about pre-existing IT? There may well be much to learn from it.

Examining what you have and having a deep and meaningful understanding of it, can shape and influence the new requirements. It isn’t necessarily a case of even understanding the requirements at this stage: Like cooking, it is more a case of understanding the ingredients and what they are capable of first. Then, we can focus on the recipe.

Time for some biscuits.

Project plans are, it seems to me, like weather forecasts: well intentioned, produced using the best available information, worth looking at certainly, but frequently wrong.

This doesn’t mean we shouldn’t have them, necessarily. Nobody suggests not having weather forecasts just because they are often wrong, and you could say the same about project plans. There are certainly situations where project plans don’t add any value and I don’t see why they are there, frankly. But I can accept that they sometimes have a part to play – if only to provide a guide to the general direction of travel. That is fine by me.

Ultimately it is people’s perception that is the problem. Plans and forecasts don’t seem to be enough nowadays. People expect absolute certainty and predictability in everything, even if we know in our heart-of-hearts this is an unachievable nonsense. Plans and forecasts need to be treated as such. With a degree of caution: Points of reference are sensible, but whole-hearted reliance and an expectation of quasi-scientific perfection is just silly.

We should know better by now.

The word “Wrong” also needs examination. People often seem to assume that something “wrong” can always be put “right”. It is often further implied that the wrong could have been avoided entirely if only we had done something differently: If only we had taken some (inevitably elusive) action. If only we had gone that little bit further to make the plan perfect. Damn. No matter – next time it will be better and we will improve, right?

This is just hindsight playing tricks, I’m afraid. A wonderful thing, but also depressing and destructive if you don’t keep it in check.

The most dangerous thing is that it perpetuates the myth – and thats what it is – that the “perfect” project plan is achievable. Like the perfect weather forecast…

Requirements get a hard time. People often seem scared of them. They are viewed as dangerous animals. They need taming. They need to be controlled, prioritised, re-prioritised, scrutinised, numbered. They are scary, unpredictable beasts.

And more terrifying than that: They Change.

Yet this shouldn’t be a problem at all, should it? It is ironic that we supposedly live in a world of continual change, where adaptation is absolutely central, yet people seem to have a problem with requirements changing. Bizarre. The fact that requirements change isn’t the problem. It is how we deal with the change that is the problem. If we understand our systems and how they behave and what they are capable of, why would unexpected changes in requirements be a problem? If anything, such changes should be encouraged. That’s what we’re here for. Furthermore, It ought to be the creative, rewarding part of the job?

Compare that to a situation where you have a suite of poorly maintained and poorly understood systems. In this scenario, you’ll have problems. Simple as that. (For a further discussion of this, read the rest of this blog).

The project I recently completed was successful in the face of incredible change, alteration and general vagueness. It was an object lesson in it. What was eventually delivered was almost the complete opposite of what the programme manager and other senior staff suggested at the start they wanted. So to say the requirements changed and were vague is talking things up. As a result, It could have been a catastrophic disaster.

Yet it wasn’t. So why wasn’t it?

Well, simply because the application in question was well understood.

Through what seems to me to be an accident of history as much as anything, it had been well invested in over the years. in that it had dedicated development staff to maintain it. This in turn was because its technical platform necessitates it rather than a conscious choice. This is an investment that has been rewarded many times – on this and other projects.

It is sadly very common nowadays to see people attempt to counter these issues by imposing more and more ‘control’. More and more scrutiny. This inevitably results in more and more administration. More and more reviews. More and more reporting. Furthermore it takes people away from doing the work itself. Well intentioned as it is.