Latest Entries »

Jason Gorman writes an excellent post which sums up so much about what we know about software development but aren’t always able to acknowledge. I don’t necessarily agree with the “Government shouldn’t be the customer” part – I see no problem with the government being the customer: it only becomes a problem if they have the wrong attitudes and ideology. This is vitally important in the public services.

The rest of it is pretty much on the money in my view.

I’ve found myself having some conversations about Cloud Computing recently. One with some work colleagues, and one with a student friend who is studying the subject.

I will be clear right from the start and say that I am not sold on the cloud. The idea of more closely aligned, connected and accessible technology services undoubtedly makes sense. The idea of organisations containing other organisations and having suppliers, partners and other dependencies closely connected through technology, also makes sense.

But Cloud Computing always seems based on the premise of organisations adopting generic solutions that already exist somewhere. It seems much less about building something new, innovative and unique and taking control of it. That might be fine for certain applications that genuinely are generic – spreadsheets, document editing, email and the like. But what about our key business systems?

This, I would argue exposes the two fundamental and competing forces which underpin IT, namely:

1. The desire to do new, novel and unique things – which means innovation and difference. Obviously we all hope that these can deliver ‘competitive advantage’ and distinctiveness.

Versus

2. The desire to adopt ‘best practices’ or ‘industry standard’ ways of doing things.

So, if we do new things and have innovation, we might end up with a labyrinthine, complex, or even somewhat ramshackle environment but perhaps it works. On the other hand, if we adopt the standard, ‘tried and tested’, ‘best of breed’ ways of doing things, we run the risk of simply becoming a clone of our competitors. And worst still, dull into the bargain. By adopting the ‘best’ (a subjective idea anyway), people in the organisation might actually find it less and less easy to innovate and less and less easy to change as we are constrained by the generic Cloud services we have bought into. Ironic considering we are led to believe we are living in an era of constant change.

Like the Project Triangle of Time-Quality-Cost, the above conflict of ‘different’ and ‘unique’ versus ‘standard’ and ‘best’ isn’t a ‘problem’ to be solved. It is an inevitability to accepted, and rightly so. We should be wary of anyone who thinks ‘they’ve cracked it’. Technology provides almost endless possibilities but it also runs the risk of creating unintentional barriers even prisons, if we are led by the technology rather than influencing and controlling it.

This is a video you may have seen:

It shows a toddler sitting up holding a magazine. She tries to swipe it – she tries to expand it – she bangs it to try to make it play. Nothing happens. And in frustration she throws it away. To a toddler a magazine is a tablet that’s broken.

This is – it is said – how this generation is growing up. It will have a totally different set of norms and behaviours.

I have just stepped away from a somewhat heated online debate about this. Below is verbatim what I said:

I have to say I don’t agree. How do we know that the toddler thinks the magazine is a broken tablet? They might just toss it aside they way they would with anything they don’t (yet) understand. Lego, say. Humans assess such things the way they assess anything else: through their individual world view, which changes over time. The same toddler probably won’t see the risk associated with grabbing a knife from the kitchen table or crawling over the top of a flight of stairs. Those ‘norms and behaviours’ aren’t fixed.

By extension, that doesn’t mean they won’t discover magazines later in life and really like and value them. By extension further (which is what the underlying message behind this clip is) it doesn’t mean that the same toddler/pre-school/teenager won’t grow up and value traditional media – radio, television, books and so on. People’s attitudes change. Television audiences are going up not declining. Radio has been written of for 40 years but is still as strong as ever.

The trap many fall into (and Marketing people especially) is that they think they can examine the habits and behaviours of a specific range of people and extrapolate the results to predict the future. This is nonsense. The world doesn’t work like that.

It’s possibly also worth pointing out that Marie Claire isn’t aimed at toddlers.

I was talking to some IT-friends the other day about requirements gathering. This subject has long been a source of problems (sorry – challenges) for our industry, for a whole range of reasons we all know well – vagueness, change, estimation, conflict etc. But the discussion moved onto high-level ‘strategic’ or ‘transformational’ projects where you might not have any requirements at all. You might get some high level goal like “we want to build an online shop” or “we want a replacement system to support more suppliers” but these aren’t really ‘requirements’ as we have known them. And the more i talk to people the more there seems to be an increasing sense that the concept of traditional ‘Requirements’ is diminishing or often doesn’t exist at all: If there is a requirements list then it pretty much consists of:-

#1: “Make it work”
#2:
#3:

So are we witnessing the death of the ‘requirement’?

You can contrast this with earlier years where the end users would often approach technology proactively with a rough idea. In my experience this would need shaping, but generally be pretty well formed to begin with. Technology then went ahead and built or implemented it. I daresay at his point there will be some people that will throw their hands up with horror at such a simplistic view of the world, but trust me it can work. The notion of end users suggesting a project and technology responding with “Yes. I see what you mean. Lets do it” can actually result in success. It really can be that simple.

This I suppose, suggests a ‘bottom-up’ culture where people lower down the food chain feel confident enough and empowered enough to make such a suggestion. Furthermore management ensure they are trusted and supported to see it through. I’m not disputing it can have a downside of course – individual parts of the organisation doing their own thing and uncoordinated chaos as a result. “Local Optimisations” can spring up. But not always. If people talk and share, it can work perfectly well.

The alternative is where people stop making suggestions, stop attempting to change or innovate and just accept what they have. What sort of a bizarre world is that?

So what of the requirement, then?

I don’t know the answer, frankly. What I do know is that some will say it is a non-issue because the same people “further down the food chain” will know the intimate details of their jobs and processes so well they can easily bring about the necessary change. This is dangerous nonsense. Technology people need to be in there to assist the users. It’s not unreasonable for them to expect technology people to help educate them and to explain the better or ‘right’ ways of doing their jobs. Personally, this is a major driver behind why I went into this industry in the first place.

The death of requirements – if that is what it is – is not a good thing. Perhaps It is indicative of technology’s commoditisation and marginalization. If that is the case it is wrong and ultimately catastrophically bad for everyone inside and outside of our industry.

In recent weeks the newspaper media sections have all carried articles proclaiming that “television is back”. These have been prompted by a report from OFCOM, the UK communications regulator on The Communications Market. The report states that audience figures for television are strong and that it has a dominant position in the UK’s media landscape.

You could conclude that far from being “back”, it seems it never went away.

The report also talks about the phenomenon of “meshing” – watching television whilst talking about it on Facebook. This is also on the rise and I wrote about it previously here; It is an interesting example of how ‘new’ media is attracting people to (rather than away from) ‘old’ media. In doing so, ‘old’ media is strengthening, not declining.

Who’d have thought it, eh?

The chart below is the one people should study and inwardly digest. From this you get a feel for just how strong television is and for what most of the public always knew: On demand services and the internet feature in their lives but television is still the primary media. The alternatives don’t represent a replacement to or a migration from it. Note that television viewing is going up..

Figures

Radio – a medium people have been writing off for 40 years – is also strong and its reach stays pretty much the same year-on-year.

There has long been an unstated, received wisdom in media circles that traditional television and radio are on their way out at the hands of the Internet and the on demand services that have stemmed from it. Few people state this belief openly; it is a subtlety conveyed ambient thing. A drip feeding.

And it is of course nonsense. There is not the tiniest fragment of evidence for television and radio’s demise.

So what are we to make of all this? Well, I’ve written previously on these pages about the inability of the technology industry (or part of the technology industry at any rate) to come to terms with the notion of more than one way of doing something. It is at its most comfortable with wholesale replacement. New versions. Clear demarcation between old and new. This is perpetuated by various ‘experts’, ‘visionaries’ and assorted ‘blue sky thinkers’ – most of whom have no idea about what is really going on.

The public clearly think otherwise and people on the bus home would give you greater insight. It is a pity they are not consulted more often.

In England, a new national curriculum is apparently being introduced for secondary schools. Two of the things in it are the 12-times table and fractions. But why are we still teaching either? The reason decimal was invented (by a Dutchman which is where “going Dutch” comes from) is to get rid of the mind-wrenching nonsense of trying to work out what 1/3 of 7/8s is. As for the 12-times table, I don’t see why you need it when you have the metric decimal system – based on 10.

A far more productive and useful skill to be taught would be estimation.

I don’t say this specifically because estimation is such an issue for IT projects. I say it because it is so useful generally. I am in the process of having my garden landscaped, and it occurred to me that I have no idea how long my garden actually is. And furthermore I wouldn’t have a clue at estimating it. I would have to measure it myself or go back to the requisite paperwork.

Having said that, if I underestimate the bark chips I will need, that would simply be viewed as an inconvenience. Not a catastophe in the way that unexpected events on IT projects tend to be interpreted.

London radio station LBC celebrates its 30th birthday later this year. It has, by all accounts, been a turbulent history and radio historians amongst you will recall that the ‘LBC’ of today doesn’t really have any connection with the original station. In quirk of regulatory fate, the station lost its broadcasting licence in the early 90s to be replaced with a new service run by, if memory serves, Reuters. This service headed swiftly towards disaster and eventually changed its format and name back to LBC to avoid total oblivion. It was rare for the regulators to not renew a radio station’s licence and it is ironic they did this to one of the better ones.

Nowadays, in our wonderful deregulated radio ‘market’, commercial stations are free to do pretty much whatever they want in the mistaken belief that removing regulation unleashes a tidal wave of otherwise pent-up creativity. What it really leads to of course is a culture where little or nothing of any ambition and passion is attempted and certainly nothing that costs money. Yet good radio needn’t be expensive to do – Its not digging coal out of the ground or landing a man on Mars. It’s sad that the whole thing is so corporatised and much of the fun seems to have gone out of it. It’s all taken way too seriously.

But radio is still an amazing medium. In my option, if ballet is the highest form of dance, radio is still the highest form of electronic communication. You don’t need the Internet, television, twitter, on demand media or anything else if you can open a microphone, talk and make a compelling and interesting programme. That’s the greatest gift to have.

Bizarrely I said all of this at a party last year to someone who turned out to be very senior in the BBC World Service. He almost exploded with agreement and enthusiasm – as if he had spent the last 10 years of his life trying to get the same message across to people. Better still, he bought me a drink.

Happy birthday LBC and in the words of “Radio Ga Ga” by Queen – “You had your time, you had the power, You’ve yet to have your finest hour”

The other day an interesting debate took place at work in our regular Business Analysis meeting regarding testing, Test Driven Development and where the involvement of the Business Analyst ends and the QA/Tester’s begins.

It’s a debate that has come up many times in different guises over the years and I don’t really regard it as a problem that can be “solved” as such. Projects tend to be different from one another and people’s levels of expertise and knowldge are different. The discussion partly concerned who “owns” and completes the feature files needed for development to commence.

If you define a feature as a collection of user stories and user stories giving rise to a series of scenarios then one solution to the BA/tester divide is that the BA produces the features and User Stories (and epics if necessary which is a different topic) and the QA/Tester then augments these with the scenarios described in the automatable GIVEN-WHEN-THEN syntax. Or the BA can list out the scenarios as best as they are understood at that point and they then work together to complete them.

The latter approach is what I would favour, but in any event this is probably what has to happen simply because without wishing to sound too pompous about it, as a BA the QA/Tester only knows what you tell them. Unless you have a tester with bags of domain knowledge it simply won’t be possible to have a clear-cut hand-off from the BA to the QA/Tester. You can’t really expect them to just go off and write meaningful tests independently.

But there is a much more interesting debate to be had around what is needed in addition to the stories and scenarios to enable people to build the solution. I have encountered people over the years that think stories and scenarios are all you need. This is rubbish. User Stories are a great technique and TDD has improved things, but it’s important we don’t get carried away and drift into a fantasy world. Not every characteristic and behaviour of a system can be expressed as stories. It just isn’t possible. You need to supplement them with other techniques or you risk either being swallowed up in a miasma of over elaborate stories that will have your users and customers zoning out (one of the common problems that used to happen with Use Cases) or creating a shallow, oversimplified view of the world that just isn’t detailed enough.

Our old friend UML can help at supplementing the stories. It is unfortunate that many people still seem to insist on misinterpreting UML and believing it is a methodology, in some way at odds with the agile movement. It isn’t – it is simply a communication tool. Of course, if you feel you don’t need it and stories are all you need, then I can say is – good luck…

My nephew starts secondary school later this year but is already conversant with Scratch – a graphical programming environment which allows for drag and drop program creation. It is now being taught in school.

The fact we have kids new to secondary school being taught programming seems pretty impressive to me and can only be good for the industry in the long run. I’ll leave that to you to think about, though.

Scratch is interesting on a number of fronts. It allows for quick creation of graphical animations (Sprites) and therefore looks to be a great way of getting kids interested. If like me, you have been around a bit you will think about similarities with Logo, though it doesn’t look like there is much shared heritage.

Secondly, the development environment is truly visual – see below. At long last we may be on the way to having programming environments that don’t require extensive typing of code but instead a drag-and-drop approach.

Perhaps this could be called Visual Studio?

Scratch width=

The other day I was talking to one of my fellow BAs that I hadn’t seen for a while. After some general conversation about our projects and business processes, he bemoaned that “it is not for IT to tell the business how to do their jobs”.

I have heard people use this phrase before but I don’t find it any less alarming. To explain to people in ‘the business’ (a phrase I hate, by the way) not only how they should do their jobs, but to train and educate them, is EXACTLY what IT should be doing.

If we can’t do this, it is often for two reasons. Either we can’t because of time constraints and priorities, or we can’t because we lack knowledge to do so. Either are appalling: It is like phoning up Panasonic for product advice and be told “we don’t understand our products. We let the customers work out for themselves how best to use them, and we ask them if we have queries”.

Bizarre.

To give the person in question the benefit of the doubt, perhaps what he meant was that ‘the business’ should be free to innovate and change and it isn’t for others to impose rules, structure and explanation that might undermine the innovation. This sounds convincing at first, except that in the modern era, technology is in itself the innovator. Innovation isn’t brought about by the traditional ‘business’ making ‘requests’. It is about technology taking the lead. Nobody ‘requested’ the Internet after all, and it wasn’t on someone’s business requirements list.

In my view, this conversation is indicative of how IT is increasingly seen as a detached and remote ‘supplier’ of services and solutions – often only reluctantly engaged as a last resort or late in the day. This is a terrible situation and certainly not one that motivated me to enter the profession.