Category: Technology


I’ve found myself having some conversations about Cloud Computing recently. One with some work colleagues, and one with a student friend who is studying the subject.

I will be clear right from the start and say that I am not sold on the cloud. The idea of more closely aligned, connected and accessible technology services undoubtedly makes sense. The idea of organisations containing other organisations and having suppliers, partners and other dependencies closely connected through technology, also makes sense.

But Cloud Computing always seems based on the premise of organisations adopting generic solutions that already exist somewhere. It seems much less about building something new, innovative and unique and taking control of it. That might be fine for certain applications that genuinely are generic – spreadsheets, document editing, email and the like. But what about our key business systems?

This, I would argue exposes the two fundamental and competing forces which underpin IT, namely:

1. The desire to do new, novel and unique things – which means innovation and difference. Obviously we all hope that these can deliver ‘competitive advantage’ and distinctiveness.

Versus

2. The desire to adopt ‘best practices’ or ‘industry standard’ ways of doing things.

So, if we do new things and have innovation, we might end up with a labyrinthine, complex, or even somewhat ramshackle environment but perhaps it works. On the other hand, if we adopt the standard, ‘tried and tested’, ‘best of breed’ ways of doing things, we run the risk of simply becoming a clone of our competitors. And worst still, dull into the bargain. By adopting the ‘best’ (a subjective idea anyway), people in the organisation might actually find it less and less easy to innovate and less and less easy to change as we are constrained by the generic Cloud services we have bought into. Ironic considering we are led to believe we are living in an era of constant change.

Like the Project Triangle of Time-Quality-Cost, the above conflict of ‘different’ and ‘unique’ versus ‘standard’ and ‘best’ isn’t a ‘problem’ to be solved. It is an inevitability to accepted, and rightly so. We should be wary of anyone who thinks ‘they’ve cracked it’. Technology provides almost endless possibilities but it also runs the risk of creating unintentional barriers even prisons, if we are led by the technology rather than influencing and controlling it.

I was talking to some IT-friends the other day about requirements gathering. This subject has long been a source of problems (sorry – challenges) for our industry, for a whole range of reasons we all know well – vagueness, change, estimation, conflict etc. But the discussion moved onto high-level ‘strategic’ or ‘transformational’ projects where you might not have any requirements at all. You might get some high level goal like “we want to build an online shop” or “we want a replacement system to support more suppliers” but these aren’t really ‘requirements’ as we have known them. And the more i talk to people the more there seems to be an increasing sense that the concept of traditional ‘Requirements’ is diminishing or often doesn’t exist at all: If there is a requirements list then it pretty much consists of:-

#1: “Make it work”
#2:
#3:

So are we witnessing the death of the ‘requirement’?

You can contrast this with earlier years where the end users would often approach technology proactively with a rough idea. In my experience this would need shaping, but generally be pretty well formed to begin with. Technology then went ahead and built or implemented it. I daresay at his point there will be some people that will throw their hands up with horror at such a simplistic view of the world, but trust me it can work. The notion of end users suggesting a project and technology responding with “Yes. I see what you mean. Lets do it” can actually result in success. It really can be that simple.

This I suppose, suggests a ‘bottom-up’ culture where people lower down the food chain feel confident enough and empowered enough to make such a suggestion. Furthermore management ensure they are trusted and supported to see it through. I’m not disputing it can have a downside of course – individual parts of the organisation doing their own thing and uncoordinated chaos as a result. “Local Optimisations” can spring up. But not always. If people talk and share, it can work perfectly well.

The alternative is where people stop making suggestions, stop attempting to change or innovate and just accept what they have. What sort of a bizarre world is that?

So what of the requirement, then?

I don’t know the answer, frankly. What I do know is that some will say it is a non-issue because the same people “further down the food chain” will know the intimate details of their jobs and processes so well they can easily bring about the necessary change. This is dangerous nonsense. Technology people need to be in there to assist the users. It’s not unreasonable for them to expect technology people to help educate them and to explain the better or ‘right’ ways of doing their jobs. Personally, this is a major driver behind why I went into this industry in the first place.

The death of requirements – if that is what it is – is not a good thing. Perhaps It is indicative of technology’s commoditisation and marginalization. If that is the case it is wrong and ultimately catastrophically bad for everyone inside and outside of our industry.

In recent weeks the newspaper media sections have all carried articles proclaiming that “television is back”. These have been prompted by a report from OFCOM, the UK communications regulator on The Communications Market. The report states that audience figures for television are strong and that it has a dominant position in the UK’s media landscape.

You could conclude that far from being “back”, it seems it never went away.

The report also talks about the phenomenon of “meshing” – watching television whilst talking about it on Facebook. This is also on the rise and I wrote about it previously here; It is an interesting example of how ‘new’ media is attracting people to (rather than away from) ‘old’ media. In doing so, ‘old’ media is strengthening, not declining.

Who’d have thought it, eh?

The chart below is the one people should study and inwardly digest. From this you get a feel for just how strong television is and for what most of the public always knew: On demand services and the internet feature in their lives but television is still the primary media. The alternatives don’t represent a replacement to or a migration from it. Note that television viewing is going up..

Figures

Radio – a medium people have been writing off for 40 years – is also strong and its reach stays pretty much the same year-on-year.

There has long been an unstated, received wisdom in media circles that traditional television and radio are on their way out at the hands of the Internet and the on demand services that have stemmed from it. Few people state this belief openly; it is a subtlety conveyed ambient thing. A drip feeding.

And it is of course nonsense. There is not the tiniest fragment of evidence for television and radio’s demise.

So what are we to make of all this? Well, I’ve written previously on these pages about the inability of the technology industry (or part of the technology industry at any rate) to come to terms with the notion of more than one way of doing something. It is at its most comfortable with wholesale replacement. New versions. Clear demarcation between old and new. This is perpetuated by various ‘experts’, ‘visionaries’ and assorted ‘blue sky thinkers’ – most of whom have no idea about what is really going on.

The public clearly think otherwise and people on the bus home would give you greater insight. It is a pity they are not consulted more often.

The other day an interesting debate took place at work in our regular Business Analysis meeting regarding testing, Test Driven Development and where the involvement of the Business Analyst ends and the QA/Tester’s begins.

It’s a debate that has come up many times in different guises over the years and I don’t really regard it as a problem that can be “solved” as such. Projects tend to be different from one another and people’s levels of expertise and knowldge are different. The discussion partly concerned who “owns” and completes the feature files needed for development to commence.

If you define a feature as a collection of user stories and user stories giving rise to a series of scenarios then one solution to the BA/tester divide is that the BA produces the features and User Stories (and epics if necessary which is a different topic) and the QA/Tester then augments these with the scenarios described in the automatable GIVEN-WHEN-THEN syntax. Or the BA can list out the scenarios as best as they are understood at that point and they then work together to complete them.

The latter approach is what I would favour, but in any event this is probably what has to happen simply because without wishing to sound too pompous about it, as a BA the QA/Tester only knows what you tell them. Unless you have a tester with bags of domain knowledge it simply won’t be possible to have a clear-cut hand-off from the BA to the QA/Tester. You can’t really expect them to just go off and write meaningful tests independently.

But there is a much more interesting debate to be had around what is needed in addition to the stories and scenarios to enable people to build the solution. I have encountered people over the years that think stories and scenarios are all you need. This is rubbish. User Stories are a great technique and TDD has improved things, but it’s important we don’t get carried away and drift into a fantasy world. Not every characteristic and behaviour of a system can be expressed as stories. It just isn’t possible. You need to supplement them with other techniques or you risk either being swallowed up in a miasma of over elaborate stories that will have your users and customers zoning out (one of the common problems that used to happen with Use Cases) or creating a shallow, oversimplified view of the world that just isn’t detailed enough.

Our old friend UML can help at supplementing the stories. It is unfortunate that many people still seem to insist on misinterpreting UML and believing it is a methodology, in some way at odds with the agile movement. It isn’t – it is simply a communication tool. Of course, if you feel you don’t need it and stories are all you need, then I can say is – good luck…

My nephew starts secondary school later this year but is already conversant with Scratch – a graphical programming environment which allows for drag and drop program creation. It is now being taught in school.

The fact we have kids new to secondary school being taught programming seems pretty impressive to me and can only be good for the industry in the long run. I’ll leave that to you to think about, though.

Scratch is interesting on a number of fronts. It allows for quick creation of graphical animations (Sprites) and therefore looks to be a great way of getting kids interested. If like me, you have been around a bit you will think about similarities with Logo, though it doesn’t look like there is much shared heritage.

Secondly, the development environment is truly visual – see below. At long last we may be on the way to having programming environments that don’t require extensive typing of code but instead a drag-and-drop approach.

Perhaps this could be called Visual Studio?

Scratch width=

The technology industry, or at least a fair chunk of the technology industry, doesn’t like the notion of more than one way of doing something. It is at its most comfortable with clear cut migrations of the ilk of VHS tape to DVD. “You now have this, so you won’t want that any more” is the thinking. New supersedes old. And it’s not just that new supersedes old, but that this happens in its entirety. Things move forward along predictable lines based on wholesale replacement.

So the report i recently stumbled across here, about IP-TV (connected TV) as well as this article about NetFlix releasing new content online ahead of television are both interesting on a number of fronts. Not for the first time, the implications in both is that television is about to be shaken to its very foundations.

With the number of reports and amount of research, speculation, conjecture and general cerebral cortex being dedicated to this topic over the years, I am amazed that television has any foundations left. Yet television is still in a very strong position and shows no signs of being overtaken by IP, Cloud or other Internet-based TV options of the sort these reports describe.

And let’s be clear. The reason this isn’t happening is much the same reason radio didn’t disappear a decade or more ago.

In fact if we look back, it turns out that radio is a medium people have been writing off for 50 years. It survived “you won’t need it any more because we now have television – you know, with pictures” in the 50s and 60s, and “you won’t need it any more because of compilation tapes and Walkmans in the 70s and 80s, and “you won’t need it any more because of downloads” in the 90s and 00s. The latest onslaught – the ability to download or stream podcasts, music when you want and programmes when you want hasn’t made much difference to radio’s popularity. The last research I saw on listening habits suggested something like 90% of the UK population listen to radio at some point during the week. Pretty impressive considering I was reading articles 5 or 6 years ago by assorted ‘experts’ and ‘visionaries’ predicting that all of this would kill it off once and for all.

Downloading stuff in no way replaces the immediacy, ‘newness’ and (if it is done well) unpredictability of radio. A collection of downloads does not a radio station make. No matter how big your music collection, no matter how many gigabytes of ‘stuff’ you have, much of the time you will be quite happy with the radio providing a schedule of something that someone has compiled in the hope you might like it.

Often you won’t, but often you will.

And pretty much the same philosophy applies to television. The ability to download or seek out what you supposedly might “want” when you want doesn’t undermine the fact that people will often be quite happy with viewing a channel or maybe a series of channels without having to spend time making selections – exactly the same as they do with radio. That is not to say that they will not also use on demand services, But this will be at different times and is based on occasion, mood and and a whole range of subjective reasons that – thank heavens – the market research folks will never be able to fully categorise, pin down and document for administrative convenience. The issue isn’t that different people adopt different technologies. Nor is it about ‘migration’ from one thing to another or ‘replacement’ of what is supposedly ‘old’. The issue that people can’t come to terms with is that the same people may use all of what is available. Just at different times.

“Both” is good. Unfortunately it doesn’t make for a very lively PowerPoint presentation.

I have spent a fair proportion of my professional life untangling applications and processes that nobody in the company properly understood. Quite why organisations let themselves get into this situation when very often they will have built the applications in the first place (they didn’t appear out of thin air), is an interesting question for another occasion. For now though, let’s just acknowledge that we’re stuck with situations like this.

I’ve met many Business Analysts who aren’t concerned in the slightest about what I have just said. Many even purposely steer clear of looking too carefully at any pre-existing applications. The belief seems to be that looking at what already exists in some way contaminates their judgement.

To be fair, there is something in this. What already exists can be a constraint to peoples thinking. But equally it can give useful insight. We can learn from the past. The reason things are the way they are is often significant.

Technology isn’t just about building something new. Sometimes it is about getting more from what we have. Reverse engineering can help us stabilise and improve what we have, improve our acceptance testing and knowledge generally. I find it surprising it is not used more, and why so many people (analysts, developers and others) try to steer clear of it.

If we can reverse engineer a UFO (as I saw on TV a while back) why not use the technique on software more regularly?

There are various reports in the press that HMV are having poor pre-Christmas sales, and this perhaps means that my prediction of them not being with us for much longer will come true.

I for one will feel sad about this: I’ve spent a fair chunk of my life in various branches of HMV and a good proportion of my CD and DVD collection is from there. I still think the shared experience of a physical shop is important. Their demise is therefore significant.

It is fashionable nowadays to not be emotional about such things. One must write it off as an agnostic “one of those things”; others lapse into phrases and clich├ęs such as “flawed business models” as if they have any idea of what is really going on.

Bizarrely, I found myself saying all of this at a work event recently where I ended up sitting next to someone who turned to be a board member of my company. After all, what better way to break the ice than to talk about the future of digital media?

It is worth reminding ourselves that the drift away from CD – a key element of HMV’s woes – is in no way similar to what caused people to adopt it in the first place. People switched to CD originally from other formats (vinyl, cassette) because of ease of use. Not, as many ‘experts’ had predicted, because of durability and sound quality. In any event, claims of durability and sound quality turned to be highly contentious and still are.

By contrast, people switched away from CD – generally to MP3 – because they realised they could download them for free. Once word got round that this was possible, why buy a CD when you can get it for nothing from some eastern European web site? So the motivation for MP3 in no way mirrored a technical choice or decision making process in the way that the past – Vinyl versus CD, VHS versus DVD, Blu-Ray versus HD-DVD or even more dramatically, analogue versus digital, did.

Obviously not all MP3 downloads are illegal, but I would still say that for people to claim MP3 has revolutionised music is rather like saying shoplifting has revolutionised high street fashion retail, or train fare evasion has made the country more mobile. The genie is now well and truly out of the bottle. And not necessarily entirely for the good.

Quite where that leaves people trying to make it all pay, and more serious still, for those creators trying to do something new whilst paying the bills and putting food on the table?

I haven’t the faintest idea.

But we were out of time. And coffee.

I’ve written previously on these pages about the prolonged ‘future of television’ debate which has been simmering away on the back burner of the technology industry for years. The debate centres around two main arguments:

Firstly, so the argument goes, the advent of on-demand Internet and cable based services renders the traditional ‘channel’ and ‘schedule’ obsolete. Why passively sit in front of a TV or listen to a radio dispensing a schedule when you can seek out exactly what you want, when you want it? In other words, If on demand services are available, why would you want anything else?

Secondly, young people (whatever that means – that let’s take it to mean people of school age) are turning their back on traditional media. The Internet must hold the answers, since this is where they frequent.

Yet TV and radio audiences are still strong, and despite all the online and on demand services people still seem to value them and give them high approval ratings. It wasn’t too long ago I frequently read articles predicting how podcasts and on demand music services would kill off radio. But radio listening seems to be going up. In television, it is certainly true that audiences aren’t what they were: The days of shows getting 20 million-odd viewers on a Saturday night are over, but the point is those days were over long before the Internet – and certainly long before any on-demand services enabling you to watch TV content.

As for young people drifting away from traditional media, well, the argument seems to be that because they are not consuming television and radio today, they never will. This is a pretty weird idea because people’s views, opinions and attitudes change over time. How do we know they won’t ever come back later in life? Some people seem to think that by observing young people today, they are in some way predicting the future: that those behaviours will continue forever.

So I don’t accept these two arguments.

The big question is, why does all this perpetuate?

Part of the reason comes from the fact there is a portion of the technology industry that can’t come to terms with the idea of more than one way of doing something: There is something in the DNA that is focused on technology always bringing about total migration, replacement, and a single solution.

And to be fair it did. At one time. In the 1950s and 1960s say, technology would bring about wholesale replacement of football pitch sized offices of people processing billing or payroll. These were clear migrational changes and they were beyond dispute. But our modern world is different. The way people consume media and entertainment is fragmented. As a result the technologies delivering it are fragmented. People are quite happy with having alternatives, even if this involves including the tried and tested and the ‘traditional’. Having a microwave in your kitchen does not mean a conventional oven is no longer needed. Blu-Ray disks don’t mean the end of cinema. And ‘on demand’ internet services don’t mean the end of the schedule. Far from it.

Like most people, I’m happy with a bit of both please.

Once upon a time there was a service called Prestel The idea was that by connecting a computer to the telephone line, you could access a remote central computer containing a database of information. A kind of much expanded and turbo-charged Teletext service. Further, and unlike Teletext, you could interact: You could send messages to other users, order goods and services, make reservations, and take part in various multi-user facilities such as chat rooms.

Sounds a bit like something we have today?

.

Nowadays, Prestel has long since disappeared and is virtually forgotten. IT, never an industry to have much regard for it’s own history and heritage, doesn’t celebrate past advances much. Even important historical figures such as Alan Turing (who is becoming regarded as something of a father of computing) are almost unknown today. It is left to volunteers and enthusiasts to raise the profile of the past. (A tip of the hat to Jason Gorman and others who are doing this).

In one of my less successful predictions (though to be fair, I was still at school when I made it), I thought at the time that Prestel was the future. I thought people would embrace it with open arms. Instead it was pretty much written off as expensive and pointless by most and never achieved more than 90000 subscribers. In a weird twist of fate, the point at which It may have started really breaking through into the mainstream coincided with the internet and World Wide Web taking off. It is perhaps Ironic that one of the reasons for Prestel’s unpopularity was the fact that to connect to it involved the cost of a local telephone call. Yet when the web took off (in the era of dial-up modems – no broadband yet) people seemed to forget that that was exactly what they had to do.

The rest is history. Prestel was dead and buried before it even got properly started.

Should we feel nostalgic or emotional about such things? I do – though I’m not entirely sure why. In a way, you could just dismiss the whole thing as a failure. After all, many printed magazines had more subscribers than Prestel ever did. But it did do one important thing: By planting the seeds of an online world, people could at least see the possibilities for real. It wasn’t just research lab theory – It worked. Technology could deliver instant information and interaction into your home. People hopefully would (and did) get enthused and strive for a bigger, better and more ubiquitous online world. And we all know where that led.