September 2012

Don’t drown in data: Decision making needs curiosity

 “The important thing is not to stop questioning. Curiosity has its own reason for existing…” — Albert Einstein


How does your organisation make key decisions?



Data can drive decisions

Few would argue that making business decisions can be critical and individual decisions can affect the success of an organisation.  Often, the decisions can be difficult to make because there is no obvious “right” and “wrong” answer, and organisations instead must focus on making the most right decision; for example, the one that will deliver the best return on investment or the most value to its customers.  Informed decision making of this type relies on having a good understanding of the business environment, context and operations.  Access to analytical data becomes essential, and it’s easy to assume that “access to more data is better”.


However, as organisations collect more data from more sources, and as the speed and availability of that data increases, a real challenge emerges.  How can decision makers use information and data effectively, without drowning in a sea of ever-changing numbers and figures?  Having a high volume of data, with no way of efficiently and effectively interpreting it can lead to “analysis-paralysis”.   In a worst case scenario it can slow down decision making to a snail’s pace, as decision makers struggle to interpret what various disparate data sources are telling them.


I recently saw an interesting video where Wendy Harrington (Chief Marketing Officer and EVP at Franklin Templeton Investments) spoke about the challenge of mining data.  She spoke of not only the challenges relating to the volume, but also the velocity of the data and the need for hypothesis based approach, along with healthy curiosity.  She also spoke about linking data and insight into customer value. These are important themes that are well worth consideration when you are analysing your data:

Be disruptive: A hog’s head beats a leaflet any day

I recently spent two days at the Edinburgh Fringe Festival.  For those of you not familiar with it, the Fringe is a great place to see an eclectic range of theatre, arts and other performers, but it is most renowned for providing a platform for new comedians.  There are literally thousands of performances across the whole of Edinburgh, creating a real buzz throughout the city.


The challenge for the performers is getting audience members through the door.   When they aren’t performing, they can be found in the street handing out leaflets and flyers to anyone who walks past.  Every surface is turned into a billboard, with hundreds upon hundreds of advertisements pasted on it:


A pillar with lots of flyers stuck to it

How do you get heard?


The challenge is that since there are so many leaflets, posters and flyers – people largely ignore them.  Notice that in the picture above, there isn’t a single person paying attention to the posters—they are all walking past.    Do any of the posters particularly grab your attention?  I’m guessing not…


In saturated markets like this, it’s difficult to get noticed. Walking down the Royal Mile in Edinburgh—particularly towards the end of the fringe—is an interesting experience.  You are accosted by performers expending the last bit of energy they have trying to convince you to go to one of their final shows.   Understandable given they’ve been doing this for nearly a month solid. The local council put large recycling bins out, so that people can quickly ditch and recycle the handful of flyers that desperate performers foisted into their hands.   What a waste!


In situations like this, the smart money is on doing something disruptive. Something different that grabs attention. 


Whilst walking through yet another busy street in Edinburgh, I could hear gasps; people saying “What on earth is that?”.   I was intrigued …  I went over to take a look:

Debate: The danger of “negative” requirements

I recently read a thought-provoking thread on the IIBA LinkedIN forum, relating how Business Analysts should state requirements.  The conversation centred around one main topic:  Is it advisable to define a system by stating what it shouldn’t do, or should these “negatively framed” requirements be avoided. 


There’s no hard and fast answer on this, and there are certainly varying views on what constitutes a well-written requirement. However, I firmly believe that it is highly preferable to define systems in terms of what they can do (rather than what they can’t).


I’ll illustrate this with a real life—but non IT—example.  I live in Portsmouth, on the south coast of the UK.  Near the seafront, there’s a really steep hill.  Given the number of pedestrians that are in the area, cyclists are banned from using the path downhill (as it’s likely they’d lose control and collide with an unsuspecting passer-by).  A picture of the start of the hill is shown below. Trust me, it’s much steeper than it looks in the picture!


Painted text on the ground saying "No cycling, no skateboarding"

Should negative requirements be avoided?


When it comes to projects, are you picking the winners?

Innovation is critical for organisational survival.  Whether it’s a new product innovation, or creating better and slicker processes, organisations need to adapt to stay afloat.  They need to find new ways to reach out to their customers, and new ways of meeting (and exceeding) their customers’ needs and expectations.


This creates a dilemma.  Knowing which project, product or initiative to progress next is a tricky choice.  There’s often no absolute “right” or absolute “wrong” answer, and as a result people often rely on experience to make decisions.  A recent HBR article showed that on average, marketers rely on intuition and “gut feel” over data, for 89% of customer-related decisions.


Stack of foreign coins

What is the ROI on your projects?

A rich source of data could be available to organisations to help them make decisions.  One area where useful data can be found is the examination of benefits realised by previous projects.  That is, understanding which new products and projects delivered the business and customer value that was anticipated, and which didn’t.   This boils down to understanding which projects were good investments and which were bad.


The discipline of “benefits realisation” is thoroughly ingrained within the project management profession, yet statistics show that organisations rarely define and track these benefits.  Figures from 2011 show that in the Insurance industry, measurement of benefits happens in only 25% of organisations.   Another way of putting this is that in the insurance industry 75% of organisations have no idea whether their projects have delivered any benefits at all!


Business woman with arms crossed

The tough truth: Your stakeholders don’t want a BA

I’m pleased to say that my most recent blog article has been published on “”, where I have contributed as a guest author. This article focuses on business analysis from the perspective of SMEs and stakeholders… and has caused some controversy! I’d love to hear what you think, so please take a look and add a comment… 

Business-person working at the office

Is your data deceiving you?

  • Adrian Reed 
  • 3 min read

It takes a lot for a “tech news” article to grab my attention these days, but I was astounded to read that an estimated 83 million Facebook accounts are either fakes or duplicates.  That’s one in every 12 accounts!  A sobering thought for those that use Facebook to connect with new contacts, prospects and acquaintances.


It’s easy to point the finger at companies like Facebook.  You can imagine media commentators lambasting the Social Media group: “How on earth could they have so much incorrect data!”  It’s a valid question, but these figures do need to be taken into context.


Business-person working at the office

Could your data be deceptive?

Let me ask you this question:  If you work for a corporate organisation, how “clean”, accurate and up-to-date is your data?  How much would you estimate is outdated, incomplete or just plain wrong?  Research by IBM shows the average is close to 23%.  This is mind boggling!  Are you better or worse than average, and more significantly, could your data be so wrong it is deceiving you?