In many ways it has never been easier to bring data sources together and calculate insights – you can quite literally press a button and ta-dah!  Now before you chortle and say, “not where I work, honestly you should see the data.” There was a time (not so long ago) when you used to have to wake up at the crack of dawn to get your allotted time on the Cray computer as it started up.  Having diligently waited for your turn on the computer you had to wait a very long time for results that these days your laptop can return virtually instantaneously.

But it’s not only computational power that has made huge advances, in the last 20 years the availability of analytical software with Intuitive GUIs, from R and Python through to SAS, SPSS and Power BI have made analytics available to everyone.  These factors combined with increasing automation see results and numbers churning out.

All this means that it has never been more critical to pay attention to the:

  • Data available for use
  • Approach being used
  • Interpretation of the results

Some may find the sometimes slow, repetitive nature of data checking to be irksome; the diligent checking of underpinning assumptions to be tedious.  True, there are aspects of statistics and data science that are slow, that cannot be rushed.  But, for me that is where much of the beauty and enjoyment lies.  There is a happiness that is found in understanding what the data looks like, it’s shape and its properties.  Truth be told, there is an excitement when you realise that perhaps the data isn’t what you were expecting and new avenues to explore begin to open-up.  In an era when we often seem to “fail fast”, might I suggest an alternative approach?  If we want the numbers, values and insights that we are creating to stand the test of time, surely, we should be prepared to invest time and effort in laying the foundations?