Skip to main content
18 Sep 2020

Adapting to the times: thoughts and practical recommendations from an analytics perspective

Adapting to the times: thoughts and practical recommendations from an analytics perspective

We all crave trends at the best of times. And when uncertainty strikes and massive potential structural resets in economic and customer behaviour loom, it’s natural to question how data should help us to manage the transition.

We all crave trends at the best of times. And when uncertainty strikes and massive potential structural resets in economic and customer behaviour loom, it’s natural to question how data should help us to manage the transition.

Traditionally, analytical maturity takes us on a journey from describing data through to using its history to make predictions and then automated prescriptions.

Neither that journey nor the long-term commercial imperative behind it is gone. But in times of significant change, the requirement to revisit our priorities and assess the value of our data assets – past, present and future – becomes equally critical.

The Power of Description

Never has the “boring” stuff – data definitions, data accuracy, latency, statistical significance of trends, avoiding misleading visualisations – been so high profile in the public mindset.

In our business mindset, knowing where we are right now should be equally high profile.

The reality is a lot of the time we risk cutting corners with solid descriptive analytics (reporting, dashboards, data storytelling, data literacy, business intelligence – choose your favourite expression of the month!) simply because a lot of things don’t change that often in the market and we don’t really care to check in on them that regularly as a result.

When all bets are off, simply knowing something is changing becomes the first step of empowerment to be in any place to do something about it, whether tactically or strategically.

And when those changes could be quite granular and fast moving in nature (e.g. massive relative swings in different product lines, customer segments and geographic markets) the accuracy and visibility of those trends at the most senior level is critical in terms of impact.

It’s easy to watch top line vanity results with excitement or fear while failing to effectively track leading indicators that would reveal invaluable knowledge to help steer a business. A mature approach to business intelligence right now could reveal pivotal indicators that point to further insight paths.

The Importance of Understanding

Data visibility (and indeed literacy) alone does not cut it though. Our confidence to make decisions comes from understanding, not just the data itself.

The fast application of context is what bridges that gap. Human experience teaches us more than we often give ourselves credit for: the historical understanding of previous recessions and recoveries combined with customer empathy derived from a fundamental empathy for society. Equally, human thinking is prone to cognitive biases that can easily outweigh the value of that experience.

Skilled analysts and data scientists that can think both in mathematical, human and business domain terms have a critical role to play in framing both the data itself and the analysis. And those engineers and architects orchestrating the accuracy and timeliness of that underlying data have an equally critical role in preserving its transparency and reliability.

But understanding also goes all the way back to measurement design and the interaction with customers. A lot of recent obsession in personalisation vs privacy has been related to profiling based on behaviour to try and figure out what kind of person someone is without them revealing it.

Sometimes it’s right to just directly ask the audience if you can deliver something more relevant in return to that segment. I don’t need you to understand me fully as an individual to acknowledge my needs are different to someone in a completely different group.

Reframing our Predictions

Statistical models, machine learning and artificial intelligence approaches are not going away. But they all implicitly feed off history, and some of them may need some pause for thought and/or retuning.

In large volume and fast response data scenarios, machine learning algorithms may be able to start relearning relatively quickly and the focus will rightly be on sensitive tuning and monitoring.

For models looking to predict further into the future, the extensive previous history they rely on may no longer be as relevant and new external variables may be arising.

Overall, expect some retraining and, particularly for black box unexplainable algorithms, a risk of failure to converge using existing longer-term models. Our recent white paper on AI vs AI Hype is still highly relevant now and worth a read.

Set against that, the reality is that we make a lot of subconscious predictions all the time: simple assumptions that some things do not change. That prediction is often derived from data that we don’t even feel we need to regularly consult; it just becomes an accepted (perhaps half) truth.

Scenario planning and being prepared to use data over gut as new operating environments emerge will be critical to medium and long-term survival and success.

And forecasting will remain super-critical too; we’re just going to need to revisit it more often, consider more variables, and keep our interpretation grounded in the context of the business and indeed the world.

Planning for Change – 5 Practical Recommendations

A lot of statements written now risk not aging terribly well. But I’ll do my best to be as timeless as possible with 5 practical recommendations to get ahead of the curve, whatever direction it might be pointing in:

  • Start actively measuring what you don’t expect to happen. This has always been a great approach to valuable diagnostic analytics (e.g. detailed error tracking and classification), and it’s highly likely we’re going to need all the diagnostics we can get our hands on right now.
  • Identify what important trends you don’t regularly review right now based on prior stability and make them visible and make them granular (i.e. at market, segment, product/service line levels).
  • “Past performance is no guarantee of future success”. Make sure any automated models have read that small print, or even better those mapping the requirements and designing, (re-)calibrating and deploying them.
  • The interaction between directly asked customer data and behavioural data (particularly digital) has always been a sweet spot: the who and the what. Now more than ever it’s critical for marketing, compliance, technology architecture and measurement design plans to be aligned at maximising that as a critical data asset for an imminent future of change.
  • Relevancy is important, but remember relevancy is in the eye of the beholder. You can be highly relevant to me based on understanding one thing about me that puts me in a different group needing a different response. Don’t set “one-to-one personalisation” as your stretch target (do you really have a different relevant message for every one of your customers?) and then fail to deliver decent segmented marketing while that sits on the project (back) board.

It’s tough out there right now. I hope that statement does age rapidly, but irrespective we’re here to help in good and bad times, and if we don’t think we can make a difference we’ll always be honest about it. We’re always open for a chat, just drop us a line here.

 

View all News
Loading

TICKETS NOW LIVE!

27-28 NOVEMBER 2024 | EXCEL LONDON

GET TICKETS