I recently posted Big data projects in Marketing are surprising. Constantly (Part 1) explaining that, from my experience, big data projects in Marketing were a do & learn approach. I was highlighting the importance of aligning stakeholders on a common definition of big data.
In part 2, I am still referring to the same project: Marketing campaign effectiveness in a bank.
This time, I am asking 2 fundamential questions:
How can marketers move from aspirations to operations?
Will we ever able of understanding customers’ behaviours?
The bank was achieving very low results from doing some kind of generic marketing campaigns. We believed that combining this unstructured data with other customer data would help the bank gain a greater insight.
“The question was where to focus?”, I said earlier in part 1. Our working sessions with stakeholders showed that the multitude of potential use cases brought a range of choices that were often confusing or even conflicting for marketers.
So, this raised a new one. An important one: “How can marketers move from aspirations to operations?” And, then, a second one: “How to select and prioritise the right use cases for business value?”
For example, we discussed in detail several use cases, such as customer insights, segmentation of the customer base, churn prediction, multi-channel interaction analysis, and next best action. We also talked sentiment analysis and social media analysis. Obviously, the use cases we would select had to be related to the data available internally. Purchasing external data for specific campaigns remained in scope however.
Such question requires to identify the most important burning issue that we wanted to solve and focus on business value. This would come both from data and from the use cases.
First, we focused on specific pain points and areas in order to understand which existing processes and solutions could be improved by combining structured and unstructured data. We leveraged those findings to identify specific business improvement opportunities. It required to measure the right thing and not everything. The outcomes were a long list of candidate use cases.
Next, we shortlisted and prioritised those use cases. For prioritsation, we decided to avoid the “nose-guessing” approach – too much un-professional in my view -, and be more thoughtful and structured. We established an evaluation framework against which all candidate use cases be assessed. This exercise was based on the data exploration we performed previously and the criteria and associated weights we had agreed for prioritizing use cases.
All use cases were weighted against criteria such as expected benefits, customer value, strategic value or the automation potential. We also looked at the technical complexity and ease of execution: expertise level required, the data formats and quality, the data source complexity, the complexity of a use case, or the data structure. It was interesting to see that use cases have different intensity. While a 360-view of the customer is very storage intensive, others like Next Best Action may be more compute intense.
We eventually wrote down a prioritized set of use cases. We wanted to select and solve the right problems that have most value to the bank.
In our case, it was marketing campaign effectiveness. Marketing wanted to understand new correlations and improve their campaign strategy. They wanted to better segment the customer base and identify the right customers for promotional campaigns. They also wished to quickly create and deliver service and product offers to the right channel, a kind of Next Best Action functionality. The solution would then analyse the campaign results, allowing marketers to fine-tune the existing campaigns or create new ones.
Recommended for youIs your organization ambitious enough to deal with the new customer engagement narrative?
We often claim that it is now possible to “fully know” customers. Technology translates data into a single view of the customers providing new insights into their needs and behaviours. It may sound promising. It is true that organisations have never had so much data about their customers. Yet, as we have seen, the challenge is to extract value from it.
A fundamental question came: Will organizations ever be able to understand customers’ behaviours? I mean in all the human complexity.
To be even more provocative, I would add “Should we accept the fact that we will never entirely know the complexity of customer behaviour?” Almost certainly, I would argue.
Today, we may rely probably too much on data, analytics, and on automation. We are often trying to obtain a 99%-accuracy in all our statistical approaches.
But, interestingly, we observe that data does not translate automatically into useful findings.
Data accuracy and internal siloes may be two of the reasons. Underestimating the human touch might be another one. In most cases, we have to admit that only an 80%-confidence rate is enough for marketers to discover new correlations and patterns. I would say that it not about deciding whether we should go “statistical” or “human”, it is about integrating technology with a human touch. Won’t you agree?
Had we been too technical-focused, intentionally or not, we would have probably missed the business objectives.
From the very beginning, it did not take too long to understand that aligning everyone on a mutual definition of big data, what it meant for the bank, and what was our ambition level was critical.
And, while it is true that big data projects are about data – all sorts of data, the focus should be on business value. Ultimately, we knew we had to find possibilities to generate value both from both the data and from the use cases. In this approach, the evaluation process we used for selecting use cases greatly helped the bank focus on solving the issues that had most value to it.
I think it is easy to understand why big data is increasingly recognised as a key value driver by most marketing executives. It is not surprising either to see that, as a consequence, it attracts larger budgets today, even in the current business environment. This results today in a growing number of initiatives. But even now, they are often taking place at a tactical or department level. In short, organizational siloes are still slowing down progress, despite the implementation of datalakes and MDM solutions.
During this project, I began to realize that big data initiatives have a centralizing power. They really challenge how organizations are structured. and how they work and think.
They force marketing units to rethink their customer engagement narrative. They drive organizations to develop strategic views on data in general (probably also enforced by regulations like GDPR to be fair). They push them to create a roadmap to improve their maturity and capabilities overtime. They force them to pay attention to skills. Using big data is not a pre-requisite for success. An organization needs to have skilled marketers and data scientists to grasp the benefits that can be achieved from big data solutions.
Central to the discussion is the maturity level of the organization. So, a final question to ask is “how can business leaders evaluate the maturity of their organization?”
Considering that big data initiatives may provide a snapshot on how organisations may look like in the near future, this last question about the maturity level can quickly move up to the top of the agenda of most executives. It should.
I hope you enjoyed the article! Please leave me a review, I love reading them!