Putting an agile method and data-driven marketing together creates faster overall budget optimisation
In my last article, I referred to establishing a start-up marketing team with an agile operating method. I should confess, I skimmed over perhaps the most important aspect: they already had an agile operation within the dev ops team; so the agile methodology was already understood by at least one part of the organisation. Also, at least one of the founders (CTO) has experience in agile teams. The marketing team was by comparison, less structured. When we began, cadence wasn’t measured, so I can only report whether it is now better or worse by opinion, but I do know that top-line growth has accelerated as a result of the additional testing and learning we’ve executed. We’ve doubled the number of tests we are doing since we began measuring and got much better metrics from the channels.
I know the number of tests is not the most refined measurement of business objectives, but it is for agility and the outcome of accelerated growth for the business makes the marketing team now heroes! How they did it; we’ve put some key elements together we believe are critical to the success the team have achieved:
It’s a pretty diverse group of experience in the marketing team. Very few people had worked in start-up teams before. Some had never been so directly responsible for their channel success. This meant we had to have a very respectful, but dynamic team culture, keen to discuss and exchange ideas and hypotheses. Essentially, we were asking many different business questions and these all needed to be recorded before we could prioritise which were likely to be the ‘business critical’ vs the ‘nice to know’ data driven insights.
We have a healthy pre-occupation with the KPIs, their weekly reporting and the need to show up at the weekly review meeting with a good knowledge of the numbers and report on what worked and what the next test might be for this week.
We call it the ‘data backbone’ – it’s the infrastructure that means the data is ‘always on’, building a warehouse of datapoints for us to derive insight from trends and patterns, codified for all to share in a stable cloud/API architecture instead of the spreadsheet culture of handcrafting reports. For this instance we use Amazon AWS for storing data, xplenty for ETL, Klipfolio for dashboarding. (Incidentally, we’re about to trial DOMO on a replacement infrastructure like this with a hedge fund client). This always-on architecture means the KPIs are reported consistently with minimum scope for human error.
Seeing is believing and we saw that when the team has visibility on each channel, the trends and relationships are better understood, because they are discussed openly and different ideas could be exchanged amongst the team.
Frequent, on-demand data analysis
With good visibility, good data access, heathy respect for ideas and hypotheses, the insights or investigations requested increased in number, providing the analyst with plenty of interesting challenges that were stored and prioritised on the TestBoard to manage the cadence.
Weekly Testing Schedule
Without an agreed testing schedule, the ‘nice to knows’ creep in and this is the importance of the team’s peer-review effect; the act of verbally stating to your team mates, the test and the reasons for it, ensures that test are done in value priority order and are planned for serial and parallel testing to maximise testing sprint periods.
This is so important, I’ve repeated it, because you may have dismissed it as idealisticclap-trap the first time.
It’s important to maintain an open approach to asking questions, respectfully exchanging and discussing ideas for subsequent data analysis. Alternative or fresh perspectives brought into a team can so often help in model build and refinement for uncovering insight from the business’ data.
Reflecting upon the method described above, I started researching whether others were thinking or writing along the same lines. I found this article from McKinsey (with a really helpful film) demonstrating how in parallel, we’ve been working on establishing agile marketing teams with a very similar approach. I’m glad a consensus might be emerging in this way, in particular, the rules of intensity in ‘making promises to your colleagues’ on daily tasks.