• Tony Grant

Building A Culture Of Experimentation

Whilst running client CRO campaigns we’ve discovered that when teams adopt certain habits, conversion rate optimisation is far more successful. In this article, I discuss four mindsets that, if introduced and reinforced, lead to a favourable environment for conversion rate optimisation practices to thrive.


There is also an updated interview with fellow CRO expert.

  1. Customer-centric approach

  2. Build a culture of optimisation

  3. Data drives decisions

  4. Failure is not a failing


Read on to find out how incorporating these theories inspires growth.

Customer-centric approach One of the most common questions we’re asked when talking to businesses is ‘how do I increase my conversion rate?’ Answering it is very difficult. It’s highly dependent on factors that differ between company, sector and goals, and isn’t always the best thing to concentrate on.

Our approach focusses on learning about the customer. Different customer types, on how, when and why they interact with your site and its content, as well as how the customer interacts with search campaigns.

Build a culture of optimisation You have to keep understanding your customers and making improvements in line with findings to be successful. To do this, you need to shift perceptions with-in your team.

Build a culture of optimisation, where ‘improving performance’ becomes part of their cognitive process. As long as goals are communicated effectively, your team will remain focussed on delivering quality results time and time again. Try it. I've seen huge improvements in performance when companies embrace a culture of continuous improvement.

Data drives decisions Data is very powerful. It produces concrete metrics that can be analysed to judge a website’s performance. Empirical data is arguably more valuable than anything else when looking at ways to solve problems with new and innovative solutions. A solid benchmark of data gives the best starting point for identifying and prioritising areas to improve

Failure is not a failing When you enter the world of testing some things won’t turn out the way you expected. This should not be seen as a failure. Testing produces data. As we know, data is invaluable to good decision making, so testing, no matter what the result, is crucial. Any data that helps us make the right decisions in the future is always a big benefit.

Here is a recent interview with fellow CRO Specialist Rommil.


Hi Tony, how have you been? Thanks for taking the time to chat!

Thank you for inviting me to talk about experimentation. All is as well as it can be in these unprecedented times. Week 5 of lockdown for us in the UAE, keeping safe and staying busy, learning new skills, reading and exercising.


Could you share with us a bit about what you do at Informa and your career journey thus far?

I’m a CRO specialist, enabling companies to improve their online sales and increase opportunities at lower costs. After completing an MSc in Marketing Comms, I joined a wonderful digital agency called CandidSky as an account manager. Very quickly started to notice that many companies didn’t make decisions based on data available to them. It was then I immersed myself in the world of ‘CRO’, reading whitepapers, taking courses, speaking to people in the industry and of course, trying and failing.


After a few years at payments company Worldpay based in London, I moved to Dubai to be the CRO specialist at Informa Markets. My main responsibility is to enable experimentation and drive a culture of test & learn across the division. I'm sure we'll come on to the how.


As someone that is driving a change of culture towards a data-driven “Test & Learn” one - can you share with us how you convince skeptics and get people excited about Experimentation?


If your key business purpose is growing customer value, competitor advantage and sustainable profitability, there is good chance experimentation will help. You have to be excited and enthusiastic about what you do and believe in it, which you can say about anything you do.


I try to understand and learn why that person might be sceptical by collating data such as, how mature their digital marketing strategy is, have they tried experimentation before? What did they do? How did they go about it? And then drilling down accordingly. Depending on the answers given this directs me down the right route to take. The key is keeping the questions open-ended.


For example, simply asked how their conversions are performing, 99% of people will say it could be better and/or improved. This leads to, what do they think the issue is? Do they know?

If they know, as an experienced optimiser you might be able to identify some areas to look at. If they don't know, then simply demonstrate there are methods we can use to find out, provide previous examples.


Being honest, authenticity is absolutely key. Always leave people with a simple action/tip they can try quickly and easily with a low barrier to entry. Always extend the offer to help based on the information you collate.


I see you’ve spent some time at UserTesting. What role does user testing plays in Experimentation?

User Testing, in my opinion, is underused by companies. Often in projects such as a new website or redesign project, this step is missed, or deprioritised due to time/budget constraints.


User testing compliments web analytics and is essential for any form of research. User testing helps to figure out 'why'. Why are people doing what they are doing? What are the bottlenecks? You can then spot patterns and begin the process of minimising perceived barriers creating a stronger hypothesis.


Documentation is a cornerstone of a well-run experimentation practice. What are some of the key details you capture?

Agreed, an extremely important aspect of an experimentation programme. Primarily to aid adoption and scalability of a programme. Apart from the obvious variation screenshots, hypothesis, date and results. We tag the information in a certain way.

  • Behaviour type

  • Segment

  • Template page type

  • Product/service

  • Region

  • Market

This allows teams to view what has or hasn't worked. The main purpose for us at the moment is to help springboard teams who may be new to CRO. Providing inspiration and educating them to run their own tests and experiences, again reducing the entry barrier. As we mature, it will become a playbook for future initiatives.


How do you ensure that the two practices - Experimentation and User Research complement each other efficiently?

The two go together, you can't really do one without the other (you can, but we all know how that goes). You need solid research to inform what you are going to experiment on, and you need to validate the research is correct.


Time is split differently to support these activities

  • Enablement and adoption, leaning towards driving tests, ensuring tech stack is working properly, are we up to date with industry practices, improving frameworks, processes, alignment with business goals and the PR internally.

  • Research, improving knowledge, up to date behavioural insights, to increase the quality of experimentation. We also provide training, workshops, and knowledge share.


How do you decide how many Experiments to run per month? What is not enough and what is too much?


Completely depends on the show size. 6 months from the show obviously the marketing teams are in the planning & strategy phase. The traffic volume and conversions are much lower during this period of the cycle.


There are two metrics essentially and a few variables we look at.

  • Metrics inc. number of users (not sessions) and number conversions.

  • Team maturity, bandwidth and what support do they need in place to be set up for success.


Based on the data above, we can begin adjusting the strategy accordingly. The number of tests, areas of interest, hypothesis etc.

There are guidelines in place to help teams inform test cadence to avoid false positives/ negatives.



As a practice leader, how do you measure the performance of an Experiment program?

Data is only powerful if it’s converted into insightful customer knowledge, business intelligence, and informing an ever-improving value proposition.


Formulate a maturity model and benchmark the programme against the criteria you deem important. We have quite a robust maturity model with criteria such as management buy-in, technology, people skills, experiment quality.


This assessment allows us to understand the different show's needs in terms of support. Do they require more training? What are their awareness levels? Is technology holding them back? Holding these conversations with senior stakeholders helps us to strategically grow the programme.


In your opinion, what goes into an Experimentation roadmap or vision?

The experimentation roadmap needs to be underpinned by a business and behavioural objective. Business objective means staying close to the company and understanding what success looks like 'defining success'.

The behavioural objective would be along the lines of are people utilising the right methods, are experiments impacting behaviour and are we learning as a company.

The roadmap is then formulated to deliver that vision. Test quantity, areas of interest, type of tests, segments to target and aligning with marketing and product sprints to maximise ROI.


It’s time for the Lightning Round!

  • Bayesian or Frequentist? Depends who I’m speaking to ;)

  • If you couldn’t work in Experimentation what would you do? Sky Sports Anchorman (but not like Ron Burgundy, maybe the suit)

  • What is your biggest Experimentation-related pet peeve? ‘I did a test once and it didn't work’

  • I see you’ve moved from the UK to the UAE. What food do you miss most from home? Mum's shepherd's pie

  • What is your favourite discovery in the UAE? The history of the region. I didn't know much about the UAE before moving. The culture is fascinating.

  • Describe Tony in 5 words or less. A man with no shins


Related Pages

Conversion Optimisation Minidegree with CXL - Learn how to run an experimentation program

4 views

© Smarter Web Conversions 2020