The Industry’s Take on Conversion Rate Optimization: What Works, What to Expect, and How to Maximize your Testing Efforts

Earlier this year Econsultancy conducted a survey related to conversion rate optimization (CRO), which we at 6D Global took part in, and which was published in the Econsultancy Conversion Rate Optimization 2016 Report.

Firstly, if you’re not familiar with conversion rate optimization (CRO), here’s a brief overview; alternatively, if you are already familiar with CRO, skip this paragraph. CRO has many names – A/B testing, website experimentation and so on – all with the same concept - to run controlled experiments on your website with actual website visitor participants, tracking the differences between original and variation(s) to determine if those changes you’ve made have a statistically significant impact on site performance.

Below are some insights and findings from the Econsultancy Conversion Rate Optimization 2016 Report with our own additional explanation and elaboration.

CRO Really Does Work

The numbers speak for themselves: “Seven in ten (71%) companies [doing CRO/testing] have seen their conversion rates improve over the last 12 months and 72% have witnessed a ‘significant’ or ‘small’ increase in sales conversions since 2015.”

That said, a word of caution: CRO is not for everyone. One of the largest dictators of testing is traffic. Not just overall site traffic, this relates to traffic to particular parts of the site, number of variations per test, and the audience of the test (like mobile only or new visitors only). What’s the magic number you might be asking? It depends, but generally, we won’t encourage a client to start testing if their site is receiving less than 30K - 50K Unique Visitors per Month (although the more the better).

Another point from the report is that 73% of those companies that increased their CRO budget saw improved conversion rates, a clear correlation between investment and results.

Traffic is one thing – another major contributing factor is investment. If a client isn’t willing to dedicate the time and resources required to engage in testing, the outcomes will be similarly low. Sound testing takes time to ideate, prioritize, build, launch, and monitor each test. Our recommendation is simple: don’t start testing unless you’re really ready to make a committed, concerted effort and investment.

Not Everything is Sunny with CRO

Although the vast majority of companies conducting CRO are seeing an increase in conversion, the proportion of those who are either ‘quite’ or ‘very’ dissatisfied with their conversion rates has increased by 8% since last year. Speculation for this in the report points to resources, which is cited as the most significant barrier for testers but also ties back to our previous comment on perhaps making a half-baked investment in CRO initially.

From our experience, it is common that dissatisfaction may be due to improper expectations of what CRO can do and in what timeframe. When discussing outcomes with an agency, freelancer, member of your team, or your executive, make sure proper expectations are set. If anyone is expecting wildly improved metrics in a very short amount of time, that’s a red flag.

Testing is much slower than out-right website development due to test building, QAing, launching, monitoring, reporting, and closing. Each step is important to ensure proper testing methodology and therefore, builds great confidence that the results we are seeing in the test are directly correlated to the changes we have made.

More Isn’t Always Better

Some people believe to maximize return, run as many tests as possible. That’s in fact not true. While there is an increase in the proportion of testers running more than 3 tests, when you correlate this to results, those doing lots of tests are seeing their results either reduce or the satisfaction levels drop dramatically. Interestingly, it is suggested that the sweet spot appears to be running about three tests a month.

This is likely due to approach. Those conducting “as many tests as possible” are likely conducting CRO with an unstructured “spaghetti-on-the-wall” approach, i.e. running tests for the sake of running tests. From our experience, the best results from and satisfaction with a CRO engagement are achieved with a strategic, methodological approach to testing, with which one can be more confident in individually ran tests and the overall testing engagement potential.

In fact, the report points out that 84% of companies with a structured approach have seen improvements in conversion rates, while that same figure for those without a structured approach is just 64%.

A quote from the Econsultancy Conversion Rate Optimization 2016 Report elaborates on this: “Encouragingly, organizations are more likely to give their optimization strategy the attention it deserves in their bid to improve conversion rates, potentially because they’ve already experienced the benefits of developing a strategic plan instead of relying on guesswork.”

In short, with strategic testing practices in place aligned with proper expectations set and with enough resources to support the initiative, CRO is statistically worth the investment from outcomes to satisfaction and even ROI.

If you are a member of Econsultancy and want to read the full report click here to visit its site; or if you are interested in talking with a CRO professional at 6D Global, click here to get a conversation started.