You are measuring conversion rate wrong.

1. Conversion rate reporting

Conversion rate is the percentage of visitors who have completed goal Y after completing goal X.

The one that most are familar with, measures the percentage of visitors to your website that complete a desired goal. The conversion rate is usually a low single-digit percentage.

To measure conversion rate, the most common method is to take one number and divide it by another. This is WRONG.

This is one of the instances where we were taught to do the wrong thing and few questioned the authenticity.

Applying saliva to a wound comes to mind. In case you are wondering, the bacteria present in your saliva might hurt you more than heal you.

2. Right and wrong ways of reporting

To put things into perspective, here’s a case study to show the difference between how your digital data person (Tom) and a digital data guru (WAG) handle reporting.

Case study

We want to measure the effectiveness of our product application flow to find out if there is opportunity for optimization. Below is the flow:

  1. Customer lands on the product page.
  2. Customer went on to click “Apply” button.
  3. Customer went on to start filling up the application form.
  4. Customer went on to complete the application form.

We will measure 3 conversion rates:

  • % of customers who clicked “Apply” button after landing on product page
  • % of customers who started application form after landing on product page
  • % of customers who completed application form after landing on product page

Tom, your digital data person

A. First, he extracts the 4 numbers he needs:

  • Number of customers who viewed product page
  • Number of customers who clicked on “Apply” button
  • Number of customers who started application form
  • Number of customers who completed application form

B. To calculate the 3 conversion rates, he starts dividing the numbers on hand:

  1. % of customers who clicked “Apply” button after landing on product page:
    • Click apply button / View product page
    • = 8,921 / 12,773 = 69.84%
  2. % of customers who started application form after landing on product page:
    • Start application form / View product page
    • = 3,666 / 12,773 = 28.70%
  3. % of customers who completed application form after landing on product page:
    • Complete application form / View product page
    • = 2,146 / 12,773 = 16.80%

C. With these healthy-looking numbers, you, the business user, will conclude,

Product application flow is doing well and does not require further optimization.

But the question remains… why does the product sales number suggest otherwise?

WAG, your digital data guru

A. First, he prepares the 4 segments he needs:

  • Customers who viewed product page
  • Customers who clicked on “Apply” button
  • Customers who started application form
  • Customers who completed application form

B. Next, he creates 3 sequential segments for customers who:

  • Viewed product page, then clicked on “Apply” button
  • Viewed product page, then clicked on “Apply” button, then started application form
  • Viewed product page, then clicked on “Apply” button, then started application form, then completed application form

* If you are unsure of what is sequential segment, here’s an article for you:

C. To calculate the 3 conversion rates, he starts dividing the numbers derived from the sequential segments:

  1. % of customers who clicked “Apply” button after landing on product page:
    • View product page > Click apply button / View product page
    • = 4,233 / 12,773 = 33.14%
  2. % of customers who started application form after landing on product page:
    • View product page > Click apply button > Start application form / View product page
    • = 940 / 12,773 = 7.36%
  3. % of customers who completed application form after landing on product page:
    • View product page > Click apply button > Start application form > Complete application form / View product page
    • = 523 / 12,773 = 4.09%

D. With these numbers, you, the business user, will conclude,

Product application flow is not optimal and there are opportunities for optimization:

  • Engagement rate (click apply button) is too low
    • Surface CTA button? Make button more prominent?
  • High drop-off from Click apply button > Start application step
    • Reduce number of steps? Simplify authentication method?

3. Bad reporting = result inflation

Let’s take a look at the numbers reported by both Tom and WAG.

Tom’s numbers were wildly inflated, giving business users a false sense of accomplishment.

Reason for discrepancy

Take for example “Click apply button”.

WAG’s results take into account only users who clicked on apply button after visiting the product page.

Tom’s results take into account all users who clicked on apply button from all possible sources.

Technical != data storytelling

Technical consultants like Tom typically have tons of experience with data collection and tracking implementation.

What they lack is the ability to put data into context and to understand the true definition behind each piece of data collected.

Data misrepresentation

Unfortunately, data misrepresentation is more common than you think.

Data misrepresentation often leads to bad insights, which end up impacting business decision making.

Fallout chart tells it all

If we compare the results to a fallout chart, you can clearly see that the data was never meant to be reported like how Tom did:

4. Conclusion

As business user, if you are unsure whether your reports are accurate, do approach your data person with this new-found knowledge and sort things out.

As data person, if you suspect you have been doing reporting wrong, do take some time to go through your existing reports and revisit those business requirements.

If you need help, feel free to reach out to me and I’ll be glad to assist.

About Zenny Tan Zhong Ming 32 Articles
Let's connect on LinkedIn ( https://www.linkedin.com/in/zenny-tan-zhong-ming ).

Be the first to comment

Leave a Reply

Your email address will not be published.


*