Monday Meeting: A/B Testing


Monday Meeting

b testing-01.png

Agenda: A/B Testing


AB testing is an experiment where two or more variants of an advertisement are shown to users at random. As the experiment runs, data is collected. Eventually, the data is analyzed to determine which variation performs better.

Why use it?

Two reasons.

Test assumptions: You have questions that need to be answered for you to move forward. Without spending all sorts of money to find out why you create an AB test for your customers to show you what is more impactful.

Refinement: You want to know the difference between the minute details. For example, are your customers more likely to click on a green or red-colored button? Or, will your customers reach differently with Type A caption or Type B caption.

How to use it?

Pick something you want to test. Build a hypothesis. Test the dang thing! Once you have the data, see what conclusions you can draw. If you were specific in your test, you will have specific results.

For example:

Assumption: Customers, visiting your website, care more about the services you provide vs your recent work.

Test: Your home landing page has two buttons, “services” and “recent work”. You see how many button-clicks there are for both over a given period. Analyze the results. BOOM, you have your answer.

Answer: For my website, data shows people seem to care more about my services than my recent work.

TWO common mistakes?

1) Applying SEVERAL variables in a test. These tests are short and sweet, so be specific so the data shows the results to your hypothesis. With several variables, you have to spend more time analyzing and may realize the data is inconclusive.

2) Using the wrong metric when testing. Most people are okay with running a test for a set duration of time. Tests do NOT care about time, it is about the number of data points that matter, not how long the test ran.

My thoughts…

I love AB testing and figuring out the psychology of a subtle change. This can be as simple as using a word like “wet” vs “moist” in a campaign, the appealing nature of “blue” vs “green” on a button, and so much more. Also, running these tests make me feel like an undercover cop or a behavioral scientist. Seeing the data and coming to a conclusion is the fun part. Test your hypothesis and get an answer!