loading...
markdown guide
 

The foundational step with A/B testing is having a goal based on a measurable, reliable metric.

Before you go changing things you need to know that there's a behavior that's accurately measured, and that you want to affect that behavior.

It's also worth noting that you should be measuring as many user goals as possible. This coverage lets you account for unforeseen negative changes in other goals based on an A/B change.

Tools are important, and there's a decent A/B testing library for most given languages or platforms these days, but the methodology behind it can be easily lost.

 

I've been working in the technical SEO industry for a year and a half now, and when it comes to A/B testing, we at KickAssGrowth work together with the client's development team to communicate all changes in our planned A/B tests.

We used to use VWO for setting up A/B tests we knew we could implement ourselves when it was all about readjusting the layout, but when we needed to add a CTA or a popup to a page, we told the developers to set everything up.

There was an issue where I broke the login & register functionalities by editing the navbar's HTML tag inside of VWO, which resulted in the server side not being able to verify anyone (some checksums were always mismatching because the website always had the same value attached to the buttons; I kinda forgot what it really was), and after a day, I deleted the entire A/B test and had to start over, ditching the entire navbar change...

Classic DEV Post from Oct 30 '19

The Divergence of Open Source Maintainer From Software Engineer

Ben Halpern profile image
A Canadian software developer who thinks he’s funny. He/Him.