
We grew signups from The Craft 50% year-on-year — yes, even as SEO traffic got tricky. Below, we share the A/B testing playbook that made it all possible.
by Karen Robinson, Joe Martin, and Ben Ice
I’m sure you’re no stranger to this type of team discussion — a healthy debate between your editorial, design, and marketing colleagues about the best way to structure your website and content. The goal? Catch attention and convert it.
Here at The Craft, we’re no different. So — how do we keep everyone happy, and ensure we’re putting out quality, beautiful articles that still meet our marketing goals, without being too shouty? How do we showcase the wonderful work people create with Shorthand, without being too salesy?
To put it more bluntly, how do we choose a winner when we’re arguing about a small piece of copy or design?
What difference did including a 'read time' make to our bounce rates? Find out below.
What difference did including a 'read time' make to our bounce rates? Find out below.
A/B testing over the last 12 months has helped us make more informed decisions and drive greater business impact from The Craft. Here’s how we got there — and how the lessons we learned can help you, too.
In this guide, we cover:
It's the fastest way to publish beautifully engaging digital magazines, reports, internal comms, and more.
The Craft, Shorthand’s digital magazine, has a number of different goals.
Those final two points are where our optimisation is most important, and measurable.
Each story on The Craft is a serious investment — we’re talking thousands of dollars in writing, design, and ongoing upkeep. So, it makes sense to ensure every article pulls its weight. That means not just bringing in traffic, but keeping readers hooked and nudging them to take the next step. When we get it right, the payoff is huge — some of our best performers deliver more than 500 signups a year, long after they were first published.
But it’s just as easy to back the wrong horse with a piece that doesn’t land or a design that doesn’t convert.
That’s where A/B testing comes in.
Simple additions like buttons and G2 ratings can make all the difference — but it's important to test before rolling out.
Simple additions like buttons and G2 ratings can make all the difference — but it's important to test before rolling out.
By experimenting with what works (and what doesn’t), we can fine-tune individual stories and roll out those wins across our whole library. Given how much we invest in The Craft — and how simple these tests are to run — it’s a no-brainer.
Diving into A/B testing has proven to be a timely move. With search traffic on the slide and algorithms doing their usual dance, relying on Google felt like a shaky strategy. By focusing on conversion instead of just clicks, we’ve made sure every visitor counts — turning readers into subscribers, customers, and long-term fans. In a world where traffic is never guaranteed, that shift has helped futureproof The Craft for whatever comes next.
Digital publishers love to think we know what works — great writing, clean UX, beautiful design. But when it comes to CTAs and user journeys, we’re often just making educated guesses. Testing lets us challenge our assumptions by comparing like for like — rolling out tweaks and variants across two (or three, or four…) otherwise identical pieces of online real estate, then tracking how user behaviour changes.
With clear goals, the right tools, and a little curiosity, you can spin up tests in no time — and start proving (or disproving) your hunches fast.
Take your website’s buttons, for instance. Why that colour? That wording? That spot on the page? Are you sure they couldn’t earn more clicks or signups if you mixed things up a little? In our case, a simple colour change boosted conversions by 9%.
Which button colour drove the most clicks? Find out in our quiz below.
Which button colour drove the most clicks? Find out in our quiz below.
And that’s just one example. A/B testing — also known as split testing — has helped us uncover all sorts of wins. Here’s how we did it.
When embarking on A/B testing, it’s important to be clear on what you want to improve — signups, time on page, click-throughs, conversions, or something else. A test without a clear goal is just noise.
For The Craft, our goal was simple but ambitious: lift our signup rate by 30%. That meant focusing on the moments where readers decide whether to stick around — the buttons, forms, and CTAs that turn casual visitors into subscribers. With that target in place, every design tweak and experiment had a clear purpose: getting more of our readers to take that next step.
With our goal in place, the next step was finding the right tool to make it happen. There’s no shortage of A/B testing platforms out there, but we wanted something powerful, straightforward, and sensibly priced. After comparing a few options, we landed on Convert — a platform that ticked all the boxes without unnecessary bells and whistles. With help from our developers, we got it integrated and our conversion actions properly defined, so every signup was tracked from day one. Getting that setup right meant we could spend our time testing ideas, not troubleshooting tech.
Installing Convert into our Shorthand stories was relatively straightforward. If you’re thinking about getting started with A/B testing your own Shorthand content, chat with our tech support team about integrating your chosen testing tool into your custom theme. And, if you’re looking for an affordable testing platform, take a look at Convert, VWO, or Optimizely.
The Craft is produced by a talented team — but when it came to fine-tuning our designs, layouts, and calls-to-action for better conversions, we knew we could use a fresh perspective.
Enter Melody Chan, a conversion optimisation specialist who took a detailed look under the hood. Melody ran a full audit of The Craft, analysing our top-performing posts, traffic data, design styles, and the ways we invite readers to sign up for Shorthand.
After completing her audit, Melody proposed ten initial optimisation tests based on the LIFT model — each designed to either increase urgency to act, boost relevance and clarity, or minimise distraction and anxiety.
The LIFT model demonstrated. Image courtesy of Conversion.
The LIFT model demonstrated. Image courtesy of Conversion.
Various A/B test suggestions, prioritised with expected impact weighed up against amount of work to implement.
Various A/B test suggestions, prioritised with expected impact weighed up against amount of work to implement.
With ten potential tests on the table, the next challenge was deciding where to start. We rated each idea based on two factors — its expected impact and the effort required to set it up — then combined those scores to create a simple priority ranking. That gave us a clear roadmap for which experiments to tackle first.
For each test, we created duplicate Shorthand stories and made our tweaks to the variants, keeping the original story as the control. In most cases, we ran three versions in parallel: the original plus two modifications. From there, we set up split URL tests in Convert, which automatically directed equal amounts of traffic to each variant.
Once everything was live, all that was left to do was watch the numbers roll in — and see which ideas actually moved the needle.
Once the tests were up and running, it was time to see what stuck. We monitored results in Convert, watching key metrics like signups and engagement to spot early trends — but we resisted the urge to call winners too soon. Each test needed enough traffic and time to reach statistical significance, so patience was part of the process.
As data finally came in, some results were expected, while others completely surprised us. Small tweaks — such as changes in button colour — sometimes had outsized effects, while a few ideas we were confident in turned out to have little impact. But even those ‘misses’ were useful: every test taught us more about how readers interact with The Craft.
Those insights became the foundation for an ongoing testing strategy — one that continues to make The Craft sharper, smarter, and more effective with every round.
Dashboard results showing the success and statistical confidence of a test variant.
Dashboard results showing the success and statistical confidence of a test variant.
Testing button colours.
Testing button colours.
Once we knew what worked, we rolled those wins out across The Craft. Each test taught us something valuable — about copy, colour, placement, or timing — and applying those insights more broadly meant compounding improvements over time. A small lift on one story could translate into hundreds of extra signups once scaled across our library.
These days, testing is just part of how we work. We keep at least one A/B test running on The Craft at any given time, constantly refining how we engage and convert readers. And it’s paying off — a year on, we’ve hit our original goal of increasing signups by 30%, and last month alone we were up 50% year over year. The experiments never really stop, but that’s exactly the point — each one brings us closer to making The Craft the best it can be.
Here are some of the variants we A/B tested on The Craft. Test your knowledge — take a guess which variants performed better.
It took us a few rounds of testing before we landed on the right mix — realistic goals, the right candidate articles, and enough traffic and cadence to start seeing meaningful results. To give you an idea, here are a few things we tested early on without getting any actionable insights:
So, prep your hypotheses and start testing. Even if those first results (or non-results) don’t teach you much, you’ll still learn how to test better — and that’s what gets you to the good stuff.
We were overambitious in our first round of testing — running five different experiments, each on its own individual Craft post. Every one of them came back inconclusive. There simply wasn’t enough traffic hitting each variant to produce meaningful results.
Now, we run one test at a time across multiple posts — giving us bigger audiences, clearer data, and far more useful insights.
Ask yourself:
Our successful testing format took 6-12 months to finesse, but is now motoring along fuelled by volume and frequency.
With the above lessons in mind, we now perform one test at a time, applied over a selection of 10 high-traffic articles. This guarantees we’ll get enough traffic through the tests to have a material result.
The gains you uncover through testing may be incremental. To achieve real results, running more tests, more often, will help drive noticeable improvements over time.
We run each test for 30 days, during which we meet as a team to plan the next one. This helps minimise downtime between tests, which we cap at seven days.
Now that we’ve landed on a solid testing format and a reliable set of ‘lab rat’ articles, speed to market has become a priority. It’s our way of futureproofing The Craft — making sure our content keeps delivering, even as search patterns do their little dance.