Copy tests that optimise micro-elements can make a powerful difference, but trying to test the relative performance of different language styles is much trickier.
Testing language style or tone with a view to optimising conversions is tricky. Sure, we can often see how adding a word denoting urgency to a call to action (CTA) button can lead to an uplift in conversions, or show that a benefit-led headline on a landing page plays better than a feature-led one.
Many such copy tests have a strong element of common sense. But when it comes to testing the flavour of language to use in, say, your product descriptions, things get harder.
Copy style or tone is notoriously difficult to define and replicate. For one thing, it’s hard to be sure that two completely different sets of copy doing the same job are each written to the same standard, on their own terms.
For another, ensuring that you’re testing the effect of language alone and not also other elements such as design is tricky. And there are reasons to wonder whether the entire enterprise is even the right thing to be testing.
For me, the following two case studies, while interesting and useful, highlight some of these issues…
[…]
Read More…