12 Tips for Obtaining and Using Customer Testimonials

One of the most powerful things you can put on your software sales website is a customer testimonial. Visitors to your website who are prospective customers will trust you more if you have testimonials because it indicates that 1. There are other people using your software, 2. At least some of those other people are very happy about using your software, and 3. You listen to what customers are saying about your software.

So, here are 12 tips for using testimonials on your website.

Obtaining Testimonials

  • FIRST, above all else, your testimonials need to be from actual users, preferably ones who are paying you money. Anything else is deceptive (and in some places, illegal).
  • DON’T ask customers to “write you a testimonial.” Most won’t, and the ones that do will try to think like marketing people and will write things that sound forced and unnatural. You want testimonials that are genuine, so readers say “I believe that a real person wrote that, not a marketing drone.”
  • DO send a regular survey to your customers that asks open ended questions like “What is the primary benefit you receive from <product>.” I recommend an annual survey, but depending on your market, something more frequent could be ok. (By the way, we highly recommend the simple survey at survey.io, which is where the question quoted above came from).
  • DO keep an eye on your communication with your users. Emails, tweets, and even bug reports can have customers saying very nice things about your software. When you get nice comments, make a note of them.
  • DO ask people if you can use their quote before publishing it. Send a short email asking if you can use what they said. If they say “no,” then don’t use it. (In 11 years of running a software business, I have never had a customer say “no” when asked if I could quote them on my website).
  • DO alter their testimonial so it makes sense. Don’t put words in their mouth, but it’s ok to make “editorial” changes to make their message more concise. For example, if the customer wrote “I love using your software, it’s a must have for every one of the people I know that’s a dentist,” you might change it to “I love using <ProductName>. It’s a must have for every dentist I know.”
  • DO tell the customer exactly what you intend to publish, including any wordsmithing you’ve done, and how you intend to display their name and any other personal details you intend to use.

Publishing Testimonials

Here are a few simple guidelines to help your testimonials have maximum impact when you publish them:

  • DO make the testimonial stand out from the other text on your website. You could make it a different color, put it in a box, or use a different font. The point is to convey the idea that this text is different from the other text, because someone else wrote it.
  • DO use testimonials as a form of “proof” of something you’re trying to convey on the page. For example, a page about how much time your software saves users would be enhanced by a testimonial that says “<ProductName> saves me a ton of time.”
  • DO use multiple testimonials together, if they all support the same basic idea. If you have three testimonials that talk about how much time your software saves, you can put them together to strengthen the “proof” mentioned above.
  • DO make the testimonial as personal as possible. Always use the author’s first name. Even better, use the author’s first and last name. Better yet, use their first and last name and their picture! How far to go with this will depend on the comfort level of your customers and your market. Personally, I use a first name with a last initial, and a bit saying “<ProductName> user since 2008,” or something similar. You want your reader to feel a connection to the author of the testimonial, so that they trust the testimonial, and by extension trust you.
  • Finally, DO deliver on the testimonial! This goes back to the top, about using genuine, honest quotes from real users. If something about your product changes that makes the testimonial no longer apply, stop using it. If the user had an exceptional experience that’s the opposite of most of your users, don’t quote him on that experience (Have you seen those “results not typical” weight loss ads? It might generate interest, but it just leads to users that try your software and are disappointed. Don’t do that).

Testimonials build both interest and trust in your message. Obtain them appropriately and use them wisely, and they’ll do great things for your business.

BONUS: Did you like the advice in this post? I’ve got three more tips on using testimonials (including the exact text of the email I send to customers when I ask if I can quote them) that I’m going to send exclusively to our mailing list next week. Join the mailing list to get more useful advice on software marketing (we won’t spam you, we promise).

10 Tips for Time-Crunched Marketing

I often find myself having just a few minutes before some scheduled engagement, and trying to find some way to be productive during that time. I’ve found that, as a programmer, having less then 30 minutes to devote to a task just isn’t adequate to get anything significant done, but when it comes to the marketing side of my businesses, I can usually find a good way to spend those few minutes. Here are some things I’ve found to be successful:

30 minute marketing

  1. Write a short blog post.
  2. Use Google blog search to find recent blog posts on topics that your customers read about. Make meaningful, useful comments on those posts (complete with links to your website, of course).
  3. Submit one of your webpages to criticue, then review 2-4 other websites. You’ll get 2-4 reviews of your own site, which you can use as a starting point for deciding what to A/B test.
  4. Brainstorm 10 ideas for future blog posts.

15 minute marketing

  1. Create a customer development survey on survey.io. Send a link to the survey to 50 random customers.
  2. If you haven’t already, set up a Facebook page and/or Twitter account for your business (separate from your personal one, unless you’re trying to build yourself as your brand).
  3. Post something interesting to your audience on Facebook or Twitter. It could be original content or a link to something interesting.

5 minute marketing

  1. Go to whatpageofsearchamion and search for something your customers would search for. See where your website is in the results. If you’re unhappy with the result, make a note to work on SEO for that phrase.
  2. Write down one idea for a blog post to write later.
  3. Find a tweet that’s interesting to your audience and “retweet” it. (What?)


5 Tips for A/B Testing Websites with low Traffic

It seems that I’ve been seeing an unusually high number of blog posts on the same basic subject lately: The importance of letting your A/B test run long enough. Here’s the deal: It’s really important. It boils down to a statistics concept called “confidence interval.”

Confidence Interval Primer

Confidence interval is sort of a measure of how likely it is that you have a result that’s actually meaningful.

In A/B testing packages like Google Content Experiments or Visual Website Optimizer, confidence interval is usually in a column called “Chance of Beating Original” or “Probability of Outperforming Original.” The funny thing about confidence intervals is that they have to be really high to actually have any meaning. A confidence interval of 80% is actually no better than a confidence interval of 50%. Statistics is kind of funny that way. In order to actually have a conclusive result in your A/B testing, you need to have a confidence interval of 95% or more.

I won’t go into the math of how that’s calculated. A/B testing software does that for us, and if you really want to know how it works, it’s easy enough to find on the internet.

But what does this have to do with how long your test runs? Well, confidence interval is related to the number of visits exposed to the experiment, and the conversion rates of each of the experiment pages. There are two ways to increase the confidence interval: Increase the number of visits, or increase the distance between the conversion rates.

For example, suppose a test runs as follows:

  • Variation A gets 45,135 visits, with 401 conversions, for a conversion rate of 0.89%
  • Variation B gets 46,012 visits, with 450 conversions, for a conversion rate of 0.98%

At first glance, it looks like variation B is the winner, since it has a higher conversion rate, but the confidence interval for this test is only 92%. This test is actually inconclusive! Statistically, there’s still a fair chance that tomorrow, the conversion rate will swing back the other direction and put variation A on top.

How can we get conclusive results?

After running the test a little longer, we get the following:

  • Variation A now has 86,143 visits, with 767 conversions. The conversion rate is still 0.89%
  • Variation B now has 92,151 visits, with 903 conversions. The conversion rate is still 0.98%

Now, however, the magic of statistics tells us that the confidence interval is 98%. With the same conversion rates, we now have a conclusive result. Variation B is the winner.

But what about the other scenario? What if the difference between conversion rates had come out differently? Suppose…

  • Variation A has 45,135 visits, with 401 conversions. The conversion rate is 0.89%
  • Variation B has 46,012 visits, with 471 conversions. The conversion rate is 1.02%

This is very similar to the first example above. There’s only a 0.04% difference in the conversion rate of scenario B. BUT, this time the confidence interval is 98%, so we can say we have a conclusive result. Again variation B is better!

This is not new information, and there are many books, websites, and blogs that discuss A/B testing and confidence interval in greater detail. I used to read those and think “That would be nice, having 90,000 visits to A/B test on.” I’m running an A/B test on one of my websites right now that would take 3 years to get 90,000 visits! That’s way too long to wait for results.

Low traffic sites

So what do we do with our much more normal websites that get just a few hundred to a few thousand visits per month?

First option: Abandon AB testing until you’ve got enough traffic to get some results.

Frankly, this is probably what you should do if you’re getting less than a few hundred visits each month. It’s not that A/B testing wouldn’t be conclusive. It’s that your time and energy is better spent getting more traffic to the website.

But once you’ve got a small but steady flow of traffic, here are 5 tips to get more conclusive A/B test results on a website with low traffic:

1. Test BIG things

The classic A/B testing example is testing the color of buttons to see which converts better. If you’re Amazon.com, and you get millions of views to your test every day, then you can test your button colors and try to eek out a 0.01% conversion increase. Remember, for tiny conversion differences, you need huge amounts of traffic. Instead, focus on things that are likely to yield very large differences in conversion rate. Put the call to action at the top of the page instead of the bottom. Test the big headline that might be causing people to lose interest and click “back” before they read any further. Test two completely different page layouts. Ask this question: “What might double my conversion rate?”

2. Change the conversion

Conversion is often synonymous with “sale,” but it doesn’t have to be. Higher conversion rates will generally mean a greater difference between variations (because the difference will increase proportionally), so if you choose something that’s associated with sales, but happens more often, you’ll get faster results.

For example, you might track something that leads directly to sales in fairly predictable ways, like free trial downloads, or viewing a demo page. You could also track something as simple as site interaction (a “conversion” could be any visitor who stays on site for more than 30 seconds, or who scrolls past a certain point on your long-copy page).

3. Buy more traffic

You can likely increase the traffic to your site temporarily buy spending money on advertising. AdWords, Facebook, LinkedIn, and other pay-per-click advertising systems are generally accessible to people running lower traffic websites, and if you’re willing to spend some money, you could drive a lot of relatively well qualified traffic to your website pretty quickly. Chances are that when you stop the advertising campaign, the traffic will drop off, but if you take advantage of the opportunity, you will have spent the money on a better website.

4. Don’t get lured in by multivariate testing

“Multivariate” testing is basically A/B testing with a twist: You provide multiple variations of several elements of the site, and the testing package tests all possible combinations until it finds one that’s significantly better than the others. For example, which converts better? The short headline with the long subtitle and the green button, or the long headline with the short subtitle with the yellow button, or the short headline with the short subtitle with the green button, or the… you get the idea.

Every time you add an element to a multivariate test, you exponentially increase the amount of traffic needed in order to get conclusive results. Multivariate testing is awesome, and for sites with a sufficiently large amount of traffic,will yield faster results than iterative A/B testing. But for sites with less than tens of thousands of daily visits, you’re better off with plain old A/B testing.

5. Try qualitative testing instead

“Qualitative” is testing based on some perceived “quality,” as opposed to a “quantitative” test that is based on cold, hard numbers. Qualitative Testing is just a fancy term for asking people to tell you what they think. If you’re asking the right people, you can get very valuable information from qualitative tests.

UserTesting.com is a commercial service that gives you qualitative feedback about your website. For $39, you get a video of a visitor using your website and talking about what they’re thinking as they do it, and a written summary of their experience with your site. Personally, I’ve not used UserTesting.com, but I’ve heard very good things about it.

Criticue.com is a peer-reviewed qualitative testing system. You submit your website for reviews, and then you are asked to review other websites. For each website review you provide, you get a review of your site by someone else. I have used Criticue several times in the last few months, and found the reviews to be quite good. It’s also fun to look at other people’s websites from a critical standpoint and provide constructive criticism.

Qualitative testing is a very efficient way of getting info about your website. You get much more data from 1 visit than any A/B test would ever provide. HOWEVER…

You have to be careful about qualitative tests, because they can be wrong. A classic example is the “long copy sales page.” People say they don’t want to read all that, and that you should break it up into smaller chunks, and yet over and over again, A/B tests have shown that long copy pages tend to convert better (of course, you should test that on your market, because it isn’t always true, but you would be surprised at how often it is true). My advice: use qualitative testing to decide what A/B tests to run.

Good luck, and happy testing!