GMT NewYork London Moscow Tokyo Sydney

May

17

2017

Disproving Best Practices: The One- vs. Two-Column Form Test

Published by in category A/B testing, Daily, Landing Pages | Comments are closed

disproving-best-practices.png

A few months ago, I took the stage at Digital Summit Dallas to talk about blog conversion rate optimization (CRO). The session right before mine was led by Unbounce Co-Founder Oli Gardner — a household name for those of us in the CRO industry. Needless to say, it was a tough act to follow. 

In his session, “Frankenpage: Using A Million Little Pieces of Data to Reverse Engineer the Perfect Landing Page,” Oli shared lots of great data-backed tips for landing page optimization. In discussing best practices for conversion forms, he talked about how two-column forms weren’t ideal. 

What’s the Beef With Two-Column Forms?

Oli isn’t the only one to frown upon the use of two-column forms. Baymard Institute, a usability research company, published this a few years back, and ConversionXL Founder Peep Laja has also asserted that one-column forms perform better.

Peep’s colleague Ben Labay even published a study about the superiority of the one-column form over multi-column forms. The study showed that users complete the linear, single-column form an average of 15.4 seconds faster than the multi-column form. While speed is not directly tied to form completion, the data suggests that if the single-column form is faster to complete, fewer people will abandon it, garnering more conversions. It all boils down to user experience.

But Oli’s advice to avoid multi-column forms originally caught my attention because we had just redesigned HubSpot’s demo landing page, one of the most important landing pages on our website, and switched from a one-column to a two-column form in the process.

The thing that stuck out to me was that in switching to two columns, we had actually improved the conversion rate of our page by 57%. Now to be fair, the form wasn’t the only variable we manipulated in the redesign (we refreshed the design and made some copy tweaks as well), but it still made me wonder whether two-column forms were really all that bad.

So I put it to the test. 

The One- vs. Two-Column Form Test

Using HubSpot’s landing page A/B testing tools, I pitted the two-column form (the control) against the one-column form (the variant). Here’s how they looked …

Control (Two-Column Form)

demo-lp-control-two-column-form.png

Variant (One-Column Form)

demo-lp-variation-one-column-form.png

So “best practices” aside, which do you think performed better?

And the Winner Is …

not the one-column form. In fact, the two-column form converted 22% better than the one-column form, statistically significant with a 99% confidence level.

Surprised? I wasn’t. Just look at the length of that one-column form! Yes, HubSpot’s lead-capture forms are long (13 fields to be exact), but they’re long by design. Through our experience, we’ve learned that having more fields helps us better qualify our leads, and weed out unqualified ones.

But a 13-field form doesn’t exactly lend itself to a one-column design, which is why I think for us, the two-column form works better. The theory is that the one-column form, despite having the same number of fields, looks longer, so visitors are much more likely to get scared off before completing it.

Since we ran the test, we’ve actually switched to a kind of hybrid form, with elements of both a one- and two-column form, to make our two-column form a bit more user friendly. Our old two-column form is on the left, and our new hybrid form is on the right.

two-column-vs-hybrid-form.png

Questioning “Best Practices”

Any CRO worth their salt knows there’s really no such thing as best practices, and that everything should be tested yourself (which, coincidentally enough, was a major theme in the talk I delivered after Oli’s).

In fact, Oli and Peep will be the first ones to tell you that while they may share certain CRO findings and trends from their experience, there are no sure things. That’s why testing things for yourself is so important. What might work better for one site, might not necessarily work better for yours  that’s fundamental to CRO.

And in my opinion, running those tests to figure out what works for you is what makes conversion rate optimization so much fun. Especially when the results challenge what the experts say 😉 

land

Powered by WPeMatico

Apr

7

2017

The Beginner's Guide to Conversion Rate Optimization (CRO)

Published by in category A/B testing, Daily, marketing automation, SEO | Comments are closed

beginners-guide-conversion-rate-optimization-compressed.jpg

Today, most marketing teams are structured to drive traffic towards websites, which then converts into leads for the sales team to close. Once this process starts to deliver results, marketers then seek to generate even more traffic, and hopefully even more success.

An oversimplification, but that’s the standard marketing playbook. Few marketing teams focus on getting more from existing traffic. That’s where conversion rate optimization (CRO) comes in.

In this blog post, we’ll teach you all about CRO — what it achieves, why you should do it, and how your team can execute it. We’ll explain how you can drive more results from your existing traffic so your content can work smarter, and not harder, for you.

What Is Conversion Rate Optimization (CRO)?

I’m glad you asked. Many websites are designed to convert website visitors into customers. These conversions occur all over the website — on the homepage, pricing page, blog, and landing pages — and all of these can be optimized for a higher number of conversions. The process of optimizing those conversions is exactly what CRO entails.

CRO is a huge, often untapped opportunity for marketing teams, and you might be surprised by the oversized impact you could deliver by fine-tuning your website for conversions.

When Is Conversation Rate Optimization (CRO) Right for Your Business?

Once your sales and marketing engine attracts website visitors who consistently convert into leads for your sales team, you should start thinking about CRO.

Most businesses have a finite demand for products and services, so it’s imperative that you make the most out of your existing website traffic. Tools like Google’s Global Market Finder can show you online search volume to give you an idea of your potential customer demand. Once you determine the threshold of your customer demand, it’s time to nail down how to get more out of your existing website traffic.

Below are three formulas to help you figure out how to tackle CRO at your company, and what goals to set:

  1. New revenue goal ÷ average sales price = # of new customers
  2. # of new customers ÷ lead to customer close rate % = lead goal
  3. Leads generated ÷ website traffic X100 = % conversion rate

To help you understand the impact CRO could have on your business, here’s an example of the formulas in action.

If your website has 10,000 visitors per month that generate 100 leads and subsequently, 10 customers each month, the website visitor to lead conversion rate would be 1%.

But what if you wanted to generate 20 customers each month? You could try to get 20,000 visitors to your website and hope that the quality of traffic doesn’t decrease. Or, you could get more leads from your existing traffic by optimizing your conversion rate.

If you increased the conversion rate from 1% to 2%, you’d double your leads and your customers.

The table below shows the impact of increasing your website’s conversion rate:

  Company A Company B Company C
Monthly website traffic 10,000 10,000 10,000
% conversion rate 1% 2% 3%
Leads generated 100 200 300
# of new customers 10 20 30

The key point here? Trying to generate more website traffic isn’t necessarily the right approach. Think of it like a leaky bucket. Pouring more water into a leaky bucket won’t fix the root cause — you’ll just end up with a lot of waste. Conversion rate optimization is about getting more from what you have and making it work even better for you.

Ready to take the first steps towards CRO at your company? Check out the strategies below, and start testing.

8 Conversion Rate Optimization Strategies to Try

1) Create text-based CTAs within blog posts.

While it’s good practice to include a call-to-action (CTA) in your blog post, these sometimes fail to entice people to take the desired course of action. Banner blindness is a very real phenomenon as people become accustomed to ignoring banner-like information on websites. This lack of attention, coupled with the fact that website visitors don’t always read to the bottom of a blog post as they “snack” on content, means a new approach is required.

That’s where the text-based CTA comes in handy. Here at HubSpot, we ran a test with text-based CTAs — a standalone line of text linked to a landing page and styled as an H3 or an H4 — to see if they would convert more traffic into leads than regular CTAs at the bottom of a web page. Here’s one of ours below:

Manage and plan your social media content with the help of this free calendar  template.

In HubSpot’s limited test of 10 blog posts, regular end-of-post banner CTAs contributed an average of just 6% of leads that the blog posts generated, whereas up to 93% of a post’s leads came from the anchor text CTA alone.

2) Include lead flows on your blog.

Another test you should consider is including lead flows on your blog. Essentially, these are high-converting pop-ups designed to attract attention and offer value. You can select from a slide-in box, drop-down banner or pop-up box, depending on your offer. We experimented with the slide-in box on the HubSpot blog, and it achieved a 192% higher clickthrough rate, and 27% more submissions than a regular CTA at the bottom of a blog post.

Head over to the HubSpot Academy to learn how to add lead flows to your blog posts. They can dramatically increase conversions on your website.

3) Run tests on your landing pages.

Landing pages are an important part of the modern marketer’s toolkit. A landing page is where a website visitor becomes a lead, or an existing lead engages more deeply with your brand. These pages play an important role on your website, so you should run A/B tests to get the most from them.

But what should you A/B test? We know that a high performing landing page can have a tremendous impact on a business, so at HubSpot, we make it easy to test variants and eke out more conversions. You can quickly and easily test website copy, content offer, image, form questions, and page design. Check out these tips for effective A/B testing and our A/B testing calculator.

4) Help leads to immediately become a marketing-qualified lead (MQL).

Sometimes, your website visitors want to get straight down to business and speak with a sales rep, rather than be nurtured by marketing offers. You can make it easy for them to take this action (and immediately become a marketing qualified lead) with a combination of thoughtful design and smart CTAs.

Compelling, clear copy has the ability to drive action and increase conversions for your business. But which actions do you want to encourage so visitors can become MQLs?

Here at HubSpot, we discovered that visitors who sign up for product demos convert at higher rates than visitors who sign up for free product trials, so we optimized our website and conversion paths for people booking a demo or a meeting with a sales rep. Admittedly, this depends on your product and sales process, but our best advice is to run a series of tests to find out what generates the most customers. Then, optimize for that process.

The key takeaway is to look for ways to remove friction from the sales process. That being said, if you make it easy for people to book a meeting with sales reps, we do recommend further qualification before the call takes place, so the sales rep can tailor the conversation.

5) Build workflows to enable your sales team.

There are a number of automated workflows you can create that your colleagues in sales will thank you for. For instance, did you know it’s possible to send emails on behalf of sales reps, so leads can book a meeting with them at the click of a button? Or that sales reps can receive an email notification when a lead takes a high intent action, such as viewing the pricing page on your website? And if you work in ecommerce, you can send an email to people who abandon their shopping cart.

All of this is possible with marketing automation. Want to learn more? Master marketing automation with our helpful guide.

6) Add messages to high-converting web pages.

With HubSpot’s messages tool, it’s now possible to chat with website visitors in real-time. To increase conversions, you should add messaging capabilities to high-performing web pages, such as pricing or product pages, so leads convert rather than leave.

You can also make chatting action-based. For example, if someone has spent more than a minute on the page, you may want to automatically offer to help and answer any questions they may have.

HubSpot’s messages tool is coming in the spring of 2017, but you can apply to join the beta program here.

7) Optimize high-performing blog posts.

If you’ve been blogging for more than a year, it’s likely you’ll have some blog posts that outperform others.

The same is true at HubSpot — in fact, the majority of our monthly blog views and leads come from posts published more than a month ago. Blog posts are a big opportunity for conversion rate optimization.

To get started, identify the blog posts with high levels of web traffic, but low conversion rates. It may be that the content offer you’re promoting isn’t aligned with the blog post’s content, or your CTA could be unclear.

In one instance, we added a press release content offer to a blog post about press releases and saw conversions for that post increase by 240%.

Additionally, you should look at blog posts with high conversion rates. You want to drive more qualified website traffic to those posts, and you can do that by optimizing the content for search engines or updating the content to ensure that it’s fresh and relevant. If you’re a HubSpot customer, you can drive traffic to these pages from LinkedIn and Facebook using the ads add-on.

8) Leverage retargeting to re-engage website visitors.

It doesn’t matter what your key conversion metric is: The cold, hard truth is that most people on your website don’t take the action you want them to. By leveraging retargeting (sometimes known as remarketing), you can re-engage people who’ve left your website.

Retargeting works by tracking visitors to your website and serving them online ads as they visit other sites around the web. This is particularly impactful when you retarget people who visit high-converting web pages.

The normal inbound rules still apply — you need well-crafted copy, an engaging image and a compelling offer for retargeting to work. If you’re a HubSpot customer, you should take a look at how the AdRoll integration can improve your conversion efforts.

How to Get Started with Conversion Rate Optimization (CRO)

We’ve shared a ton of information in this post, and at this point, you may be thinking, “where should I start?”

Here’s where the PIE framework comes in. Before starting a CRO project, we recommend prioritizing through the lens of PIE — rank each project based on its potential, importance, and ease. We used this framework at HubSpot with great results.

You should use this framework to answer the following questions for every strategy outlined in the previous section. Assign to each strategy a score between one and 10 (with one being the lowest and 10 being the highest):

  1. How much total improvement can this project offer?
  2. How valuable will this improvement be?
  3. How complicated or difficult will it be to implement this improvement?

Once you’ve assigned a score for each strategy, add up the numbers and then divide it by three — this gives a score which shows what project will have the greatest impact. Then, work on the projects with the highest scores first. The framework isn’t perfect, but it’s easy to understand, systematic, and a great way to communicate to the rest of your colleagues which CRO projects are being selected and why.

Want to learn more about the PIE framework? Take a look at this explanation from WiderFunnel.

What’s next?

There are a lot of “best practices” out there, but ultimately, you need to find out what your customers respond to, and what drives results for your business. Here are three follow-up actions to get started with CRO today:

  1. Use the three formulas to start the CRO conversation.
  2. Leverage the PIE framework to help prioritize your strategy.
  3. Make CRO someone’s responsibility.

What CRO strategies does your business leverage? Share with us in the comments below.

32 Ecommerce Conversion Mistakes to Avoid

Powered by WPeMatico

Nov

22

2016

4 Common A/B Testing Mistakes (And How to Fix Them)

AB Testing Mistakes Carl.jpg

When you’re creating content for the web, it’s easy to make assumptions about what you think your audience might respond to — but that’s not necessarily the right mentality.

Enter A/B testing: one of the easiest and most popular forms of conversion rate optimization (CRO) testing known to marketers. And while many businesses have seen the value in using this type of validation to improve their decision making, others have tried it, only to be left with inconclusive results — which is frustrating, to say the least. Download our free introductory guide to A/B testing here.  <http://offers.hubspot.com/an-introduction-to-ab-testing/> ” src=”https://no-cache.hubspot.com/cta/default/53/db238795-8fb2-4ed9-916d-c978f32aaeae.png”> </a></p>
<p>The trouble is, small mistakes made during A/B testing can lead to round after round of incremental optimizations that fail to producing meaningful results. To combat that, I’ve outlined some of the most common A/B testing mistakes (as well as their remedies) below. These tips are designed to help you keep your testing plans on track so you can start <a href=converting more visitors into customers, so let’s dive in …

4 Common A/B Testing Mistakes (And How to Fix Them)

Problem #1: Your testing tool is faulty.

Popularity is a double-edged sword — it’s true for high schoolers and it’s true for A/B testing software.

The ubiquity of A/B testing has led to a wide range of awesome, low-cost software for users to choose from, but it’s not all of equal quality. Of course, differing tools offer differing functionality, but there can also be some more tricky differences between tools. And if you’re unaware of these differences, your A/B tests may be in trouble before you even get started.

For example, did you know that some testing software can significantly slow down your site? This decrease speed can have a harmful impact on your site’s SEO and overall conversion rates.

In fact, on average, just one second of additional load time will result in an 11% decrease in page views, and 7% decline in conversions. This creates a nightmare scenario where the websites you were hoping to improve through A/B testing are actually hindered by your efforts.

It gets worse: Your selection of A/B testing software can actually impact the results of your tests, too. Entrepreneur and influencer, Neil Patel, found that the A/B software he was using was showing significant differences, but when he implemented the new page he failed to see conversions change. His problem turned out to be a faulty testing tool.

So with all these hidden pitfalls, what can you do to make sure your A/B testing software is working fine?

The Fix: Run an A/A test.

Prior to running an A/B test, you should run an A/A test with your software to ensure it’s working without impacting site speed and performance.

For the uninitiated, an A/A test is very similar to an A/B test. The difference is that in an A/A test both groups of users are shown the exact same page. That’s right, you need to literally test a page against itself. While this may seem silly at first, by running an A/A test you will be able to identify any distortionary effects caused by your testing software.

An A/A test is the one time you want your results to be boring. If you see conversion rates drop as soon as you start testing, then your tool is probably slowing down your site. If you see dramatic differences between the results for the two pages, then your software is likely faulty.

Problem #2: You stop testing at the first significant result.

This is the statistical equivalent to taking your ball and going home. Unfortunately, when it comes to A/B testing, stopping your test as soon as you see a statistical significant result is not just bad sportsmanship, but it also produces completely invalid results.

Many tools encourage this behavior by allowing users to stop a test as soon as statistical significance has been hit. But if you want to drive real improvement to your site, you need to fight the urge to end your tests early. This may seem counterintuitive, but the more often you check your test for significant results, the more likely you are to see incorrect results.

The issue here is false positives: these are results that incorrectly show a difference between pages. The more often you check your results, the more likely you will hit a result that has been thrown off by false positives.

This isn’t an issue if you stay calm and don’t end your test early. However, if you end your test at the first sign of a significant result then you’ll likely fall victim to deceptive false positive outcomes.

Analytics firm Heap published the results of a simulation, which displays how ending your test early compromises your results.

Using standard significance testing, results from a 1,000-user test are checked once there is a 5% chance of false positive. If the tester checked the same group of users 10 times, the chance of a false positive result balloons to 19.5%. If checked 100 times, our 5% chance of a false positive increases eight fold to 40.1%.

These are good numbers to remember next time you get excited about early promising results.

The Fix: Stick to a predetermined sample size.

To combat false positives, discipline is key. You should set a sample size in stone prior to running an A/B test and resist the urge to end your test early (no matter how promising your results look).

Don’t fret if you’re scratching your head on how large your sample needs to be. There are plenty of tools available online for calculating a minimum sample size. Some of the most popular are from Optimizely and VWO.

One last note on sample size: Keep in mind that you’ll need to pick a realistic number for your page. While we would all love to have millions of users to test on, most of us don’t have that luxury. I suggest making a rough estimate of how long you’ll need to run your test before hitting your target sample size.

Problem #3: You’re only focusing on conversions.

When you’re deep in the weeds of an A/B test, it’s easy to focus on the trees and miss the forest. Put more literally, in A/B testing, it is easy to concentrate only on conversions and lose sight of the long-term business results produced.

While adding new copy to your site may produce higher conversion rates, if the converted users are of lower quality then a higher conversion rate may actually create a negative result for the business.

It can be easy to fall victim to vanity metrics while A/B testing, yet these metrics will distract your focus away from the actual revenue-driving results. If you’re testing a call-to-action that leads to a landing page, you should not just focus on conversions to the landing page. Instead, measure the leads produced from the page and ideally try to tie those leads to the revenue they produce.

The Fix: Test a hypothesis.

Before you start your A/B test you should outline a hypothesis you wish to validate or disprove. By focusing this hypothesis on a KPI that drives actual business results, you’ll avoid being distracted by vanity metrics.

Your A/B test should be judged on its ability to affect this KPI, and not its impact on other associated figures. So if your goal is to increase sign-ups, always judge success by measuring sign-ups, not on clickthrough rates to the sign-up page.

When working to validate or disprove your hypothesis, don’t just throw out any results that aren’t statistically significant — use these results to inform your later tests, instead. For example, if a change to your page’s CTA showed a small, statistically insignificant improvement, then this could be a sign that you might be onto something. Try running further tests on your CTA and see if you can hit on one that produces a significant improvement.

Problem #4: You only test incremental changes.

The button color test may have ruined A/B testing, as this test’s popularity has made it the frame of reference for understanding how A/B testing should be utilized. But there’s more to the practice than that. In fact, while a large website might see a big return from adjusting something small like button color, for the vast majority of us, these small, incremental changes are not going to produce meaningful results.

A/B testing can force us to aim for miniscule improvements, but by focusing only on the incremental, we may be missing a much larger opportunity.

The Fix: Periodic radical testing.

A good rule of thumb? Periodically test radical changes to your page. (This practice has since been coined radical testing.) If you’re seeing weak conversion rates, then it’s probably a sign you should invest time in testing out a radical change rather than incremental changes.

Think of your testing efforts like a poker game, you’ll need to periodically bet big if you want to see a big return.

But before you run off preaching the accolades of radical testing, be aware that it has some drawbacks. First, it requires more upfront labor than A/B testing. Radical testing requires that you invest time drafting a major page redesign. Because of this time investment, I recommend only periodically conducting radical tests.

An additional pitfall to radical testing is that it makes it hard to pinpoint what factors are having the largest impact on your site. What radical testing does allow you to do is determine if a large page rehaul will impact your conversions, but it won’t allow you to pinpoint which individual changes might be driving these results — so keep that in mind before you get started.

These are a few of the most common A/B testing mistakes but there are many, many more. Share your thoughts below some of the missteps you’ve seen.

free guide to a/b testing

Nov

22

2016

4 Common A/B Testing Mistakes (And How to Fix Them)

AB Testing Mistakes Carl.jpg

When you’re creating content for the web, it’s easy to make assumptions about what you think your audience might respond to — but that’s not necessarily the right mentality.

Enter A/B testing: one of the easiest and most popular forms of conversion rate optimization (CRO) testing known to marketers. And while many businesses have seen the value in using this type of validation to improve their decision making, others have tried it, only to be left with inconclusive results — which is frustrating, to say the least. Download our free introductory guide to A/B testing here.  <http://offers.hubspot.com/an-introduction-to-ab-testing/> ” src=”https://no-cache.hubspot.com/cta/default/53/db238795-8fb2-4ed9-916d-c978f32aaeae.png”> </a></p>
<p>The trouble is, small mistakes made during A/B testing can lead to round after round of incremental optimizations that fail to producing meaningful results. To combat that, I’ve outlined some of the most common A/B testing mistakes (as well as their remedies) below. These tips are designed to help you keep your testing plans on track so you can start <a href=converting more visitors into customers, so let’s dive in …

4 Common A/B Testing Mistakes (And How to Fix Them)

Problem #1: Your testing tool is faulty.

Popularity is a double-edged sword — it’s true for high schoolers and it’s true for A/B testing software.

The ubiquity of A/B testing has led to a wide range of awesome, low-cost software for users to choose from, but it’s not all of equal quality. Of course, differing tools offer differing functionality, but there can also be some more tricky differences between tools. And if you’re unaware of these differences, your A/B tests may be in trouble before you even get started.

For example, did you know that some testing software can significantly slow down your site? This decrease speed can have a harmful impact on your site’s SEO and overall conversion rates.

In fact, on average, just one second of additional load time will result in an 11% decrease in page views, and 7% decline in conversions. This creates a nightmare scenario where the websites you were hoping to improve through A/B testing are actually hindered by your efforts.

It gets worse: Your selection of A/B testing software can actually impact the results of your tests, too. Entrepreneur and influencer, Neil Patel, found that the A/B software he was using was showing significant differences, but when he implemented the new page he failed to see conversions change. His problem turned out to be a faulty testing tool.

So with all these hidden pitfalls, what can you do to make sure your A/B testing software is working fine?

The Fix: Run an A/A test.

Prior to running an A/B test, you should run an A/A test with your software to ensure it’s working without impacting site speed and performance.

For the uninitiated, an A/A test is very similar to an A/B test. The difference is that in an A/A test both groups of users are shown the exact same page. That’s right, you need to literally test a page against itself. While this may seem silly at first, by running an A/A test you will be able to identify any distortionary effects caused by your testing software.

An A/A test is the one time you want your results to be boring. If you see conversion rates drop as soon as you start testing, then your tool is probably slowing down your site. If you see dramatic differences between the results for the two pages, then your software is likely faulty.

Problem #2: You stop testing at the first significant result.

This is the statistical equivalent to taking your ball and going home. Unfortunately, when it comes to A/B testing, stopping your test as soon as you see a statistical significant result is not just bad sportsmanship, but it also produces completely invalid results.

Many tools encourage this behavior by allowing users to stop a test as soon as statistical significance has been hit. But if you want to drive real improvement to your site, you need to fight the urge to end your tests early. This may seem counterintuitive, but the more often you check your test for significant results, the more likely you are to see incorrect results.

The issue here is false positives: these are results that incorrectly show a difference between pages. The more often you check your results, the more likely you will hit a result that has been thrown off by false positives.

This isn’t an issue if you stay calm and don’t end your test early. However, if you end your test at the first sign of a significant result then you’ll likely fall victim to deceptive false positive outcomes.

Analytics firm Heap published the results of a simulation, which displays how ending your test early compromises your results.

Using standard significance testing, results from a 1,000-user test are checked once there is a 5% chance of false positive. If the tester checked the same group of users 10 times, the chance of a false positive result balloons to 19.5%. If checked 100 times, our 5% chance of a false positive increases eight fold to 40.1%.

These are good numbers to remember next time you get excited about early promising results.

The Fix: Stick to a predetermined sample size.

To combat false positives, discipline is key. You should set a sample size in stone prior to running an A/B test and resist the urge to end your test early (no matter how promising your results look).

Don’t fret if you’re scratching your head on how large your sample needs to be. There are plenty of tools available online for calculating a minimum sample size. Some of the most popular are from Optimizely and VWO.

One last note on sample size: Keep in mind that you’ll need to pick a realistic number for your page. While we would all love to have millions of users to test on, most of us don’t have that luxury. I suggest making a rough estimate of how long you’ll need to run your test before hitting your target sample size.

Problem #3: You’re only focusing on conversions.

When you’re deep in the weeds of an A/B test, it’s easy to focus on the trees and miss the forest. Put more literally, in A/B testing, it is easy to concentrate only on conversions and lose sight of the long-term business results produced.

While adding new copy to your site may produce higher conversion rates, if the converted users are of lower quality then a higher conversion rate may actually create a negative result for the business.

It can be easy to fall victim to vanity metrics while A/B testing, yet these metrics will distract your focus away from the actual revenue-driving results. If you’re testing a call-to-action that leads to a landing page, you should not just focus on conversions to the landing page. Instead, measure the leads produced from the page and ideally try to tie those leads to the revenue they produce.

The Fix: Test a hypothesis.

Before you start your A/B test you should outline a hypothesis you wish to validate or disprove. By focusing this hypothesis on a KPI that drives actual business results, you’ll avoid being distracted by vanity metrics.

Your A/B test should be judged on its ability to affect this KPI, and not its impact on other associated figures. So if your goal is to increase sign-ups, always judge success by measuring sign-ups, not on clickthrough rates to the sign-up page.

When working to validate or disprove your hypothesis, don’t just throw out any results that aren’t statistically significant — use these results to inform your later tests, instead. For example, if a change to your page’s CTA showed a small, statistically insignificant improvement, then this could be a sign that you might be onto something. Try running further tests on your CTA and see if you can hit on one that produces a significant improvement.

Problem #4: You only test incremental changes.

The button color test may have ruined A/B testing, as this test’s popularity has made it the frame of reference for understanding how A/B testing should be utilized. But there’s more to the practice than that. In fact, while a large website might see a big return from adjusting something small like button color, for the vast majority of us, these small, incremental changes are not going to produce meaningful results.

A/B testing can force us to aim for miniscule improvements, but by focusing only on the incremental, we may be missing a much larger opportunity.

The Fix: Periodic radical testing.

A good rule of thumb? Periodically test radical changes to your page. (This practice has since been coined radical testing.) If you’re seeing weak conversion rates, then it’s probably a sign you should invest time in testing out a radical change rather than incremental changes.

Think of your testing efforts like a poker game, you’ll need to periodically bet big if you want to see a big return.

But before you run off preaching the accolades of radical testing, be aware that it has some drawbacks. First, it requires more upfront labor than A/B testing. Radical testing requires that you invest time drafting a major page redesign. Because of this time investment, I recommend only periodically conducting radical tests.

An additional pitfall to radical testing is that it makes it hard to pinpoint what factors are having the largest impact on your site. What radical testing does allow you to do is determine if a large page rehaul will impact your conversions, but it won’t allow you to pinpoint which individual changes might be driving these results — so keep that in mind before you get started.

These are a few of the most common A/B testing mistakes but there are many, many more. Share your thoughts below some of the missteps you’ve seen.

free guide to a/b testing

May

9

2016

6 Conversion Experts Answer 20 of Your Most Important CRO Questions [Live Google Hangout]

Blog_header_image_-_resized_1.jpg

Whether you’re new to marketing or decades into your career, conversion rate optimization is an ever-changing topic and necessary asset in your marketing playbook.

Looking to learn more about your audience? Want to manipulate your existing resources to improve their performance? How about growing your business by improving lead flow? Wouldn’t that be nice?

An effective CRO strategy can help you achieve all that — without forcing you to crank out a bunch of new content. 

In this live Google Hangout, these six experts will teach you the most up-to-date CRO strategies and how to use different methods to get results. With your help building the agenda, we’re going to play “20 Questions” with today’s top CRO experts and learn how to start, where to start, and when to stop testing and optimizing your marketing efforts for lead conversion. 

  • When: Wednesday 6/1 @ 2 p.m. ET // 5 p.m. GMT // 9 a.m. PT for one hour
  • Where: Live Google Hangout
  • Hashtag: #CROhangout

Want to learn more about conversion rate optimization? Click here to save your seat for this live event.

Meet the Conversion Experts

Rand Fishkin, Wizard of Moz

Rand Fishkin

Rand Fishkin uses the ludicrous title, Wizard of Moz. He’s founder and former CEO of Moz, board member at presentation software startup Haiku Deck, co-author of a pair of books on SEO, and co-founder of Inbound.orgRand’s an unsaveable addict of all things content, search, and social.

Larry Kim, Founder & CTO, Wordstream

Larry Kim

Larry Kim founded WordStream in 2007. He bootstrapped the company by providing internet consulting services while funding/managing a team of engineers and marketers to develop and sell software for search engine marketing automation. Today he serves as company CTO and is a contributor to both the product team and marketing teams.  

Oli Gardner, Co-founder, Unbounce

Oli Gardner, Unbounce

Unbounce Co-Founder Oli Gardner has seen more landing pages than anyone on the planet. His disdain for marketers who send campaign traffic to their homepage is legendary. He is a prolific webinar guest and writer, and speaks internationally about Conversion-Centered Design where he is consistently ranked as the top speaker. 

Peep Laja, Founder, ConversionXL

Peep_Laja.jpg

Peep is an entrepreneur and conversion optimization expert with 10+ years of global experience. He has extensive experience across verticals: In the past he’s run a software company in Europe, an SEO agency in Panama, a real estate portal in Dubai, and worked for an international non-profit. 

Pamela Vaughan, Principle Optimization Marketing Manager, HubSpot

Pamela Vaughan

As HubSpot’s principle optimization marketer, Pam currently manages large-scale projects relating to CRO and SEO (with an expertise in blog/content optimization) on the HubSpot marketing team’s new optimization team. Her team’s goal is to optimize and grow traffic and conversions from HubSpot’s various marketing assets. 

Michael Aagard, Senior Conversion Optimizer, Unbounce

Michel Aagard

For seven years, Michael has spent about 60 hours a week testing and optimizing websites to gain a deeper understanding of what really works in Online Marketing and CRO. He’s helped a multitude of clients from all over the world make more money online. In July 2015 he quit his career and joined Unbounce as Senior Conversion Optimizer. 

Moderated by: Meghan Keaney Anderson, VP of Marketing, HubSpot

Meghan Keaney Anderson

As Vice President of Marketing at HubSpot, Meghan leads the content, product marketing, and customer marketing teams. Together with her teams, she’s responsible for the company’s blogs, podcast, and overall content strategy, as well as the company’s product launch and customer demand campaigns.

join a Google Hangout With CRO experts

May

9

2016

6 Conversion Experts Answer 20 of Your Most Important CRO Questions [Live Google Hangout]

Blog_header_image_-_resized_1.jpg

Whether you’re new to marketing or decades into your career, conversion rate optimization is an ever-changing topic and necessary asset in your marketing playbook.

Looking to learn more about your audience? Want to manipulate your existing resources to improve their performance? How about growing your business by improving lead flow? Wouldn’t that be nice?

An effective CRO strategy can help you achieve all that — without forcing you to crank out a bunch of new content. 

In this live Google Hangout, these six experts will teach you the most up-to-date CRO strategies and how to use different methods to get results. With your help building the agenda, we’re going to play “20 Questions” with today’s top CRO experts and learn how to start, where to start, and when to stop testing and optimizing your marketing efforts for lead conversion. 

  • When: Wednesday 6/1 @ 2 p.m. ET // 5 p.m. GMT // 9 a.m. PT for one hour
  • Where: Live Google Hangout
  • Hashtag: #CROhangout

Want to learn more about conversion rate optimization? Click here to save your seat for this live event.

Meet the Conversion Experts

Rand Fishkin, Wizard of Moz

Rand Fishkin

Rand Fishkin uses the ludicrous title, Wizard of Moz. He’s founder and former CEO of Moz, board member at presentation software startup Haiku Deck, co-author of a pair of books on SEO, and co-founder of Inbound.orgRand’s an unsaveable addict of all things content, search, and social.

Larry Kim, Founder & CTO, Wordstream

Larry Kim

Larry Kim founded WordStream in 2007. He bootstrapped the company by providing internet consulting services while funding/managing a team of engineers and marketers to develop and sell software for search engine marketing automation. Today he serves as company CTO and is a contributor to both the product team and marketing teams.  

Oli Gardner, Co-founder, Unbounce

Oli Gardner, Unbounce

Unbounce Co-Founder Oli Gardner has seen more landing pages than anyone on the planet. His disdain for marketers who send campaign traffic to their homepage is legendary. He is a prolific webinar guest and writer, and speaks internationally about Conversion-Centered Design where he is consistently ranked as the top speaker. 

Peep Laja, Founder, ConversionXL

Peep_Laja.jpg

Peep is an entrepreneur and conversion optimization expert with 10+ years of global experience. He has extensive experience across verticals: In the past he’s run a software company in Europe, an SEO agency in Panama, a real estate portal in Dubai, and worked for an international non-profit. 

Pamela Vaughan, Principle Optimization Marketing Manager, HubSpot

Pamela Vaughan

As HubSpot’s principle optimization marketer, Pam currently manages large-scale projects relating to CRO and SEO (with an expertise in blog/content optimization) on the HubSpot marketing team’s new optimization team. Her team’s goal is to optimize and grow traffic and conversions from HubSpot’s various marketing assets. 

Michael Aagard, Senior Conversion Optimizer, Unbounce

Michel Aagard

For seven years, Michael has spent about 60 hours a week testing and optimizing websites to gain a deeper understanding of what really works in Online Marketing and CRO. He’s helped a multitude of clients from all over the world make more money online. In July 2015 he quit his career and joined Unbounce as Senior Conversion Optimizer. 

Moderated by: Meghan Keaney Anderson, VP of Marketing, HubSpot

Meghan Keaney Anderson

As Vice President of Marketing at HubSpot, Meghan leads the content, product marketing, and customer marketing teams. Together with her teams, she’s responsible for the company’s blogs, podcast, and overall content strategy, as well as the company’s product launch and customer demand campaigns.

join a Google Hangout With CRO experts

Mar

14

2016

13 Case Studies That Prove the Power of Word Choice

Word_Choice_Conversion_Tests.jpg

Your website copy is responsible for more than just presenting your visitors with basic information. In fact, your words alone have the ability to influence how visitors feel about your brand, what they choose to click (or not click), and how your site ranks in search engines.

How can you ensure that it’s working the way you want it to? Testing. Sometimes the tiniest change in word choice can have a major impact on your conversion rates. While the difference between a checkout button that reads “Add to Cart” rather than “+Cart” might seem insignificant, it can significantly alter your website’s performance.

Still not convinced? Check out these 13 case studies that prove word choice does indeed matter when it comes to optimizing for conversions.

15 Case Studies That Prove Word Choice Matters for Conversions

1) Fab Increased Cart Adds by 49%

In order to optimize their sales, the online retail community Fab experimented with the “Add to Cart” button on their site. After changing the button from one that showed a picture of a shopping cart to one that spelled out the words “Add to Cart,” the website saw a 49% increase in cart adds.

fab.png

Source: Optimizely

Takeaway: In trying to increase conversions, the more obvious you can be, the better. Always make your desired action as easy as possible to achieve, and leave nothing to be inferred.

2) NuFACE Increased Sales by 90%

In an effort to increase sales, an anti-aging skin product company called NuFACE decided to offer free shipping on all orders over $75. The result was a 90% increase in sales.

Nuface_Variation.png

Source: VWO

Takeaway: The word “free” still works like magic. According to author Dan Ariely, it serves as a powerful emotional trigger that can cause us to instantly change our behavior. If you can find a legitimate way to include it on your website, you might just see a big change in conversions.

3) Monthly1K Increased Sales by 6.5%

Monthly1K — a software solution for entrepreneurs — wanted to increase the number of online courses they were selling. To do this, they experimented with a new headline that read: “How to Make a $1,000 a Month Business.” This replaced the old headline that read: “How to Make Your First Dollar.”

Monthly1K.png

Source: AppSumo

The result was nearly a 6.5% increase in sales.

Takeaway: Most people are optimists, and if you can promise them big returns on their investment, then invest they will. As long as you can back up these promises, don’t be afraid to use them to increase your sales.

4) GoCardless Increased Conversions by 139%

A debit supplier in the UK called GoCardless tested two different versions of their CTA, each one only differing by one word. The first one read “Request a demo,” while the second read, “Watch a demo.”

gocardless.png

Source: GoCardless

Ultimately, the company found that the second version led to a 139% increase in conversions.

Takeaway: Some words have a connotation that can cause visitors to hesitate. The word “request” draws up images of having to fill out forms and wait for responses, while the word “watch” implies a far quicker and more direct process. Pay careful attention to the emotional connection of each word you use, especially when it comes to your CTA.

5) JCD Increased Conversions by 18%

No matter what you’re selling, writing engaging copy is vital. The iPhone repair service JCD, for instance, saw this first hand when they replaced their straightforward, factual web copy with an entertaining and humorous description of their services.

jcd.png

Source: Copyhackers

The result? Almost an 18% increase in conversions.

Takeaway: It’s important to include specific details, but be sure you’re writing them in a way that’s entertaining and engaging. Even giving your copy a small dose of personality can drastically increase the chances of it actually being read by a visitor.

6) Pink Pest Services Increased Conversions by 96%

The pest removal service, Pink Pest Services, decided to alter their advertisement so both the header and the copy below it were focused on their free quote, rather than having the header focus on the quote and the copy focus on a free report.

pink_pest_service.png

Source: Marketing Results

The results were a 96% increase in the number of conversions they saw.

Takeaway: Your copy needs to be consistent. Expounding on your first offer is far more effective than transitioning to a second offer under the same heading.

7) DaFlores Increased Sales by 27%

DaFlores is a company that sells and ships fresh flower arrangements through their website. In an effort to improve sales, the company added a sense of urgency by displaying the text “Order in the next [x] hours for delivery today.” In exchange, they saw a 27% increase in sales.

Takeaway: Often times, urgency is the key to driving impulse purchases. By adding a sense of urgency to your website — whether through a limited-time sale or through guaranteed delivery, as what DaFlores offered — you can open up an opportunity to drastically increase your number of conversions.

8) Stride Increased Conversions by 112%

The folks at Stride decided to switch up their abandoned cart emails by focusing more on the customer and their needs, rather than the company. Notice the consistent use of “you” and “your” in the email copy below.

stride.png

Source: AWeber

As a result, they were able to rack up a 112% increase in conversions.

Takeaway: While it can be incredibly tempting to tout the accomplishments and strong-points of your company, website, or services in your copy, you would, in general, be far better served by staying focused on the customer.

9) Raileasy Increased Email Opens by 31%

In this case, an online travel agency, Raileasy, wanted to improve the effectiveness of the emails sent out to customers who abandoned their shopping cart. To do this, they use personalization to customize email subject lines based on the name of the destination the customer was shopping for.

RailEasy.png

Source: Which Test Won

As a result, Raileasy saw a 31% increase in email opens.

Takeaway: People are obviously interested in the item they added to their cart, or they never would have made it that far. Reminding them of that interest is a more effective way of re-engaging them than a generic abandoned cart email.

10) TextMagic Increased Conversions by 38%

The goal of one of TextMagic’s CTAs was to send people from the company’s homepage to its pricing page. The original text in their CTA read “Buy SMS Credits,” but when TextMagic changed this to read “View SMS Credits,” they saw nearly a 38% increase in conversions.

Takeaway: People are always a little wary of pressing buttons on the internet. If your text leads them to believe that pressing a button equals buying a product, they’re often less likely to follow through. Instead, ease hesitant customers through the process with more comforting language.

11) L’Axelle Increased Cart Adds by 93%

L’Axelle‘s goal was to increase the number of underarm sweat pads being sold on their website. To do so, they changed their ad copy from “Feel fresh without sweat marks” to a more action-oriented phrase: “Put an end to sweat marks!”

laxelle.png

Source: Kissmetrics

The result was a 93% increase in items added to cart.

Takeaway: It’s called a call-to-action for a reason. By making your copy direct and action-oriented, you’ll leave your visitors feeling more inspired and compelled to follow through.

12) Betfair Increased Conversions by 7%

The online betting service, Betfair, tested six different persuasion tactics on their website, including reciprocity, scarcity, commitment and consistency, liking, authority, and social proof.

While each page that employed these specific tactics fared better than the control group, the top performer was the website that employed social proof by pointing out the number of “Likes” Betfair had on their Facebook page.

betfair.png

Source: VWO

This variation in particular enjoyed a 7% increase in conversions over the control.

Takeaway: Social proof is a powerful tactic. If people believe that there’s a buzz about your product or service, they’re more apt to want to join the party themselves. Don’t be quick to rule out other persuasion tactics, though. All six of the strategies tested proved effective for Betfair, and which one works best for you will depend on your specific website and goals. (Click here for tips on how to add social proof to your landing pages.)

13) Bloomspot Increased Conversions by 20%

Last up, Bloomspot took the surprisingly obvious step of making the text on their landing pages match the images on their landing pages. For example, if the image was of a restaurant, the text mentioned deals on top restaurants in the area.

bloomspot.png

Source: Kissmetrics

As a result, they saw a 20% boost in conversions.

Takeaway: Images and text are both two different means of conveying a message. When both your images and your text convey the same message, you’re giving your visitors a double dose of your CTA and increasing the likelihood of them converting.

As these case studies make clear, word choice matters — whether it’s in your CTAs, your headlines, or your body copy. Take their wisdom to heart by using them to identify areas on your own site that may be falling victim to unclear or uninspired language usage.

Have another great case study to add to this list? Share your suggestions by leaving a comment below.

free ebook: optimizing landing pages

Feb

17

2016

24 Conversion Rate Optimization Tools for Research, Feedback, Analytics & More

Conversion_Rate_Tools.jpg

Believe it or not, driving traffic to your website — albeit challenging — isn’t enough to sustain your business. In an effort to truly leverage that investment in traffic, marketers must use conversion rate optimization, or CRO, to convince said traffic to complete a desired action.

In some cases, these optimization techniques might be as basic as changing the color of a CTA. In other cases, there’s a lot more that can be improved. 

The list below outlines a ton of helpful tools for marketers who are looking to optimize their conversion rates. To help you understand which tools are used for what, we’ve also broken this list into a few major categories: lead capture tools, research tools, analytics tools, mouse tracking and heat maps, feedback tools, and experiment tools. 

From high-level changes like landing page and email design and inspiration to in-depth insights on how your visitors navigate through your content, these tools will help you improve the performance of your site.

Ready? Let’s get converting …

Lead Capture Tools

These are the tools that you will use to capture more leads on your site, thus improving your CRO. While most conversion-focused content has a built-in form or CTA, these tools act as additional lead capture mechanisms to boost the number of leads that convert on your content.

1) HelloBar

Price: Free for Basic, $12/mo for Pro, $83/mo for Enterprise

HelloBar is a lead capture tool that allows you to add a popup form to your website to grow your email list, promote your social pages, showcase a sale, or other lead generation strategies. The free version allows you to create one modal that’ll be shown to every tenth visitor. However, premium plans offer more advanced call-to-action options. 

HelloBar.png

2) SumoMe

Price: Free for Basic, $10/mo for Starter, $100/mo for Pro

SumoMe offers a suite of free tools to help you increase your site conversions. For lead capture, it offers a ‘Welcome Mat’ popup CTA, a ‘Smart Bar’ to increase email subscribers, a scroll-triggered box, and a ‘Contact Us’ form. Along with their Google Analytics research tools, the SumoMe suite helps you gain on-page insights and increase your email list.

SumoMe.png

3) Leadin

Price: Free!

This tool is like Google Analytics meets SumoMe meets a CRM. Sounds cool, right?

Leadin starts with a popup CTA, then syncs with your website’s existing forms to learn about your site visitors and their path through your pages. It gives you in-depth contact insights on both prospects and current contacts in your database. Leadin also pairs its contacts database with a dashboard that shows you on a high-level view of which marketing efforts are paying off and converting, and which ones aren’t. 

Leadin-3.png

Research Tools

Before you create any content, you’ll want to call upon these tools to draw inspiration and check out what other smart marketers have seen success with in the past.

4) BuzzSumo

Price: $99/mo for Pro, $299/mo for Agency, $999/mo for Enterprise

The best content is the content that gets shared and linked to the most, right? So what better way to gain preliminary insights than to compile all of the most shared content on your particular topic?

With BuzzSumo, all you have to do is enter the keyword or topic you’re focusing on and it will pull together all of the most shared and linked to content on that topic from the last day, week, month, or year. So if you’re trying to optimize the landing page for your new webinar on cat fashion, all you have to do is enter ‘cat fashion,’ and BuzzSumo will spit out all of the best articles, resources, videos, and more on the fascinating topic of cat couture.

You’ll then be able to dig in and explore some of the key elements that made these pages popular so that you can go back and incorporate them into your own content.

BuzzSumo-43.png

5) SimilarWeb

Price: $199/mo for Basic

Knowing where your website visitors came from can (and should) have a big impact on the type of content you create. With SimilarWeb, you can see where your traffic is coming from, which keywords are fueling your organic traffic, and what other sites are considered most similar to yours.

With this information, you’ll be able to optimize your content for your biggest traffic sources, and dig in to see what competitor sites and doing to drive conversions.

similarweb.png

6) LandBook

Price: Free!

If you’re creating a landing page from scratch, getting started can be difficult. Luckily there’s LandBook: a free collection of the web’s best designed landing pages.

With LandBook, you can explore all of the ways that top companies are using elements like copy, positioning, layout, and design to drive conversions. Pick and choose your favorite elements from the LandBook database, and then incorporate them into your own landing page.

Land_Book.png

7) Really Good Emails

Price: Free!

Today, the average attention span of an adult is eight seconds, which means if you want to get your message across, you’d better know a thing or two about visual communication and design.

Don’t know a thing or two about either? Enter ReallyGoodEmails.

Similar to LandBook (see above), ReallyGoodEmails is a database of the web’s best designed emails from the world’s most innovative companies. Use this as a resource to see how you can design your email to get your message across in the best way possible, as fast as possible.

(Check out this post for even more resources where you can find great marketing examples.)

reallygoodemails.png

8) SubjectLine.com

Price: Free!

When sending email, the subject line can either make or break your performance. Before you choose which ones to ship, check them out using this awesome resource. SubjectLine.com has tested over three million subject lines and has a tool to evaluate your potential options. It gives a deliverability and marketing score, plus advice on improving.

Subject_Line-1.png

9) Headline Analyzer

Price: Free!

CoSchedule’s headline analyzer gives you a score of 1–100 to gauge how effective your title will be. The score is calculated based on word usage, grammar, vocabulary, which type of headline it is, as well as character and word count.

The tool also shows you what your headline looks in like Google search, in an email subject line, and more. While we still recommend that you always test your headlines, this tool serves as a great litmus test to find out generally how well your headline will perform.

coscheduleheadlinetool.png

Analytics Tools

These are the tools that you will use to measure and track your content’s performance. You can use them to fully analyze the dips, jumps, and fluctuations in your conversion rate.

10) Kissmetrics

Price: $200/mo for Starter, $700/mo for Basic, $2,000/mo for Pro

Kissmetrics is a complex tool that integrates with your email service provider to make it easy for you to analyze your audience and email them in specific cohorts. With Kissmetrics, you can learn the path that your customers have taken through your website, conduct A/B tests, build data sets (without SQL), and figure out the ROI from your campaigns. 

Kissmetrics-2.png

11) Google Analytics

Price: Free for Basic, Contact Sales for Premium

Google Analytics is a free way to track your website visitors. You can see how long it takes visitors to bounce from your pages, if visitors complete goals from a certain path, and which sources are bringing people to your website. What’s great about Google Analytics is that is allows you to see which keywords people are searching to find your page, track which device people are searching for your website on, and uncover demographic data. However, there are no specific emails/contacts associated with your site visitors.

GoogleAnalytics-1.png

12) HubSpot Website Grader

Price: Free!

Website Grader is a great way to get a quick snapshot of a website’s overall performance. It gives insights on performance factors (including speed, page size, and page requests), mobile responsiveness and appearance, SEO (page titles, meta descriptions, headings, and site map), and security. From there, the tool devises a grade and provides suggestions on how to improve, which makes it easy to come up with some quick wins that’ll help you boost conversions. 

WebsiteGrader-1.png

Mouse Tracking & Heat Mapping Tools

These are the tools that you will use to see how people are interacting with your content, including how they scroll and where they click.

13) Hotjar

Price: Free for Basic, $29/mo for Pro, $89/mo for Business

Once you’ve nailed the basics like landing pages, CTAs, popups, and content, you’re ready for some more advanced conversion rate optimization.

Hotjar offers heat maps and screen recordings, which enable you to track how much of your page is being viewed, as well as how visitors are navigating your website. 

Hotjar-1.png

14) Clicktale

Price: Pricing is based on solution (contact sales for a demo)

Clicktale is similar to Hotjar, as it also offers heat maps to help you determine the most valuable real estate on your pages, scroll depth (where is the “fold” on your website?), click tracking, and also link analysis.

Using these tools, you’ll have the information you need to organize content, CTAs, and page design in a way that makes the most sense for engagement. 

Clicktale.png

15) Clicky

Price: Free for Basic, $9.99/mo for Pro, $14.99/mo for Pro Plus, $19.99/mo for Pro Platinum

Clicky gives you real time analytics on the visitors on your website. It tells you where people are accessing your site from, how long they’ve stayed on each page, and how many visitors are actively online. The resource also offers heat maps and scroll tracking. 

Clicky.png

16) Crazy Egg

Price: $9/mo for Basic, $19/mo for Standard, $49/mo for Plus, $99/mo for Pro

Crazy Egg offers a full suite of heat maps and click tracking, with the additional functionality of being able to segment clicks by source and evaluate link effectiveness. The basic package is fairly inexpensive and gives great insights on how effective each page of your website is.  

Crazy_Egg.png

17) Heatmap.me

Price: Free for Basic, $100/mo for Premium, $200/mo for Enterprise

Heatmap.me is a great free option for anyone looking to start exploring heat maps, responsive web design tests, and real-time page statistics. Heatmap.me can also track dynamic elements on your site in the heat map tool — think: slider bars, photo galleries, and other interactive sections.

Heatmapme.png

Feedback Tools

These are the tools that you will use to engage and receive feedback from your visitors. Feedback tools include surveys, polls, messaging, and user testing programs. 

18) Intercom

Price: $49/mo for Basic (in each product- Acquire, Engage, Learn, Support) + package plans

You can use chat tools to both acquire new customers and chat with existing customers. As a CRO tool, you can use Intercom to communicate with website prospects to learn if they need additional help, find out how their experience is going, and learn how you can improve. It also allows you to track leads and use a shared inbox with your team.

Intercom-1.png

19) Qualaroo

Price: $63/mo for Small Business, $199/mo for Professional, $499/mo for Enterprise

Using chat windows doesn’t just have to be for live chat. Qualaroo offers popups to collect live feedback from website viewers. With this information, you can tailor a site experience, target certain customers, and learn what issues people may be experiencing. This tool is extremely helpful at all stages of the funnel, and is especially utilized in the ecommerce space. 

Qualaroo-1.png
20) SurveyMonkey

Price: Free for Basic, $26/mo for Select (month to month), $25/mo for Gold (annual), $85/mo for Platinum

SurveyMonkey has a free option for those just starting out with survey research. You can use this tool to learn demographic information, discover which types of content your prospects and blog subscribers prefer, and get product feedback. 

surveymonkey-2.png
21) Five Second Test

Price: Free for Basic, $99/mo for Pro

UsabilityHub has an awesome community-fueled tool called 5 Second Test that allows users to upload a product, app experience, or design and have the community test it out. You get responses about recall, general feedback, and UI thoughts. This is a great way to get opinions, and they also offer click tests, preference tests, question tests, and navflow tests for other website and UI questions. 

fivesecondtest.png

Experiment Tools

These are the tools that you will use to manage, plan, and execute A/B and multivariate tests. Some of these tools will help you turn ideas into experiments, while others will help you create the variations and run the actual tests on your site.

22) Optimizely

Price: Free for Starter, $49/mo for Pay As You Go

Testing is hard: It’s hard to come up with a good control group, find a large sample, and determine if your experiment is statistically significant. Luckily, Optimizely helps a lot with all of that … and then some. With Optimizely, you conduct tests across all devices and platforms, then figure out if it is significant or not. The software offers A/B, multiple page, and multivariate tests.

optimizely.png

23) ABTesting.net

Price: Free!

Act-On software offers a free A/B testing tool that walks you through the process of trying out your first A/B test of a site page. It helps explain the basics of A/B testing, and allows you to confidently publish your winning variation (if your content management system doesn’t already include an A/B testing tool).

actonabtest.png

24) Effective Experiments

Price: Free Trial + Contact Sales for Pricing

Effective Experiments is a concise way to track all of your experiments. If you have tons of Excel spreadsheets cross-referenced with Google Analytics data, you are probably going crazy trying to keep track of everything. This tool puts it all in one place and helps you determine statistical significance. 

effectiveexperiments.png

Hopefully you’re armed and ready to start improving conversion rates across your website and marketing efforts. These tools range from free and for beginners to robust and more advanced. Feel out which options seem right for you and soon you’ll be upgrading to the more complex tools when you’ve mastered the basics.

What are your favorite CRO tools? Share anything we missed in the comments section below!

free ebook: optimizing landing pages

Jun

8

2015

How to Identify and Fix Friction on Your Landing Pages

Landing_Page_Friction.jpeg

You’ve spent a lot of energy (and budget) getting targeted traffic to your website.

Unfortunately, those visitors aren’t doing what you want them to do when they get there. 

But why?

Often times this comes as a result of landing page friction — a barrier that prevents your visitors from completing the action you’d like them to take.

Whether your copy is too long, your button text isn’t compelling enough, or you’re lacking social proof, you need to start by identifying the points of friction before you can turn them around. 

To help you get a better grasp on what landing page friction looks like, when it can be used to benefit your business, and how to resolve it when it’s hurting your conversions, keep reading. 

When Negative Friction Is Good

Negative friction point, you say? Aren’t all friction points negative? I disagree — friction can be both good and bad. For example, a commonly stated cause of negative friction on landing pages is long submission forms.

If there’s a lot of information to hand over when visitors arrive on the landing page, there is little incentive for them to complete your call-to-action. This state is often referred to as “psychological resistance.”

Do visitors feel like the transaction in unbalanced? Are they paying too much to get what you are offering?

Despite long submission forms being a common friction point within landing pages, we at HubSpot argue that sometimes a long form is a good thing. This is because those who commit to filling out a long form are commonly more interested — and often more qualified — than a visitor who does not. 

In other words, friction functions somewhat like a method of exclusion. Let’s take social media ads, for example. Targeted Facebook ads deliberately exclude certain demographics to avoid wasting money on those who would never be interested in actually buying. Down the funnel, this goes a long way towards improving the lead-to-customer rate. 

Essentially, both long forms and targeted Facebook ads aim to reduce the number of unqualified submissions by leveraging friction and exclusion to deflect those who wouldn’t be a good fit for your company. 

To give you another example, Chris Brogan of Owner Media Group goes against conventional marketing practices by charging registrants $20 to attend a webinar.

The-20-dollar-webinar.png

While I’m not privy to Brogan’s webinar goals, it would be logical to assume that any person willing to shell out $20 for a webinar is likely to be more qualified and engaged. Essentially, this approach employs friction to help him to weed out a lot of unqualified leads. 

Takeaway: What Can We Learn From These Insights?

  1. You need to identify the parts of your landing page that are preventing quality leads from continuing down the funnel.
  2. You need to figure out the parts of your landing page that are moving poor quality leads down the funnel.

How do you go about doing this?

The answer: good testing.

Identifying Friction: Two Real Sample Tests

At HubSpot, we are great believers in ongoing testing — even when something is performing really well for us. Much like the sales mantra “always be closing,” we hold ourselves to the motto “always be testing.”

When it comes to reducing negative friction, there are many things on a landing page that you can test to boost your conversion rate. Here are a few examples:

  • Persuasive copy
  • Social proof
  • Security
  • Referral source personalization
  • Imagery 
  • Benefits
  • Visual triggers (arrows, pointing, etc.)
  • Discounts or money-back guarantees

To give you a better idea of how an effective test is carried out, we’ve detailed two in-depth examples alongside their results. 

1) The “Indicated Reading Time” Test

Have you ever read an article online and noticed an indicated reading time at the top? Maybe it said five minutes, or seven, but either way it worked to set your expectations before you started reading, right?

I’m currently running an experiment to test the effectiveness of the “indicated reading time” inclusion by comparing a normal image on a landing page to an image including an estimated reading time for the offer. 

To illustrate my experiment, take a look at the image below. As you’ll notice, the first variation contains just the ebook, while the second shows both the ebook and the reading time: 

reading-time-test-mglive-blog.png

In terms of the process of the experiment, here’s how things have played out so far:

Background

The inspiration for this test stemmed from a feedback email we received from someone who had downloaded an ebook. He explained that the image on the landing page lead him to believe that it represented the length of the offer. From our point of view, the image of a book simply indicated that the offer was an ebook, rather than a template or recording, however, it was clear there was confusion. 

Problem

Could the uncertainty of length be preventing people from downloading our ebook offers? Aware that marketers are often strapped for time, picking and choosing which pieces of content they are going to read can be understandably difficult.

Hypothesis

While the word “ebook” often implies the length of an actual paper book, HubSpot ebooks are commonly under 25 pages. Trouble is, it didn’t seem as though our landing pages were communicating that. Aware that we were driving a lot of quality visitors to the page, failing to communicate the length of our content could results in us losing their interest — possibly forever.

Test

We ran a 50-50 A/B split-test for people who visited the landing page for one of our ebooks. One variation employed a regular ebook cover image, and the other displayed the cover image as well as a clock with the indicated reading time.

Results

So far, we’ve seen a 6% increase in submissions at 98% certainty. (Looking for an easy-to-use calculator to check your A/B test results? We recommend Get Data Driven’s A/B test significance calculator.)

Takeaway

After reviewing the results, it appears that we’re on to something. However, to ensure the validity of our results, we plan on replicating this experiment across a few more ebooks before declaring a winner. 

2) The “Form Redesign” Test (By: Yousaf Sekander)

Another great example of how to reduce friction points comes from Yousaf Sekander of RocketMill

Looking for a way to increase conversions for one of his clients, Sekander conducted an A/B test to compare the original variation of their landing page, against his optimized variation. Check out the image below to see the difference between the the two pages:

RocketMill-test-example.png

To give you a better idea of how Sekander approached this test, I asked him to provide a run-through of his experiment. This is what he had to say:

Background

A client of ours approached us to find out why visitors to their website were not moving along their sales funnel. The critical point identified was that the conversion rate on their forms was low.”

Problem

We analyzed the inquiry form (see above) on Tchibo UK’s website and uncovered a handful of friction points. Although the form was actually quite simple, the format made it appear big and complicated. In addition to its overwhelming size, there was also no incentive for the prospects to fill it out. On top of the form complications, we also noticed that the scrolling navigation obscured the telephone number.”

Hypothesis

Our CRO campaign manager, Bertram Greenhough, came up with a few form design ideas to reduce the friction. First, we would simplify the form by reducing it to one column and place it within a contrasting pop-up. The purpose of this would be to reduce the attention ratio, and redirect the visitor’s focus to both the messaging and the CTAs. In addition, we’d add a compelling question header, a bulleted list of unique selling propositions, a phone number, and an image of the product to clarify what the visitor would be inquiring about.”

Test

To determine the influence of the form changes, we conducted an A/B test to compare both variations and identify which one converted better.”

Results

After analyzing the results, we found that the simplified form saw over a 200% increase in conversions.” 

The Simple Way to Get Started 

If, like me, you have tons of hypotheses you want to test, you’ll need to start with a framework that shows you what should get done first. I like to use the “PIE” system. This process requires you to create a simple excel sheet, dump in all your ideas, and give them a score out of 10 in each of the following categories: 

  • Potential. Are these types of pages your worst performers?
  • Importance. How crucial are these pages for visitor-to-lead?
  • Ease. Is it easy to set up? 

PIE-chart-600x370.png

Source: WiderFunnel

The average of the sum of the total points will tell you what to begin with. Once you generate a score for each of the potential hypotheses, you can then begin to set priorities and start testing. 

At the end of the day, generating leads isn’t the easiest thing that marketers are tasked with, but it’s important nonetheless. Rather than allow landing page friction to negatively influence your conversion rates, don’t hesitate to explore different experiments such as the ones we detailed above. You never know until you test.  

Want to see a website through the eyes of a HubSpot marketer? We’re hosting an eight-minute live analysis of a website on June 11th at 15:00 BST / 10:00 EST. We’ll analyze the conversion potential of three different websites by looking at SEO, content, social, and design. Register here to watch.

Come along and watch us talk about websites like yours!

May

25

2015

5 Simple Ways to Optimize Your Website for Lead Generation

quick-lead-gen-wins.jpeg

Optimizing your website to generate leads is a no-brainer. But it’s not as simple as throwing a “click here” button on your home page and watching the leads pour in. (Unfortunately.)

Instead, marketers and designs need to take a more strategic approach. In this post, we’ll go over some quick ways you can optimize your website for lead generation that actually work.

To understand how to optimize our website, we’ll have to first gain a basic understanding of the lead generation process. What components are at play when a casual website visitor turns into a lead? Here’s a quick overview:

lead_generation_visualization.png

The lead generation process typically starts when a website visitor clicks on a call-to-action (CTA) located on one of your site pages or blog posts. That CTA leads them to a landing page, which includes a form used to collect the visitor’s information. Once the visitor fills out and submits the form, they are then led to a thank-you page. (Learn about this process in more detail in this post.)

Now that we’ve gone over the basics of lead generation, we can get down to the dirty details. Here are five simple ways to optimize your site for lead generation.

1) Figure out your current state of lead gen.

It’s important to benchmark your current state of lead generation before you begin so you can track your success and determine the areas where you most need improvement.

A great way to test out where you are is to try a tool like Marketing Grader, which evaluates your lead generation sources (like landing pages and CTAs), and then provides feedback on ways to improve your existing content.

You can also compare landing pages that are doing well with landing pages that aren’t doing as well. For example, let’s say that you get 1,000 visits to Landing Page A, and 10 of those people filled out the form and converted into leads. For Landing Page A, you would have a 1% conversion rate. Let’s say you have another landing page, Landing Page B, that gets 50 visitors to convert into leads for every 1,000 visits. That would be a 5% conversion rate — which is great! Your next steps could be to see how Landing Page A differs from Landing Page B, and optimize Landing Page A accordingly.

Finally, you could try running internal reports. Evaluate landing page visits, CTA clicks, and thank-you page shares to determine which offers are performing the best, and then create more like them.

2) Optimize each step of the lead gen process.

If your visitor searched “lawn care tips” and ended up on a blog post of yours called, “Ten Ways To Improve Your Lawn Care Regimen,” then you’d better not link that blog post to an offer for a snow clearing consultation. Make sure your offers are related to the page they’re on so you can capitalize on visitors’ interest in a particular subject.

As soon as a visitor lands on your website, you can start learning about their conversion path. This path starts when a visitor visits your site, and ends (hopefully) with them filling out a form and becoming a lead. However, sometimes a visitor’s path doesn’t end with the desired goal. In those cases, you can optimize the conversion path.

How? Take a page out of Surety Bonds‘ book. They were struggling to convert visitors at the rate they wanted, so they decided to run an A/B split test (two versions of a landing page) with Unbounce to determine which tactics were performing better on each page. In the end, they ended up changing a link to a button, adding a form to their homepage, and asking different questions on their forms. The result? A 27% increase in lead generation. 

If you want to run an A/B test on a landing page, be sure to test the three key pieces of the lead gen process:

a) The Calls-to-Action

Use contrasting colors from your site. Keep it simple — and try a tool like Canva to create images easily, quickly, and for free. Read this blog post for ideas for types of CTAs you can test on your blog., like the sliding CTA you see here:

Pop-up_CTA-1.gif

b) The Landing Pages

According to a HubSpot surveycompanies with 30+ landing pages on their website generated 7X more leads than companies with 1 to 5 landing pages. 

For inspiration, here are 15 examples of well-designed landing pages you can learn from.

c) The Thank-You Pages

Oftentimes, it’s the landing pages that get all the love in the lead generation process. But the thank-you page, where the visitor is led to once they submit a form on the landing page and convert into a lead, shouldn’t be overlooked.

Along with saying thank you, be sure to include a link for your new lead to actually download the offer on your thank-you page. You can also include social sharing buttons and even a form for another, related offer, as in the example below:

    • HubSpot landing page

Bonus: Send a Kickback Email

Once a visitor converts into a lead and their information enters your database, you have the opportunity to send them a kickback email, i.e. a “thank-you” email.

In a study HubSpot did on engagement rates of thank you emails versus non thank you emails, kickback emails doubled the engagement rates (opens and clickthroughs) of standard marketing emails. Use kickback emails as opportunities to include super-specific calls-to-action and encourage sharing on email and social media.

3) Personalize your calls-to-action.

Dynamic content lets you cater the experience of visiting your website to each, unique web visitor. People who land on your site will see images, buttons, and product options that are specifically tailored to their interests, the pages they’ve viewed, or items they’ve purchased before.

Better yet, personalized calls-to-action convert 42% more visitors than basic calls-to-action. In other words, dynamic content and on-page personalization helps you generate more leads. 

How does it work? Here’s an example of what your homepage may look like to a stranger:

Smart Content

And here’s what it would look like to a customer:

Smart Content

(To get dynamic content (or “smart content”) on your site, you’ll need to use a tool like HubSpot’s Content Optimization System.)

4)  Test, test, test.

We can’t stress this part of the process enough. A/B testing can do wonders for your clickthrough rates.

For example, when friendbuy tried a simple A/B test on their calls-to-action, they found a 211% improvement in clickthroughs on those calls-to-action. Something as simple as testing out the wording of your CTA, the layout of your landing page, or the images you’re using can have a huge impact, like the one friendbuy saw. (This free ebook has fantastic tips for getting started with A/B testing.)

5) Nurture your leads.

Remember: No lead is going to magically turn into a customer. Leads are only as good as your nurturing efforts.

Place leads into a workflow once they fill out a form on your landing page so they don’t forget about you, and deliver them valuable content that matches their interest.  Lead nurturing should start with relevant follow up emails that include great content. As you nurture them, learn as much as you can about them — and then tailor all future sends accordingly. 

Here’s an example of a lead nurturing email:

Lead Nurture Email

This email offers the recipient some great content, guides them down the funnel, and gets to the point. According to Forrester Research, companies that nurture their leads see 50% more sales ready leads than their non-nurturing counterparts at a 33% lower cost. So get emailing!

What other tips do you have for optimizing your website for lead generation? Share them with us in the comments.

free ebook: optimizing landing pages

May

25

2015

The Biggest Pet Peeves of CRO Experts

cro-pet-peeves.jpeg

Conversion Rate Optimization (CRO) isn’t a widely known field, even among digital marketers. If you need a quick refresher, CRO is the process of creating an experience for your website visitors that’ll convert them into customers.

But this science of lead conversion is quickly gaining ground. After all, who doesn’t want more clicks, leads, and sales?

On International Conversion Rate Optimization Day back in April, some of the best CRO experts in the business came together for an “Ask Me Anything” discussion on inbound.org, where they answered questions about all things conversion rate optimization. One of the interesting topics they covered was the things that really tick them off in the world of CRO. And trust me when I say they didn’t hold back.

What were some of the things that grinded these CRO experts’ gears? Here are 13 pet peeves related to conversion rate optimization to be sure you aren’t making on your website.

13 Pet Peeves From CRO Experts

1) Over-Simplification

The world is not simple, yet it’s natural for people to oversimplify everything. Optimizers have to be better than that. There is no ‘people always prefer’ or ‘who would ever.'”

– Peep Laja (author, CRO specialist, & founder of ConversionXL)

(Read more from Laja here.)

2) Assumptions

You should [make it] very easy for the user to checkout. The buttons and headlines should tell people what to do next. Never make assumptions that you know what the customer should do.”

– Alex Harris (e-commerce conversion specialist)

(Read more from Harris here.)

… Send good cart abandonment emails (and A/B test them), minimize distractions during the checkout process, make it clear to the customer what’s happening in the process and when, try to avoid anything that makes it look like you’re springing surprise fees or clever accounting on the customer, and reinforce why they’re buying from you (painless pre-paid returns process, best in class quality, social proof of satisfied customers, etc. etc. — test what works best for your customers).”

– Jim Gray (marketing engineer, data scientist & founder of Ioseed)

(Read more from Gray here.)

3) “Click Here” on Calls-to-Action

I personally hate “click here” prefixes, and so do search engines. (It hurts SEO.) It begs the question, does your CTA not already look like a clickable button For both headlines and CTAs, I use a variation of the fore mentioned formula: “I’d like to…” [WHAT: Specific Action]; “Because I want to…” [WHY: Specific Value]. 

“It’s important to pair WHAT and WHY together. Sometimes this can be accomplished in one line. Two lines (headline + subhead, 2-line CTA, CTA + booster) are more often needed though. This shouldn’t be feared if it provides more clarity and value.”

– Angie Schottmuller (chief of conversion marketing at Unbounce)

(Read more from Schottmuller here.)

A simple formula to follow for button CTA’s is ‘Action Verb’ + ‘Benefit.'”

– Bobby Hewitt (president and founder of Creative Thirst)

(Read more from Hewitt here.)

4) Ghost Buttons

Ghost buttons drive me crazy. It goes against usability. The concept is a designer’s fantasy trend that should die. The only time I find this tactic useful is when a client insists in having two CTAs on the page, and I basically want one to disappear. Ghosted buttons have ghost conversions.”

– Angie Schottmuller (chief of conversion marketing at Unbounce)

(Read more from Schottmuller here.)

5) Ego

“It can be really hard to let something go when you’ve sweated over it. If it loses, you have to have the courage to throw it away. The best way to do that is to celebrate the fact that you learned something from the failure.”

– Oli Gardner (co-founder of Unbounce)

(Read more from Gardner here.)

6) Unclear Call-to-Action Copy

It has to be abundantly clear what’s going to happen when someone clicks that button. What are they going to get? Are they scheduling a demo, or signing up for that demo right then and there? You can’t afford to leave people wondering, or they won’t click out of nervousness.”

– Joel Klettke (CRO copywriter)

(Read more from Klettke here.)

7) A “One-Size-Fits-All” Approach

I’ve seen case studies where including the word click increased… clicks. But like every case study, it isn’t a panacea and should be taken with a grain of salt. You can’t apply case study learnings, only use them to serve as inspiration and to be used to generate your own related hypothesis.”

– Oli Gardner (co-founder of Unbounce)

(Read more from Gardner here.)

Everything you’ve read about button design is true, and false, and somewhere in between. If you truly believe that the best hypothesis and test you can come up with — the one that will deliver a 200% increase on conversions — is to change the button, then you should run A/B or multivariate tests against all of those options to see what works for your audience.

“The fact is, different audiences relate to different designs, language, reading levels, colors, and more. Averages across industries won’t help you here.”

Stewart Rogers (director of marketing technology at VentureBeat Insight)

(Read more from Rogers here.)

8) Superlatives and Hyperboles

When it comes to using words like “amazing,” Peep Laja said it best: “Superlatives tend to lose against specifics (‘amazing pizza’ vs ‘stone-oven baked pizza by an Italian master chef,’ ‘fastest pizza delivery’ vs ‘delivery in 15 minutes’) 9 times out of 10. Instead of superlatives, offer lots of detail and specifics.”

Superlatives tend to lose against specifics (‘amazing pizza’ vs. ‘stone-oven baked pizza by an Italian master chef;’ ‘fastest pizza delivery’ vs. ‘delivery in 15 minutes’) 9 times out of 10. Instead of superlatives, offer lots of detail and specifics.”

– Peep Laja (author, CRO specialist, & founder of ConversionXL)

(Read more from Laja here.)

Instead of obsessing over individual words, think about your context and slash hyperbole wherever it stands. If the claims you are making are believable, hit on customer pain points and directly explain a benefit, then the verbiage you use to describe that benefit can be flexible, so long as it fits the context.”

– Joel Klettke (CRO copywriter)

(Read more from Klettke here.)

9) Buzzwords

i personally loathe ‘rockstar.’ I’ve used it. I’m embarrassed about it. But … when I see it on a page today, I instantly get that feeling that an old person is trying to sound young.”

– Joanna Wiebe (conversion copywriter)

(Read more from Wiebe here.)

10) Fluffy Language

A big hindrance on conversion rates and SEO alike is content that reads like generic fluff for the sake of targeting phrases.”

– Joel Klettke (CRO copywriter)

(Read more from Klettke here.)

11) Half-Baked Value Props

I hate when writers rely on old, tired [stuff] like, ‘We do X so that you can focus on what matters!’(…so.. what matters?); ‘We get to know our customers’ (everyone does); ‘We’re the highest quality’ (what does that even MEAN? Nobody wants high quality!).”

– Joel Klettke (CRO copywriter)

(Read more from Klettke here.)

12) Ignoring or Avoiding Data

In answering the question, “What’s your biggest pet peeve?”

When others pretend like the data doesn’t exist.”

– Tommy Walker (marketer at Shopify)

“Or worse, when others attempt to manipulate math for statistical significance to claim that the data qualifies as a valid test. Statistical significance is not the same as validity.”

– Angie Schottmuller (chief of conversion marketing at Unbounce)

(Read more from Walker and Schottmuller here.)

13) Businesses That Stop Testing

“Always be testing” was the rallying cry for this crowd. The takeaway? Keep on testing, even after you have wins. (If you’re not sure where to start, here’s a list of real-life CRO tests to try for yourself.)

Many thanks to all the CRO specialists who joined me in this inbound.org discussion.

What are your biggest CRO pet peeves? Share them with us in the comments.

free webinar: conversion rate optimization

May

18

2015

7 Conventional Landing Page Design Tactics You Should Still Test

 test-landing-pages.jpeg

Conventional wisdom is usually the safe play. Why take risks when there’s an established “right way” to do things?

In marketing, however, success often falls to those willing to buck trends and experiment.

When it comes to optimizing your landing pages, complacency and assumption are your worst enemies. It’s far too easy to defer to best practices instead of discovering what makes your unique audience click. Proven landing page techniques are commonly practiced for a reason, but what works for 90% of websites won’t automatically work for yours.

The genius of A/B testing your landing pages is that it allows you get a little bit crazy without making a permanent error. You can be as unconventional as you want, and testers consistently find that extreme adjustments are required for extreme wins. If you’re using a template anding page designer, making these “big changes” can take less than a minute, so there’s no excuse to play it safe.

To get the big wins, you need to test even when the answers seems obvious. Here are seven landing page “best practices” you should be challenging on your site.

7 Landing Page Design “Best Practices” You Should Still Test For Yourself

1) Minimalistic Design

Conventional wisdom dictates that landing pages should remain as empty, calm and spacious as possible. No distractions. Three colors maximum. One font. Like this landing page from DropBox for Business:

dropbox-for-business.png

The issue with this hard-and-fast “rule”? Different audiences demand different stimuli at different times. A Swiss audience might react differently to sleek design than an audience based in the rural America. Millennials are used to considerably more stimulation than an AARP crowd.

While simplicity is a “best practice” for a reason, like any practice, it should be tested specifically against your audience. Sometimes a more complex layout can drive more conversions.

2) Smiling Faces

A variety of studies (including several here on HubSpot) have demonstrated the effect a smiling face can have on your conversion rate. Human faces can create emotions within your visitors and help compel them to take action.

But images of human faces can also distract visitors. The human-free version of the HubSpot landing page below actually converted 24% better than the one with a smiling face that you’re seeing.

hubspot-product-demo.png

One of the biggest problems I see is with smiley photos is incorrect implementation. Many businesses will use images that are pretty obviously stock photos — you know, the ones with painfully fake smiles — in order to capitalize on this “best practice.”

But, in some cases, stock photos can actually kill your conversion rate, so you’ll want to tread carefully here. High quality photos of real people are the best way to go, but again, they won’t work for every one of your landing pages. (Here’s a list of 10 sites for free, non-cheesy stock photos to get you started.)

3) Security Seal

An eTrust, PCI or BBB badge on your page supposedly assuages fears that your site visitors may be harboring, so they can feel free to move forward with your offer, knowing their information is secure. In the example below from bills.com, you’ll find several trust seals on the left-hand side.

bills.png

Consider, however, that people often associate these types of seals with web forms relating to financial transactions. If you’re not asking your visitor for money, and yet you place a trust seal next to your signup form, it might raise suspicion that credit card charges could soon appear.

In situations like these, your landing page may perform better without the icons. It’s not so much a case of poor principles, but more a misapplication of good principles. Building trust with your visitors is paramount, but you want to save trust verifiers for the right point in your conversion funnel.

For more on trust seals and whether your landing page needs one, read this blog post.

4) Offering Only Legitimate Service Packages

A lot of marketing is simply the extension and exploration of natural intuition. For the most part, this stuff makes sense. But there are times when the study of human response can throw us for a loop.

When creating a pricing page, it would make sense to add a variety of packages to most efficiently engage with demand. You want to offer several legitimate package options and optimize your price points for max revenue. That’s standard practice.

But what several marketers have found is that customers will actually convert at a higher rate with the inclusion of an irrational option no one would ever buy. The concept stems from our inclination to compare objects that are more similar. By including an obviously inferior option that’s similar to our desired sale package, we can influence users to make the desired purchase.

For example, Carter & Kingsley increased a client’s profits by 114% simply by adding a package no rational person would ever buy:

split-test-subscriptions.jpg


Unbounce did the same thing, increasing revenue by 233% with the inclusion of a made-up product:

unbounce-pricing-new.jpg

For some reason, this strategy can actually work — and it might just be able to boost conversions on your website.

5) Adding Social Proof

Social proof can actually be extremely powerful at building trust with your audience and leveraging the power of the crowd for the purpose of conversions. But like so many other “best practices,” social proof is only effective when applied within a specific set of parameters. The misapplication of social proof can significantly sabotage your conversion rate. Its power to destroy is directly proportional to its power to build.

For example, CalPont saw a big conversion boost after removing social share buttons from their page content.

calpont-products.png

Taloon.com increased CTA click-throughs by 11.9% after eliminating social share buttons.

visual-website-optimizer.png

The unifying factor here is that negative social proof is just as powerful at discouraging conversions as positive social proof is at increasing them.

In both of these examples, the pages tested tended to have a low share rate. Not all content is the type people want to circulate on social media. Accordingly, the visual display of low share counts actually worked against the company. It essentially told the customer, “No one wants this, and you shouldn’t either.”

So while social proof can be a big win in certain scenarios, it’s something worth testing, because it will likely work against you if you’re not careful.

6) Hiding The Price

Any salesperson will tell you to never reveal the price too early. The price is nothing more than a figure until you’ve been able to first establish the value of your product or service. Then, once you’ve built up the value in your readers’ minds, revealing the price makes it look much less intimidating — and can even be utilized as a major selling point in your pitch.

In a person-to-person sales presentation, this means saving the price for the close or pre-close. On your website, this traditionally looks like waiting until the bottom of the page to include your price or saving it for after your customer clicks-through to continue the sales process.

But, as we are highlighting in this article, standard best practices don’t always work. And that’s what SafeSoft Solutions found when they included their price above-the-fold in the middle of their homepage hero shot.

market-dialer.png

The only change they made to the landing page above was to add in the $75 price sticker, and it increased conversion by 100%. What if you, too, are just a simple change away from doubling your sales? It’s worth testing out.

7) Using Video

In a sense, this principle is akin to “bacon makes everything better.” It may be true most of the time, but anyone who’s attended a creative dinner party and been served bacon-wrapped sushi knows better. (Sushi is good, and bacon is good. But together, they make for culinary revulsion.)

While video can be an effective (and visually appealing) tool for enhancing impact or conveying a difficult concept, when it comes to landing page conversions, it can also alienate, over-stimulate, or distract your audience’s attention from your primary call-to-action. Here’s an example of video on a landing page from Shopify:

shopify.png

In one case, testers found that replacing their homepage video with a much-maligned image-slider actually increased signups by 30%. Another business increased conversions by hiding its intro video within a modal box instead of featuring it within an embedded player.

Test, Rinse, Repeat

The “best practices” of landing page design can help you to jump into marketing with a solid game plan. What works well for the majority of companies might do fine for yours. But once you’ve gained some context and experience, it’s time to take off the training wheels and start testing for you unique audience.

Every brand is different. Every landing page is different. Every website visitor is different. What fails for 90% of businesses might catapult yours into success. You’ll never know until you start challenging the cookie cutter practices and figuring out exactly what works for your audience.

free ebook: optimizing landing pages

Apr

6

2015

Rethinking CRO: How Remarketing Can Unlock Higher Conversion Rates

remarketing

If you’re serious about improving more conversions, you need to start thinking of conversion rate optimization (CRO) differently. CRO isn’t just about making small adjustments to a landing page to get 5% more conversions. But if you’ve tried moving page elements around, tested variations of your copy, and optimized your form, but still aren’t seeing meaningful conversion gains, don’t worry — all is not lost.

Both large and small optimizations can make a difference, but in my experience, it’s the radical optimization changes that have the highest potential to earn you more conversions. If you want to be part of the top 5-10% getting unbelievably high conversion rates, you have to be willing to try some crazy and sometimes counterintuitive things. You might find they pay off in a big way.

Before I get started, if you’d like to learn in detail about some of the most impactful CRO hacks out there, then click here to register for a webinar this Thursday, April 9, 2015 as part of HubSpot’s #CRODay celebrations. I’ll cover the massive changes you can make, and HubSpot’s Lanya Olmsted will share optimizations large and small that have impacted HubSpot’s conversion rates.

Rethinking Conversion Rate Optimization

International CRO Day (yes, it’s a thing) is a fantastic opportunity to take a step back from your current conversion optimization strategy and rethink your process. It’s far too easy to get stuck in the rut of trying the same optimizations over and over, hoping for different results each time.

As marketers, we’ve had it drilled into our heads that conversion rates of 3-7% or so are pretty good — so you might think 10% is pretty fantastic. But if you’re already there, how do you know whether there’s room for growth?

You might be surprised to learn that what we think of as “good” conversion rates are really just average, or in some cases even below average. In our analysis of thousands of AdWords accounts with $3 billion in annualized spend, we discovered that across industries and verticals, exceptional advertisers are converting at two to three times the average.

The median conversion rate is actually 2.35%. A full quarter of accounts have conversion rates of 1% or less.

search-conversion-rate-distribution

As you can see, though, the top 25% of accounts have a conversion rate of 5.31% or higher, or about twice the average. The top 10% of accounts are converting at five times greater or more than the median — these accounts have average conversion rates of 11.45% or more.

Crazy, right? It’s not a fluke, and these advertisers aren’t just lucky. They’re consistently outperforming their competitors by magnitudes of three to five times or more.

How do they do it? And more importantly, how can you do it, too? Right now, I’ll share with you one of my favorite CRO hacks: remarketing. (I’ll cover nine more in Thursday’s CRO hacks webinar.)

Remarketing: One Key to Improving Conversion

Remarketing is one very effective, data-backed CRO hack. While growing in popularity, this advertising tactic is still surprisingly underutilized. The percentage of marketers investing more than 50% of their digital ad budget in remarketing doubled from 2013 to 2014, but that only brought it up to 14%.

And yet, over 90% of marketers surveyed in a study commissioned by AdRoll said remarketing is as effective or more effective than email, search, and other display campaigns.

What’s so great about remarketing? For starters, it gives you absolutely massive reach on the Google Display Network (over 92% of sites on the web, in fact). But other remarketing networks like Facebook and Twitter give you crazy precise targeting, as well. You can reach about 84% of the people you tag, 10-18 times per month.

reach-more-users

Image Credit: WordStream

Visitors to your website or app are tagged and can be remarketed to, which gets your brand and messaging back in front of them wherever they happen to be on the web. It’s a super effective way of reconnecting with people who have already expressed some kind of interest in your companies, products or services.

How Remarketing Moves the Needle on Conversion

Remarketing to existing site visitors obviously taps into the greater intent these people have already displayed. Beyond that though, remarketing is CRO. The fact is that the vast majority of your site visitors aren’t going to convert. Why give up on them the first time they hit your landing page and leave?

Your goal with remarketing is to remind people to complete the action you want them to complete — by getting in front of them with relevant messaging and offers while they’re doing things like watching YouTube videos, checking their email, searching online, checking their Facebook, and so on. If you can do that, then you can increase conversions.

Depending on how targeted you get with your audience lists and campaign types, remarketing can make a huge difference in your conversion rates. Custom printer company Storkie Express’s found that dynamic remarketing campaigns yielded conversion rates 203% higher than regular display ads and 119% higher than regular remarketing campaigns. Take a look at this data from PPC Hero:

ppc-hero-data

Image Credit: PPC Hero

One account had a conversion rate of 1.86% for their remarketing campaign, while the regular display campaign’s conversion rate was 1.19%.

Here are a few of the other benefits of adding remarketing to your CRO arsenal:

  • Increased brand exposure, keeping you top of mind among those already engaged in some way with your business.
  • Precise targeting, with the ability to exclude audience segments that you may have identified as less likely to convert or more likely to waste a click.
  • “Stickier” ads, as the CTR of remarketing ads stays high even as you might expect ad fatigue to set in. A user is still far more likely to engage with a remarketing ad even after having seen it six times before than they are with a brand new generic display ad.
  • Greater engagement, giving you the opportunity to have leads complete a series of increasingly lucrative small conversions (sign up for email, download an ebook, etc.) on their path-to-purchase, or to convert to a sale.
  • Better ROI, thanks to your ability to know your audience and speak more directly to each segment through targeted copy.

You can also count on lower costs-per-click, thanks to the higher CTR inherent to marketing to people already more motivated than the average ad viewer. Higher CTRs positively impact your Quality Score in AdWords and tell ad networks that your ads are highly relevant, which is typically rewarded with reduced costs per click.

In addition, returning visitors are more likely to be engaged with your site than new visitors, especially if you refine your audience to people who visited a certain product page, put an item in their cart, and so on.

Remarketing is one of my favorite CRO hacks, but it’s just one of the ten little-known and super effective tactics Lanya and I are sharing in our April 9th #CRODay webinar, so be sure to register.

We’ve run dozens of tests to prove these unconventional CRO hacks and can’t wait to walk participants each one. We’re going to challenge everything you think you know about conversion and give you a radical new view of the conversion formula. You’re going to take away CRO tactics built on a solid foundation of marketing psychology and proven in extensive testing by HubSpot and WordStream. See you there!

free webinar: conversion rate optimization

Dec

11

2014

Running an Email A/B Test? How to Determine Your Sample Size & Testing Time Frame

ab_testing_calculations

Do you remember your first A/B test on email? I do. (Nerdy, I know.) I felt simultaneously thrilled and terrified because I knew I had to actually use some of what I learned in college stats for my job. 

I sat on the cusp of knowing just enough about statistics that it could be dangerous. For instance, I knew that you needed a big enough sample size to run the test on. I knew I needed to run the test long enough to get statistically significant results. I knew I could easily run one if I wanted, using HubSpot’s Email App.

… But that’s pretty much it. I wasn’t sure how big was “big enough” for sample sizes and how long was “long enough” for test durations — and Googling it gave me a variety of answers my college stats courses definitely didn’t prepare me for.

Turns out I wasn’t alone: Those are two of the most common A/B testing questions we get from customers. And the reason the typical answers from a Google search aren’t that helpful is because they’re talking about A/B testing in an ideal, theoretical, non-marketing world. So, I figured I’d do the research to help answer this question for you in a practical way. At the end of this post, you should be able to know how to determine the right sample size and time frame for your next email send.

Theory vs. Reality of Sample Size and Timing in Email A/B Tests

In theory, to determine a winner between Variation A and Variation B, you need to wait until you have enough results to see if there is a statistically significant difference between the two. Depending on your company, sample size, and how you execute the A/B test, getting statistically significant results could happen in hours or days or weeks — and you’ve just got to stick it out until you get those results. In theory, you should not restrict the time in which you’re gathering results.

For many A/B tests, waiting is no problem. Testing headline copy on a landing page? It’s cool to wait a month for results. Same goes with blog CTA creative — you’d be going for the long-term lead gen play, anyway. 

But on email, waiting can be a problem — for several practical reasons:

1) Each email send has a finite audience.

Unlike a landing page (where you can continue to gather new audience members over time), once you send an email A/B test off, that’s it — you can’t “add” more people to that A/B test. So you’ve got to figure out how squeeze the most juice out of your emails. This will usually require you to send an A/B test to the smallest portion of your list needed to get statistically significant results, pick a winner, and then send the winning variation on to the rest of the list. 

2) Running an email marketing program means you’re juggling at least a few email sends per week. (In reality, probably way more than that.) 

If you spend too much time collecting results, you could miss out on sending your next email — which could have worse effects than if you sent a non-statistically-significant winner email on to one segment of your database. 

3) Email sends are often designed to be timely.

Your marketing emails are optimized to deliver at a certain time of day, whether your emails are supporting the timing of a new campaign launch and/or landing in your recipient’s inboxes at a time they’d love to receive it. So if you wait for your email to be fully statistically significant, you might miss out on being timely and relevant — which could defeat the purpose of your email send in the first place. 

That’s why email A/B testing programs have a “timing” setting built in: At the end of that time frame, if neither result is statistically significant, one variation (which you choose ahead of time) will be sent to the rest of your list. That way, you can still run A/B tests in email, but you can also work around your email marketing scheduling demands and ensure people are always getting timely content.

So to run A/B tests in email while still optimizing your sends for the best results, you’ve got to take both sample size and timing into account. Next up: how to actually figure out your sample size and timing using data.

How to Actually Determine Your Sample Size and Testing Time Frame

Alrighty, now on to the part you’ve been waiting for: how to actually calculate the sample size and timing you need for your next email A/B test. 

How to Calculate Your Email A/B Test’s Sample Size

Like I mentioned above, each email A/B test you send can only be sent to a finite audience — so you need to figure out how to maximize the results from that A/B test. To do that, you need to figure out the smallest portion of your total list needed to get statistically significant results. Here’s how you calculate it.

1) Assess whether you have enough contacts in your list to A/B a sample in the first place.

To A/B test a sample of your list, you need to have a decently large list size — at least 1,000 contacts. If you have fewer than that in your list, the proportion of your list that you need to A/B test to get statistically significant results gets larger and larger. 

For example, to get statistically significant results from a small list, you might have to test 85% or 95% of your list. And the results of the people on your list who haven’t been tested yet will be so small that you might as well have just sent half of your list one email version, and the other half another, and then measured the difference. Your results might not be statistically significant at the end of it all, but at least you’re gathering learnings while you grow your lists to have more than 1,000 contacts. (If you want more tips on growing your email list so you can hit that 1,000 contact threshold, check out this blog post.) 

Note for HubSpot customers: 1,000 contacts is also our benchmark for running A/B tests on samples of email sends — if you have fewer than 1,000 contacts in your selected list, the A version of your test will automatically be sent to half of your list and the B will be sent to the other half.

2) Click here to open up this calculator.

Here’s what it looks like when you open it up:

ab_testing_calculator

3) Put in your email’s Confidence Level, Confidence Interval, and Population into the tool.

Yep, that’s a lot of stat jargon. Here’s what these terms translate to in your email:

Population: Your sample represents a larger group of people. This larger group is called your population.

In email, your population is the typical number of people in your list who get emails delivered to them — not the number of people you sent emails to. To calculate population, I’d look at the past three to five emails you’ve sent to this list, and average the total number of delivered emails. (Use the average when calculating sample size, as the total number of delivered emails will fluctuate.)

Confidence Interval: You might have heard this called “margin of error.” Lots of surveys use this, including political polls. This is the range of results you can expect this A/B test to explain once it’s run with the full population. 

For example, in your emails, if you have an interval of 5, and 60% of your sample opens your Variation, you can be sure that between 55% (60 minus 5) and 65% (60 plus 5) would have also opened that email. The bigger the interval you choose, the more certain you can be that the populations true actions have been accounted for in that interval. At the same time, large intervals will give you less definitive results. It’s a tradeoff you’ll have to make in your emails. 

For our purposes, it’s not worth getting too caught up in confidence intervals. When you’re just getting started with A/B tests, I’d recommend choosing a smaller interval (ex: around 5).  

Confidence Level: This tells you how sure you can be that your sample results lie within the above confidence interval. The lower the percentage, the less sure you can be about the results. The higher the percentage, the more people you’ll need in your sample, too. 

Note for HubSpot customers: The Email App automatically uses the 85% confidence level to determine a winner. Since that option isn’t available in this tool, I’d suggest choosing 95%. 

Example:

Let’s pretend we’re sending our first A/B test. Our list has 1,000 people in it and has a 95% deliverability rate. We want to be 95% confident our winning email metrics fall within a 5-point interval of our population metrics. 

Here’s what we’d put in the tool:

  • Population: 950
  • Confidence Level: 95%
  • Confidence Interval: 5

sample_size_calculations

4) Click “Calculate.”

5) Your sample size will spit out. 

Ta-da! The calculator will spit out your sample size. In our example, our sample size is: 274.

This is the size one of your variations needs to be. So for your email send, if you have one control and one variation, you’ll need to double this number. If you had a control and two variations, you’d triple it. (And so on.)

6) Depending on your email program, you may need to calculate the sample size’s percentage of the whole email.

HubSpot customers, I’m looking at you for this section. When you’re running an email A/B test, you’ll need to select the percentage of contacts to send the list to — not just the raw sample size. 

To do that, you need to divide the number in your sample by the total number of contacts in your list. Here’s what that math looks like, using the example numbers above:

274 / 1000 = 27.4%

This means that each sample (both your control AND your variation) needs to be sent to 27-28% of your audience — in other words, roughly a total of 55% of your total list.

email_ab_test_send

And that’s it! You should be ready to select your sending time. 

How to Choose the Right Time Frame for Your A/B Test

Okay, so this is where we get into the reality of email sending: You have to figure out how long to run your email A/B test before sending a (winning) version on to the rest of your list. Figuring out the timing aspect is a little less statistically driven, but you should definitely use past data to help you make better decisions. Here’s how you can do that.

If you don’t have timing restrictions on when to send the winning email to the rest of the list, head over to your analytics. 

Figure out when your email opens/clicks (or whatever your success metrics are) starts to drop off. Look your past email sends to figure this out. For example, what percentage of total clicks did you get in your first day? If you found that you get 70% of your clicks in the first 24 hours, and then 5% each day after that, it’d make sense to cap your email A/B testing timing window for 24 hours because it wouldn’t be worth delaying your results just to gather a little bit of extra data. In this scenario, you would probably want to keep your timing window to 24 hours, and at the end of 24 hours, your email program should let you know if they can determine a statistically significant winner.

Then, it’s up to you what to do next. If you have a large enough sample size and found a statistically significant winner at the end of the testing time frame, many email marketing programs will automatically and immediately send the winning variation. If you have a large enough sample size and there’s no statistically significant winner at the end of the testing time frame, email marketing tools might also allow you to automatically send a variation of your choice.

If you have a smaller sample size or are running a 50/50 A/B test, when to send the next email based on the initial email’s results is entirely up to you. 

If you have time restrictions on when to send the winning email to the rest of the list, figure out how late you can send the winner without it being untimely or affecting other email sends. 

For example, if you’ve sent an email out at 6 p.m. EST for a flash sale that ends at midnight EST, you wouldn’t want to determine an A/B test winner at 11 p.m. Instead, you’d want to send the email closer to 8 or 9 p.m. — that’ll give the people not involved in the A/B test enough time to act on your email.

And that’s pretty much it, folks. After doing these calculations and examining your data, you should be in a much better state to send email A/B tests — ones that are fairly statistically valid and help you actually move the needle in your email marketing.

email marketing planning template

Nov

18

2014

How an A/B Test of Landing Page Form Copy Improved Lead Quality

clean_dataWhen most people talk about getting quality lead information from forms, they usually talk about one tactic: changing the length of the form. The longer the form, the better quality the leads will be … right?

Truthfully, it’s not always that simple. For most businesses, changing the form length is a great way to get started with increasing lead quality — but at a certain point, you’re going to need to experiment with other form conversion optimization tactics to get better information about the people filling the form out. 

At HubSpot, we’ve had a long lead generation form on our website for a while, but it wasn’t quite getting us the best results we needed to effciently rotate the right leads to Sales. Below is what we tested to help improve the quality of our form submission data — all without adding a single form field.

The Problem

Like we mentioned above, we’ve always had a long lead generation form on our landing pages — we’ve wanted a decent amount of information to properly qualify incoming leads for Sales. Here’s the type of information we typically ask for:

  • Your name
  • Your email
  • Your company name
  • The number of employees at your company
  • The URL of your company’s website 
  • Your role
  • Your department
  • Which CRM you use
  • Your biggest Marketing and Sales challenge
  • Whether your company is a marketing agency (or sells marketing services)

The last bullet is one of the most crucial to ask. We sell HubSpot both directly to businesses and through our Partner program, so we use this question to route new Partner leads to a certain team in Sales. To make sure that sales team is always getting the right type of leads, if someone says that they are a marketing agency, they go into a special queue. In that queue, the website is manually checked to confirm the lead is actually an agency, and if so, is sent on to our Partner sales team. 

About a year ago, we realized that our manual checkers had to still do a ton of filtering to get the proper leads to the proper teams. We discovered that of those people who said “Yes” (they could be a Partner agency), 60% of them were in reality not an agency. So if 10 people said “Yes” on the form, only 4 of them would be a marketing agency. As you can imagine, that manual data scrubbing takes time, and isn’t an efficient way to scale. So we decided to run a test on the form to see if we help people better identify themselves as a partner agency from the get-go.

The Methodology

The test all revolved around testing the form copy. Here’s what the original question said:

Control

SDavidson_Blog_Post-_Original_Form_Field 

In our experiment, we ran an A/B test on one of our landing pages comparing Treatment A and Treatment B against the control — the only difference between the landing page versions was this question. You can see below that we played with copy changes as well as visual presentation in the form field.

Treatment A

SDavidson_Blog_Post_Form_A_Variation

Treatment B

SDavidson_Blog_Post_Form_B_Variation

Next up is what we found.

What We Found

Both treatments had a huge improvement in data accuracy from our original 60% error rate. Treatment A had a 26% false positive rate (meaning of 100 people saying “Yes” on the form, 26 of those people were NOT an agency) while Form B had a 22% false positive rate. After testing for statistical significance, we decided to replace this question on all our forms with Form B’s field format. Since replacing this form field, the false positive rate has gone down even further. 

While this may seem like a trivial test, it’s had a huge impact on our business. Now, instead of getting caught in this data-scrubbing bottleneck, more leads are going to the right sales reps faster. And the faster they can respond to a qualified lead, the more relevant conversation they can have with the lead. So this didn’t just help us get cleaner data at scale — it helped make our sales process more lovable. Talk about a win-win!

optimize marketing channels

Oct

1

2014

3 Real-Life Examples of Incredibly Successful A/B Tests

experiment_examplesWhether you’re looking to increase revenue, sign-ups, social shares, or engagement, A/B testing and optimization can help you get there. But for many marketers out there, the tough part about A/B testing is often finding the right test to drive the biggest impact — especially when you’re just getting started.

So, what’s the recipe for high-impact success? (more…)

Sep

24

2014

The Psychology of Color: How It Affects the Way We Buy [Infographic]

color-wheel-eyeHumans are visual creatures — so visual, in fact, that color plays a much bigger role in influencing what we purchase than we might think.

There’s a reason companies test the colors of things like advertisements, banner ads, and call-to-action (CTA) buttons. When we did a (more…)

Aug

14

2014

4 Fresh Ways to Squeeze More Conversions Out of Your Blog

Published by in category A/B testing, Blogging | Leave a Comment

183712392This post originally appeared on the Insiders section of Inbound Hub. To read more content like this, subscribe to Insiders.

Allow me to make an assumption about you: At one time or another, you’ve done something because your friends were doing it. (more…)

Aug

13

2014

Landing Page Best Practices You Should Still Test For Yourself

test-tubesWouldn’t it be nice if people in your industry did some A/B tests, talked about or published their results, and then you could just replicate their tactics and get the same results? Think of how much bandwidth it would save you and your team!

But unfortunately, just because something (more…)

Aug

1

2014

Facebook, OkCupid, and the Ethics of Online Social Experiments

Published by in category A/B testing | Leave a Comment

guinea-pigRecently, two different social websites admitted they ran experiments on users who had no idea they were being experimented on. Facebook ran tests to see if they could affect users’ emotions, and OkCupid ran tests mismatching users to see if they would interact with each other differently. (more…)


Below are Sister sites of the Site you are on Powered by: MCC Group
Advertising Pages Exchange Free Traffic Exchange Free Links Directory Free Promotion Forum