GMT NewYork London Moscow Tokyo Sydney

Jul

15

2017

How We Generated 1 Million Facebook Video Views: A HubSpot Experiment

Published by in category A/B testing, Daily, Social Media | Comments are closed

social-media-experiment-coverage-compressed.jpg

Gone are the days when social media publishing and engagement could be tacked onto the daily responsibilities of an intern — as were many of the first roles in social media.

Today’s growth-minded organizations need a team of people ideating, creating, publishing, and promoting content on social media to drive meaningful returns on investment — and this requires time, effort, and creativity.

Manage and plan your social media content with the help of this free calendar  template.

Marketers are realizing this and dedicating more time, resources, and staff headcount to social media. In the 2017 State of Inbound report, more marketers said they planned to add social channels like Facebook, Instagram, and YouTube to their marketing efforts than in the previous year’s survey.

Here at HubSpot, we’re constantly evaluating and changing the way we create content on social media to adapt to the way people want to stay connected. As our audience’s preferences for social media content evolve each year, so does our strategy.

In our survey, nearly half of respondents said they wanted to see more social media content — and more videos, especially. So we’ve run a few experiments digging into what our audience wants to see, how they want to engage, and where they want to interact on social media.

We have our strategies, process, and results to share that might inform your strategy, but our greatest piece of advice for any social media team — no matter the size — is to always experiment. The social media space changes so much every month — it’s important to figure out what works, but it’s also important to remain agile so you can try new things whenever you can.

How We Increased Video Views 20X by Creating Native Social Media Content

Social media has changed.

That might read like the biggest understatement in the world, but hear me out.

As part of our previous social media strategy, our posts were connected to lead generation goals — and most had a strong tie to our brand and promoting our content. Now, our content is all about our audience — and not all about us. We needed to refocus and remember that our audience members are people, just like us. If we wouldn’t want to see a piece of content in our Facebook News Feeds, why would our audience? We wanted to test the effect of focusing our content on our audience — what they want to learn about, what their goals are, and even what struggles they face.

In short, we wanted to be more social, and less promotional.

This doesn’t mean we recommend doing away with sharing blog content or ebooks on social media entirely. After all, it’s hard to come up with new ideas for creating social media videos to share every day of the week. We’re just saying you shouldn’t post a link to a blog post or ebook on Facebook and call it a day. Instead, get inspired by the ideas and salient points, and repurpose your content into Facebook videos, Instagram albums, or Snapchat Stories. You can still use the good ideas — but use them to create native social media content that performs better for the medium.

If your current social media strategy sounds like our previous, all-about-us approach, don’t worry — read on to learn how we’ve changed things up.

1) Different Video Topics

The Goal:

We examined our audience and learned what they did engage with in our previous social media strategy. Then, we researched the broader social media and digital landscapes. We wanted to learn what marketers and salespeople were already engaging with and finding relevant, and how we could create content more specifically for them, instead of distributing our content on social media channels for our goals.

The Experiment:

We created a list of topics and headlines we believed our audience would respond well to — that were more lifestyle and culturally-oriented — and immediately started creating content. Then, we started brainstorming culturally relevant and popular topics and saw how we could creatively present them to this audience in a way that makes sense for our audience and their world.

Our audience is made up of marketers and salespeople who want to learn how to grow and get better at their jobs — so instead of using our social media channels to simply push out content we were producing on our blogs, offers, and external channels, we’ve started creating content specifically for our Facebook audience based on what we know about what they like (like this video about making coffee to improve productivity — two things busy people love):

The Results:

We went from an average of 50,000 video views per month to 1 million views in our first month performing these experiments. Our engagement rate also shot up as the content resonated with our audience and they started liking, commenting, and sharing our posts.

Take a look at our engagement rates from our previous social strategy (orange indicates reach, and pink indicates clicks):

social-strategy-v1.png

And here’s what our views and engagement rate looked like under our new strategy:

social-strategy-v2.png

Pretty big jump, huh?

Key Takeaway for Marketers:

When you start evaluating how to generate more Likes, comments, and shares from your Facebook audience, think about how you yourself use the platform. You might not be as interested in sharing a post that’s highly specific to one brand or organization, but you might engage with a post that’s highly relevant to you, right?

Conduct some detailed persona research, analyze your Facebook audience insights, and learn more about how they’re spending time on the platform instead of simply using Facebook as a means to only promote what you’re doing. Create social media content specifically for your audience, and you’ll get better results.

2) Different Video Design Devices

The Goal:

We wanted to test our videos to see if different designs and formats would lead to different engagement rates.

The Experiment:

We tested the effects of some new design devices. These included starting videos with a human face, putting title bars throughout the duration of our videos, ensuring there were captions throughout, adding a “Best with Sound on” animation, and providing a CTA with the goal of audience engagement.

  • Human face: We felt that people would respond best to a human face as they would feel connected. Here’s an example.
  • Title bars: People are scrolling so fast we wanted to grab their attention and make sure they knew what our video was about in that quick motion. It also helps if someone gets distracted during the video, they will always know the topic. Here’s an example.
  • Subtitles: Our historical data showed that about 95% of people watched our videos without the sound on. We wanted to meet them where they were at and make sure that our videos could still be watched in that format. Here’s an example.
  • “Best with Sound On” animations: Knowing our audience primarily watches with the sound off, we wanted to make sure our videos that benefitted from sound were noted as such. Here’s an example.
  • Call-to-action: To help facilitate how people could engage with our video, we ask them questions, or ask them to respond in some way to our video. Here’s an example.

The Results:

In January 2017, the average Facebook view earned 4,500 views. By May 2017, our average number of Facebook video views had increased to 56,000 per new video. Additionally, our average number of video engagements (Likes, comments, clicks, and shares) during this time period increased from 100 to 496. Finally, our percentage of Facebook viewers who watched videos with the sound off decreased from 95% to between 60-70% per video.

Key Takeaway for Marketers:

The design devices we added are designed to make for an easier viewing experience for the audience and crafted based off how people are using social these days. We are serving up videos to our audience how our audience wants to see them, and the results show that it’s working. Use different devices and tools, such as the ones we tried, to make your videos easier to consume for your audience — no matter where or how they’re watching.

3) Different Video Sizes

The Goal:

We want to create videos that are eye-catching and attract our audience’s attention in the Facebook News Feed to increase our video views and engagement on the platform.

The Experiment:

We’ve started experimenting with different video sizes and formats to improve engagement and increase video views. The default Facebook video size has an aspect ratio of 16:9 (a rectangle), like the video below. When viewers click the video to watch on mobile devices (where roughly 90% of Facebook users access the platform), it only takes up part of the center of their screen in the News Feed:

facebook-aspect-ratio-vox1.png

And when viewers click in to watch the video, it takes up so little space that Facebook queues up another video for them to watch next — or potentially navigate away to if they get bored with the original video:

facebook-aspect-ratio-vox2.png

Conversely, when we post Facebook videos with a 1:1 aspect ratio, the video takes up 78% more space on the mobile News Feed — and it takes up more space when the viewer clicks to watch it, too.

          facebook-aspect-ratio-hubspot1.png        facebook-aspect-ratio-hubspot2.png

The Results:

We haven’t experimented with mobile Facebook video sizes enough to report on it — yet. Luckily, our friends at Buffer have us covered — they partnered with Animoto to experiment with optimal Facebook video sizing for mobile earlier this year.

Buffer and Animoto found that square videos (1:1) outperformed landscape videos (16:9) in both views and engagement. Over the course of the experiment, square videos achieved 30-35% more views and 80-100% more engagement than the landscape format. You can dig into all of the results in Buffer’s blog post for all of the findings — including the interesting note that landscape videos outperformed square videos for desktop users.

Since the odds are that most of your Facebook fans will access your content on mobile devices, filming videos in a square aspect ratio will most likely drive more engagement and more video views for your content on the platform. But, if you’re specifically targeting desktop users for a campaign or ad, landscape videos might be a better choice.

Key Takeaway for Marketers:

Film videos in 1:1 aspect ratios to take up as much space as possible on your audience’s mobile phones — because that’s where they’re interacting with you the most. Next, we’ll be trying to film in 9:16 aspect ratio to take up the entire screen — like an Instagram or Snapchat Story — and we’ll keep you posted how those perform in future posts.

Tl;dr: Native Social Media Content

Overall, we averaged 50,000 total Facebook video views per month last year, and this year so far, we’re averaging about 1 million total video views per month. Some of these additional views may be attributable to Facebook ad spend on another video campaign (more on that in a subsequent post), and although using paid ads and sponsored posts on social media is part of the game, creating good content matters too. And we think our overhaul of how we create Facebook videos was the primary growth lever.

Think about creating content for social media the way you think about optimizing blog posts with on-page SEO elements.

You know all about on-page SEO — how to create titles, headers, meta descriptions, and URL structures that help your blog posts and webpages rank in search engines. Think about these video devices and strategies like on-page SEO — but for social media.

Subtitles, title bars, animations, and video sizes all work together to make it easier for your audience to view your videos and interact with them the way they want to — primarily on mobile devices, and without turning up the volume. And of course, engaging topics get viewers interested in clicking and watching something on their News Feeds in the first place.

Next Steps

We’re going to keep iterating on what works, scrap what doesn’t, and brainstorm more ways to keep our audiences engaged and entertained by our content. The ultimate goal of social media is to be just that — social — and we want to hear from our audience, learn what they like and dislike, and keep creating cool stuff for them to enjoy.

What experiments is your social media team running? Share with us in the comments below.

free trial of hubspot's social media software

Powered by WPeMatico

Jun

24

2017

We Asked Our Audience What They Really Think of PDF Ebooks: A HubSpot Experiment

Published by in category A/B testing, Content Marketing, Daily | Comments are closed

pdf-preferences-experiment-compressed.jpg

I don’t know about you, but I barely print anything anymore.

Seriously, think about it — when’s the last time you had to type Command + P and print out a document? Between e-tickets, virtual payment options, and online signature tools, I think the last thing I printed out was the lease for my apartment.

So you can imagine my surprise when HubSpot’s audience started telling us they still like to print out our ebooks — which are often 20 or 30 pages in length — instead of viewing them on a web page.

In 2017 — during the era of self-driving cars, augmented and virtual reality, and artificial intelligence — our team here at HubSpot is constantly striving to test and implement the most modern techniques for content creation to provide cool, useful resources for our audience. But as it turns out, our perceptions of what our audience actually values when they download out content were a little … off.

In this post, I’ll dive into our hypothesis, how we tested it, and what we’re learning about our audience — and how they actually like to consume our content.

What We Do

I work on HubSpot’s Marketing Acquisition team creating content offers — such as our downloadable ebooks, guides, and templates — that our audience exchanges their contact information for in order to download them.

If you’re familiar with the inbound marketing methodology we’ve been teaching here at HubSpot for more than 10 years, I operate in the “Convert” stage of the process of helping new people discover and learn about HubSpot:

inbound_methodology_title-2.png

When a person happens upon HubSpot for the first time online — via a blog post like this one, through social media, or by conducting a Google search — they might see a bold, brightly-colored call-to-action (CTA) encouraging them to learn more about a particular topic or product.

And in order to get that information — from an ebook, a guide, a template, a webinar, or an event — the person has to hand over their contact information. This ensures they can receive an emailed version of the content offer or event registration, and it also converts them from a visitor into a lead.

My job is to create content that visitors are so interested in learning more about that they exchange their phone number, email address, and professional background information. And to make sure we keep converting visitors into leads for the health of HubSpot’s business, I make sure that ebooks, guides, and events are helpful, fascinating, and ultimately educate our audience on how to do inbound marketing.

What We Wondered

For the most part, my team’s job has entailed creating PDFs that visitors can download once they submit a form with their contact information.

More specifically, this has meant creating a lot of PDFs.

And although people were filling out forms and downloading our content offers, we started wondering if we should offer them something different — something more cutting-edge — than a file format created back in 1993. And we wondered if changing the format of our content offers would change conversion rates, too.

We decided to run a survey — and a little test.

We wanted to know if our core persona who we marketed these content offers to still liked PDFs and found them useful. So, how else would we find out than by creating an offer?

I created two different version of the same content offer — one in PDF format, and one in web page format. Then, once someone downloaded the offer, we sent them a thank-you email, and we asked them which format they preferred, and why.

What We Learned

More than 3,000 individuals submitted their information to access the offer, and roughly 9% responded to our question, which gave us more than 300 responses to learn from.

And much to our surprise, 90% of the respondents preferred downloading a PDF to reading our content on a web page.

bethany-survey-pdf.png

We gleaned a ton of valuable information about our core audience from this survey, and the qualitative feedback was incredibly helpful, too. Our key takeaways about format preferences were:

  1. Our core persona likes to print offers.

  2. People viewing our content want to be able to download it and come back to it later.

  3. People don’t think our web page offers look as good as PDFs.

  4. Some people are potentially defaulting to the format they know best.

  5. People liked having both print and online versions.

It’s incredibly helpful to learn what’s going on behind the decisions and choices our audience makes to inform future strategy when it comes to content creation. But this information leaves us with a challenge, too: How do we get our audience excited about content living on interactive web pages, too?

Content living on web pages can be crawled by Google to improve websites’ domain authority (and SEO superpowers) — and PDFs can’t be. So we’re making it our mission to keep offering our audience different options for consuming content the way they want to — while innovating and testing new ways to offer content our core persona is just as excited about in a web-based format.

I’ll be back with more details about that next experiment, but in the meantime, download one of our latest content offers, and let us know if you like the format in the comments.

What’s your opinion? PDF or web page? Share with us what you learned in the comments below.

free report: content lessons from 175K blog posts

Powered by WPeMatico

May

17

2017

Disproving Best Practices: The One- vs. Two-Column Form Test

Published by in category A/B testing, Daily, Landing Pages | Comments are closed

disproving-best-practices.png

A few months ago, I took the stage at Digital Summit Dallas to talk about blog conversion rate optimization (CRO). The session right before mine was led by Unbounce Co-Founder Oli Gardner — a household name for those of us in the CRO industry. Needless to say, it was a tough act to follow. 

In his session, “Frankenpage: Using A Million Little Pieces of Data to Reverse Engineer the Perfect Landing Page,” Oli shared lots of great data-backed tips for landing page optimization. In discussing best practices for conversion forms, he talked about how two-column forms weren’t ideal. 

What’s the Beef With Two-Column Forms?

Oli isn’t the only one to frown upon the use of two-column forms. Baymard Institute, a usability research company, published this a few years back, and ConversionXL Founder Peep Laja has also asserted that one-column forms perform better.

Peep’s colleague Ben Labay even published a study about the superiority of the one-column form over multi-column forms. The study showed that users complete the linear, single-column form an average of 15.4 seconds faster than the multi-column form. While speed is not directly tied to form completion, the data suggests that if the single-column form is faster to complete, fewer people will abandon it, garnering more conversions. It all boils down to user experience.

But Oli’s advice to avoid multi-column forms originally caught my attention because we had just redesigned HubSpot’s demo landing page, one of the most important landing pages on our website, and switched from a one-column to a two-column form in the process.

The thing that stuck out to me was that in switching to two columns, we had actually improved the conversion rate of our page by 57%. Now to be fair, the form wasn’t the only variable we manipulated in the redesign (we refreshed the design and made some copy tweaks as well), but it still made me wonder whether two-column forms were really all that bad.

So I put it to the test. 

The One- vs. Two-Column Form Test

Using HubSpot’s landing page A/B testing tools, I pitted the two-column form (the control) against the one-column form (the variant). Here’s how they looked …

Control (Two-Column Form)

demo-lp-control-two-column-form.png

Variant (One-Column Form)

demo-lp-variation-one-column-form.png

So “best practices” aside, which do you think performed better?

And the Winner Is …

not the one-column form. In fact, the two-column form converted 22% better than the one-column form, statistically significant with a 99% confidence level.

Surprised? I wasn’t. Just look at the length of that one-column form! Yes, HubSpot’s lead-capture forms are long (13 fields to be exact), but they’re long by design. Through our experience, we’ve learned that having more fields helps us better qualify our leads, and weed out unqualified ones.

But a 13-field form doesn’t exactly lend itself to a one-column design, which is why I think for us, the two-column form works better. The theory is that the one-column form, despite having the same number of fields, looks longer, so visitors are much more likely to get scared off before completing it.

Since we ran the test, we’ve actually switched to a kind of hybrid form, with elements of both a one- and two-column form, to make our two-column form a bit more user friendly. Our old two-column form is on the left, and our new hybrid form is on the right.

two-column-vs-hybrid-form.png

Questioning “Best Practices”

Any CRO worth their salt knows there’s really no such thing as best practices, and that everything should be tested yourself (which, coincidentally enough, was a major theme in the talk I delivered after Oli’s).

In fact, Oli and Peep will be the first ones to tell you that while they may share certain CRO findings and trends from their experience, there are no sure things. That’s why testing things for yourself is so important. What might work better for one site, might not necessarily work better for yours  that’s fundamental to CRO.

And in my opinion, running those tests to figure out what works for you is what makes conversion rate optimization so much fun. Especially when the results challenge what the experts say 😉 

land

Powered by WPeMatico

Apr

7

2017

The Beginner's Guide to Conversion Rate Optimization (CRO)

Published by in category A/B testing, Daily, marketing automation, SEO | Comments are closed

beginners-guide-conversion-rate-optimization-compressed.jpg

Today, most marketing teams are structured to drive traffic towards websites, which then converts into leads for the sales team to close. Once this process starts to deliver results, marketers then seek to generate even more traffic, and hopefully even more success.

An oversimplification, but that’s the standard marketing playbook. Few marketing teams focus on getting more from existing traffic. That’s where conversion rate optimization (CRO) comes in.

In this blog post, we’ll teach you all about CRO — what it achieves, why you should do it, and how your team can execute it. We’ll explain how you can drive more results from your existing traffic so your content can work smarter, and not harder, for you.

What Is Conversion Rate Optimization (CRO)?

I’m glad you asked. Many websites are designed to convert website visitors into customers. These conversions occur all over the website — on the homepage, pricing page, blog, and landing pages — and all of these can be optimized for a higher number of conversions. The process of optimizing those conversions is exactly what CRO entails.

CRO is a huge, often untapped opportunity for marketing teams, and you might be surprised by the oversized impact you could deliver by fine-tuning your website for conversions.

When Is Conversation Rate Optimization (CRO) Right for Your Business?

Once your sales and marketing engine attracts website visitors who consistently convert into leads for your sales team, you should start thinking about CRO.

Most businesses have a finite demand for products and services, so it’s imperative that you make the most out of your existing website traffic. Tools like Google’s Global Market Finder can show you online search volume to give you an idea of your potential customer demand. Once you determine the threshold of your customer demand, it’s time to nail down how to get more out of your existing website traffic.

Below are three formulas to help you figure out how to tackle CRO at your company, and what goals to set:

  1. New revenue goal ÷ average sales price = # of new customers
  2. # of new customers ÷ lead to customer close rate % = lead goal
  3. Leads generated ÷ website traffic X100 = % conversion rate

To help you understand the impact CRO could have on your business, here’s an example of the formulas in action.

If your website has 10,000 visitors per month that generate 100 leads and subsequently, 10 customers each month, the website visitor to lead conversion rate would be 1%.

But what if you wanted to generate 20 customers each month? You could try to get 20,000 visitors to your website and hope that the quality of traffic doesn’t decrease. Or, you could get more leads from your existing traffic by optimizing your conversion rate.

If you increased the conversion rate from 1% to 2%, you’d double your leads and your customers.

The table below shows the impact of increasing your website’s conversion rate:

  Company A Company B Company C
Monthly website traffic 10,000 10,000 10,000
% conversion rate 1% 2% 3%
Leads generated 100 200 300
# of new customers 10 20 30

The key point here? Trying to generate more website traffic isn’t necessarily the right approach. Think of it like a leaky bucket. Pouring more water into a leaky bucket won’t fix the root cause — you’ll just end up with a lot of waste. Conversion rate optimization is about getting more from what you have and making it work even better for you.

Ready to take the first steps towards CRO at your company? Check out the strategies below, and start testing.

8 Conversion Rate Optimization Strategies to Try

1) Create text-based CTAs within blog posts.

While it’s good practice to include a call-to-action (CTA) in your blog post, these sometimes fail to entice people to take the desired course of action. Banner blindness is a very real phenomenon as people become accustomed to ignoring banner-like information on websites. This lack of attention, coupled with the fact that website visitors don’t always read to the bottom of a blog post as they “snack” on content, means a new approach is required.

That’s where the text-based CTA comes in handy. Here at HubSpot, we ran a test with text-based CTAs — a standalone line of text linked to a landing page and styled as an H3 or an H4 — to see if they would convert more traffic into leads than regular CTAs at the bottom of a web page. Here’s one of ours below:

Manage and plan your social media content with the help of this free calendar  template.

In HubSpot’s limited test of 10 blog posts, regular end-of-post banner CTAs contributed an average of just 6% of leads that the blog posts generated, whereas up to 93% of a post’s leads came from the anchor text CTA alone.

2) Include lead flows on your blog.

Another test you should consider is including lead flows on your blog. Essentially, these are high-converting pop-ups designed to attract attention and offer value. You can select from a slide-in box, drop-down banner or pop-up box, depending on your offer. We experimented with the slide-in box on the HubSpot blog, and it achieved a 192% higher clickthrough rate, and 27% more submissions than a regular CTA at the bottom of a blog post.

Head over to the HubSpot Academy to learn how to add lead flows to your blog posts. They can dramatically increase conversions on your website.

3) Run tests on your landing pages.

Landing pages are an important part of the modern marketer’s toolkit. A landing page is where a website visitor becomes a lead, or an existing lead engages more deeply with your brand. These pages play an important role on your website, so you should run A/B tests to get the most from them.

But what should you A/B test? We know that a high performing landing page can have a tremendous impact on a business, so at HubSpot, we make it easy to test variants and eke out more conversions. You can quickly and easily test website copy, content offer, image, form questions, and page design. Check out these tips for effective A/B testing and our A/B testing calculator.

4) Help leads to immediately become a marketing-qualified lead (MQL).

Sometimes, your website visitors want to get straight down to business and speak with a sales rep, rather than be nurtured by marketing offers. You can make it easy for them to take this action (and immediately become a marketing qualified lead) with a combination of thoughtful design and smart CTAs.

Compelling, clear copy has the ability to drive action and increase conversions for your business. But which actions do you want to encourage so visitors can become MQLs?

Here at HubSpot, we discovered that visitors who sign up for product demos convert at higher rates than visitors who sign up for free product trials, so we optimized our website and conversion paths for people booking a demo or a meeting with a sales rep. Admittedly, this depends on your product and sales process, but our best advice is to run a series of tests to find out what generates the most customers. Then, optimize for that process.

The key takeaway is to look for ways to remove friction from the sales process. That being said, if you make it easy for people to book a meeting with sales reps, we do recommend further qualification before the call takes place, so the sales rep can tailor the conversation.

5) Build workflows to enable your sales team.

There are a number of automated workflows you can create that your colleagues in sales will thank you for. For instance, did you know it’s possible to send emails on behalf of sales reps, so leads can book a meeting with them at the click of a button? Or that sales reps can receive an email notification when a lead takes a high intent action, such as viewing the pricing page on your website? And if you work in ecommerce, you can send an email to people who abandon their shopping cart.

All of this is possible with marketing automation. Want to learn more? Master marketing automation with our helpful guide.

6) Add messages to high-converting web pages.

With HubSpot’s messages tool, it’s now possible to chat with website visitors in real-time. To increase conversions, you should add messaging capabilities to high-performing web pages, such as pricing or product pages, so leads convert rather than leave.

You can also make chatting action-based. For example, if someone has spent more than a minute on the page, you may want to automatically offer to help and answer any questions they may have.

HubSpot’s messages tool is coming in the spring of 2017, but you can apply to join the beta program here.

7) Optimize high-performing blog posts.

If you’ve been blogging for more than a year, it’s likely you’ll have some blog posts that outperform others.

The same is true at HubSpot — in fact, the majority of our monthly blog views and leads come from posts published more than a month ago. Blog posts are a big opportunity for conversion rate optimization.

To get started, identify the blog posts with high levels of web traffic, but low conversion rates. It may be that the content offer you’re promoting isn’t aligned with the blog post’s content, or your CTA could be unclear.

In one instance, we added a press release content offer to a blog post about press releases and saw conversions for that post increase by 240%.

Additionally, you should look at blog posts with high conversion rates. You want to drive more qualified website traffic to those posts, and you can do that by optimizing the content for search engines or updating the content to ensure that it’s fresh and relevant. If you’re a HubSpot customer, you can drive traffic to these pages from LinkedIn and Facebook using the ads add-on.

8) Leverage retargeting to re-engage website visitors.

It doesn’t matter what your key conversion metric is: The cold, hard truth is that most people on your website don’t take the action you want them to. By leveraging retargeting (sometimes known as remarketing), you can re-engage people who’ve left your website.

Retargeting works by tracking visitors to your website and serving them online ads as they visit other sites around the web. This is particularly impactful when you retarget people who visit high-converting web pages.

The normal inbound rules still apply — you need well-crafted copy, an engaging image and a compelling offer for retargeting to work. If you’re a HubSpot customer, you should take a look at how the AdRoll integration can improve your conversion efforts.

How to Get Started with Conversion Rate Optimization (CRO)

We’ve shared a ton of information in this post, and at this point, you may be thinking, “where should I start?”

Here’s where the PIE framework comes in. Before starting a CRO project, we recommend prioritizing through the lens of PIE — rank each project based on its potential, importance, and ease. We used this framework at HubSpot with great results.

You should use this framework to answer the following questions for every strategy outlined in the previous section. Assign to each strategy a score between one and 10 (with one being the lowest and 10 being the highest):

  1. How much total improvement can this project offer?
  2. How valuable will this improvement be?
  3. How complicated or difficult will it be to implement this improvement?

Once you’ve assigned a score for each strategy, add up the numbers and then divide it by three — this gives a score which shows what project will have the greatest impact. Then, work on the projects with the highest scores first. The framework isn’t perfect, but it’s easy to understand, systematic, and a great way to communicate to the rest of your colleagues which CRO projects are being selected and why.

Want to learn more about the PIE framework? Take a look at this explanation from WiderFunnel.

What’s next?

There are a lot of “best practices” out there, but ultimately, you need to find out what your customers respond to, and what drives results for your business. Here are three follow-up actions to get started with CRO today:

  1. Use the three formulas to start the CRO conversation.
  2. Leverage the PIE framework to help prioritize your strategy.
  3. Make CRO someone’s responsibility.

What CRO strategies does your business leverage? Share with us in the comments below.

32 Ecommerce Conversion Mistakes to Avoid

Powered by WPeMatico

Nov

22

2016

4 Common A/B Testing Mistakes (And How to Fix Them)

AB Testing Mistakes Carl.jpg

When you’re creating content for the web, it’s easy to make assumptions about what you think your audience might respond to — but that’s not necessarily the right mentality.

Enter A/B testing: one of the easiest and most popular forms of conversion rate optimization (CRO) testing known to marketers. And while many businesses have seen the value in using this type of validation to improve their decision making, others have tried it, only to be left with inconclusive results — which is frustrating, to say the least. Download our free introductory guide to A/B testing here.  <http://offers.hubspot.com/an-introduction-to-ab-testing/> ” src=”https://no-cache.hubspot.com/cta/default/53/db238795-8fb2-4ed9-916d-c978f32aaeae.png”> </a></p>
<p>The trouble is, small mistakes made during A/B testing can lead to round after round of incremental optimizations that fail to producing meaningful results. To combat that, I’ve outlined some of the most common A/B testing mistakes (as well as their remedies) below. These tips are designed to help you keep your testing plans on track so you can start <a href=converting more visitors into customers, so let’s dive in …

4 Common A/B Testing Mistakes (And How to Fix Them)

Problem #1: Your testing tool is faulty.

Popularity is a double-edged sword — it’s true for high schoolers and it’s true for A/B testing software.

The ubiquity of A/B testing has led to a wide range of awesome, low-cost software for users to choose from, but it’s not all of equal quality. Of course, differing tools offer differing functionality, but there can also be some more tricky differences between tools. And if you’re unaware of these differences, your A/B tests may be in trouble before you even get started.

For example, did you know that some testing software can significantly slow down your site? This decrease speed can have a harmful impact on your site’s SEO and overall conversion rates.

In fact, on average, just one second of additional load time will result in an 11% decrease in page views, and 7% decline in conversions. This creates a nightmare scenario where the websites you were hoping to improve through A/B testing are actually hindered by your efforts.

It gets worse: Your selection of A/B testing software can actually impact the results of your tests, too. Entrepreneur and influencer, Neil Patel, found that the A/B software he was using was showing significant differences, but when he implemented the new page he failed to see conversions change. His problem turned out to be a faulty testing tool.

So with all these hidden pitfalls, what can you do to make sure your A/B testing software is working fine?

The Fix: Run an A/A test.

Prior to running an A/B test, you should run an A/A test with your software to ensure it’s working without impacting site speed and performance.

For the uninitiated, an A/A test is very similar to an A/B test. The difference is that in an A/A test both groups of users are shown the exact same page. That’s right, you need to literally test a page against itself. While this may seem silly at first, by running an A/A test you will be able to identify any distortionary effects caused by your testing software.

An A/A test is the one time you want your results to be boring. If you see conversion rates drop as soon as you start testing, then your tool is probably slowing down your site. If you see dramatic differences between the results for the two pages, then your software is likely faulty.

Problem #2: You stop testing at the first significant result.

This is the statistical equivalent to taking your ball and going home. Unfortunately, when it comes to A/B testing, stopping your test as soon as you see a statistical significant result is not just bad sportsmanship, but it also produces completely invalid results.

Many tools encourage this behavior by allowing users to stop a test as soon as statistical significance has been hit. But if you want to drive real improvement to your site, you need to fight the urge to end your tests early. This may seem counterintuitive, but the more often you check your test for significant results, the more likely you are to see incorrect results.

The issue here is false positives: these are results that incorrectly show a difference between pages. The more often you check your results, the more likely you will hit a result that has been thrown off by false positives.

This isn’t an issue if you stay calm and don’t end your test early. However, if you end your test at the first sign of a significant result then you’ll likely fall victim to deceptive false positive outcomes.

Analytics firm Heap published the results of a simulation, which displays how ending your test early compromises your results.

Using standard significance testing, results from a 1,000-user test are checked once there is a 5% chance of false positive. If the tester checked the same group of users 10 times, the chance of a false positive result balloons to 19.5%. If checked 100 times, our 5% chance of a false positive increases eight fold to 40.1%.

These are good numbers to remember next time you get excited about early promising results.

The Fix: Stick to a predetermined sample size.

To combat false positives, discipline is key. You should set a sample size in stone prior to running an A/B test and resist the urge to end your test early (no matter how promising your results look).

Don’t fret if you’re scratching your head on how large your sample needs to be. There are plenty of tools available online for calculating a minimum sample size. Some of the most popular are from Optimizely and VWO.

One last note on sample size: Keep in mind that you’ll need to pick a realistic number for your page. While we would all love to have millions of users to test on, most of us don’t have that luxury. I suggest making a rough estimate of how long you’ll need to run your test before hitting your target sample size.

Problem #3: You’re only focusing on conversions.

When you’re deep in the weeds of an A/B test, it’s easy to focus on the trees and miss the forest. Put more literally, in A/B testing, it is easy to concentrate only on conversions and lose sight of the long-term business results produced.

While adding new copy to your site may produce higher conversion rates, if the converted users are of lower quality then a higher conversion rate may actually create a negative result for the business.

It can be easy to fall victim to vanity metrics while A/B testing, yet these metrics will distract your focus away from the actual revenue-driving results. If you’re testing a call-to-action that leads to a landing page, you should not just focus on conversions to the landing page. Instead, measure the leads produced from the page and ideally try to tie those leads to the revenue they produce.

The Fix: Test a hypothesis.

Before you start your A/B test you should outline a hypothesis you wish to validate or disprove. By focusing this hypothesis on a KPI that drives actual business results, you’ll avoid being distracted by vanity metrics.

Your A/B test should be judged on its ability to affect this KPI, and not its impact on other associated figures. So if your goal is to increase sign-ups, always judge success by measuring sign-ups, not on clickthrough rates to the sign-up page.

When working to validate or disprove your hypothesis, don’t just throw out any results that aren’t statistically significant — use these results to inform your later tests, instead. For example, if a change to your page’s CTA showed a small, statistically insignificant improvement, then this could be a sign that you might be onto something. Try running further tests on your CTA and see if you can hit on one that produces a significant improvement.

Problem #4: You only test incremental changes.

The button color test may have ruined A/B testing, as this test’s popularity has made it the frame of reference for understanding how A/B testing should be utilized. But there’s more to the practice than that. In fact, while a large website might see a big return from adjusting something small like button color, for the vast majority of us, these small, incremental changes are not going to produce meaningful results.

A/B testing can force us to aim for miniscule improvements, but by focusing only on the incremental, we may be missing a much larger opportunity.

The Fix: Periodic radical testing.

A good rule of thumb? Periodically test radical changes to your page. (This practice has since been coined radical testing.) If you’re seeing weak conversion rates, then it’s probably a sign you should invest time in testing out a radical change rather than incremental changes.

Think of your testing efforts like a poker game, you’ll need to periodically bet big if you want to see a big return.

But before you run off preaching the accolades of radical testing, be aware that it has some drawbacks. First, it requires more upfront labor than A/B testing. Radical testing requires that you invest time drafting a major page redesign. Because of this time investment, I recommend only periodically conducting radical tests.

An additional pitfall to radical testing is that it makes it hard to pinpoint what factors are having the largest impact on your site. What radical testing does allow you to do is determine if a large page rehaul will impact your conversions, but it won’t allow you to pinpoint which individual changes might be driving these results — so keep that in mind before you get started.

These are a few of the most common A/B testing mistakes but there are many, many more. Share your thoughts below some of the missteps you’ve seen.

free guide to a/b testing

Nov

22

2016

4 Common A/B Testing Mistakes (And How to Fix Them)

AB Testing Mistakes Carl.jpg

When you’re creating content for the web, it’s easy to make assumptions about what you think your audience might respond to — but that’s not necessarily the right mentality.

Enter A/B testing: one of the easiest and most popular forms of conversion rate optimization (CRO) testing known to marketers. And while many businesses have seen the value in using this type of validation to improve their decision making, others have tried it, only to be left with inconclusive results — which is frustrating, to say the least. Download our free introductory guide to A/B testing here.  <http://offers.hubspot.com/an-introduction-to-ab-testing/> ” src=”https://no-cache.hubspot.com/cta/default/53/db238795-8fb2-4ed9-916d-c978f32aaeae.png”> </a></p>
<p>The trouble is, small mistakes made during A/B testing can lead to round after round of incremental optimizations that fail to producing meaningful results. To combat that, I’ve outlined some of the most common A/B testing mistakes (as well as their remedies) below. These tips are designed to help you keep your testing plans on track so you can start <a href=converting more visitors into customers, so let’s dive in …

4 Common A/B Testing Mistakes (And How to Fix Them)

Problem #1: Your testing tool is faulty.

Popularity is a double-edged sword — it’s true for high schoolers and it’s true for A/B testing software.

The ubiquity of A/B testing has led to a wide range of awesome, low-cost software for users to choose from, but it’s not all of equal quality. Of course, differing tools offer differing functionality, but there can also be some more tricky differences between tools. And if you’re unaware of these differences, your A/B tests may be in trouble before you even get started.

For example, did you know that some testing software can significantly slow down your site? This decrease speed can have a harmful impact on your site’s SEO and overall conversion rates.

In fact, on average, just one second of additional load time will result in an 11% decrease in page views, and 7% decline in conversions. This creates a nightmare scenario where the websites you were hoping to improve through A/B testing are actually hindered by your efforts.

It gets worse: Your selection of A/B testing software can actually impact the results of your tests, too. Entrepreneur and influencer, Neil Patel, found that the A/B software he was using was showing significant differences, but when he implemented the new page he failed to see conversions change. His problem turned out to be a faulty testing tool.

So with all these hidden pitfalls, what can you do to make sure your A/B testing software is working fine?

The Fix: Run an A/A test.

Prior to running an A/B test, you should run an A/A test with your software to ensure it’s working without impacting site speed and performance.

For the uninitiated, an A/A test is very similar to an A/B test. The difference is that in an A/A test both groups of users are shown the exact same page. That’s right, you need to literally test a page against itself. While this may seem silly at first, by running an A/A test you will be able to identify any distortionary effects caused by your testing software.

An A/A test is the one time you want your results to be boring. If you see conversion rates drop as soon as you start testing, then your tool is probably slowing down your site. If you see dramatic differences between the results for the two pages, then your software is likely faulty.

Problem #2: You stop testing at the first significant result.

This is the statistical equivalent to taking your ball and going home. Unfortunately, when it comes to A/B testing, stopping your test as soon as you see a statistical significant result is not just bad sportsmanship, but it also produces completely invalid results.

Many tools encourage this behavior by allowing users to stop a test as soon as statistical significance has been hit. But if you want to drive real improvement to your site, you need to fight the urge to end your tests early. This may seem counterintuitive, but the more often you check your test for significant results, the more likely you are to see incorrect results.

The issue here is false positives: these are results that incorrectly show a difference between pages. The more often you check your results, the more likely you will hit a result that has been thrown off by false positives.

This isn’t an issue if you stay calm and don’t end your test early. However, if you end your test at the first sign of a significant result then you’ll likely fall victim to deceptive false positive outcomes.

Analytics firm Heap published the results of a simulation, which displays how ending your test early compromises your results.

Using standard significance testing, results from a 1,000-user test are checked once there is a 5% chance of false positive. If the tester checked the same group of users 10 times, the chance of a false positive result balloons to 19.5%. If checked 100 times, our 5% chance of a false positive increases eight fold to 40.1%.

These are good numbers to remember next time you get excited about early promising results.

The Fix: Stick to a predetermined sample size.

To combat false positives, discipline is key. You should set a sample size in stone prior to running an A/B test and resist the urge to end your test early (no matter how promising your results look).

Don’t fret if you’re scratching your head on how large your sample needs to be. There are plenty of tools available online for calculating a minimum sample size. Some of the most popular are from Optimizely and VWO.

One last note on sample size: Keep in mind that you’ll need to pick a realistic number for your page. While we would all love to have millions of users to test on, most of us don’t have that luxury. I suggest making a rough estimate of how long you’ll need to run your test before hitting your target sample size.

Problem #3: You’re only focusing on conversions.

When you’re deep in the weeds of an A/B test, it’s easy to focus on the trees and miss the forest. Put more literally, in A/B testing, it is easy to concentrate only on conversions and lose sight of the long-term business results produced.

While adding new copy to your site may produce higher conversion rates, if the converted users are of lower quality then a higher conversion rate may actually create a negative result for the business.

It can be easy to fall victim to vanity metrics while A/B testing, yet these metrics will distract your focus away from the actual revenue-driving results. If you’re testing a call-to-action that leads to a landing page, you should not just focus on conversions to the landing page. Instead, measure the leads produced from the page and ideally try to tie those leads to the revenue they produce.

The Fix: Test a hypothesis.

Before you start your A/B test you should outline a hypothesis you wish to validate or disprove. By focusing this hypothesis on a KPI that drives actual business results, you’ll avoid being distracted by vanity metrics.

Your A/B test should be judged on its ability to affect this KPI, and not its impact on other associated figures. So if your goal is to increase sign-ups, always judge success by measuring sign-ups, not on clickthrough rates to the sign-up page.

When working to validate or disprove your hypothesis, don’t just throw out any results that aren’t statistically significant — use these results to inform your later tests, instead. For example, if a change to your page’s CTA showed a small, statistically insignificant improvement, then this could be a sign that you might be onto something. Try running further tests on your CTA and see if you can hit on one that produces a significant improvement.

Problem #4: You only test incremental changes.

The button color test may have ruined A/B testing, as this test’s popularity has made it the frame of reference for understanding how A/B testing should be utilized. But there’s more to the practice than that. In fact, while a large website might see a big return from adjusting something small like button color, for the vast majority of us, these small, incremental changes are not going to produce meaningful results.

A/B testing can force us to aim for miniscule improvements, but by focusing only on the incremental, we may be missing a much larger opportunity.

The Fix: Periodic radical testing.

A good rule of thumb? Periodically test radical changes to your page. (This practice has since been coined radical testing.) If you’re seeing weak conversion rates, then it’s probably a sign you should invest time in testing out a radical change rather than incremental changes.

Think of your testing efforts like a poker game, you’ll need to periodically bet big if you want to see a big return.

But before you run off preaching the accolades of radical testing, be aware that it has some drawbacks. First, it requires more upfront labor than A/B testing. Radical testing requires that you invest time drafting a major page redesign. Because of this time investment, I recommend only periodically conducting radical tests.

An additional pitfall to radical testing is that it makes it hard to pinpoint what factors are having the largest impact on your site. What radical testing does allow you to do is determine if a large page rehaul will impact your conversions, but it won’t allow you to pinpoint which individual changes might be driving these results — so keep that in mind before you get started.

These are a few of the most common A/B testing mistakes but there are many, many more. Share your thoughts below some of the missteps you’ve seen.

free guide to a/b testing

May

9

2016

6 Conversion Experts Answer 20 of Your Most Important CRO Questions [Live Google Hangout]

Blog_header_image_-_resized_1.jpg

Whether you’re new to marketing or decades into your career, conversion rate optimization is an ever-changing topic and necessary asset in your marketing playbook.

Looking to learn more about your audience? Want to manipulate your existing resources to improve their performance? How about growing your business by improving lead flow? Wouldn’t that be nice?

An effective CRO strategy can help you achieve all that — without forcing you to crank out a bunch of new content. 

In this live Google Hangout, these six experts will teach you the most up-to-date CRO strategies and how to use different methods to get results. With your help building the agenda, we’re going to play “20 Questions” with today’s top CRO experts and learn how to start, where to start, and when to stop testing and optimizing your marketing efforts for lead conversion. 

  • When: Wednesday 6/1 @ 2 p.m. ET // 5 p.m. GMT // 9 a.m. PT for one hour
  • Where: Live Google Hangout
  • Hashtag: #CROhangout

Want to learn more about conversion rate optimization? Click here to save your seat for this live event.

Meet the Conversion Experts

Rand Fishkin, Wizard of Moz

Rand Fishkin

Rand Fishkin uses the ludicrous title, Wizard of Moz. He’s founder and former CEO of Moz, board member at presentation software startup Haiku Deck, co-author of a pair of books on SEO, and co-founder of Inbound.orgRand’s an unsaveable addict of all things content, search, and social.

Larry Kim, Founder & CTO, Wordstream

Larry Kim

Larry Kim founded WordStream in 2007. He bootstrapped the company by providing internet consulting services while funding/managing a team of engineers and marketers to develop and sell software for search engine marketing automation. Today he serves as company CTO and is a contributor to both the product team and marketing teams.  

Oli Gardner, Co-founder, Unbounce

Oli Gardner, Unbounce

Unbounce Co-Founder Oli Gardner has seen more landing pages than anyone on the planet. His disdain for marketers who send campaign traffic to their homepage is legendary. He is a prolific webinar guest and writer, and speaks internationally about Conversion-Centered Design where he is consistently ranked as the top speaker. 

Peep Laja, Founder, ConversionXL

Peep_Laja.jpg

Peep is an entrepreneur and conversion optimization expert with 10+ years of global experience. He has extensive experience across verticals: In the past he’s run a software company in Europe, an SEO agency in Panama, a real estate portal in Dubai, and worked for an international non-profit. 

Pamela Vaughan, Principle Optimization Marketing Manager, HubSpot

Pamela Vaughan

As HubSpot’s principle optimization marketer, Pam currently manages large-scale projects relating to CRO and SEO (with an expertise in blog/content optimization) on the HubSpot marketing team’s new optimization team. Her team’s goal is to optimize and grow traffic and conversions from HubSpot’s various marketing assets. 

Michael Aagard, Senior Conversion Optimizer, Unbounce

Michel Aagard

For seven years, Michael has spent about 60 hours a week testing and optimizing websites to gain a deeper understanding of what really works in Online Marketing and CRO. He’s helped a multitude of clients from all over the world make more money online. In July 2015 he quit his career and joined Unbounce as Senior Conversion Optimizer. 

Moderated by: Meghan Keaney Anderson, VP of Marketing, HubSpot

Meghan Keaney Anderson

As Vice President of Marketing at HubSpot, Meghan leads the content, product marketing, and customer marketing teams. Together with her teams, she’s responsible for the company’s blogs, podcast, and overall content strategy, as well as the company’s product launch and customer demand campaigns.

join a Google Hangout With CRO experts

May

9

2016

6 Conversion Experts Answer 20 of Your Most Important CRO Questions [Live Google Hangout]

Blog_header_image_-_resized_1.jpg

Whether you’re new to marketing or decades into your career, conversion rate optimization is an ever-changing topic and necessary asset in your marketing playbook.

Looking to learn more about your audience? Want to manipulate your existing resources to improve their performance? How about growing your business by improving lead flow? Wouldn’t that be nice?

An effective CRO strategy can help you achieve all that — without forcing you to crank out a bunch of new content. 

In this live Google Hangout, these six experts will teach you the most up-to-date CRO strategies and how to use different methods to get results. With your help building the agenda, we’re going to play “20 Questions” with today’s top CRO experts and learn how to start, where to start, and when to stop testing and optimizing your marketing efforts for lead conversion. 

  • When: Wednesday 6/1 @ 2 p.m. ET // 5 p.m. GMT // 9 a.m. PT for one hour
  • Where: Live Google Hangout
  • Hashtag: #CROhangout

Want to learn more about conversion rate optimization? Click here to save your seat for this live event.

Meet the Conversion Experts

Rand Fishkin, Wizard of Moz

Rand Fishkin

Rand Fishkin uses the ludicrous title, Wizard of Moz. He’s founder and former CEO of Moz, board member at presentation software startup Haiku Deck, co-author of a pair of books on SEO, and co-founder of Inbound.orgRand’s an unsaveable addict of all things content, search, and social.

Larry Kim, Founder & CTO, Wordstream

Larry Kim

Larry Kim founded WordStream in 2007. He bootstrapped the company by providing internet consulting services while funding/managing a team of engineers and marketers to develop and sell software for search engine marketing automation. Today he serves as company CTO and is a contributor to both the product team and marketing teams.  

Oli Gardner, Co-founder, Unbounce

Oli Gardner, Unbounce

Unbounce Co-Founder Oli Gardner has seen more landing pages than anyone on the planet. His disdain for marketers who send campaign traffic to their homepage is legendary. He is a prolific webinar guest and writer, and speaks internationally about Conversion-Centered Design where he is consistently ranked as the top speaker. 

Peep Laja, Founder, ConversionXL

Peep_Laja.jpg

Peep is an entrepreneur and conversion optimization expert with 10+ years of global experience. He has extensive experience across verticals: In the past he’s run a software company in Europe, an SEO agency in Panama, a real estate portal in Dubai, and worked for an international non-profit. 

Pamela Vaughan, Principle Optimization Marketing Manager, HubSpot

Pamela Vaughan

As HubSpot’s principle optimization marketer, Pam currently manages large-scale projects relating to CRO and SEO (with an expertise in blog/content optimization) on the HubSpot marketing team’s new optimization team. Her team’s goal is to optimize and grow traffic and conversions from HubSpot’s various marketing assets. 

Michael Aagard, Senior Conversion Optimizer, Unbounce

Michel Aagard

For seven years, Michael has spent about 60 hours a week testing and optimizing websites to gain a deeper understanding of what really works in Online Marketing and CRO. He’s helped a multitude of clients from all over the world make more money online. In July 2015 he quit his career and joined Unbounce as Senior Conversion Optimizer. 

Moderated by: Meghan Keaney Anderson, VP of Marketing, HubSpot

Meghan Keaney Anderson

As Vice President of Marketing at HubSpot, Meghan leads the content, product marketing, and customer marketing teams. Together with her teams, she’s responsible for the company’s blogs, podcast, and overall content strategy, as well as the company’s product launch and customer demand campaigns.

join a Google Hangout With CRO experts

Mar

14

2016

13 Case Studies That Prove the Power of Word Choice

Word_Choice_Conversion_Tests.jpg

Your website copy is responsible for more than just presenting your visitors with basic information. In fact, your words alone have the ability to influence how visitors feel about your brand, what they choose to click (or not click), and how your site ranks in search engines.

How can you ensure that it’s working the way you want it to? Testing. Sometimes the tiniest change in word choice can have a major impact on your conversion rates. While the difference between a checkout button that reads “Add to Cart” rather than “+Cart” might seem insignificant, it can significantly alter your website’s performance.

Still not convinced? Check out these 13 case studies that prove word choice does indeed matter when it comes to optimizing for conversions.

15 Case Studies That Prove Word Choice Matters for Conversions

1) Fab Increased Cart Adds by 49%

In order to optimize their sales, the online retail community Fab experimented with the “Add to Cart” button on their site. After changing the button from one that showed a picture of a shopping cart to one that spelled out the words “Add to Cart,” the website saw a 49% increase in cart adds.

fab.png

Source: Optimizely

Takeaway: In trying to increase conversions, the more obvious you can be, the better. Always make your desired action as easy as possible to achieve, and leave nothing to be inferred.

2) NuFACE Increased Sales by 90%

In an effort to increase sales, an anti-aging skin product company called NuFACE decided to offer free shipping on all orders over $75. The result was a 90% increase in sales.

Nuface_Variation.png

Source: VWO

Takeaway: The word “free” still works like magic. According to author Dan Ariely, it serves as a powerful emotional trigger that can cause us to instantly change our behavior. If you can find a legitimate way to include it on your website, you might just see a big change in conversions.

3) Monthly1K Increased Sales by 6.5%

Monthly1K — a software solution for entrepreneurs — wanted to increase the number of online courses they were selling. To do this, they experimented with a new headline that read: “How to Make a $1,000 a Month Business.” This replaced the old headline that read: “How to Make Your First Dollar.”

Monthly1K.png

Source: AppSumo

The result was nearly a 6.5% increase in sales.

Takeaway: Most people are optimists, and if you can promise them big returns on their investment, then invest they will. As long as you can back up these promises, don’t be afraid to use them to increase your sales.

4) GoCardless Increased Conversions by 139%

A debit supplier in the UK called GoCardless tested two different versions of their CTA, each one only differing by one word. The first one read “Request a demo,” while the second read, “Watch a demo.”

gocardless.png

Source: GoCardless

Ultimately, the company found that the second version led to a 139% increase in conversions.

Takeaway: Some words have a connotation that can cause visitors to hesitate. The word “request” draws up images of having to fill out forms and wait for responses, while the word “watch” implies a far quicker and more direct process. Pay careful attention to the emotional connection of each word you use, especially when it comes to your CTA.

5) JCD Increased Conversions by 18%

No matter what you’re selling, writing engaging copy is vital. The iPhone repair service JCD, for instance, saw this first hand when they replaced their straightforward, factual web copy with an entertaining and humorous description of their services.

jcd.png

Source: Copyhackers

The result? Almost an 18% increase in conversions.

Takeaway: It’s important to include specific details, but be sure you’re writing them in a way that’s entertaining and engaging. Even giving your copy a small dose of personality can drastically increase the chances of it actually being read by a visitor.

6) Pink Pest Services Increased Conversions by 96%

The pest removal service, Pink Pest Services, decided to alter their advertisement so both the header and the copy below it were focused on their free quote, rather than having the header focus on the quote and the copy focus on a free report.

pink_pest_service.png

Source: Marketing Results

The results were a 96% increase in the number of conversions they saw.

Takeaway: Your copy needs to be consistent. Expounding on your first offer is far more effective than transitioning to a second offer under the same heading.

7) DaFlores Increased Sales by 27%

DaFlores is a company that sells and ships fresh flower arrangements through their website. In an effort to improve sales, the company added a sense of urgency by displaying the text “Order in the next [x] hours for delivery today.” In exchange, they saw a 27% increase in sales.

Takeaway: Often times, urgency is the key to driving impulse purchases. By adding a sense of urgency to your website — whether through a limited-time sale or through guaranteed delivery, as what DaFlores offered — you can open up an opportunity to drastically increase your number of conversions.

8) Stride Increased Conversions by 112%

The folks at Stride decided to switch up their abandoned cart emails by focusing more on the customer and their needs, rather than the company. Notice the consistent use of “you” and “your” in the email copy below.

stride.png

Source: AWeber

As a result, they were able to rack up a 112% increase in conversions.

Takeaway: While it can be incredibly tempting to tout the accomplishments and strong-points of your company, website, or services in your copy, you would, in general, be far better served by staying focused on the customer.

9) Raileasy Increased Email Opens by 31%

In this case, an online travel agency, Raileasy, wanted to improve the effectiveness of the emails sent out to customers who abandoned their shopping cart. To do this, they use personalization to customize email subject lines based on the name of the destination the customer was shopping for.

RailEasy.png

Source: Which Test Won

As a result, Raileasy saw a 31% increase in email opens.

Takeaway: People are obviously interested in the item they added to their cart, or they never would have made it that far. Reminding them of that interest is a more effective way of re-engaging them than a generic abandoned cart email.

10) TextMagic Increased Conversions by 38%

The goal of one of TextMagic’s CTAs was to send people from the company’s homepage to its pricing page. The original text in their CTA read “Buy SMS Credits,” but when TextMagic changed this to read “View SMS Credits,” they saw nearly a 38% increase in conversions.

Takeaway: People are always a little wary of pressing buttons on the internet. If your text leads them to believe that pressing a button equals buying a product, they’re often less likely to follow through. Instead, ease hesitant customers through the process with more comforting language.

11) L’Axelle Increased Cart Adds by 93%

L’Axelle‘s goal was to increase the number of underarm sweat pads being sold on their website. To do so, they changed their ad copy from “Feel fresh without sweat marks” to a more action-oriented phrase: “Put an end to sweat marks!”

laxelle.png

Source: Kissmetrics

The result was a 93% increase in items added to cart.

Takeaway: It’s called a call-to-action for a reason. By making your copy direct and action-oriented, you’ll leave your visitors feeling more inspired and compelled to follow through.

12) Betfair Increased Conversions by 7%

The online betting service, Betfair, tested six different persuasion tactics on their website, including reciprocity, scarcity, commitment and consistency, liking, authority, and social proof.

While each page that employed these specific tactics fared better than the control group, the top performer was the website that employed social proof by pointing out the number of “Likes” Betfair had on their Facebook page.

betfair.png

Source: VWO

This variation in particular enjoyed a 7% increase in conversions over the control.

Takeaway: Social proof is a powerful tactic. If people believe that there’s a buzz about your product or service, they’re more apt to want to join the party themselves. Don’t be quick to rule out other persuasion tactics, though. All six of the strategies tested proved effective for Betfair, and which one works best for you will depend on your specific website and goals. (Click here for tips on how to add social proof to your landing pages.)

13) Bloomspot Increased Conversions by 20%

Last up, Bloomspot took the surprisingly obvious step of making the text on their landing pages match the images on their landing pages. For example, if the image was of a restaurant, the text mentioned deals on top restaurants in the area.

bloomspot.png

Source: Kissmetrics

As a result, they saw a 20% boost in conversions.

Takeaway: Images and text are both two different means of conveying a message. When both your images and your text convey the same message, you’re giving your visitors a double dose of your CTA and increasing the likelihood of them converting.

As these case studies make clear, word choice matters — whether it’s in your CTAs, your headlines, or your body copy. Take their wisdom to heart by using them to identify areas on your own site that may be falling victim to unclear or uninspired language usage.

Have another great case study to add to this list? Share your suggestions by leaving a comment below.

free ebook: optimizing landing pages

Jun

8

2015

How to Identify and Fix Friction on Your Landing Pages

Landing_Page_Friction.jpeg

You’ve spent a lot of energy (and budget) getting targeted traffic to your website.

Unfortunately, those visitors aren’t doing what you want them to do when they get there. 

But why?

Often times this comes as a result of landing page friction — a barrier that prevents your visitors from completing the action you’d like them to take.

Whether your copy is too long, your button text isn’t compelling enough, or you’re lacking social proof, you need to start by identifying the points of friction before you can turn them around. 

To help you get a better grasp on what landing page friction looks like, when it can be used to benefit your business, and how to resolve it when it’s hurting your conversions, keep reading. 

When Negative Friction Is Good

Negative friction point, you say? Aren’t all friction points negative? I disagree — friction can be both good and bad. For example, a commonly stated cause of negative friction on landing pages is long submission forms.

If there’s a lot of information to hand over when visitors arrive on the landing page, there is little incentive for them to complete your call-to-action. This state is often referred to as “psychological resistance.”

Do visitors feel like the transaction in unbalanced? Are they paying too much to get what you are offering?

Despite long submission forms being a common friction point within landing pages, we at HubSpot argue that sometimes a long form is a good thing. This is because those who commit to filling out a long form are commonly more interested — and often more qualified — than a visitor who does not. 

In other words, friction functions somewhat like a method of exclusion. Let’s take social media ads, for example. Targeted Facebook ads deliberately exclude certain demographics to avoid wasting money on those who would never be interested in actually buying. Down the funnel, this goes a long way towards improving the lead-to-customer rate. 

Essentially, both long forms and targeted Facebook ads aim to reduce the number of unqualified submissions by leveraging friction and exclusion to deflect those who wouldn’t be a good fit for your company. 

To give you another example, Chris Brogan of Owner Media Group goes against conventional marketing practices by charging registrants $20 to attend a webinar.

The-20-dollar-webinar.png

While I’m not privy to Brogan’s webinar goals, it would be logical to assume that any person willing to shell out $20 for a webinar is likely to be more qualified and engaged. Essentially, this approach employs friction to help him to weed out a lot of unqualified leads. 

Takeaway: What Can We Learn From These Insights?

  1. You need to identify the parts of your landing page that are preventing quality leads from continuing down the funnel.
  2. You need to figure out the parts of your landing page that are moving poor quality leads down the funnel.

How do you go about doing this?

The answer: good testing.

Identifying Friction: Two Real Sample Tests

At HubSpot, we are great believers in ongoing testing — even when something is performing really well for us. Much like the sales mantra “always be closing,” we hold ourselves to the motto “always be testing.”

When it comes to reducing negative friction, there are many things on a landing page that you can test to boost your conversion rate. Here are a few examples:

  • Persuasive copy
  • Social proof
  • Security
  • Referral source personalization
  • Imagery 
  • Benefits
  • Visual triggers (arrows, pointing, etc.)
  • Discounts or money-back guarantees

To give you a better idea of how an effective test is carried out, we’ve detailed two in-depth examples alongside their results. 

1) The “Indicated Reading Time” Test

Have you ever read an article online and noticed an indicated reading time at the top? Maybe it said five minutes, or seven, but either way it worked to set your expectations before you started reading, right?

I’m currently running an experiment to test the effectiveness of the “indicated reading time” inclusion by comparing a normal image on a landing page to an image including an estimated reading time for the offer. 

To illustrate my experiment, take a look at the image below. As you’ll notice, the first variation contains just the ebook, while the second shows both the ebook and the reading time: 

reading-time-test-mglive-blog.png

In terms of the process of the experiment, here’s how things have played out so far:

Background

The inspiration for this test stemmed from a feedback email we received from someone who had downloaded an ebook. He explained that the image on the landing page lead him to believe that it represented the length of the offer. From our point of view, the image of a book simply indicated that the offer was an ebook, rather than a template or recording, however, it was clear there was confusion. 

Problem

Could the uncertainty of length be preventing people from downloading our ebook offers? Aware that marketers are often strapped for time, picking and choosing which pieces of content they are going to read can be understandably difficult.

Hypothesis

While the word “ebook” often implies the length of an actual paper book, HubSpot ebooks are commonly under 25 pages. Trouble is, it didn’t seem as though our landing pages were communicating that. Aware that we were driving a lot of quality visitors to the page, failing to communicate the length of our content could results in us losing their interest — possibly forever.

Test

We ran a 50-50 A/B split-test for people who visited the landing page for one of our ebooks. One variation employed a regular ebook cover image, and the other displayed the cover image as well as a clock with the indicated reading time.

Results

So far, we’ve seen a 6% increase in submissions at 98% certainty. (Looking for an easy-to-use calculator to check your A/B test results? We recommend Get Data Driven’s A/B test significance calculator.)

Takeaway

After reviewing the results, it appears that we’re on to something. However, to ensure the validity of our results, we plan on replicating this experiment across a few more ebooks before declaring a winner. 

2) The “Form Redesign” Test (By: Yousaf Sekander)

Another great example of how to reduce friction points comes from Yousaf Sekander of RocketMill

Looking for a way to increase conversions for one of his clients, Sekander conducted an A/B test to compare the original variation of their landing page, against his optimized variation. Check out the image below to see the difference between the the two pages:

RocketMill-test-example.png

To give you a better idea of how Sekander approached this test, I asked him to provide a run-through of his experiment. This is what he had to say:

Background

A client of ours approached us to find out why visitors to their website were not moving along their sales funnel. The critical point identified was that the conversion rate on their forms was low.”

Problem

We analyzed the inquiry form (see above) on Tchibo UK’s website and uncovered a handful of friction points. Although the form was actually quite simple, the format made it appear big and complicated. In addition to its overwhelming size, there was also no incentive for the prospects to fill it out. On top of the form complications, we also noticed that the scrolling navigation obscured the telephone number.”

Hypothesis

Our CRO campaign manager, Bertram Greenhough, came up with a few form design ideas to reduce the friction. First, we would simplify the form by reducing it to one column and place it within a contrasting pop-up. The purpose of this would be to reduce the attention ratio, and redirect the visitor’s focus to both the messaging and the CTAs. In addition, we’d add a compelling question header, a bulleted list of unique selling propositions, a phone number, and an image of the product to clarify what the visitor would be inquiring about.”

Test

To determine the influence of the form changes, we conducted an A/B test to compare both variations and identify which one converted better.”

Results

After analyzing the results, we found that the simplified form saw over a 200% increase in conversions.” 

The Simple Way to Get Started 

If, like me, you have tons of hypotheses you want to test, you’ll need to start with a framework that shows you what should get done first. I like to use the “PIE” system. This process requires you to create a simple excel sheet, dump in all your ideas, and give them a score out of 10 in each of the following categories: 

  • Potential. Are these types of pages your worst performers?
  • Importance. How crucial are these pages for visitor-to-lead?
  • Ease. Is it easy to set up? 

PIE-chart-600x370.png

Source: WiderFunnel

The average of the sum of the total points will tell you what to begin with. Once you generate a score for each of the potential hypotheses, you can then begin to set priorities and start testing. 

At the end of the day, generating leads isn’t the easiest thing that marketers are tasked with, but it’s important nonetheless. Rather than allow landing page friction to negatively influence your conversion rates, don’t hesitate to explore different experiments such as the ones we detailed above. You never know until you test.  

Want to see a website through the eyes of a HubSpot marketer? We’re hosting an eight-minute live analysis of a website on June 11th at 15:00 BST / 10:00 EST. We’ll analyze the conversion potential of three different websites by looking at SEO, content, social, and design. Register here to watch.

Come along and watch us talk about websites like yours!

May

25

2015

5 Simple Ways to Optimize Your Website for Lead Generation

quick-lead-gen-wins.jpeg

Optimizing your website to generate leads is a no-brainer. But it’s not as simple as throwing a “click here” button on your home page and watching the leads pour in. (Unfortunately.)

Instead, marketers and designs need to take a more strategic approach. In this post, we’ll go over some quick ways you can optimize your website for lead generation that actually work.

To understand how to optimize our website, we’ll have to first gain a basic understanding of the lead generation process. What components are at play when a casual website visitor turns into a lead? Here’s a quick overview:

lead_generation_visualization.png

The lead generation process typically starts when a website visitor clicks on a call-to-action (CTA) located on one of your site pages or blog posts. That CTA leads them to a landing page, which includes a form used to collect the visitor’s information. Once the visitor fills out and submits the form, they are then led to a thank-you page. (Learn about this process in more detail in this post.)

Now that we’ve gone over the basics of lead generation, we can get down to the dirty details. Here are five simple ways to optimize your site for lead generation.

1) Figure out your current state of lead gen.

It’s important to benchmark your current state of lead generation before you begin so you can track your success and determine the areas where you most need improvement.

A great way to test out where you are is to try a tool like Marketing Grader, which evaluates your lead generation sources (like landing pages and CTAs), and then provides feedback on ways to improve your existing content.

You can also compare landing pages that are doing well with landing pages that aren’t doing as well. For example, let’s say that you get 1,000 visits to Landing Page A, and 10 of those people filled out the form and converted into leads. For Landing Page A, you would have a 1% conversion rate. Let’s say you have another landing page, Landing Page B, that gets 50 visitors to convert into leads for every 1,000 visits. That would be a 5% conversion rate — which is great! Your next steps could be to see how Landing Page A differs from Landing Page B, and optimize Landing Page A accordingly.

Finally, you could try running internal reports. Evaluate landing page visits, CTA clicks, and thank-you page shares to determine which offers are performing the best, and then create more like them.

2) Optimize each step of the lead gen process.

If your visitor searched “lawn care tips” and ended up on a blog post of yours called, “Ten Ways To Improve Your Lawn Care Regimen,” then you’d better not link that blog post to an offer for a snow clearing consultation. Make sure your offers are related to the page they’re on so you can capitalize on visitors’ interest in a particular subject.

As soon as a visitor lands on your website, you can start learning about their conversion path. This path starts when a visitor visits your site, and ends (hopefully) with them filling out a form and becoming a lead. However, sometimes a visitor’s path doesn’t end with the desired goal. In those cases, you can optimize the conversion path.

How? Take a page out of Surety Bonds‘ book. They were struggling to convert visitors at the rate they wanted, so they decided to run an A/B split test (two versions of a landing page) with Unbounce to determine which tactics were performing better on each page. In the end, they ended up changing a link to a button, adding a form to their homepage, and asking different questions on their forms. The result? A 27% increase in lead generation. 

If you want to run an A/B test on a landing page, be sure to test the three key pieces of the lead gen process:

a) The Calls-to-Action

Use contrasting colors from your site. Keep it simple — and try a tool like Canva to create images easily, quickly, and for free. Read this blog post for ideas for types of CTAs you can test on your blog., like the sliding CTA you see here:

Pop-up_CTA-1.gif

b) The Landing Pages

According to a HubSpot surveycompanies with 30+ landing pages on their website generated 7X more leads than companies with 1 to 5 landing pages. 

For inspiration, here are 15 examples of well-designed landing pages you can learn from.

c) The Thank-You Pages

Oftentimes, it’s the landing pages that get all the love in the lead generation process. But the thank-you page, where the visitor is led to once they submit a form on the landing page and convert into a lead, shouldn’t be overlooked.

Along with saying thank you, be sure to include a link for your new lead to actually download the offer on your thank-you page. You can also include social sharing buttons and even a form for another, related offer, as in the example below:

    • HubSpot landing page

Bonus: Send a Kickback Email

Once a visitor converts into a lead and their information enters your database, you have the opportunity to send them a kickback email, i.e. a “thank-you” email.

In a study HubSpot did on engagement rates of thank you emails versus non thank you emails, kickback emails doubled the engagement rates (opens and clickthroughs) of standard marketing emails. Use kickback emails as opportunities to include super-specific calls-to-action and encourage sharing on email and social media.

3) Personalize your calls-to-action.

Dynamic content lets you cater the experience of visiting your website to each, unique web visitor. People who land on your site will see images, buttons, and product options that are specifically tailored to their interests, the pages they’ve viewed, or items they’ve purchased before.

Better yet, personalized calls-to-action convert 42% more visitors than basic calls-to-action. In other words, dynamic content and on-page personalization helps you generate more leads. 

How does it work? Here’s an example of what your homepage may look like to a stranger:

Smart Content

And here’s what it would look like to a customer:

Smart Content

(To get dynamic content (or “smart content”) on your site, you’ll need to use a tool like HubSpot’s Content Optimization System.)

4)  Test, test, test.

We can’t stress this part of the process enough. A/B testing can do wonders for your clickthrough rates.

For example, when friendbuy tried a simple A/B test on their calls-to-action, they found a 211% improvement in clickthroughs on those calls-to-action. Something as simple as testing out the wording of your CTA, the layout of your landing page, or the images you’re using can have a huge impact, like the one friendbuy saw. (This free ebook has fantastic tips for getting started with A/B testing.)

5) Nurture your leads.

Remember: No lead is going to magically turn into a customer. Leads are only as good as your nurturing efforts.

Place leads into a workflow once they fill out a form on your landing page so they don’t forget about you, and deliver them valuable content that matches their interest.  Lead nurturing should start with relevant follow up emails that include great content. As you nurture them, learn as much as you can about them — and then tailor all future sends accordingly. 

Here’s an example of a lead nurturing email:

Lead Nurture Email

This email offers the recipient some great content, guides them down the funnel, and gets to the point. According to Forrester Research, companies that nurture their leads see 50% more sales ready leads than their non-nurturing counterparts at a 33% lower cost. So get emailing!

What other tips do you have for optimizing your website for lead generation? Share them with us in the comments.

free ebook: optimizing landing pages

May

25

2015

The Biggest Pet Peeves of CRO Experts

cro-pet-peeves.jpeg

Conversion Rate Optimization (CRO) isn’t a widely known field, even among digital marketers. If you need a quick refresher, CRO is the process of creating an experience for your website visitors that’ll convert them into customers.

But this science of lead conversion is quickly gaining ground. After all, who doesn’t want more clicks, leads, and sales?

On International Conversion Rate Optimization Day back in April, some of the best CRO experts in the business came together for an “Ask Me Anything” discussion on inbound.org, where they answered questions about all things conversion rate optimization. One of the interesting topics they covered was the things that really tick them off in the world of CRO. And trust me when I say they didn’t hold back.

What were some of the things that grinded these CRO experts’ gears? Here are 13 pet peeves related to conversion rate optimization to be sure you aren’t making on your website.

13 Pet Peeves From CRO Experts

1) Over-Simplification

The world is not simple, yet it’s natural for people to oversimplify everything. Optimizers have to be better than that. There is no ‘people always prefer’ or ‘who would ever.'”

– Peep Laja (author, CRO specialist, & founder of ConversionXL)

(Read more from Laja here.)

2) Assumptions

You should [make it] very easy for the user to checkout. The buttons and headlines should tell people what to do next. Never make assumptions that you know what the customer should do.”

– Alex Harris (e-commerce conversion specialist)

(Read more from Harris here.)

… Send good cart abandonment emails (and A/B test them), minimize distractions during the checkout process, make it clear to the customer what’s happening in the process and when, try to avoid anything that makes it look like you’re springing surprise fees or clever accounting on the customer, and reinforce why they’re buying from you (painless pre-paid returns process, best in class quality, social proof of satisfied customers, etc. etc. — test what works best for your customers).”

– Jim Gray (marketing engineer, data scientist & founder of Ioseed)

(Read more from Gray here.)

3) “Click Here” on Calls-to-Action

I personally hate “click here” prefixes, and so do search engines. (It hurts SEO.) It begs the question, does your CTA not already look like a clickable button For both headlines and CTAs, I use a variation of the fore mentioned formula: “I’d like to…” [WHAT: Specific Action]; “Because I want to…” [WHY: Specific Value]. 

“It’s important to pair WHAT and WHY together. Sometimes this can be accomplished in one line. Two lines (headline + subhead, 2-line CTA, CTA + booster) are more often needed though. This shouldn’t be feared if it provides more clarity and value.”

– Angie Schottmuller (chief of conversion marketing at Unbounce)

(Read more from Schottmuller here.)

A simple formula to follow for button CTA’s is ‘Action Verb’ + ‘Benefit.'”

– Bobby Hewitt (president and founder of Creative Thirst)

(Read more from Hewitt here.)

4) Ghost Buttons

Ghost buttons drive me crazy. It goes against usability. The concept is a designer’s fantasy trend that should die. The only time I find this tactic useful is when a client insists in having two CTAs on the page, and I basically want one to disappear. Ghosted buttons have ghost conversions.”

– Angie Schottmuller (chief of conversion marketing at Unbounce)

(Read more from Schottmuller here.)

5) Ego

“It can be really hard to let something go when you’ve sweated over it. If it loses, you have to have the courage to throw it away. The best way to do that is to celebrate the fact that you learned something from the failure.”

– Oli Gardner (co-founder of Unbounce)

(Read more from Gardner here.)

6) Unclear Call-to-Action Copy

It has to be abundantly clear what’s going to happen when someone clicks that button. What are they going to get? Are they scheduling a demo, or signing up for that demo right then and there? You can’t afford to leave people wondering, or they won’t click out of nervousness.”

– Joel Klettke (CRO copywriter)

(Read more from Klettke here.)

7) A “One-Size-Fits-All” Approach

I’ve seen case studies where including the word click increased… clicks. But like every case study, it isn’t a panacea and should be taken with a grain of salt. You can’t apply case study learnings, only use them to serve as inspiration and to be used to generate your own related hypothesis.”

– Oli Gardner (co-founder of Unbounce)

(Read more from Gardner here.)

Everything you’ve read about button design is true, and false, and somewhere in between. If you truly believe that the best hypothesis and test you can come up with — the one that will deliver a 200% increase on conversions — is to change the button, then you should run A/B or multivariate tests against all of those options to see what works for your audience.

“The fact is, different audiences relate to different designs, language, reading levels, colors, and more. Averages across industries won’t help you here.”

Stewart Rogers (director of marketing technology at VentureBeat Insight)

(Read more from Rogers here.)

8) Superlatives and Hyperboles

When it comes to using words like “amazing,” Peep Laja said it best: “Superlatives tend to lose against specifics (‘amazing pizza’ vs ‘stone-oven baked pizza by an Italian master chef,’ ‘fastest pizza delivery’ vs ‘delivery in 15 minutes’) 9 times out of 10. Instead of superlatives, offer lots of detail and specifics.”

Superlatives tend to lose against specifics (‘amazing pizza’ vs. ‘stone-oven baked pizza by an Italian master chef;’ ‘fastest pizza delivery’ vs. ‘delivery in 15 minutes’) 9 times out of 10. Instead of superlatives, offer lots of detail and specifics.”

– Peep Laja (author, CRO specialist, & founder of ConversionXL)

(Read more from Laja here.)

Instead of obsessing over individual words, think about your context and slash hyperbole wherever it stands. If the claims you are making are believable, hit on customer pain points and directly explain a benefit, then the verbiage you use to describe that benefit can be flexible, so long as it fits the context.”

– Joel Klettke (CRO copywriter)

(Read more from Klettke here.)

9) Buzzwords

i personally loathe ‘rockstar.’ I’ve used it. I’m embarrassed about it. But … when I see it on a page today, I instantly get that feeling that an old person is trying to sound young.”

– Joanna Wiebe (conversion copywriter)

(Read more from Wiebe here.)

10) Fluffy Language

A big hindrance on conversion rates and SEO alike is content that reads like generic fluff for the sake of targeting phrases.”

– Joel Klettke (CRO copywriter)

(Read more from Klettke here.)

11) Half-Baked Value Props

I hate when writers rely on old, tired [stuff] like, ‘We do X so that you can focus on what matters!’(…so.. what matters?); ‘We get to know our customers’ (everyone does); ‘We’re the highest quality’ (what does that even MEAN? Nobody wants high quality!).”

– Joel Klettke (CRO copywriter)

(Read more from Klettke here.)

12) Ignoring or Avoiding Data

In answering the question, “What’s your biggest pet peeve?”

When others pretend like the data doesn’t exist.”

– Tommy Walker (marketer at Shopify)

“Or worse, when others attempt to manipulate math for statistical significance to claim that the data qualifies as a valid test. Statistical significance is not the same as validity.”

– Angie Schottmuller (chief of conversion marketing at Unbounce)

(Read more from Walker and Schottmuller here.)

13) Businesses That Stop Testing

“Always be testing” was the rallying cry for this crowd. The takeaway? Keep on testing, even after you have wins. (If you’re not sure where to start, here’s a list of real-life CRO tests to try for yourself.)

Many thanks to all the CRO specialists who joined me in this inbound.org discussion.

What are your biggest CRO pet peeves? Share them with us in the comments.

free webinar: conversion rate optimization

Apr

6

2015

Rethinking CRO: How Remarketing Can Unlock Higher Conversion Rates

remarketing

If you’re serious about improving more conversions, you need to start thinking of conversion rate optimization (CRO) differently. CRO isn’t just about making small adjustments to a landing page to get 5% more conversions. But if you’ve tried moving page elements around, tested variations of your copy, and optimized your form, but still aren’t seeing meaningful conversion gains, don’t worry — all is not lost.

Both large and small optimizations can make a difference, but in my experience, it’s the radical optimization changes that have the highest potential to earn you more conversions. If you want to be part of the top 5-10% getting unbelievably high conversion rates, you have to be willing to try some crazy and sometimes counterintuitive things. You might find they pay off in a big way.

Before I get started, if you’d like to learn in detail about some of the most impactful CRO hacks out there, then click here to register for a webinar this Thursday, April 9, 2015 as part of HubSpot’s #CRODay celebrations. I’ll cover the massive changes you can make, and HubSpot’s Lanya Olmsted will share optimizations large and small that have impacted HubSpot’s conversion rates.

Rethinking Conversion Rate Optimization

International CRO Day (yes, it’s a thing) is a fantastic opportunity to take a step back from your current conversion optimization strategy and rethink your process. It’s far too easy to get stuck in the rut of trying the same optimizations over and over, hoping for different results each time.

As marketers, we’ve had it drilled into our heads that conversion rates of 3-7% or so are pretty good — so you might think 10% is pretty fantastic. But if you’re already there, how do you know whether there’s room for growth?

You might be surprised to learn that what we think of as “good” conversion rates are really just average, or in some cases even below average. In our analysis of thousands of AdWords accounts with $3 billion in annualized spend, we discovered that across industries and verticals, exceptional advertisers are converting at two to three times the average.

The median conversion rate is actually 2.35%. A full quarter of accounts have conversion rates of 1% or less.

search-conversion-rate-distribution

As you can see, though, the top 25% of accounts have a conversion rate of 5.31% or higher, or about twice the average. The top 10% of accounts are converting at five times greater or more than the median — these accounts have average conversion rates of 11.45% or more.

Crazy, right? It’s not a fluke, and these advertisers aren’t just lucky. They’re consistently outperforming their competitors by magnitudes of three to five times or more.

How do they do it? And more importantly, how can you do it, too? Right now, I’ll share with you one of my favorite CRO hacks: remarketing. (I’ll cover nine more in Thursday’s CRO hacks webinar.)

Remarketing: One Key to Improving Conversion

Remarketing is one very effective, data-backed CRO hack. While growing in popularity, this advertising tactic is still surprisingly underutilized. The percentage of marketers investing more than 50% of their digital ad budget in remarketing doubled from 2013 to 2014, but that only brought it up to 14%.

And yet, over 90% of marketers surveyed in a study commissioned by AdRoll said remarketing is as effective or more effective than email, search, and other display campaigns.

What’s so great about remarketing? For starters, it gives you absolutely massive reach on the Google Display Network (over 92% of sites on the web, in fact). But other remarketing networks like Facebook and Twitter give you crazy precise targeting, as well. You can reach about 84% of the people you tag, 10-18 times per month.

reach-more-users

Image Credit: WordStream

Visitors to your website or app are tagged and can be remarketed to, which gets your brand and messaging back in front of them wherever they happen to be on the web. It’s a super effective way of reconnecting with people who have already expressed some kind of interest in your companies, products or services.

How Remarketing Moves the Needle on Conversion

Remarketing to existing site visitors obviously taps into the greater intent these people have already displayed. Beyond that though, remarketing is CRO. The fact is that the vast majority of your site visitors aren’t going to convert. Why give up on them the first time they hit your landing page and leave?

Your goal with remarketing is to remind people to complete the action you want them to complete — by getting in front of them with relevant messaging and offers while they’re doing things like watching YouTube videos, checking their email, searching online, checking their Facebook, and so on. If you can do that, then you can increase conversions.

Depending on how targeted you get with your audience lists and campaign types, remarketing can make a huge difference in your conversion rates. Custom printer company Storkie Express’s found that dynamic remarketing campaigns yielded conversion rates 203% higher than regular display ads and 119% higher than regular remarketing campaigns. Take a look at this data from PPC Hero:

ppc-hero-data

Image Credit: PPC Hero

One account had a conversion rate of 1.86% for their remarketing campaign, while the regular display campaign’s conversion rate was 1.19%.

Here are a few of the other benefits of adding remarketing to your CRO arsenal:

  • Increased brand exposure, keeping you top of mind among those already engaged in some way with your business.
  • Precise targeting, with the ability to exclude audience segments that you may have identified as less likely to convert or more likely to waste a click.
  • “Stickier” ads, as the CTR of remarketing ads stays high even as you might expect ad fatigue to set in. A user is still far more likely to engage with a remarketing ad even after having seen it six times before than they are with a brand new generic display ad.
  • Greater engagement, giving you the opportunity to have leads complete a series of increasingly lucrative small conversions (sign up for email, download an ebook, etc.) on their path-to-purchase, or to convert to a sale.
  • Better ROI, thanks to your ability to know your audience and speak more directly to each segment through targeted copy.

You can also count on lower costs-per-click, thanks to the higher CTR inherent to marketing to people already more motivated than the average ad viewer. Higher CTRs positively impact your Quality Score in AdWords and tell ad networks that your ads are highly relevant, which is typically rewarded with reduced costs per click.

In addition, returning visitors are more likely to be engaged with your site than new visitors, especially if you refine your audience to people who visited a certain product page, put an item in their cart, and so on.

Remarketing is one of my favorite CRO hacks, but it’s just one of the ten little-known and super effective tactics Lanya and I are sharing in our April 9th #CRODay webinar, so be sure to register.

We’ve run dozens of tests to prove these unconventional CRO hacks and can’t wait to walk participants each one. We’re going to challenge everything you think you know about conversion and give you a radical new view of the conversion formula. You’re going to take away CRO tactics built on a solid foundation of marketing psychology and proven in extensive testing by HubSpot and WordStream. See you there!

free webinar: conversion rate optimization

Dec

11

2014

Running an Email A/B Test? How to Determine Your Sample Size & Testing Time Frame

ab_testing_calculations

Do you remember your first A/B test on email? I do. (Nerdy, I know.) I felt simultaneously thrilled and terrified because I knew I had to actually use some of what I learned in college stats for my job. 

I sat on the cusp of knowing just enough about statistics that it could be dangerous. For instance, I knew that you needed a big enough sample size to run the test on. I knew I needed to run the test long enough to get statistically significant results. I knew I could easily run one if I wanted, using HubSpot’s Email App.

… But that’s pretty much it. I wasn’t sure how big was “big enough” for sample sizes and how long was “long enough” for test durations — and Googling it gave me a variety of answers my college stats courses definitely didn’t prepare me for.

Turns out I wasn’t alone: Those are two of the most common A/B testing questions we get from customers. And the reason the typical answers from a Google search aren’t that helpful is because they’re talking about A/B testing in an ideal, theoretical, non-marketing world. So, I figured I’d do the research to help answer this question for you in a practical way. At the end of this post, you should be able to know how to determine the right sample size and time frame for your next email send.

Theory vs. Reality of Sample Size and Timing in Email A/B Tests

In theory, to determine a winner between Variation A and Variation B, you need to wait until you have enough results to see if there is a statistically significant difference between the two. Depending on your company, sample size, and how you execute the A/B test, getting statistically significant results could happen in hours or days or weeks — and you’ve just got to stick it out until you get those results. In theory, you should not restrict the time in which you’re gathering results.

For many A/B tests, waiting is no problem. Testing headline copy on a landing page? It’s cool to wait a month for results. Same goes with blog CTA creative — you’d be going for the long-term lead gen play, anyway. 

But on email, waiting can be a problem — for several practical reasons:

1) Each email send has a finite audience.

Unlike a landing page (where you can continue to gather new audience members over time), once you send an email A/B test off, that’s it — you can’t “add” more people to that A/B test. So you’ve got to figure out how squeeze the most juice out of your emails. This will usually require you to send an A/B test to the smallest portion of your list needed to get statistically significant results, pick a winner, and then send the winning variation on to the rest of the list. 

2) Running an email marketing program means you’re juggling at least a few email sends per week. (In reality, probably way more than that.) 

If you spend too much time collecting results, you could miss out on sending your next email — which could have worse effects than if you sent a non-statistically-significant winner email on to one segment of your database. 

3) Email sends are often designed to be timely.

Your marketing emails are optimized to deliver at a certain time of day, whether your emails are supporting the timing of a new campaign launch and/or landing in your recipient’s inboxes at a time they’d love to receive it. So if you wait for your email to be fully statistically significant, you might miss out on being timely and relevant — which could defeat the purpose of your email send in the first place. 

That’s why email A/B testing programs have a “timing” setting built in: At the end of that time frame, if neither result is statistically significant, one variation (which you choose ahead of time) will be sent to the rest of your list. That way, you can still run A/B tests in email, but you can also work around your email marketing scheduling demands and ensure people are always getting timely content.

So to run A/B tests in email while still optimizing your sends for the best results, you’ve got to take both sample size and timing into account. Next up: how to actually figure out your sample size and timing using data.

How to Actually Determine Your Sample Size and Testing Time Frame

Alrighty, now on to the part you’ve been waiting for: how to actually calculate the sample size and timing you need for your next email A/B test. 

How to Calculate Your Email A/B Test’s Sample Size

Like I mentioned above, each email A/B test you send can only be sent to a finite audience — so you need to figure out how to maximize the results from that A/B test. To do that, you need to figure out the smallest portion of your total list needed to get statistically significant results. Here’s how you calculate it.

1) Assess whether you have enough contacts in your list to A/B a sample in the first place.

To A/B test a sample of your list, you need to have a decently large list size — at least 1,000 contacts. If you have fewer than that in your list, the proportion of your list that you need to A/B test to get statistically significant results gets larger and larger. 

For example, to get statistically significant results from a small list, you might have to test 85% or 95% of your list. And the results of the people on your list who haven’t been tested yet will be so small that you might as well have just sent half of your list one email version, and the other half another, and then measured the difference. Your results might not be statistically significant at the end of it all, but at least you’re gathering learnings while you grow your lists to have more than 1,000 contacts. (If you want more tips on growing your email list so you can hit that 1,000 contact threshold, check out this blog post.) 

Note for HubSpot customers: 1,000 contacts is also our benchmark for running A/B tests on samples of email sends — if you have fewer than 1,000 contacts in your selected list, the A version of your test will automatically be sent to half of your list and the B will be sent to the other half.

2) Click here to open up this calculator.

Here’s what it looks like when you open it up:

ab_testing_calculator

3) Put in your email’s Confidence Level, Confidence Interval, and Population into the tool.

Yep, that’s a lot of stat jargon. Here’s what these terms translate to in your email:

Population: Your sample represents a larger group of people. This larger group is called your population.

In email, your population is the typical number of people in your list who get emails delivered to them — not the number of people you sent emails to. To calculate population, I’d look at the past three to five emails you’ve sent to this list, and average the total number of delivered emails. (Use the average when calculating sample size, as the total number of delivered emails will fluctuate.)

Confidence Interval: You might have heard this called “margin of error.” Lots of surveys use this, including political polls. This is the range of results you can expect this A/B test to explain once it’s run with the full population. 

For example, in your emails, if you have an interval of 5, and 60% of your sample opens your Variation, you can be sure that between 55% (60 minus 5) and 65% (60 plus 5) would have also opened that email. The bigger the interval you choose, the more certain you can be that the populations true actions have been accounted for in that interval. At the same time, large intervals will give you less definitive results. It’s a tradeoff you’ll have to make in your emails. 

For our purposes, it’s not worth getting too caught up in confidence intervals. When you’re just getting started with A/B tests, I’d recommend choosing a smaller interval (ex: around 5).  

Confidence Level: This tells you how sure you can be that your sample results lie within the above confidence interval. The lower the percentage, the less sure you can be about the results. The higher the percentage, the more people you’ll need in your sample, too. 

Note for HubSpot customers: The Email App automatically uses the 85% confidence level to determine a winner. Since that option isn’t available in this tool, I’d suggest choosing 95%. 

Example:

Let’s pretend we’re sending our first A/B test. Our list has 1,000 people in it and has a 95% deliverability rate. We want to be 95% confident our winning email metrics fall within a 5-point interval of our population metrics. 

Here’s what we’d put in the tool:

  • Population: 950
  • Confidence Level: 95%
  • Confidence Interval: 5

sample_size_calculations

4) Click “Calculate.”

5) Your sample size will spit out. 

Ta-da! The calculator will spit out your sample size. In our example, our sample size is: 274.

This is the size one of your variations needs to be. So for your email send, if you have one control and one variation, you’ll need to double this number. If you had a control and two variations, you’d triple it. (And so on.)

6) Depending on your email program, you may need to calculate the sample size’s percentage of the whole email.

HubSpot customers, I’m looking at you for this section. When you’re running an email A/B test, you’ll need to select the percentage of contacts to send the list to — not just the raw sample size. 

To do that, you need to divide the number in your sample by the total number of contacts in your list. Here’s what that math looks like, using the example numbers above:

274 / 1000 = 27.4%

This means that each sample (both your control AND your variation) needs to be sent to 27-28% of your audience — in other words, roughly a total of 55% of your total list.

email_ab_test_send

And that’s it! You should be ready to select your sending time. 

How to Choose the Right Time Frame for Your A/B Test

Okay, so this is where we get into the reality of email sending: You have to figure out how long to run your email A/B test before sending a (winning) version on to the rest of your list. Figuring out the timing aspect is a little less statistically driven, but you should definitely use past data to help you make better decisions. Here’s how you can do that.

If you don’t have timing restrictions on when to send the winning email to the rest of the list, head over to your analytics. 

Figure out when your email opens/clicks (or whatever your success metrics are) starts to drop off. Look your past email sends to figure this out. For example, what percentage of total clicks did you get in your first day? If you found that you get 70% of your clicks in the first 24 hours, and then 5% each day after that, it’d make sense to cap your email A/B testing timing window for 24 hours because it wouldn’t be worth delaying your results just to gather a little bit of extra data. In this scenario, you would probably want to keep your timing window to 24 hours, and at the end of 24 hours, your email program should let you know if they can determine a statistically significant winner.

Then, it’s up to you what to do next. If you have a large enough sample size and found a statistically significant winner at the end of the testing time frame, many email marketing programs will automatically and immediately send the winning variation. If you have a large enough sample size and there’s no statistically significant winner at the end of the testing time frame, email marketing tools might also allow you to automatically send a variation of your choice.

If you have a smaller sample size or are running a 50/50 A/B test, when to send the next email based on the initial email’s results is entirely up to you. 

If you have time restrictions on when to send the winning email to the rest of the list, figure out how late you can send the winner without it being untimely or affecting other email sends. 

For example, if you’ve sent an email out at 6 p.m. EST for a flash sale that ends at midnight EST, you wouldn’t want to determine an A/B test winner at 11 p.m. Instead, you’d want to send the email closer to 8 or 9 p.m. — that’ll give the people not involved in the A/B test enough time to act on your email.

And that’s pretty much it, folks. After doing these calculations and examining your data, you should be in a much better state to send email A/B tests — ones that are fairly statistically valid and help you actually move the needle in your email marketing.

email marketing planning template

Nov

18

2014

How an A/B Test of Landing Page Form Copy Improved Lead Quality

clean_dataWhen most people talk about getting quality lead information from forms, they usually talk about one tactic: changing the length of the form. The longer the form, the better quality the leads will be … right?

Truthfully, it’s not always that simple. For most businesses, changing the form length is a great way to get started with increasing lead quality — but at a certain point, you’re going to need to experiment with other form conversion optimization tactics to get better information about the people filling the form out. 

At HubSpot, we’ve had a long lead generation form on our website for a while, but it wasn’t quite getting us the best results we needed to effciently rotate the right leads to Sales. Below is what we tested to help improve the quality of our form submission data — all without adding a single form field.

The Problem

Like we mentioned above, we’ve always had a long lead generation form on our landing pages — we’ve wanted a decent amount of information to properly qualify incoming leads for Sales. Here’s the type of information we typically ask for:

  • Your name
  • Your email
  • Your company name
  • The number of employees at your company
  • The URL of your company’s website 
  • Your role
  • Your department
  • Which CRM you use
  • Your biggest Marketing and Sales challenge
  • Whether your company is a marketing agency (or sells marketing services)

The last bullet is one of the most crucial to ask. We sell HubSpot both directly to businesses and through our Partner program, so we use this question to route new Partner leads to a certain team in Sales. To make sure that sales team is always getting the right type of leads, if someone says that they are a marketing agency, they go into a special queue. In that queue, the website is manually checked to confirm the lead is actually an agency, and if so, is sent on to our Partner sales team. 

About a year ago, we realized that our manual checkers had to still do a ton of filtering to get the proper leads to the proper teams. We discovered that of those people who said “Yes” (they could be a Partner agency), 60% of them were in reality not an agency. So if 10 people said “Yes” on the form, only 4 of them would be a marketing agency. As you can imagine, that manual data scrubbing takes time, and isn’t an efficient way to scale. So we decided to run a test on the form to see if we help people better identify themselves as a partner agency from the get-go.

The Methodology

The test all revolved around testing the form copy. Here’s what the original question said:

Control

SDavidson_Blog_Post-_Original_Form_Field 

In our experiment, we ran an A/B test on one of our landing pages comparing Treatment A and Treatment B against the control — the only difference between the landing page versions was this question. You can see below that we played with copy changes as well as visual presentation in the form field.

Treatment A

SDavidson_Blog_Post_Form_A_Variation

Treatment B

SDavidson_Blog_Post_Form_B_Variation

Next up is what we found.

What We Found

Both treatments had a huge improvement in data accuracy from our original 60% error rate. Treatment A had a 26% false positive rate (meaning of 100 people saying “Yes” on the form, 26 of those people were NOT an agency) while Form B had a 22% false positive rate. After testing for statistical significance, we decided to replace this question on all our forms with Form B’s field format. Since replacing this form field, the false positive rate has gone down even further. 

While this may seem like a trivial test, it’s had a huge impact on our business. Now, instead of getting caught in this data-scrubbing bottleneck, more leads are going to the right sales reps faster. And the faster they can respond to a qualified lead, the more relevant conversation they can have with the lead. So this didn’t just help us get cleaner data at scale — it helped make our sales process more lovable. Talk about a win-win!

optimize marketing channels

Oct

1

2014

3 Real-Life Examples of Incredibly Successful A/B Tests

experiment_examplesWhether you’re looking to increase revenue, sign-ups, social shares, or engagement, A/B testing and optimization can help you get there. But for many marketers out there, the tough part about A/B testing is often finding the right test to drive the biggest impact — especially when you’re just getting started.

So, what’s the recipe for high-impact success? (more…)

Sep

24

2014

The Psychology of Color: How It Affects the Way We Buy [Infographic]

color-wheel-eyeHumans are visual creatures — so visual, in fact, that color plays a much bigger role in influencing what we purchase than we might think.

There’s a reason companies test the colors of things like advertisements, banner ads, and call-to-action (CTA) buttons. When we did a (more…)

Aug

14

2014

4 Fresh Ways to Squeeze More Conversions Out of Your Blog

Published by in category A/B testing, Blogging | Leave a Comment

183712392This post originally appeared on the Insiders section of Inbound Hub. To read more content like this, subscribe to Insiders.

Allow me to make an assumption about you: At one time or another, you’ve done something because your friends were doing it. (more…)

Aug

1

2014

Facebook, OkCupid, and the Ethics of Online Social Experiments

Published by in category A/B testing | Leave a Comment

guinea-pigRecently, two different social websites admitted they ran experiments on users who had no idea they were being experimented on. Facebook ran tests to see if they could affect users’ emotions, and OkCupid ran tests mismatching users to see if they would interact with each other differently. (more…)

Jul

2

2014

How to Design Experiments for Your Website in 5 Easy Steps

designing_experimentsRunning experiments on your website, such as testing the copy of a headline or trying out a new color for your call-to-action buttons, is crucial to your online demand generation. These tests help you get definitive answers on what works and what doesn’t. But experimental design can seem over-complicated, overwhelming, and time-consuming.  (more…)