Ruben Gamez on how to ‘get’ inside your customer’s head, reduce churn and effectively test pricing

Go back to blog

Pin It

In 2009 Ruben started his own company, BidSketch. At its inception BidSketch was a proposal software that was primarily targeted at web designers and web developers. Starting as a one-man show, BidSketch has vastly grown, to this date it helped its customers to make more than $1 billion in sales.

But Ruben’s journey was not an easy one. On the way to his first $1000 there were multiple times when he wanted to give up on the whole idea completely. His initial research showed that the no one had any interest in the product, he wasted a whole month building a free tool that nobody used and even missed his launch date due to unreliable contractors (more on that here).

In fact, he even had to hit the $1000/month mark twice(!). One API call eroded almost all the billing information he had about his customers. In a matter of seconds, his revenue dropped down to zero. He had to email his customers, asking them to reset their paying accounts again. Surely, a fair number of customers did not return.

The upside is – throughout his journey Ruben has learnt a lot – and that’s why I am so excited to have had the opportunity to interview him. In 2012 he wrote a blog post, “What I learned from increasing my prices”, where he explains how research and testing allowed BidSketch to see one of the largest spikes in growth it has ever had. Since then his pricing page has evolved even further and that’s what we are about to dig into.

We cover pricing, small tests that he ran that resulted in substantial increases in conversion rate (and revenue), how Ruben used Jobs-to-be-Done interviews to decrease his customers’ churn rate, research and testing tools that helped him on his journey, and so much more.

I recommend you read his original article first (although it’s not required). I’ve learnt a lot and I am sure you will too.

Part 1: How to communicate value of your product on the pricing page – while keeping things simple

A little bit of background history.

Here’s the first version of BidSketch pricing page (2010)

Here’s the version that resulted in one of the largest spikes in revenue (the one he talks about in his article).  This is 2012.

This is the version that we see today (in 2016)

Egor: First of all, I would like to understand the context behind your pricing page, how it evolved over the years, and then get into the nitty-gritty of what research questions did you find being most useful, any actionable tips you can share, things that delivered the most results.

The major difference that I can see is that you had freelancer, studio and agency plans. Then, today in 2016 it is split into Solo, Team and Business. How did that change happen? The first one seemed to be more tailored to customer personas (web designers in particular), and this one seems to be more generic – more applicable to everyone. Did you change it as you scaled or was there another reason for that?

Ruben: Initially, when we had the premium and basic plans (when BidSketch first launched), it was for designers. By the time I did this pricing change, it was no longer for designers, but it still was for… you know, creatives.  I think at the time 80% or 90% were the categories of either web designers, marketing, freelancers, SEO, developers, people from companies in those categories. Persona-based pricing was a good fit for that.

Then, there was a point where we started getting more customers as we scaled, and that distribution started to change. We saw that it started to change through a few surveys, but beyond that, we also started to see it in cancellation feedback of people who were entering the trial period. More and more people were saying, ‘I don’t think this is for me. I don’t feel like it was made for my business. It seems as if it was made for designers or web developers’.

We started changing the product, for example, adding more templates for more businesses. That way we had a bunch of different signals in the app that spoke to those kinds of businesses. Then we started generalizing even more and adding more resources to appeal to them.

But we were still getting that the last piece was the pricing page. We looked at the businesses that were cancelling, at their websites and we talked to them. With some of them we did Jobs-to-be-Done interviews. It was like, ok, the pricing might be unclear when somebody goes to the pricing page and they see freelancer, studio, agency, and they are not that type of company.

For example, they could be from a SaaS company doing enterprise sales, and they would think, ‘hmm, this is not quite right’. So, we did a test to see if there would be an impact on conversions. In the previous test [the change that was carried out in 2012, see images above] where we changed general names [Basic and Premium] to freelancer, agency and studio plan names, we got more customers. This time around when we tested Business, Team and Solo, we got less trials, which was interesting, but we got a little bit more customers at a bit of a higher price point.

Egor: That’s very interesting. First of all, you targeted specific segments or even identities, and achieved an uplift, and then you repositioned it… with an appeal to broader audiences. It seems like the opposite of the technique that worked for you in the first place led to more people being closed.

Ruben: Right, you know. Business changes, market changes, competition, traffic you get, there are a lot of variables. It’s a good idea to retest, I do that sometimes – retest things that did not work before.

Egor: Another change that I can see is your plans are primarily limited by the number of users; and previously the limitations included proposals, clients and users [and storage].

The limitations you set for your plans are important, aren’t they?

They can act as an incentive for a client to upgrade.

The extent to which your product and its different features are used also affects your cost base. For example, if the number of proposals that a customer can set closely correlates with your costs, and you make it unlimited, then your cost base could skyrocket [if customers start creating loads of proposals].

Probably, this is not the case given the fact that you removed it, but I am just trying to understand what was your thinking behind setting some of these limitations, for example, users, and removing other ones (proposals and clients)? Is it primarily customer-research driven? Was it somehow affected by your consideration of costs and profit? What was your thinking process?

Ruben: It was based off of a couple of things. One was we looked at the data when we had very simple plans, either a plan with one user or a plan with unlimited users [the very first plans Bidsketch had in 2010]. Looking at the data, we could see very clear groupings or break-points. They were not getting charged for those extra users, so we could see naturally how many people on one account used the product.

We’ve seen a bunch of companies with two or three users. Then, we’d see, I think the next point was 5, and the next one 8. Just based off of that data, it felt like a really good test. We also looked at different types of companies that had these different numbers of users. That was one of the things that we looked at, and the other thing was features.

Basically, since we were just leveraging users [as a limitation], we mainly looked at customising domains and team management [for different plans]. Team management does not really mean anything for people who are on the 1 user plan, but it’s there to make it feel a lot more different, like you’re getting a lot more value on a higher priced plan where you have more users. We could probably eliminate that row, and it would still be clear what the differences are [between these plans]. The reason why it’s there is to make it feel more different, it’s something to make it stand out more.

Overall, we used a combination of metrics and qualitative data. One limitation, users, was based off of our quant data. Ability to customise your own domain is something that was highly valued based off of our conversations with customers. We tried to do both; we looked at the data that we have and we tried to have conversations with customers to get clarity on that data, to make sure that what we think we are seeing is actually what we are seeing. That was the thinking behind it.

The proposals… I am trying to remember why [we had in the first place], I think the proposals was an attempt to have something else that we use to push people towards the $29/month plan. That’s why we did it and when we had this plan, most people signed up to the $29/month plan. Most people did not sign up to the $19/month plan although it was cheaper.

So, I don’t think I ever really tested that before [specifically testing impact of proposals]. At one point I wanted to simplify pricing. So, we ran surveys and asked people what confused them. There were few things that would come up in [the surveys], but one thing that I just wondered about was is, ‘Is the number of proposals actually doing anything?’.

Hiten Shah from KISSmetrics, CrazyEgg, recommends sometimes doing what he calls sensitivity testing, which is just: remove something from the page, see if it’s actually working. Instead of adding something or changing it, just take it off, see if it actually has any impact. So, we did that and you know it did not get any worse, it did not get any better. So, I dropped it, just because I like simple, simple is better.

Egor: Was is the same for the ‘clients’ limitation?

Ruben: Well, we dropped the freelancer plan (the $19 plan) out of the main grid to add another plan. So, clients is a metric that we still limit on, but not on any of the plans that are on the grid. It’s limited on the link below the plan. Since the other plans on the grid are more expensive and we don’t limit clients on any of them, there is no need to have that.

Egor: Ah, I saw that. There is a link below that takes you to another plan. I read this case study where Joanna Wiebe from CopyHackers optimised CrazyEgg’s website, I think they did a similar sensitivity test. They removed the Johnson’s box on the left which is a navigation box, and I think they removed it in order to… basically, to make more space, so that they can put more content above-the-fold on that landing page.

You said that you tried to simplify the plans. As far as I understand, BidSketch has many more features than what is currently listed on the pricing page.

How did you… I am asking this because usually when I look at enterprise SaaS at least, they have a huge list of different features. It just falls on you and sometimes I start feeling overwhelmed.

Example of a pricing page with a loooong list of features

You can sense that the one with a larger list is meant to be more attractive for larger businesses, but to really find something for yourself… it’s hard, sometimes I can’t even make it through.

So, my question is: How did you come to that list of features that is currently listed on your pricing page? Your plans look very simplified and easy-to-digest.

Ruben: There was mainly… I think when we were working on the second version of these plans, we tested just having a bunch of features on the left hand-side, having them listed all out, more detailed [like the SurveyGizmo example above], and the simpler version won; it did better. So, that’s what moved us in that direction.

We also did some Qualaroo surveys. We found that yes, there can be value from showing what are the features on each of these plans, even if the feature does not communicate what are the differences between each of these plans, it is still valuable for users to know.

Even if you pushed everyone to see your tour page before they see the pricing page, not everyone will actively engage with your tour page. They might just skip to pricing, this is why I think it’s important to show important features that are available on all the plans. But it does not have to be done in a way that a lot of people do it, which is a column on a left hand side, and on the right there is a pricing grid.

We are doing it at the bottom before the sign-up button where we say, “All plans include templates, branding, and PDF export”.

Also, we don’t [show] all of the features that we have, we only have those features that are most important to people. The ones that we know from interviews and surveys, they are the most important things because they asked for them or whatever. So, it’s sort of still limited, but it’s shown there. And when you have them on the left hand side, it’s just more energy, it adds more visual noise, it makes it harder to parse through the pricing grid.

Part 2: How to apply Jobs-to-be-Done interviews to SaaS and finally ‘get’ your customers, build a better product and cut down churn by over 30%?

Egor: So, to simplify the plan you needed to limit the number of features you are showing. To do this, you did research and identified what customers found being most valuable in your product, what research and what types of questions. You said you used Qualaroo surveys, you used interviews – what exact questions did you find most useful when trying to understand what your customers find being most valuable?

Ruben: It’s two things. It’s seeing what they are using when they pay, and it’s never been directly asking them, but finding out what they chose or why they chose it, for example, when they upgraded or decided to pay. This came up through Jobs-to-be-Done interviews where we did ‘switch interviews.’

In those you focus on what happened; the steps that they took when they stopped using whatever it is that they were using previously and started paying for our product. In that, there is a point where they are evaluating and they are deciding and it’s pretty clear…

You ask them, ‘What did you do next? What you didn’t do next? Why did you do that? Ok, what was you thinking at this point? Did you have any concerns?’ The thing that generally comes out is the decision that they were making, the trade-offs that they were making when they were buying, so then you get to see, ‘Aha!’

So, to them the branding part is not really that important because that did not stop their decision, that did not stop them from upgrading, but they were not sure about custom domains, so in their trial they did not upgrade or did not pay or did not start their plan early – even though they wanted to – until they set up DNS and set up their custom domain, etc’.

So, there are a bunch of little stories like that, so that we can then see, ok, these were the themes that helped them to decide to pay and these are the ones that did not. So, again, we used a combination of that [qualitative data, specifically JTBD interviews] and quantitative data.

Egor: You mentioned switch interviews and Jobs-to-be-Done interviews. I have heard of Jobs-to-Done as a concept [when I read Clayton Christensen’s “How will you measure your life?”], but I have not heard of Jobs-to-be-Done interviews. Is it a standardised set of questions you use, do you prepare it yourself, is it some type of framework? Could you explain it to me?

Ruben: Yeah, sure. Generally we run switch interviews. It’s about capturing the story of the switching moment. So, instead of asking them, “Why did you sign up? How did you like it?”, or any things like that, you approach it in a different way.

Basically, people often don’t know on the surface why [they made a particular decision]. Or they would give you reasons that they think you want to hear, but instead with switch interviews you start by asking…

Well, you start in a lot of different ways, but the framework for asking these questions is to find out:

  • what they were using before
  • when did they start to have problems or doubts with the things they were using
  • why did they start looking for something else
  • why did they start to evaluate something else
  • why did they start to evaluate it or sign up for it at that moment, on that day instead of the day before, the day after, to really dig into it

You want to have them walk you through every step of what happened in order to understand their thinking, their process and ask, ‘What were you thinking here? Why did you do this? Why did you do that?’ instead of asking, ‘Why did you sign up?’. And going through that story where you are finding out what the moment when they decided to buy – through their actions – was, and what their thinking was.

[My note: Notice how the approach above is different from standard CRO questions such as, “What persuaded you to purchase from us today?”.

For those unfamiliar with JTBD framework, think about what Ruben said before: often customers do not know the deep reasons behind why they signed up. So, often if you just ask, ‘Why did you sign up?’, you will get a lot of surface answers. Eg. ‘I just needed to create proposals for my business.’ This is not very actionable.

Instead, with JTBD interviews you go through their story and ask them why they made certain decisions in the past that ultimately led to the final purchase decision. When people go back in time and start recalling situations and context in which these decisions were made, more detailed memories start coming out on the surface and the real motives behind one’s purchase are revealed.

It didn’t click with me until I read Alan Klement’s book “When Coffee and Kale compete” and tried conducting a JTBD interview myself, but the quickest way to get to your first “aha” moment with JTBD framework is to listen to the JTBD Mattress interview].

Egor: So, when does it happen? Does it happen straight after someone converted into purchase or can it happen at any time?

Ruben: Well, it’s a SaaS product, there are two things. There’s a ton of friction when we ask for a credit card upfront for someone to sign up for a trial. So, that’s one thing. There has to be enough… enough momentum and something pushing them towards entering their credit card information to do that right at that moment. That’s one point and the other more important point is when they actually decided to buy.

Since that’s a SaaS product where we just bill them automatically on day 14, it’s not on day 14 that they decided to buy. Maybe they forgot to cancel, so a month later they’re going to ask for a refund. Maybe they haven’t even set it up yet. It’s like, ‘yeah, in a few months we will’.

Usually, it’s at some point during the trial or some point after they started paying, after the trial. We often cover that with a question, ‘At what point did you know it was going to work for you?’. We walked through the whole story and ‘yeah, it’s working, we used it and it was really good’. It’s like, ok, good, at what point did you realise that it was ok, before that point you were trying it out, trying to see and then at some point something happened where you saw something and you thought, ‘Yes, this is gonna work’. That’s the buying moment.

Egor: So, you are trying to get them to narrate a story about themselves as opposed to trying to make them rationalise why they made that purchase. And then you try to understand why they made that purchase by listening to their story and analysing it yourself rather than making them to rationalise it for you. That’s very interesting. Did someone create switch interviews? Where did it originate from?

Ruben: Yeah, two guys from the Re-Wired Group that work closely with Clayton Christensen on implementing Jobs-to-be-Done interviews. Bob Moesta and Chris Spiek. They do these interviews with really big clients. They put on these Switch Workshops where they teach this concept.

So, we have also done cancellation interviews where people are switching away from our product to something else. Our product is the thing that they were using and they had a problem with, and eventually people started using something else.

Egor: When you say interviews, do you mean calling and talking through their story? What is the set up like?

Ruben: Yeah, these are like 30-45 minute interviews.

Egor: Is it difficult to recruit people for these interviews?

Ruben: If it’s for people who are paying, we are trying to do the switch interviews for people who paid at least once or they just finished payment for the next month. We do it there because we still want it to be fresh in their mind. We also want to make sure that they are paying [ie. they did not just forget to cancel].

Egor: Is it difficult to get people to agree to these interviews? Do you use some type of incentive? What kind of email do you send?

Ruben: We have not had too much luck recruiting through email. So, generally we do not do that. We previously recruited through Qualaroo surveys inside the app or using Intercom inside the app, taking them through a survey and getting them an incentive.

Recruiting for people who cancelled is much harder than people who just paid for your product, especially when you want to get them on the phone for that long. So, for people that cancelled we did a cancel confirmation page, it came up with a message, saying that they have been cancelled, sorry to see them go, feedback is very important to us, please, help us improve, asking them if they would be willing to participate in a 30-45 minute interview.

To show our appreciation, we’ll toss in a $100 Amazon gift card. It can work without the gift card, we have done that, with the gift card it’s just so much faster. We have a really big incentive. You generally need around 10 to 15 of these interviews. As you don’t need a lot of interviews, it’s well worth for us to give $100 per person for the data that we would get.

Egor: That’s amazing! So, based on what I have heard so far, there are two types, switch interviews and cancellation interviews. What did you find most valuable? With switch interviews you are trying to understand what happened in someone’s life and led them to start paying for your product. With cancellation interviews, are you trying to understand why your value proposition suffers? What’s the main value of these interviews?

Ruben: We have exit surveys on the cancel form. It’s a required form where they tell us why they are cancelling. The vast majority of people are just saying, ‘Did not use it enough’, so there is a percentage of people who say, ‘Well, this did not work or that did not work’. People that cancel in the early months, first month after paying or the first 2 months after paying, it’s generally onboarding stuff. They just did not finish setting up their account, they did not fully implement it or they just did it once, things like that. It’s still a symptom, but the reason for that varies. And people that cancel that have been using it for a while, they tend to be in different categories.

So, the cancellation interviews were to get more insight into what we were seeing as far as the feedback that they were giving us. It felt kind of superficial. It was light, it was better than nothing in these cancel forms, but we wanted to see what the stories were behind that. In particular, the biggest category was ‘not using it enough’.

What do you mean by ‘I am not using it enough’? Why not? It’s not just not using it. There was a reason for it. In some cases, there was a big disconnect between what they expected and what they got.

Another thing that came up was about the term ‘proposal’. There was a disconnect to what they understood as proposals and what the app offered them. It’s a proposal app and once they sign up, they have proposals in their mind to create and send. Then, they start using it and they think, ‘Ok, that thing was more thorough than what I currently use’, it has a lot of features for the proposals that I send. Proposals that I send are very simple.’

Well, taking a look at their “proposal”, it is not really a proposal, they are sending an estimate or they are sending a contract, but for a lot of people these are their sales proposals. These people were less likely to buy. So, as a result of these cancellation interviews we set up examples and help documentation around those other types of documents.

There are several categories and things like that. Sometimes it’s a setup thing, it’s just onboarding, if it’s onboarding, then you can fix it. But it’s much easier to uncover what those reasons are after doing interviews that way.

Egor: I see. Did you make any other product changes or marketing changes that came as a result of these interviews? And did you see any tangible results from these changes?

Ruben: Some of the pricing grid changes that we have already talked about. Through those interviews, that’s where we got the insight. Knowing what features we want to show on that page, on the left hand side or just at the bottom, and which features should we not even bother showing. A lot of that insight came from that. As far as pricing…

Egor: It does not necessarily have to be about pricing. Anything related to product or marketing…

Ruben: The ‘pause’ feature for their account, where they pay $5 a month is actually used and people come back and un-pause their account and start paying again.

Egor: Is it for people who are not using it actively, but want to stay?

Ruben: Right, with people who were cancelling it was kind of streaky. Especially if they are smaller, they would send out some proposals, then they would get something. They’d be busy with that project for several months and would not be using BidSketch. Then, we would bill them and we would bill them again. They would think, ‘I need to cancel, I am not using this’.

Then, in 2 more months they would start using it again. So, they would sign up for another trial and create another account and would not have the past history or anything like that. So, they would like to have had all their past history and not have to set everything up again. Just implementing that was a pretty good thing that came from that. It worked.

It’s used in the way that it was meant to be used. We monitored it, and we worried that people would just leave it there and not come back, but a lot of people did come back. So, that’s working well.

The other thing was yearly plans. Being more aggressive with yearly because of that cycle. This is another thing that came from Jobs-to-be-Done interviews. Being able to change the evaluation period in their mind.

When someone is on a yearly plan, it’s about, ‘How much did I use it this year?’ It’s a very different question from the one you ask yourself every month, ‘Did I use it? Oh no, this month I did not.’ So, maybe I used it 20 times in a year, but it was all in 3 months or 4 months cycles throughout the year. The rest of the months were not used at all.

For somebody who is evaluating on the yearly basis that works. For somebody who is evaluating monthly, sometimes it’s worth it if they think about it in terms of their entire usage per year, but a lot of people don’t think that way. People literally think, ‘Oh, this is a second month I have not used it’.

Egor: So, what did you do? Did you just literally push more people on the yearly plans as opposed to offering monthly plans?

Ruben: Yeah, and pitching yearly plans through Intercom at day 45 or something like that, and giving link for an upgrade with a big discount to invoices. Basically, pitching them everywhere.

Half of the traffic that we get to the pricing page gets defaulted to yearly plans, and the other half to monthly with the option to pay upfront yearly. It’s a little ghetto, but it gives us the right amount of yearly paid accounts without sacrificing too much of the monthly revenue.

Egor: Is it easy to convince people? Do you convince them with just a discount or do you build a bigger business case around it?

Ruben: Well… The discount does most of the work. Just having a generous discount, then pitching it at the right time for the people that do not default to it or initially take it. Some people don’t even know if this is going to work. They don’t feel secure enough with going for something yearly. That’s why… I found that about a 45 day mark is a good time for us to do that.

Egor: How did you come to that 45 day mark? Was it through experimentation/trial-and-error?

Ruben: Through a lot of conversations that we had, we could tell that by then, not everyone, there are many people that are still unsure, but most people would surely love it and know if it’s going to work for them or not.

Egor: How did you come to your current discount? If I am correct, it’s 40%.

Ruben: 40% is for the middle plan, 26% percent for the other plans.

Egor: How did you come to this?

Ruben: We tested discounts. We started maybe at 10% or so, I don’t remember exactly what they were.

Egor: So, you started with discounts and then you looked at how many people would get into a yearly plan? Was is the main KPI for that one?

Ruben: Yes.

Egor: Ok, and then you just went up and up with your discount and looked at what the effect would be?

Ruben: That’s right.

Egor: I want to come back to the pause feature. There has been a number of times when I would have certainly paid a small fee. I think it’s very smart…

Ruben: It’s something that I think a lot of SaaS products could do. I’ve seen it done with a pre-pause, I don’t remember the exact products. I wanted to do a ‘pay’ one because we don’t want to pause just a bunch of accounts where people had no intention of coming back. So, it was just that if they are willing to pay at least $5/month, they see that there is real value in this for them. [In that case], they would be more likely to then un-pause it at some point.

Egor: And do a lot of people come back?

Ruben: Yep.

Egor: So, it works.

Ruben: It seems to be working for us.

Egor: How does it work? When someone cancels Bidsketch, does their account get deleted straightaway, so they have to create a new one? What is the process? Is it not being saved anyways? What is the incentive for people to pause?

Ruben: If they were not to pause, if they were to cancel, then all their data gets deleted. If they were to come back, they would have to create a new account and recreate everything.

Egor: Are they being notified of that in advance? If I am cancelling, am I being told that all the data will be deleted?

Ruben: Yes, when people cancel, we explain to them that their data is going to be deleted. We make them to tick an extra check-box. We explicitly prompt them during the cancellation flow, so they can choose to pause instead of cancelling.

Egor: I want to clarify the thing about these interview. You seem to have mentioned 3 types of interviews. There are Jobs-to-be-done interviews, switch interviews and cancellation interviews. Are these all separate types?

Ruben: It’s the same type. I just cycle through, but I would say there are 2 types: ‘switching to’ and ‘switching away from’ interviews. We have also have done a lot of regular customer development interviews.

Egor: And what exactly do you mean by that?

Ruben: Just interviews that are generally shorter and are more direct. And we are not capturing their story about why they switched or not. It’s for people who have been already using the product, and when we are usually trying to get more insights around some data that we have collected somewhere or we are trying to get clarification around something.

We ask very specific questions about, for instance, the proposal thing, the term. What sort of documents, what are they sending through BidSketch, what are these documents, what do they contain, what do they have, are these documents to close a sale? Are they being sent through Bidsketch? Or are they being sent through email or other apps? This is an example of us trying to get more data through short custdev interviews, asking very direct questions.

Egor: So, with switch and cancellation interviews, you are trying to understand the Jobs-to-be-Done. With regular ones, you are just trying to clarify any questions you have about a certain aspect of your existing data.

Ruben: Right.

Egor: And with Jobs-to-be-done what questions did you find the most useful?]

I actually have them in my blog post. There is a section in there, a cheat sheet with all the questions.

Part 3: What experiments did Ruben run on the pricing page? How did a quick copy change help him to increase the trial sign up rate? What tools does he use for tracking and testing?

Egor: Coming back to the original re-design of your pricing page, you said you looked at the data. How did you look it up? Did you use any tools or did you just have it in your back-end?

Ruben: Both our back-end and Kissmetrics.

Egor: And what did you use for experimentation, for A/B testing?

Ruben: It was a combination of Optimizely and Kissmetrics.

Ruben: Optimizely is good for re-directing traffic and seeing the results on that page that we are testing, and then we use Kissmetrics to see the impact throughout the funnel, on sign-ups, cancellations, etc.

Egor: So, tracking long-term effects.

Ruben: To make sure that ‘yes, it helped our conversions’, it also did not negatively impact cancellations or something else.

Egor: Now I want to dissect your current pricing page. As you can see, I numbered every element of your current pricing page.

A couple of things are going here that I find interesting. First thing you do is communicate your value proposition in the headline. Then, you seem to communicate not just the value of your product, but value of the free trial itself. I looked at SumoMe’s pricing page today and they did not have any of those elements. How did you come to that?

Ruben: Number one used to be number two based off of some other page, I think it was Basecamp or something similar. At that time, I did not do a lot of testing around ‘get started in less than a minute’ or ‘get started quickly’. That seemed like a good idea, I had that on there, and I wanted to test something different than that. Basically, just to test the value proposition. So, we tested that and it did a little bit better. So, we kept that and I did not have number two at all.

There were questions that we asked through Qualaroo surveys. Asking people what’s stopping them from signing up to where it made me want to test number 2 underneath. It helped a little bit.

We did not see really huge jumps in trials, but most of them were just a little bit better, and so we left number 2. I was actually kind of surprised with number 2, I tested it, but I remember thinking, ‘yeah, it probably won’t do anything, but I just can’t think of anything better’, but in the test it actually worked. I thought, ‘Ha! They are reading that and it actually makes a difference to them!’

Egor: Was impact just on the free trial sign-ups or did the impact translate into actual sales?

Ruben: Yeah, it did! And the order of the plans, we had the order differently, from small to big, and we tested that, the sign-ups mostly stayed the same, but the distribution was a little different. Our revenue per customer was a little higher, it got more people paying on the higher-tier plans.

Egor: It also seems to me that you are trying to communicate value through tooltips for your features. For example, explanation for Analytics is not tied to some metrics, number of hits you have got or some other technical metric, it is more about how people would use it. If I were about to sign up for BidSketch, I would see immediate value in being able to track my clients. Was it a separate test or did you just think that this is a sensible to do?

Ruben: Yeah, I did not test that. It just made sense to try to do that in a way we write up our features on the features page, and tour page and anywhere where we are explaining it, trying to make it clear where the value is. Those have changed and it’s been mostly about clarity because, through Qualaroo surveys on that page, I have seen from time to time questions that people have.

Also, in Crazy Egg I saw, ‘Yep, they are using them’, they are hovering over them, they are looking at them, but maybe I am not explaining it clearly enough or it does not make enough sense.

Egor: I can also see that further down you are using social proof and also have an FAQ in order to, in my understanding, close some of the main objections. Was it tested separately or was it just a sensible thing to add?

Ruben: You know, I have not tested the FAQ. FAQ was added based off the questions that we saw people asking. For example, when we asked them through Qualaroo, why didn’t you sign up or what is keeping you from taking on a plan… that’s what we used that area for.

And the social proof was the results that people who were signing up talked about, the ones that people want. So, two of them are based on them talking about time save. Any time we have tested what people want, like close more deals or save more time or make more money, save more time when it comes to proposals always wins. So, that’s why those specific testimonials are there and that said, there is interest in closing more sales, so we have one focused on that.

December 21st, 2016

About the Author

 

Egor Driagin

Egor Driagin is the Content Strategist at Conversion.com. His mission is to convert Conversion.com’s insider knowledge into actionable insights that you can start applying today.

Comments (0)

There are no comments.

Post your comment here

Get involved in the conversation! Leave a comment and we or someone else in the community will be sure to reply.

 
 

Get in touch

If you're looking for a personal, results-focused approach to conversion optimization, we'd love to hear from you.

Get In Touch

For the best experience, please return your device to portrait mode.