Eric Enge and Jonathan Mendez talk about Landing Page Optimization
Podcast Date: November 20, 2007
The following is a written transcript of the October 19, 2007 podcast between Jonathan Mendez and Eric Enge:
Eric Enge: Hi, I am Eric Enge, the President of Stone Temple Consulting. You can see our website at www.stonetemple.com. We are here today with Jonathan Mendez, Founder & Chief Strategy Officer for OTTO Digital, and we plan to talk about Landing Page Optimization. You can see the OTTO Digital website at www.ottodigital.com, that's O-T-T-O Digital.com. And, Jonathan also authors the Optimize & Prophesize blog at www.optimizeandprophesize.com. How are you doing today, Jonathan?
Jonathan Mendez: Great, Eric.
Eric Enge: That's good. So, let's get started, why don't you start by defining Landing Page Optimization?
Jonathan Mendez: Wonderful; and first thanks for having me on the podcast. I would define a Landing Page Optimization as using testing and also targeting to provide measurable improvement in performance. And, I think we need to first define what a landing page is. I think that some people are not benefiting from a landing page optimization strategy or doing any landing page optimization at all. This is due to the fact that traditionally landing pages were seen as a paid search strategy.
People don't recognize their site pages as landing pages. This is because of the way that some of the publishers view their content. The fact of the matter is just about every page is a landing page, a person is coming to that page from somewhere. Knowing their source is an incredibly important tool, and really the first helpful step in landing page optimization. Knowing more than sources is even better; we can talk a little bit about that as we go on, but in the end I would say that landing page optimization is defined by creating experiences that deliver a higher degree of relevance towards the goals and attentions of your users.
Eric Enge: Right. So, one of the things you brought out is that somebody may arrive on the homepage of your site, that makes that a landing page for sure. But, then they take the next step and they are landing on a new page. So, you've got to optimize through that experience if that's a common path.
Jonathan Mendez: Absolutely, and many times the homepage itself as you mentioned is the landing page. Especially, if you think about all the people that come from brand search terms. I would recommend different strategies for someone coming from a brand term to a homepage, for someone who go to a homepage for a more generic type of query. So, we have the philosophy that every page is a landing page, and I think when people start to think of it that way and start to work towards improving their results in that manner, they start to see some great success.
Eric Enge: Right. So, what are the ways that you go about doing landing page optimization?
The idea behind A/B testing is that we are testing a single element of the experience. So, sometimes we test more than one variation of that element, and then so we might be doing an A-B-C test, if we want to test two variations against the control. Or, what we call sometimes A/B/n testing, or we can do; sometimes we've had five landing pages for a single ad group or keyword. But, the idea is that only a single element is changed, so when you think about split testing or A/B testing, you are only changing a single element. That said, the element can be the entire page; it can even be a series of pages.
Sometimes we do A/B testing of traffic flows, such as a conversion funnel. Other times though the element is a section of the page, so it can just be a specific area like a call to action or a headline. So, in my experience though I would say this is where we really see the largest improvements overall, because and this is also why we kind of start. Usually, we start a landing page optimization strategy or comprehensive strategy with A/B tests of entire pages; and where we can test wholesale changes of layouts and messages.
Eric Enge: Right. What are examples of some kinds of things you might test this way?
Jonathan Mendez: So again, A/B tests we have had a lot of success just testing completely different pages. So, one of the key strategies in successful landing page optimization is what I call creative differentiation. You don't want to test things that are too similar to one another. And, so we'll test the landing pages that have a completely different look and feel, radically different from one another, both in the messaging and the layout or the imagery used. You can be quite successful doing those kinds of A/B tests.
The other A/B tests that are really successful many times are testing elements being present or not present. Many times we think about adding things to a page to improve the page performance, but a more reductive strategy many times has better results. So, start to think about what can you take away from the page and actually test removing things, having things again being present versus not present. You start to see some great results that way as well.
One I would call out would be navigation. So, many times people think that well, it's our landing page, it shouldn't have any navigation at all on it. Sometimes that's true, but through testing we found that many times navigation is actually helpful for improving conversion rates. So, I think overall something we all would keep in mind is so much of what we test, what we find in the result is counter intuitive to what we thought. So, it makes us test even more and more, and again it's really important to look at the results and keep testing.
Eric Enge: Yeah. You've triggered three thoughts in my mind there, so I am going to try to remember them all here. One is I think too many people use their gut feel and they come up with the best design or close to best design for their intended purpose. And, so much of what you find when you do this kind of testing is counter intuitive, and it's not what you expected. Your audience isn't what thought it was, and you really need to not only be prepared for that, but you want to embrace that, because it describes the opportunity in this kind of testing.
Jonathan Mendez: Oh, absolutely. I would say one of things we find again quite a bit is as you mentioned, different people respond to different things. Within a one particular test, when we start to look at the results and filter the results by different segments, what we'll find is that even within one specific A/B test for example, if you segment the results say for people coming from Google versus Yahoo as an example, or people who've been to the page second time versus the first time.
What you'll see there is the results are different even within the one test. So, again based on source and behavior, relevance means different things to different people, and that's where we start to really craft again a holistic and comprehensive landing page optimization strategy. I would say that the core of a good landing page optimization strategy if possible, is to look at your results in a segmented way and start to deliver landing pages by segments that have a higher degree of relevance.
Eric Enge: Right. So, you also mentioned the bit about lot of people think that you want to remove the navigation and that's their instinctive response, because they think of navigation as a distraction from the goal of the page. But, there are certainly groups of users out there, if they come to a page and they see that it doesn't have any navigation, it's actually a turn off or a negative, because it doesn't look like a real business.
Jonathan Mendez: Absolutely, it doesn't look a real business and we found that especially for sites about financial services, for example, that navigation could be tremendously helpful. As you mentioned it gives a sense of confidence, a sense of trust to the landing page and just generally I like to err on the side of providing a better experience for the user; which that does. And again, this is why we test these things, but you are right intuitively you would say well, you don't want to take people to a different section of the site.
The fact of the matter is if your landing page is relevant; if the content of your page is relevant and the message addresses the goal of the user, the navigation won't be necessary. So, it will start to reinforce certain things, but people if your page is done well and optimized, they won't be using it; they will be doing what you want them to do.
Eric Enge: Right. So, what about multivariate testing, how is it different and how do you handle different complex scenarios with it?
Jonathan Mendez: Sure. Well, multivariate testing is a really great optimization technique. It's probably the most interesting one, because it provides learning into the factor of influence that particular elements on the page have on overall performance. So, if you look it allows you to test a large number of elements, and it provides you with the data that informs you of the best mix of all the different element variations that you are testing. So it's really, MVT is really a test scenario where you are listening to your audience and learning, and because of all this learning it naturally leads to a lot of follow on testing, and a never ending steam of test ideas, so it really allows you to be iterative, and it's also really highly addictive.
The multivariate testing algorithm that we use in the Offermatica tool is called the Taguchi algorithm. It's been tested and proven both digitally and manually over about fifty years. So, it wasn't invented for digital or online testing at all, it was actually I think first used in Japan to optimize assembly line production for automobiles. I guess we know how well that worked out for the Japanese oil industry.
Eric Enge: Right.
Jonathan_Mendez: What it does really is it allows the creation of what we call arrays Taguchi orthogonal arrays. Now, this let's us create what's called fractional factorial testing. What this means simply is that the number of tested elements that we use is reduced. So, the fractional factorial arrays allow us to get faster results without affecting the overall accuracy of the data. And so, we use it, and we like it because we get result in weeks in this type of testing rather than months, and months, and months. So, we need less data to get confidence.
The other way of doing a multivariate testing is full factorial testing which requires testing all the variations possible, and I am not saying there is anything wrong with that. I think it's obviously great if you can test every single factor together. However, many times that's not possible, because it takes so much data and such a very long time to get results with those types of tests. Because, you need to have so much more, in order to get just core confidence in the results, you need a lot and lot of data.
This brings us to a very important point about multivariate testing, and that is the amount of traffic or tested visits that you are getting in for your test. So, more so than A/B testing, the type of multivariate tests you run should be determined by the amount of traffic that you are going to be able to get into a test, and the estimated conversion rate that you are going to receive, and then start to design the type of multivariate tests you want to do around that data. Otherwise, you are going to spend a lot of time creating what maybe a great test, but one that will never get results or statistical confidence.
Eric Enge: Right. So, if you are testing six or eight different elements on a page, that's a lot of variants if you were to try to do what you call full factorial testing, which is where you'd have to show all combinations of those elements. Then, you'll have to have enough data in order to be able to draw a meaningful conclusion. And, that's I think as you said where the fractional factorial testing fits into the picture. As I understand it there is this process of where the algorithm decides what elements of the test it's going to treat as being similar in nature, and maybe they won't vary them at once, right? And, it starts testing scenarios out which it thinks would be statistically significant, even though you are not testing each variant data-wise. Does that make sense?
Jonathan Mendez: Yeah, absolutely. That gets into the array which is the array that's created and forms the actual design of the test. That would get back to the end of the pages that you are actually creating, and what different combinations need to be present on those pages.
Eric Enge: Right. Do you need to have a dynamic web application to do this stuff?
Jonathan Mendez: No, well any page optimization can be done without advanced tools. Multivariate arrays, they've been going back for years, they have been done on spreadsheets. You can still do them on an excel spreadsheet; I believe someone has a website somewhere, where you can create an array online. A/B testing of course can be done across different domains, or different URL's, or using redirects. I was doing A/B testing like many other folks back in 1998, 1999, well before this software, or these types of software were available.
The software just makes it a lot easier to do, it's easier to create the tests; certainly it helps with all the data collection. So, the tools that we have today ranging from free tools from Google to other tools like obviously Offermatica's tool really allow you to do so much more, and get incredible production. But, do you have to have those advanced applications and tools; no, absolutely not.
Eric Enge: Right. A really simple example of course is in your Google AdWords account, you implement two different ads and you just specify different landing pages.
Jonathan Mendez: Yup, absolutely.
Eric Enge: That's a very simple trick that anybody could do even with a static website.
Jonathan Mendez: Yeah.
Eric Enge: So, in the spirit of Danny Sullivan's SMX advanced session, where he had a session called give-it-up, do you have a cool secret of landing page optimization you can share with the audience?
Jonathan Mendez: Well, for the spirit of Danny, I will share something. I think one very important thing that I always like to keep in mind is that, the success you are going to have in your testing is usually determined prior to you even launching the test. So, in terms of the test designs that you are going to go through and the creativity that you are going to come up with, so certainly key to success is again not necessarily the techniques, or tools that you are using, but really the creative elements of the test.
When I see people who failed, it's either that the test design was wrong again for the amount of traffic, that they were testing or even more so, the creative that they tested did not have enough differentiation. So, I think I mentioned earlier creative differentiation is really a key. I would say the hidden secret is to be radical, take chances on the creative end. Test pages that look nothing like one another, try messaging that is completely different, fanatically from one another.
What you find when you do that, when you take those chances is somewhat counter to the wisdom of the crowds. A majority of users will be drawn much more to one than to another, and that really, really gets you a big win. When we look at instances where we've had improvements in conversion rate of 50%, or 150%, 250%, it's always the testing the elements that are diametrically opposed to one another. They are really creatively much, much different than one another, so that's my secret of success.
Eric Enge: Excellent. If you just try and tweak small little things, you might get small little results, and you need to do more if you want to get big results.
Jonathan Mendez: Go bigger, go home.
Eric Enge: There you go. Okay, and how about a case study? Do you have a case study you could talk about?
Jonathan Mendez: Sure. Well, we recently did some work with a client, Share Builder. Share Builder is interesting company, they are in the business of helping people build portfolios of stocks, and they have an interesting model. What we did for them was interesting, because we really looked to use a segmentation strategy. When we sat down with Share Builder, we really wanted to understand their paid search campaign. These were going to be landing pages from paid search that we were looking at and they challenged us that they had a landing page that's never been beat.
They had tried many different things, but what we did was break it down, breakdown their search traffic, because they had one landing page for everything. What we did was break their search campaign down, and look at the five ad groups that they were getting the majority of traffic from, and that considered ad groups around their brand returns, investing terms, beginner investing terms, trading, stock trading terms, and buying stock terms.
We also took some traffic from display ads in finance verticals, and created that as another segment. Then, thinking about the different goals and attentions of those different segments, we crafted landing pages that we believed spoke to those goals, for providing more relevance.
Someone coming from a brand term, they know a little bit something about Share Builder. Someone coming from a beginning investment term, certainly there were certain messages and information that they would probably want that the branded person might not need. So, we ended up creating I think seven different landing pages for them around different themes. What we did also was some A/B testing, and again segmented by ad group. We put some of the landing pages into other ad groups. For example we looked at what the buying stock pages that were done, and thought well, maybe we would put one of those into the branding group to see how they would do.
The thinking behind that was to get learning on performance; they may not perform the best or they may, but certainly will tell us though that messaging will inform us more about the content messaging that might work on follow-up tests. We're always trying to think ahead to the next test and learn what will help us on the next test. So, we created all these pages, and we ran a whole bunch of different A/B/n tests. So, each ad group had between four to six different landing pages, that we drove traffic to, and we'll take the branded one to really do a deep dive into.
What we had there was, we actually had a 142% lift on one of the recipes, and getting back to what we spoke about earlier in terms of radical changes, what we did with that, with the winning recipe there, it was actually all the content that they had on their existing landing page that they said can never be beat. But, what we did with that content is we radically changed the presentation of it. We created a page that looked really nothing like what they had, and it actually ended up that that one was the winner. And, a huge win, again a 142% increase from their branded terms, which again for them was huge, because that's where they were getting the vast majority of their traffic.
It speaks to the creative part of landing page optimization, which I think is again probably under appreciated in terms of its importance to a successful strategy. So, what we did, so we took that winner, but the other interesting thing was the three other pages that were tested versus the control. The each outperformed the control anywhere from 30% to 60%. We took the learning from that, and we used those different messages, incorporated them into a follow-up multivariate test.
We did a 4/3 multivariate test as a follow-up. So, taking four elements of the page, and testing the control versus two variations, and the elements that we took on the page were the headline, the benefit statement. We had benefits on the left, and information about the company, more information about the company on the right. And then, also navigation being present or not present as we spoke about earlier, really good test element. So, a really nice mix of different messaging, things being present or not present, and the follow-up to that after the large lift was an additional 14% lift after the multivariate test.
What was also most interesting again there was the element, what we call the element contribution report. So, what elements that we tested had the largest factor of influence to conversion, and it's an interesting issue in this case study, because it came full circle to what we talked about earlier, having the navigation bar actually had a 47% influence on conversion rate. So, really having navigation in this particular test worked out to be really, really beneficial.
Actually I have a deeper dive on to that particular test on my blog, if you go do a search on it. So, but so that I think really a good case study in terms of segmenting your audience, creating A/B test scenarios, and then following up with that learning on a multivariate test to get even more results, better results, and continue learning that can be applied elsewhere.
Eric Enge: Right. So, when you talked about a 142% lift just to get it clear on people's mind, you mean it was almost two and a half times the original conversion rate?
Jonathan Mendez: Yes. We think we do a pretty good job, but even those kinds of results don't happen everyday. But, it's certainly nice to talk about them, but yes they, that is correct.
Eric Enge: What probably happened is that they did their own optimization, but they did very iteratively rather than making radical change.
Jonathan Mendez: Actually they did multivariate testing before, and they did full factorial tests on that particular page. So, they did get some insight, and created a good page before we even got there, which was challenging for us to try to beat it. It was one of the better landing pages I've seen. Usually when we come into a client, we know we are going to have easy time of it, because the landing pages leave a lot to be desired. But, in this particular case, they had been testing for over a year and a half already when we came in there.
They had a page that was really, really working for them, but they just weren't satisfied. They wanted to take it to higher level, and I think again that speaks to, that really speaks to testing. You want to keep testing, and I think these guys as a client had a tremendously great attitude towards testing, they understood the value of it. They created an infrastructure that allowed us to test, they were accepting the kind of bold ideas and radical thinking that we presented to them, and they were able to reap the rewards of that.
So, I think again one thing that we didn't talk about being successful is, you have to have the mindset in place to be successful, and that requires buying into to this idea that nothing is sacred, and everything needs to be tested and validated. And, the data doesn't lie, all those things will breed success, but again it's much easier said than done for many people.
Eric Enge: Right. You need to be willing to take chances, as you have said before, and sometimes for larger organizations that have invested a lot in their brand strategy that could be hard, but there is a lot to be gained. Can you provide some screenshots that we can include in the transcript for people?
Jonathan Mendez: Sure. I'd be happy to send those over to you.
Eric Enge: That would be great. Well, thanks for taking the time to talk to us today Jonathan.
Jonathan Mendez: My pleasure; thanks for having me.
About the Author
Eric Enge is the Founder and President of Stone Temple Consulting (STC). STC offers Internet marketing optimization services, including SEO, Social Media and PPC optimization, and its web site can be found at: http://www.stonetemple.com.