Enquisite’s Richard Zwicky Interviewed by Eric Enge

Picture of Richard Zwicky

Richard Zwicky

Richard Zwicky has been involved in search marketing for 10 years, starting in the late 1990s. He started in the industry by managing the online campaigns for his own successful e-tail operation, which quickly led to developing Metamend, a leading search engine optimization firm which he co-founded in 2000. As CEO for Metamend, he managed and led the optimization campaigns for web properties ranging from SOHOs to Fortune 500 sites.

He split Metamend and Enquisite into separate companies in 2006, as Enquisite’s services are designed for use by any SEO and SEM. Today, he is leading Enquisite, which recently released its first products. Richard’s work is focused around helping search marketers manage campaigns more easily and with greater success. Richard believes in long-term successful campaigns that are built from the ground up, and are never caught flat-footed by shifts in search engine algorithms or by regional variances in search user behavior.

Interview Transcript

Eric Enge: Can you tell us a little bit about Enquisite Optimizer?

Richard Zwicky: Enquisite Optimizer is built from the ground up for search marketers. I used to run a lot of campaigns and it was always frustrating and time-consuming to get the right data out of existing web analytics. It was even impossible a lot of the times because, quite honestly, the focus in most analytics products isn’t on the organic search marketing campaigns. They are focused a lot more on paid search. The legacy of analytics products goes back to the days when IT needed data about page load times and information like that, and they have continued to be built on that foundation.

We came at it from a completely different angle, and developed a new way of collecting, processing and reporting the data to help the search marketer understand that they needed to do the job more efficiently and deliver higher value to their customers from a variety of perspectives.

Our long tail analysis was one great example of that. People like Rand Fishkin were always looking for better data about the long tail of the campaign they are running. They want to know how to visualize it, understand it and understand its shape because a good site has a very standardized traffic shape and pattern for its tail’s traffic, whereas tails from sites that are less well-optimized don’t follow standard form.

But you can’t see it unless you visualize it, you can’t understand it or know how to deal with it. So we built that element of the report incorporating the long-tailed graph and you’ve got the choice of long tail or amounts of overlays, so now you can visualize where your traffic is coming from as well. But we built it using the logic behind how you would run a campaign. Of course, different businesses have different needs. For example, a retailer that sells only within the US will not care about search traffic coming from anywhere else. They want to understand what is coming from the US. You should be able to analyze that and turn that data into action. The application is built so that you can segment and break out your traffic by the logic with which you actually went into business. You can segment geographically down to the zip code level.

As another example, a standard analytics package will give you which phrases bring users to your site and which search engine sent those users. We make it possible for you to segment your visitors by which webpage they landed on, from what geography and from a variety of other parameters. So you really get what you want the way you want it. If you are am SEO firm, you might want to target the word organic and find out all the different the different ways that people are using organic to arrive at your website (long tail segmentation). Show me all the strings that include the word organic, show me what that tail looks like or show me just everything that includes the term SEO.

Eric Enge: So you can specify a stem like you can in Wordtracker or KeywordDiscovery fashion?

Richard Zwicky: Exactly, but it also provides analysis as well. Simply because you are getting traffic doesn’t mean that it is good traffic. One of the other challenges that marketers have is the ability to see what the traffic really looks like and to understand what part of that traffic is actually meaningful and relevant. If you are a retailer, you would care about conversion; if you are a publisher, you would care about page views and time on site; Enquisite Optimizer discovers and reports on what is optimal on a site-by-site basis. It compares all of your referral traffic to identify optimal patterns in terms of the user behavior and which traffic has the highest potential and which one has the lowest. Just because you are getting a lot of traffic for a term, doesn’t mean it is actually ever going to result in conversion.

There is a mathematical process where you can show what the normal traffic for conversions looks like and what other traffic matches up. You can actually target the right terms for the right pages and do better job of shaping your traffic. This saves you from trying do it yourself with trial and error, which might take you months. But with our system, it will show you that information and within a couple of days you will start seeing patterns of what is normal and what is optimal. Then all of the sudden it’s helping you make those decision so you can get on with optimizing and building your campaign out as opposed to sitting there and trying to figure out what to do next.

Eric Enge: You mentioned a little earlier that the shape of the long tail curve behaves differently for sites that are less well-optimized. How is it different?

Richard Zwicky: It’s actually quite interesting. Normally, there are a very few search terms that bring large amounts of traffic, and there are a much larger number of terms that bring in relatively smaller amounts of traffic (each). As it turns out, the cumulative value of all the low volume terms is about 70 to 75% of your total traffic. In other words, the number of smaller traffic terms is so large that they cumulatively deliver more traffic than you high volume terms.

But the tail in a poorly optimized site is constructed slightly differently, with much more of the search volume going to the high volume terms, and with a much smaller tail. If you target a term like blue shoes, you need to understand all the variations of the ways it may come up. You want to be able to capture things like blue tennis shoe, blue running shoes, blue canvas shoes, blue leather shoes, blue suede shoes.

That is part of your site’s referral tail that a lot of times people don’t optimize properly against, but as soon as you start recognizing these terms, you are not just getting those variations but you are getting blue canvas, blue canvas deck shoes, which starts building it out more and more. And that’s what you see in a well-optimized campaign, you see that rich variation of terms also focused on certain themes, but all pointing back to the same core term that you want to capture.

As people start building, constructing longer and longer search queries, they are getting more and more definite about what they are looking for. And the reason they are doing that is because they are highly motivated, they are looking for what they want, and they want to get on with it, they want to purchase it, they want the information about it and they want to act on it. And when your tail is properly constructed, you are capturing all those variations through the optimization of your site and you are actually able to see it reflected in the tail.

Eric Enge: So in a well-optimized site, you might have 70% of your traffic coming from a long tail, but on a poorly-optimized one, it might be the opposite.

Richard Zwicky: Yes, that’s a good way to put it.

Eric Enge: Being able to visualize your own long tail is huge because a lot depends on if the site is optimized properly or not.

Richard Zwicky: There are still opportunities to grow and improve on any site’s long tail, even though if it is already well optimized.

Eric Enge: The way you collect data is through JavaScript on the publisher site?

Richard Zwicky: Yes, we provide every website operator unique JavaScript for their site. They put it everywhere in their site, not just on particular pages. And the reason for that is we provide the user behavior analyses to help give them more information. What is nice about it is two-fold. The JavaScript is actually served off with the Akamainetwork, so instead of having to log all the data to one central point we can use the nearest server that Akamai has, which makes it very responsive and very robust. This generally provides a load time of 12 milliseconds or less for anybody on a broadband connection anywhere on earth. And this means that we don’t miss data even if people start loading a webpage and click the first link to move on, prior to the page fully loading. Because we have already captured the log information on their behaviors, we can report on it, and add value to our clients.

This also means that we don’t have to go to sampling to fill in any gaps, which is a critical issue with some analytics. So we know that when we are reporting we know it’s accurate and comprehensive data we saw and that the customer would actually receive as opposed to any hypothesizing or extrapolating to complete a picture. This also means that you don’t have to worry about whether or not 10% of the data is missing. That 10% can be crucial. In this case, there is nothing missing. The only thing that would cause data to be missing is if the visiting user has disable Javascript in their browser. The other advantage we have is that we provide a single JavaScript tag for everywhere in your site. So when you want to analyze outcomes, conversions, actions or anything like that, you don’t have to modify the script on a page-by-page basis. You just can figure it out once, and you can specify what events you want to track and then go backward and look through all the data.

You actually collect everything with that one JavaScript tag. If you have had the tag in place for a year, and then you realize you want to do a new analysis of the data over the past year, you can do that. This is not easily doable in many web analytics packages. Additionally, the JavaScript tracks sessions across multiple visits so you are able to understand attribution over time, not just attribute all your sales to the last click. If somebody came back twenty times and finally made a purchase, you can see how they first got there and when they came back the second, third and nineteenth times. This way you can actually understand how all of your online marketing efforts start fitting together, and that’s incredibly valuable.

Eric Enge: Let’s talk a little bit about Enquisite Campaign.

Richard Zwicky: Thanks! We are getting phenomenal feedback and response from the people who have been using Enquisite Campaign as beta testers in the lead up to launch, which occurred on May 19.

In the search marketing industry, by which I mean the paid side of search, the market has a very straightforward business model that everybody can understand. That is that you invest a certain amount in paid search and your agency gets compensated a percentage of that amount for managing your spend. The more successful they are, the more money you will spend in the future, and the agency succeeds as well.

On the organic side of the business, everybody is always negotiating for fixed-rate contracts. And this is fine, except for the fact that it’s very hard for anybody to understand the true nature of the opportunity or forecast what the contract should look like, or how much effort is really required to succeed.

Also, a fixed-rate contract is disincentive to perform at a certain point, because a lot of SEOs can deliver a ton of tremendous positive value, but they don’t get paid more for finding that other opportunity and driving all this new business into the customer. Having a fixed rate contract can be almost counter productive for them at the certain point, because they are not able to leverage that opportunity as intended or are limited by contract scope to how much value they can drive into a customer’s business, thus also limiting how much of a profit they too can make for themselves.

Eric Enge: Right, the danger in consulting contracts that are fixed rate is if you have a consultant that’s savvy enough to know when they’ve done enough to earn their fees, which is actually better than the consultants that aren’t savvy enough to know that because they probably don’t care about earning their fees, but then they have some other client that’s barking at them and they stop paying attention to you.

Richard Zwicky: That’s correct. And in the other model, you can still have a base fee for all the base work you are doing. But an incentive model may allow you to stretch your goals, go for that opportunity and discover where those other opportunities are. Then you have a model where all of a sudden you were rewarded for going that extra mile, because what it basically means is you are delivering added, unexpected and unforeseen value to the customer.

If the customer earns more sales, the consultant should win with the customer. Today, you might run under a fixed rate model, and you are able to get running on an existing campaign, but every time you come up for renewal there is a frustrating discussion about what value you delivered.

That discussion has become obsolete because of Enquisite Campaign. You can prove the value delivered, so it shouldn’t be a question of if you your money’s worth or not. It is more like “Wow, I can see that not only did I get my money’s worth, I have got more than I ever expected. Definitely, I am renewing with you.”

You need to be rewarded for the value of your work. There is no upside or incentive to go that extra mile in the present systems, where there are fixed-rates contracts and they don’t really reward most SEOs. A lot of large agencies are now having to focus on the SEO business models. They struggling with the question of how to compensate or build the right pricing models to sell to their clients. Now the ecosystem can run much more efficiently so that some of the larger agencies are going to go out and contract more and more SEOs in a much more efficient manner and help everybody win together. It’s a win all around.

The client wins because they are getting value, and the agency wins because they get compensated for delivering that value. I mean, what more could you ask?

Eric Enge: Can you describe a little bit about how you collect the performance data and do the value calculation?

Richard Zwicky: To collect the data we use the same JavaScript that we use in all of our products, but where the application actually begins is in helping people determine what an opportunity really is, so that you are able to determine if you are focusing on something that’s worthwhile or not. Or, if a customer comes in and says I want to show up and get customers for a specific term, we are able to sit down with them and determine whether or not it is possible and worth the time, effort and investment.

Our system figures it out and runs a really intensive series of calculations to determine how many people will search for the term over to next 30 days. Let’s say the term is Blackberry. How many people are going to search for Blackberry over the next 30 days? And if you are placed in the top four, how many referrals can you expect to receive for that term?

Eric Enge: Are you using that classic AOL data for people who click on number-one or number-two or you are using your own data?

Richard Zwicky: We use our own data. We’ve done a lot of analysis work, and one of the beautiful things about having such a large data sample internally, is that we are able to qualify, verify, validate and iterate the reference data as the marketplace changes. You are not always going to be number-one, but if you are placed in the top 4 and you are bouncing around in there, what’s a reasonable expectation of the referral traffic you could acquire? Now, if you are number one all the time, you are going to exceed the numbers we’re laying out as potentially available because we are doing the weighted average of how much traffic you will get if you are in the top four. We’re also adding a slider so you can project “what if I only reach page 3?” type questions and answers.

That’s the first part of the platform. The second part is that we make it possible for you to build a campaign based around conversions if you want. Then, customer can pay you as they make a sale. We can also do it based on a cost-per-click basis. Essentially, if you are only targeting Massachusetts for Blackberry phones, you would be able to build a very targeted SEO campaign for that, and our system will do the calculation to determine what the fair market price for organic clicks will be.

To establish a fair market organic price, we actually take into account the differences between informational and transactional queries, the difference in conversion rates between paid and organic in each area, the difference in user behavior within search results and the website for when they actually arrive there or how much of the traffic that you get from organic that is actually good.

This helps establish where the market really should be because you might want to be paid on a cost-per-click model or a customer may want to pay you on that basis. In an affiliate model they have to pay for every referral that comes through. But how do you define what the payment is? And to date, there hasn’t been a good model for defining that in organic search. In Enquisite Campaign we have built it.

Eric Enge: Thanks Richard!

Richard Zwicky: Yes, thank you Eric!

Leave a Reply

Your email address will not be published. Required fields are marked *