Top 10 Bad SEO Ideas
By Eric Enge
The world of Search Engine Optimization is complicated for many reasons. For example, it is well known that the Google algorithm takes into account more than 200 factors in ranking a web page. In addition, search engines treat their algorithms as highly proprietary for two main reasons: (1) they don't want their competition to know what they are doing, and: (2) they don't want web spammers to design sites to get rankings that they don't deserve.
Another reason the SEO world is so complicated is that it has changed dramatically over the past few years. What worked in 2007 stopped working in 2008. What worked in 2008 stopped working in 2009. The complexity of this environment, and the rapid changes, have led to many SEO myths. This article identifies the top 10 worst SEO ideas, and provides an explanation as to why they don't work. Here is our top 10 list:
- Relying on keyword metatags: Deserves the number 1 spot, simply because this stopped working 3 years ago. Search engines rely almost solely on user visible text on your site in order to determine its ranking. Text that is not user visible, such as the keyword metatags, stopped being significant years ago, because the Spammers made them abused them so badly. So take the top few keywords that your page is focused on, plug them in here, and then forget about it.
Do implement a title metatag though, because it is user visible, and one of the most important things you can do on your page to improve its ranking. Do implement a description metatag, not because it will influence rankings (because it doesn't), but because some search engines (such as Yahoo) may use it as the description it shows in your search results under some circumstances.
- Stuff keywords in invisible text: Definitely deserves the number 2 spot, because it can and will get your site banned. This includes text written in the same color as the background, or that is drawn way off the user visible page. These schemes are trivially recognized by search engines, and are treated as the act of a blatant Spammer. Don't do it. Ever. Learn The Art of Keyword Selection.
- Purchase Links: This practice is still incredibly popular, largely because there are many people who get away with doing it, and it helps them with their rankings. The problem is that the it is in the strategic interest of the search engines to defeat this practice, and they are working hard to do so. Google uses three techniques to detect purchased links:
- Algorithms look for obvious patterns, such as the presence of words such as "Advertisers" or "Sponsors" near the link. Another thing they can look for is a grouping of unrelated links that don't fit the topic matter of the page where the links are found.
- Google has thousands of editors in Asia whose sole purpose is to review search results for quality purposes. Part of what they are trained to do is detect purchased links and flag them.
- Google also accepts reports of purchased links and will send these for review by their team in Asia.
So what does Google do when a purchased link is detected? They flag it and make it useless from a site ranking perspective. In addition, if they detect flagrant link buying for ranking practicespurposes, they can, and do, ban sites. Use the time more wisely. Take the same time you might have invested in finding links to buy, and find a link you deserve instead. It's much safer, and it will build your business for the long term.
- Horde Page Rank: This is one of my favorites, because it's one that most webmasters don't understand yet. This is because it changed over the past year or two. The concept people have in their mind is that page rank is a key part of site rankings and linking to other sites "leaks page rank" from your site. However, the world has changed. Page rank is a minute factor in ranking these days. Establishing, and reinforcing, site relevance is a huge factor in your rankings. You can do this by linking to pages and sites that are relevant to yours. Do link to relevant content.
- Swap Links: Another oldie, but not goodie. Search engines want links to represent endorsements. Swapped links represent barter, and they are trivial to detect. Don't swap links for the purpose of building page rank. It's a waste of your time. However, do swap links with sites that are highly relevant to your business, if these sites would be valuable to your users. Building your relevance in ways that are good for visitors to your site is always good. Of course, if you can get these relevant sites to link to you without linking back, this is better still.
Read these articles for a Linking Overview and for Link Building Strategies.
- Implement duplicate content: There are many different ways that this can happen, but here are two of the most popular scenarios:
- Many businesses operate many 2 or more sites that contain similar, or even identical content. These different doorways may have been implemented as different business fronts to enable the business to pursue different methods for marketing their products or services.
- Many sites have multiple ways of navigating to the same content, yet the content is delivered on a different URL in each case. Usually the URL is a simple manifestation of the path the user used to get there. The site owner has no bad intent and views each URL as being the "same page".
The trouble with duplicate content is that search engines want to rank the same content only once. So if you have multiple URLs on one site with the same content, one of these is just a waste of the search engine's time. Here is a real case where you are "leaking page rank" - you are sending your own precious page rank to pages that will never rank.
You also need to think about your crawl budget. If the search engine comes to your site and is going to crawl 1000 pages today, and 400 of these are duplicate pages that will never rank, you wasted a significant percentage of your opportunity for the search engine to find good unique content and rank it.
And if you have implemented "doorway sites" you could be in bigger trouble. Search engines see this as Spamming, and you could get banned.
- Use Session IDs on your URLs: Search engines makeing indexing decisions over a time period of many months. Getting a new site to rank is a lengthy process. Because of this, search engines look for static pages. When they see parameters at the end of a URL, the search engine treats them as part of the URL.
If a search engine sees one Session ID when it crawls a page on your site today, and a different one when it crawls the same page next week, it thinks it has found two different pages. Neither version of the page will get ranked, and the search engine will view your site as unstable. Session IDs will kill your rankings. Put your parameters in a cookie. Live with the fact that 2% of the surfing public disables cookies.
- Implement your site in Flash: Probably very pretty. But probably very useless from a search engine ranking perspective. Search engines can read and index Flash (try the following search: "cooking schools filetype:swf"), but you will not find any sites that rank high on competitive terms implemented in Flash. One basic reason for this is that search engines love text, and if you plan to implement a site with lots of text, Flash just does not make sense as the medium to use (movies are visual experiences, not reading experiences). You can read more about search engines and Flash here
- Cloaking: This is the practice of showing different content to the crawlers thaen you show to the user. It's really easy to come up with legitimate ideas as to why you might want to do this. But it does not matter. It's an emotional issue with the search engines, and they do not accept responsibility for determining your intent. It's emotional because it was a very popular technique with the Spammers in years gone by.
Search engines periodically implement new bots that they send out for the explicit purpose of detecting cloaking. There is no known technique for cloaking a bot whose name you do not yet know, coming from an IP address you currently don't know. These new bots easily detect a cloaking implementation.
When a search engine detects a site that is cloaking, there is an excellent chance that it will lead to the site being banned. Your intent in implementing cloaking does not matter. So don't do it. Solve your problem by another means.
So what's the bottom line? There are really two major things you need to do:
- Learn how to communicate to the search engine what your site is about. Many of the problems listed above relate to common practices that make the search engine's job harder, or even impossible. Learning how to build your site so that the search engine can easily determine the unique value of your site is an outstanding idea.
- Don't spend your time figuring out how to beat the search engine. It's just not a good place to be. You may even succeed in the short term. But if you do succeed in tricking them in the short term, the day will come when you wake up in the morning and a significant piece of your business has disappeared overnight. Not a good feeling at all.
Take the same energy you would have invested in the tricks and invest it in great content for your site, and in the type of marketing programs you would have implemented if the search engines did not exist.
About the Author
Eric Enge is the Founder and President of Stone Temple Consulting (STC). STC offers Internet marketing optimization services, including SEO, Social Media and PPC optimization, and its web site can be found at: https://www.stonetemple.com.