Search engines constantly
strive to advance their technology and algorithms
in order to provide the most relevant search results
for their users. Achieving effective results require
the identification, and ultimately the complete
eradication of, manipulative search engine optimisation
tactics. As Internet marketers, it is up to us
to achieve high listings for our customers and
this requires that our search engine marketing
tactics change and grow with the new technology.
Unfortunately there are many search engine optimisation
strategies still being used by ill informed or
unscrupulous Internet marketers and webmasters
that became obsolete a long time ago. In many
cases, these strategies are not only ineffective
but are now considered spam and can have dire
consequences for your rankings and even result
in your web site being banned permanently from
The following is a list of what are now considered
within the professional Internet marketing world
to be the Top 10 worst search engine optimisation
1. Doorway Pages (or Gateway Pages, Information
Pages, Ghost pages, etc.)
These are generally multiple web pages that are
devoid of useful content but heavily optimised
for search engine rankings with each page being
created for a particular key phrase. The idea
of this concept was to fool the search engines
into thinking that these pages were highly relevant
and provide top rankings for them under their
targeted phrase. When a surfer came across on
the page they were often shown a "Click Here
to Visit Our Web Site" link that the surfer
had to click on to actually arrive at the legitimate
website. Isn't that what they were trying to do
when the ended up at this page?
Once among the most popular methods of attaining
multiple search engine placements, doorway pages
were widely used until 2000 by many Internet marketers
and web masters. Since then, Doorway pages have
become the most obvious form of Spam that a search
engine can find and the repercussions are dire
if such a tactic is employed. Unfortunately, many
web masters and marketers still employ this tactic
and then wonder why they suddenly drop from the
search engines results after being banned for
using this technique.
2. Invisible Text
Invisible text is used in a variety of ways in
an effort to increase the frequency of keywords
in the body text of a web page. Some methods are:
making text the same colour as the background
of the web page, hiding text behind layers, placing
text at the very bottom of over-sized pages, etc.
This tactic is particularly perilously as it is
obvious to search engine spiders. In 1999, search
engines began implementing automated methods of
detection and penalization.
3. Content Misrepresentation
Misleading search engines into believing your
web page is about topic 'A' when it is in fact
about 'B'. This tactic was used primarily for
the promotion of adult, gambling, and other extremely
competitive search markets.
Unfortunately this tactic is still in use by unscrupulous
web masters and marketers. The fact is that this
tactic is the simplest for a search engine to
identify and the result will be swift and complete;
banishment from the search engine index indefinitely.
The worst offence against the search engines is
to try to fool them.
Redirects have some innocent uses (practical,
legal, etc.) but they are also used to mislead
search engines by making them believe that the
page they have indexed is highly relevant to a
particular search phrase. When a surfer visits
the page, however, they don't see the original
page and are redirected to an entirely different
In most cases search engines have advanced enough
to see this technique being used and act accordingly.
In fact they usually ignore any page with a redirect
(assuming correctly that the content is useless)
while spidering the redirect destination instead,
i.e.; the page that the surfer sees. Redirects,
unless blatantly Spam-related do not directly
result in intentional ranking penalties; however,
they have no positive effect either.
5. Heading Tag Duplication
Heading Tags were created to highlight page headings
in order of importance. Thus the Heading Tags
that are available: H1, H2, H3, etc. This duplication
technique involves implementing more than one
H1 tag into a web page in order to enhance a particular
keyword or phrase.
This tactic is still very prevalent and likely
still works on some search engines; however, none
of the major search engines will respond well
to this technique as it has been identified as
a common manipulation.
6. Alt Tag Stuffing
Alt Tag stuffing is the act of adding unnecessary
or repetitive keywords into the Alt Tag (words
that appear when you hover over an image with
you mouse pointer).
The Alt Tag is meant to be a textual description
of the image it is attached to. There is nothing
wrong with tailoring the Alt tag to meet your
keyword goals IF the tag is still understandable
and if it appropriately describes the image. The
offence occurs when an Alt tag has obvious keyword
repetition/filler that a search engine can key
in on as spam.
7. Comment Tag Stuffing
Comment Tags are used to include useful design
comments in the background source code (html)
when creating a web page. These tags should be
used only for adding technical instructions or
reminders; however, these tags were often used
to artificially increase the keyword count for
particular search phrases.
At one time there was some argument that this
technique worked, but it has always been a "Black
Hat" search engine optimisation technique that
even then could result in placement penalties.
Nowadays this technique will not help an optimisation
campaign, if anything it will be ignored or produce
a negative result.
8. Over Reliance on Meta Tags
Meta Tags is a broad term for descriptive tags
that appear in most web pages and are used to
provide search engines with a concept of the page
topic. The most common tags are the description
and keyword tags.
At one time, extinct search engines such as Infoseek
relied a great deal on Meta Tags and many took
advantage of this factor to manipulate rankings
with relative ease. In today's far more advanced
climate the search engines place cautious weight
on Meta Tags and when considering rankings Metas
play only a fractional role. Some webmasters still
consider Meta Tags the 'end-all and be-all' of
ranking producers and forget to optimise the rest
of their web page for the search engines. With
this line of thinking they miss that the search
engines place far more importance on the body
text (or visible text) of the web page. This is
a critical error that will ultimately lead to
low or insignificant rankings.
Note: An extremely common example of Meta
Tag over-reliance are web sites that have been
designed totally graphically and are devoid (or
nearly so) of html text that a search engine can
read. A web page such as this will have no body
text to index and may only provide a small amount
of relevance to the web page which ultimately
leads to poor rankings.
Over reliance on Meta Tags does not produce intentional
search engine penalties, however, the simple act
of ignoring other ranking principles often means
a lower ranking.
9. Duplicate Content
This tactic is blatant Spam and is very common
today. Essentially the Webmaster will create a
web site and then create duplicates of each page
and optimise them differently in order to obtain
varying placements. By doing this you are saturating
the search engine databases with content that
is essentially eating valuable bandwidth and hard
Duplicate content is a dangerous game often played
by full-time marketers accustomed to trying to
attain placements in aggressive markets. Avoid
this tactic like the plague unless you are willing
to sustain serious ranking damages if you get
caught - which you likely will.
10. Automatic Submission
Automatic Submission is the use of automated software
to submit a website to the search engines automatically
and often repeatedly.
At Enable UK the word 'automated' is a disturbing
when used in reference to search engine optimisation
and submission. The fact is that automated campaigns
are not as effective as manual (by hand) ones.
Automatic Submission Tools can only submit to
search engines that allow such submissions. These
search engines make the majority of their profit
from surfers like you viewing their advertising,
be this at their web site or by the emails you
will receive as a result of submitting to them.
Automated tools have also been known to repeatedly
submit sites and sometimes each individual page
within a site and if a search engine is submitted
to too often it will consider the submission as
Spam and the website being submitted will not
The more established
and popular search engines do not allow automated
submissions, in fact the submission companies
continually try to upgrade their software to try
and subvert the search engines latest effort to
stop their programs.
All in all, this leaves the submitter in an unstable
position where they may or may not have their
submission ignored. The cardinal rule of search
engines. submit ONCE and it may take a while (usually
no more than 2 or 3 months) but the site will
get spidered at some point. If within a few months
a site is not listed, then resubmit. As for the
major engines like Google. be patient and definitely
don't submit more than once if you can help it.
I hope that this article has told you a lot of
things you already know and that you have not
already fallen into any of these traps. If you
are intending to outsource your Internet marketing
campaigns, be extremely wary of any search engine
optimisation company that suggests any of these
tactics. Some of these tactics may work in the
short term; however, that outcome is not only
rare it is also a great way to get banned from
the major search engines completely.