COMPLETE WEB-SITE OPTIMIZATION FOR SEARCH ENGINES (part2)

Nov 2
22:00

2003

Pavel Lenshin

Pavel Lenshin

  • Share this article on Facebook
  • Share this article on Twitter
  • Share this article on Linkedin

COMPLETE WEB-SITE ... FOR SEARCH ENGINES ... (c) Pavel ...

mediaimage

COMPLETE WEB-SITE OPTIMIZATION FOR SEARCH ENGINES (part2)

------------------------------------------------------------
copyright (c) Pavel Lenshin
------------------------------------------------------------

Source code optimization.

{title}...{ itle}

This tag is to be a winner. This is a primary spot to
include our keywords for SE spiders,COMPLETE WEB-SITE OPTIMIZATION FOR SEARCH ENGINES (part2) Articles bots or crawlers
("spider" hereafter). {title} tags are the best "dainty
dish" for SE spiders. They eat them as cakes, so make
title tags to be tasty for them, about 65 characters long.

{meta name=description content="..."}

Important Meta tag. Very often the description you put will
be shown at the SE searching results. To my personal opinion
they have more important marketing role of attracting
visitors than actual optimization.
The SEs' trust in "description" tag as well as our next
"keywords" tag has been greatly discriminated due to fraud
and unfair competition. Make it no more than 250 characters
long, including, of course, your targeted keywords as well.

{meta name=keywords content="..."}

Another advisable to use Meta tag should be included with
all your targeted and untargeted, but related to the topic,
key phases separated by commas. Note that highly popular and
stand alone keywords like "web-site", "internet", "business"
etc. will give you nothing more than increase the size of
your web-page. I won't be mistaken, if I say that about
several millions of web-pages have them. Don't overuse your
keywords as well, spiders don't like to be forced to eat
what they don't want to.

{meta name=author content="..."}{meta name=copyright
content="..."}{meta name=language content="..."} etc.

Subsidiary Meta tags that are used more likely to satisfy
webmasters' ego, rather than bring any real help in
rankings.

{h1}...{/h1} {h2}...{/h2} {h3}...{/h3}

In contrary to the previous tags the importance of, let's
call them, "body" tags have substantially risen for simple
reason, they are readable by visitors and it is hardly to
cheat SE with them than Meta description or keywords tags
where any webmaster may put anything s/he wants.
Given that these tags determine the headers of your web-page
from the SE spiders' viewpoint, try to include your targeted
keywords in them.

{img src=: alt="..."}

"Alt" is just a comment for every image you insert into the
page. Use this knowledge at your advantage. Include your key
phrases where possible and safe. By "safe" I mean common
sense, don't input comment like "ebook package" into the
image of the button that leads to your partner, say, "Pizza
ordering" web-site.
On the contrary, if your web-site has graphical menu and
buttons, it is very wise to include "alt" comments according
to directions they lead to, i.e. "Home", "Services", "About
Us", "Contacts" etc. If for any reason visitors have their
browser with images turned off, they won't see any menu if
you haven't inserted "alt" comments.

Content

Your informational coverage should be keyword/phrase rich,
the same way as headers. In general the more relevant key
phrases your textual information will contain, the better
your chances of being "remarked" by SE spider are.

HTML text format tags like bolding {b}, italic type {i} and
underlining {u} may also have some weight in SEs placement.

Key word density and frequency are another indexes vastly
used by SE to rank web-pages. Don't overuse them though.

Link popularity (page rank)

Another extremely important parameter for your listing
position nowadays. In general the more links on third party
web-sites point to your site the better. Although try to
avoid "link farms" or other "clubs' the only aim of which is
to artificially increase your link popularity. These tactics
may simply result in penalization or banning of your
web-site.

Link popularity without any doubt helps to increase the
relevance of searched terms more often than it doesn't, but
makes SEO even more far-reaching target, because
establishing quality "incoming" links pointing to your site
is beyond your direct power.

To be short, your task is to find web-sites that have
highest SE listing positions and/or page rank (determined
via Google Toolbar) and negotiate a link to your site in
return for some service, product or solicit simple exchange
of links. As you see these "manual" work is the most
time-consuming, but it repays if you are focused to get as
much relevant links as possible.

You may apply viral strategies by offering some free/paid
service that implies putting a link back to your site.

Google has developed its own link popularity evaluation tool
called Page Rank. It is calculated basing on consistently
changing number rules: current rank of the site the link to
your page is pointing from, its relevance to your web-site
topic, presence of targeted words etc.

Fake tactics

They are what I call them and used by webmasters similar to
ways some "marketers" use spam to promote their businesses.

Unfortunately, usual internet users don't have ability to
"ban" spammers the same way SEs penalize those "smart"
webmasters. I don't recommend you to use any of these
tactics, even on someone's "advice".

They include excessive use of related and totally unrelated
keywords, comment tags, hidden layers, text on the
background of the same color, artificial link farms,
numerous entry pages etc.
This game simply won't be worth candles if your web-site is
banned for good.

robots.txt file

Very important file every web-site should have. It allows
you to literally rule or direct SE spider to the "proper"
places, explaining what and where should be scanned, not
just blind waiting of your lucky day. With its help you can
also protect your confidential web-pages and or directories
from scanning and showing at the SE searching results, very
important feature many web-masters solve with "tons" of Java
or even Perl coding instead of one line string in the
robots.txt file that will forbid to scan "download",
so-called "thank you" pages or anything you want!

General rules of creating robots.txt file you can find here
http://www.robotstxt.org/wc/robots.html

Design & Layout issues

Next point is to have a textual info. The simple declaration
of content rich web-site is not enough, SEs need text to
scan.

Clear to follow links. If you have Flash or Java applet
navigation menu, make sure to duplicate somewhere and
include HTML links as well. Most SE spiders cannot
distinguish dynamically created web-pages with the help of
ASP, Perl, PHP or other languages. It is also clear that all
web-pages, access to which was forbidden (no matter how) by
administrator, would also be left unnoticed. The same
relates to HTML frame sites. What frames actually do is
complicate the way web-site is being scanned, no more, no
less. When I see web-site made of frames, it is like
webmaster telling me: "I want lower SE position."

Because of the excessive work spiders have to do in order to
scan as many pages as possible, their scanning "accuracy",
if we can say so, have dropped, so they will hardly scan
each and every of your pages from the very top to the
bottom, it is more likely to be selective scanning, so, to
ease this process you should try to arrange the most
valuable info, including header tags and text at the very
top of web-pages. Having "site map" page with all link
connections of your site not only does it help your
potential visitors, but SEs as well.

All link names, inside your informational content, are to
contain your related keywords or phrases, not just "click
here" or "download here".

Avoid a lot of javascripts, cascade style sheet tags or a
lot of image tags at the top of the page that could occupy
more than a page of HTML source code with almost no textual
info. If you have java or .css coding save them as separate
files and upload on request, leaving one string of code in
your HTML document only. This tactic is also very smart
considering general web-page optimization and space saving
purposes.

Allow to the Internet market know your business better.