Search Engine Optimization, or S.E.O.

Feb 3
21:38

2005

Seamus Dolly.

Seamus Dolly.

  • Share this article on Facebook
  • Share this article on Twitter
  • Share this article on Linkedin

While it can be spelt a variety of ways, ... after that can be ... is a business to some, and ... they extol their own ... search engine ... doesn't h

mediaimage

While it can be spelt a variety of ways,Search Engine Optimization, or S.E.O. Articles agreement after that can be difficult.

It is a business to some, and understandably, they extol their own theories.



However, search engine optimisation doesn't have to be complicated beyond the reach of the average site owner.



Its essence is to simply make your page as spider friendly as possible, and having keyword, search phrases, density somewhere close to the accepted requirements of search engines.



Too high a density may be considered "spamming", and is dependent to some degree on the particular engine in question. The correct density is one that will satisfy such an engine that the keywords, phrases, are repeated often enough not to be incidental. Logically, the word "false teeth" which is now within the body text of this article should not cause a search engine to believe that is what it is about. Remember, we are not dealing with a human editor, and relevance must be established with software, less sympathetic to context, in its English meaning, as we are.

Search engines can have different algorithms or indexing criteria.



S.E.O. must change as the indexing criteria changes, so what is good today, may have to be re-considered tomorrow.



The view of many is to make the site/page easy to navigate, with respect to internal and external links. Java script can present a problem for some engines, and should perhaps be kept to a minimum. Some people will tell you that raw HTML is simpler to "read", spider-wise. Sure, it might be simpler, but javascript rich pages are indexed none the less.



For anyone to guarantee that they can get you to number one, is a little optimistic, as everyone cannot practically or theoretically achieve such a goal.

Surely, anywhere on the first page of matches would not be a bad thing. All of us don't necessarily opt for the number one match, and those with any research experience will "skim" through the descriptions, to go some way in deciding the best match. Descriptions, should you be favoured by an engine, may tip the balance towards you.

Of course, this approach doesn't represent all surfers, so variables will always exist.



To achieve number one for a spurious or unusual term/word/phrase is relatively easy, and no great boast. Like wise, for less unusual terms or keywords for rare products or less competitive markets.



Little or no search engine optimisation experience should be needed in such cases.



It is almost certainly true to say that any advantage in the case of competitive keywords/markets, is really where the benefits of search engine optimisation come into play.



It is also true to say that where searches are confined/focused or country specific, the task is somewhat easier than if the search was "web-wide". For example, if your product was rubber tyres and you only delivered within your own country, then the web results will have a less commercial benefit to you.



Of course, another variable would be if your country produced unusually high numbers of rubber tyres, in which case search engine optimisation would need extra consideration and input.



Generally speaking though, any use of the search engines may convince you that the major players seem to dominate. However, it is not to say that they cannot be toppled, so to speak. From an engines' relevance viewpoint, these may or may not be "tightly themed", but often have relevance, with respect to time. This is a bid by the engines to return results appropriate with the time we live in. The annals of history are not foremost on the minds of surfers/researchers, and therefore updated content carries some weight.

The events of history can be searched more specifically, where that is the desire of the user.



An issue for some people is to be indexed in the first place.

You can join the queue and wait, or get indexed through a spidered link on a site that is regularly indexed. One purpose of a spider/robot is to follow links, and this also ties in with suggestions that javascripted links may obstruct/slow down the spidering process.



A simple way to get such a link is to contribute something/anything to the article directories. In return for your "textual" input, a link via your resource box can point to your domain.

It is generally agreed that spiders like text, or more importantly, new text.



A sensible defence of such a claim is that we don't especially want MATCHES for banners, images or anything uninformative. Such images, like the annals of history, can be searched specifically.

Another logical defence of such a claim is that we really are living in the INFORMATION AGE, and mediums to relay information are still predominately verbal and textual, however delivered.





While it may be an overstatement to say that Search Engine Optimisation is to webpages, as Neural Linguistic Programming is to humans, it may not be too ridiculous.



Remember

1.You are dealing with software that is attempting to analyse like a human. It cannot do this as readily as organic intelligence.



2.You are dealing with software, familiar with abusive human strategies of deception, and has in-built counter measures. There are humans behind the software, wise to the will of cheats.



3.You must help it with your selected keywords/search phrases, which you deem to relevant. Their densities will determine relevance from a textual degree, whatever about a “site-relevance” degree. Too often is word spamming, and not enough is irrelevant. Anywhere between two and five wouldn’t be too bad, and depends on who you talk to, or indeed, listen to. Make more pages instead of trying to fit every conceivable keyword into one page. A lot of people do this at the beginning, which is understandable as well, but the body text just won’t make any sense and any visitor will get a headache as surely as you will. Spare both parties.



4. Some engines insist on robot text, and should probably be facilitated, even though some have a commercial interest in their insistence. Robot text is not difficult to learn or implement and the name of it, should not be a deterrent.



5. Metatags are designed to assist as well and are nothing to be feared, but favoured, should you decide to use them.



6.You must get indexed, to bear any optimised fruit. This should no longer be difficult either.