Search Engine–Friendly website HTML structure

Mar 9
07:22

2011

james daksh

james daksh

  • Share this article on Facebook
  • Share this article on Twitter
  • Share this article on Linkedin

There are a number of HTML issues to explore such as HTML structural elements, copy prominence and tables, frames and forms.

mediaimage
1-HTML Structural Elements
In general,Search Engine–Friendly website HTML structure Articles HTML provides structural elements that may help a search engine understand the overall topicality of documents, as well as where logical divisions and important parts are located.

2-Tables
Many tables-based sites place their site navigation element on the left. This use of tables tends to push the primary content further down physically,and because of this, may contribute to poorer rankings. If there are many navigational elements above  primary content, it may confuse the search engine as to what is actually the primary content on the page, because the navigational elements are higher physically in the document.Search engines do try to detect repetitive elements, such as a navigation elements placed physically before the primary content on a page, and at least partially ignore them. Modern search engines also examine the actual displayed location of content rather than just their physical location in a source document.However, avoiding the situation entirely may improve the odds of proper indexing regardless.
There are three solutions for this:
- Instead of using a tables-based layout, use a pure CSS-type layout where presentation order is arbitrary.
- Place the navigation to the right side of the page in a tables-based layout.
- Apply a technique that designers typically call the table trick, which uses an HTML sleight-of-hand to reverse the order of table cells physically in the document without reversing their presentation.Even if your site uses tables, typically, parts of a document can be rendered using CSS layout on a selective basis. Doing so does not force you to abandon a tables-based layout completely. A good place to look is in repetitive elements such as navigational elements and repeated blocks to shrink HTML size, because tables tend to have a large footprint.

3-Frames
There have been so many problems with frames since their inception that it bewilders us as to why anyone would use them at all. Search engines have a lot of trouble spidering frames-based sites. A search engine cannot index a frames page within the context of its other associated frames. Only individual pages can be indexed. Even when individual pages are successfully indexed, because another frame is often used in tandem for navigation, a user may be sent to a bewildering page that contains no navigation. There is a workaround for that issue, but it creates still other problems. The noframes tag also attempts to address the problem, but it is an invisible on-page factor and mercilessly abused by spammers. Any site that uses frames is at such a disadvantage that we must simply recommend not using them at all.

4-Forms
A search engine spider will never submit a form. This means that any content that is behind form navigation will not be visible to a spider. There is simply no way to make a spider fill out a form; unless the form were to consist of only pull-downs, radios, and checkboxes — where the domain is defined by permutations of preset values, it could not know what combinations it submits regardless. This is not done in practice, however.There are some reports that Google, in particular, does index content behind very simple forms.Forms that consist of one pull-down that directs the user to a particular web page are in this category. However, as with the example of JavaScript links being spidered, I do not recommend depending on this behavior. As a corollary, if such a form points to content that should be excluded,it may be wise to exclude the content with an explicit exclusion mechanism, such as robots.txt or the robots meta tag! There is no magic solution for this problem. However, there is a workaround. As long a  your script is configured to accept the parameters from a GET request, you can place the URLs of certain form requests in a sitemap or elsewhere in a site.So if a form submits its values and creates a dynamic URL.