Click to Call: (530) 404-5556   |

seowordcloudYou used to have to submit each and every page of your site manually to each search engine. That, happily, is no longer the case.

Search engines follow links on your site or other sites to find pages which they then import into their databases. So pages on your site will eventually be discovered and indexed if there is at least one link from an indexed page that leads to it.

However, a page or site being ‘indexed’ does not mean it will ever appear on any search.  It just means that if the search engine has its information. To be an effective business website the pages of your website must first be optimized with relation to the product or service that it represents, and then it must be updated frequently to be sure that new or updated information is indexed properly and timely.

An Effective  Submission Strategy First, determine how frequently search engines should be ‘encouraged’ to re-index the pages.  For instance, on a real estate website listings can change on a daily basis, so new pages are created and others deleted often. Second, be sure that not only the homepage but also all of the other pages are indexed, too.

The number of pages on a business site may be anywhere from a dozen to several thousand.  A typical Silk Shorts, Inc.’s real estate site would have from 50 to 2000 pages. In order to not waste time or to unnecessarily interfere with websites, search engines will limit the number of pages on a site that they ‘crawl’ (index) at any one visit.

So, although a site may have hundreds of pages, only a few may be adequately indexed and then the pages indexed may not be the most important ones on the site. To avoid losing valuable search engine indexing you must add special files to help the crawlers choose which pages to focus on.  These are called ‘Sitemaps’ and they exist in different formats for different purposes.

  • An ‘html sitemap’ is to help human visitors find pages that may not be obvious in regular navigation.
  • An ‘xml’ sitemap uses special coding to tell the search engines where all of the pages on the site are located (URLs), what their relative importance is (for instance a privacy statement has little value so it would show low importance) and how frequently the page should be updated.
  • A ‘robots.txt’ file (not really a sitemap) that tells search engines what directories and files you do not want indexed.  This also saves indexing resources.

With this information search engines can spend their allotted time to crawl your site on important pages first and leave less important pages for a later visit. Silk Shorts, Inc. recreates the xml sitemaps weekly or monthly as needed by the individual site and submits them to the search engines to achieve the maximum re-indexing for the site. This makes it possible for you to stay in front of your customers and clients closing deals while we stay in front of the computer building your business.