Sitemap

Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling - sitemaps.org

The Sitemap protocol format consists of XML tags. All data values in a Sitemap must be entity-escaped. The file itself must be UTF-8 encoded - sitemaps.org

Here is an online tool to create sitemaps - xml-sitemaps.com

In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

A Directed Acyclic Graph I've created that shows some of the paths a user may take from the Main Page.

Legend: Red line - should not link. Dotted Line - Page section (not really a link). Light Yellow Box - End of the line (an article)) - wikipedia

Web Crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.