One of the most important factors for both SEO and User Experience (UX) is website navigation. A well-designed, well-organised site with easy navigation will not only improve your site’s UX, but will also help the search engine bots crawl a website by helping them find all the site’s content and making more efficient use of their crawl budgets.
Elements to consider when assessing and improving website navigation include:
• Site depth
• Site architecture
• URL structure
• Internal search
• Including a sitemap and using robots.txt
The two fundamental types of structure: deep and shallow. On deep-structured sites, content is far from the homepage and users need to click multiple links to get from the home page to the page they are searching for.
This can make it confusing for a user to reach pages and revisit them. Deep content is also less likely to be reached by the search engine’s index during a crawl.
A shallow site structure lets people have access to the majority of the website’s content within a few clicks. This makes the site succinct and easy to use, and also makes the job of the search spider easy, as they are not searching through every nook and cranny for elusive pages.
Site architecture refers to the way site’s content is organised. There is evidence to suggest Google prefers a site where content is arranged by topic areas. This is sometimes called “siloing” and can provide a more logical system for users (and search engine spiders) to explore the site with. Google is quick to reward topic authority, so the more relevant content you can include under the same silo, the better.
This could take the form of adding new FAQ pages within a silo on a related topic rather than creating a separate FAQ section on the site where all FAQs across different topics are stored. This is also particularly beneficial if your site’s content is so broad that a deep site navigation system is all but unavoidable.
It is vital that all of your URLs are logically named. Many content management systems automatically suggest web addresses, and some are done in a numerical format based on how many pages have gone before it. Making your URL relevant to the content of the page will improve its rankings, and improve UX by making it obvious to users what they will be reading when they click the link.
Plenty of sites use specialist internal search forms, which index all of the pages of a specific website, and give the user the option to search for any pages within that website. They can be extremely useful for content rich websites, in which users want to quickly find the answer or page they want without having to work out which navigation path to find it in.
Include a sitemap and robots.txt
A sitemap is a list of all of the pages on your website, and is usually in either .html and .xml formats. The former is a comprehensive navigation system for users, though if your site is shallow , it is not as essential as an .xml sitemap. These are submitted to Google Search Console, and create the starting point for a search engine spider’s crawl of your site; your sitemap should also omit pages which you do not want spiders to crawl.
Robots.txt is a file uploaded to a website which tells crawlers the best way to crawl the site. It highlights the location of the sitemap, and can also instruct crawlers on certain pages of the site to avoid indexing, such as images or videos, backing up the exclusions from your sitemap.