There was a common SEO misconception, webpages in the root of a domain will be ranked higher than webpages deep within the subfolder directory structure.

  • website.com/webpage.html would be a webpage in the root of the domain
  • website.com/subfolder-1/webpage.html will be one subfolder deep
  • website.com/subfolder-1/subfolder-2/webpage.html will be two subfolders deep
  • website.com/subfolder-1/subfolder-2/subfolder-3/webpage.html will be three subfolders deep

Subfolder Depth SEO Myth

Subfolder Depth SEO Myth

Subfolder Depth SEO Myth

It used to be believed by some SEO’s if you went too deep with your directory subfolder structure (over 4 subfolders deep – website.com/subfolder-1/subfolder-2/subfolder-3/subfolder-4/webpage.html) those webpages wouldn’t even be spidered at all by search engines: not true BTW!

The reason for this SEO myth is an artefact of common internal navigation linking structures, not subfolder structure per se.

The subfolder structure tends to mimic the link or navigation structure of a website and if it takes at least 4 links (clicks/taps) to go from the home page to the deepest webpages of a large website, some links might be missed by the search engine spiders and so some deep content might never be spidered at all: basically Googlebot never finds them.

Those deep linked webpages will tend to have one or a very small number of unimportant internal backlinks and so as links are very important to Google rankings they won’t be seen as important by Google and so might not be indexed: Google spiders most webpages it finds, but decides for various reasons they shouldn’t be included in the Google web search index.

Put another way if you only send one unimportant internal link to a webpage on your website and Googlebot rarely follows it, why would Google believe it’s an important webpage and rank it highly or even keep it in the Google web search index?

Reality is you can put your webpages as deep as you like subfolder wise (within reason) as long as they are linked correctly: try to go for no more than two taps/clicks from the home page, three at the very most for large websites.

Google Spiders via Links NOT Subfolders

If you had a webpage 4 subfolders deep website.com/subfolder-1/subfolder-2/subfolder-3/subfolder-4/webpage.html and that webpage had a sitewide link (every webpage on a domain links to it), Googlebot will probably find and follow one of the links every time it visits the domain. Googlebot randomly follows links, it’s basic probability, more links there are to a particular webpage, more chances there are Googlebot will find and follow one of those links.

If there’s only one link to a webpage from another webpage with just one link to it, Googlebot is highly unlikely to regularly spider that webpage.

If in doubt use a Google XML sitemap, but don’t expect it to increase rankings, Google will spider and index webpages from an XML sitemap, but if Google can’t find other links to a particular webpage it probably won’t rank it for anything.

Links Power All SEO

Links Power All SEO

Never forget, backlinks power all SEO, No links = No Google Search Engine Rankings.

You can have the best content on the planet, but if it’s not linked, Googlebot can’t find it to spider and even if your one rubbish internal link was enough for Googlebot to find that really deep bit of amazing content, without some reasonable quality link, Google won’t rank it high.

*
David Law : Technical SEO Expert with 20+ years Online Business, SEO, Search Engine Marketing and Social Media Marketing experience... Creator of multiple WordPress SEO Themes and SEO Plugins. Interests: wildlife, walking, environmental issues, politics, economics, journalism, consumer rights.

Website - SEO Gold Services