Websites are constantly changing – pages are added, copy is edited, services and/or offerings messaging needs to be adjusted – the list goes on. On top of the on-page optimization, there is an important technical side of things. As we know, crawlability and indexability are the prerequisites of good rankings. We need to make sure the following happens:
- Search engines crawl the website
- Search engines understand the content on the pages
- Search engines index the pages
- Crawl path optimization is implemented
Identifying the Problem
Our SEO team performed a case study on a client in the hospitality/education space. The client had pages on their website that were perfectly accessible for search engines. However, many pages were not indexed, and even the indexed pages weren’t ranking all that well. “Why is this happening?” we asked. We analyzed the site structure to check out a few different things including:
- Comparing the crawlable pages with the pages indexed by Google
- Looking at the length of content on the pages to help us identify weak content
- Observing the number of pageviews per web page over a longer period of time to see user behavior trends
- Reviewing how pages were generating organic traffic
- Reviewing ranking data to see what pages were ranking for different type of searches
Based on our findings, we identified the source of the issue. Well…sources of the issue
- Several pages were competing against each other
- Some of the targeted keywords were not supported by quality content on a dedicated page
- Smaller bits of information regarding the same topic were scattered throughout the website
- Outdated documents and pages were still published live and not redirected to the updated ones
- The navigation structure wasn’t logical, and categories and subcategories were overlapping
Implementing the Fix
We started organizing the useful information and creating pages with high-quality content, while ridding the site of the weak content. We merged pages, completely rewrote others, as well as implemented redirects from pages that were no longer usable.
We did this based on the site’s current (and potential) rankings, user engagement KPIs and conversion metrics, bounce rate and drop-off rate and, maybe most importantly: keyword research.
We wanted the new pages and categories to be optimized for the demand we saw, and we wanted to create a structure that would help users find what they’re looking for. This meant we had to identify the users’ intentions based on the engagement metrics to ensure this project’s success.
While identifying user intent and performing additional keyword research was vital, the other side of our process involved relinking everything. We wanted to make sure no redirect chains were created and no internal link was pointing to old URLs. Yes, we implemented the redirects, but it’s much better to have internal links pointing to a 200 OK page instead of a page that will redirect you to a 200 OK page.
The finishing touch was to update the XML sitemap to have the new pages indexed.
Just to provide a better understanding of the significance of the changes, the website in question had ~200 pages, 50 of which we implemented bigger and more important changes to – so, basically 25% of the original site was affected.
We started this project in mid-July and finalized everything by the second week of August. The following is what we observed from a rankings and traffic data standpoint:
- July: The website was ranking for 775 keywords with an average KW position of 66. Meh.
- November: We saw a 63% increase in the number of ranking keywords. From 775 to 1,222. Boom!
More importantly, the average ranking position for these keywords was 56.2. The number of ranked keywords experienced a significant 63% increase, but the average ranking position for these keywords improved 10 positions overall.
This chart shows the difference in online visibility (value Y showing the average ranking position, while value X and the size of the bubble shows the number of ranked keywords):
What about the SERPs? In July, the website ranked for 40 keywords with the average ranking position of 6. In November, there were 57 keywords (an additional 17 keywords for those who aren’t quick at math) ranking on the first page keeping an average position of 6!
We see an even bigger improvement when we look at the first 20 positions (or, the first two pages of Google’s SERPs). There were 104 keywords ranking with an average position of 12.6 in July. By November, we saw 135 keywords ranking, with an average position of 11.7.
With all of these improvements, it’s only natural to want to know how this affected the site’s organic traffic (spoiler alert: it’s good news!).
First, let’s see how the traffic was trending in 2016. The time frame we’re looking at is January 1st – November 30th, 2016.
Note: The little annotation/spike we’re seeing in August marks the last day of our finishing touches. The bigger changes were pushed live earlier.
Before you say anything, we know what you’re thinking: “Well, sure! This all looks very convincing so far, but are you sure seasonality wasn’t a factor for the additional boost in traffic?” On the contrary, friend:
When we compare January 1st – November 30th, 2016 to January 1st – November 30th, 2015, we see organic traffic dropping in September 2015. Whereas in 2016, it’s increasing.
To get a better understanding of the results, we should compare July 1st – November 30th, 2016 to July 1st – November 30th, 2015.
As you’ll notice, the site experienced 52.72% increase in organic traffic. On top of that, new users increased 73.03%.
By improving the site’s crawlability and therefore, giving search engines a better understanding of the content on these pages (which helps the engines could provide these as results for relevant searches), we saw a significant uptick in organic traffic generated by specific parent and child pages.
Due to tracking issues on the site during the previous years, we were not able to showcase an apple-to-apple comparison in conversion rates. However, for the July 1st – November 30th, 2016 timeframe, we recorded a 2.66% overall conversion rate. With an average industry conversion rate of 2%, we’re happy with our 2.66%.
The restructuring project was a success, and we were able to show the client significant improvements. Here are items to remember when evaluating your site’s structure:
- Make a plan for your pages and content
- Focus on one major topic per page
- Plan the structure like a tree
- The overall structure should be well-balanced
- Keep the number of levels as low as possible (nobody will click 25 times to get to a particular page)
- Make sure there is search demand for the category pages
And finally, always ask: “If users were looking for something specific, where would they click?”
Mate is a SEO Strategist at IMI. He works on a multitude of clients across a plethora of verticals. If you would like a consultation from one of our SEO experts on your site’s structure and crawlability, contact us today.