I’ve been immersed in the SEO world for over 5 years now, but for the first time, I had the pleasure of attending MozCon and came away with a lot of interesting takeaways.
This was the most well put together conference I’ve ever attended. All the speakers were in the same room, so you didn’t have to choose which presentation you wanted to go to. You got to hear everything that was offered.
I also learned things to better my career. At most conferences, it’s mainly high-level information, but at MozCon, you learn things to better yourself as a SEO professional, not to mention the networking opportunities it presents.
The main takeaway I received was the discussion surrounding the state of search heading into 2017. Rand Fishkin opened with this topic and it inspired me to dive down deeper into what the state of search looks like as we leave 2016 behind and head into a new year.
Left in the Dark in 2017?
In 2017, we can expect search to still be bigger than social in terms of driving traffic to websites. SEO has become more of a skill, not just a job title. If you’ve gauged how companies are listing their job opportunities, they aren’t necessarily including SEO in the title, but they require some sort of SEO knowledge.
If you take a look on LinkedIn, you’ll see more and more marketing jobs have SEO knowledge as a requirement now rather than a bonus. SEO is a hot commodity when it comes to hiring; LinkedIn had it as the number four skill to have in 2015, and in 2016, it still ranked as a top ten skill to have.
Despite many beliefs, search is still a huge driver of traffic. In fact, it continues to dominate other forms of web marketing when it comes to search interest. However, Google is becoming stingier, “dark traffic” (traffic that can’t be attributed to any channel) is a challenge, so it’s tough to attribute that traffic to any specific channel.
In Rand’s presentation, he mentioned that 38% of his social traffic is “dark.” This will likely continue to be an issue as we launch into 2017.
Google’s Move to Machine Learning
Next up: machine learning. With the announcement of RankBrain this year, it’s clear Google is focusing on machine learning more than ever before. Everyone in the SEO world has seen this coming.
Google is highly focused on user intent when it comes to queries. It’s no longer about what the user actually searched; it’s about what they intended on searching for.
This makes analyzing the SERPs and the competition within those SERPs even more important, as you must not only optimize for the query you want to rank for, but now you’ll need to satisfy that user intent as well.
My suggestion to many clients to satisfy this need is by becoming an educator in the space. Don’t be limited in only selling your products, but educate people on those products. If someone is searching for a “small business loan,” for example, are they looking to take out a small business loan right away or are they looking for more information surrounding a small business loan? Chances are the latter.
Our industry will continue to hear about machine learning for many years to come, how it will continue to affect the SERPs, and how marketers will adapt to any changes.
“AdWords is Redacting Data Like It’s the CIA”
In the words of Rand, “AdWords is redacting data like it’s the CIA.” Any of us in SEO or even paid search knows the effects of this recent update. Unless you are spending tons of money with Google, you now only see a range for how many times a given keyword is searched per month.
This can be annoying when you are trying to justify certain areas of spend and where to target your on-page strategies. This is forcing many to look to other tools such as SEMrush and AWR Cloud for keyword volumes.
Ten Blue Links Now Only Exists in 3% of Searches
This one is painful for anyone who’s been in the SEO game for awhile. No longer can we expect to see nothing but organic results in the SERPs. In fact, less than 3% of all searches give just ten blue links.
To go along with that, we rarely even receive ten results on the first page of Google when doing a search. With the introduction of rich snippets, AMP results, news results, and the like, spotting ten blue links in a SERP is like spotting a snow leopard in the wild.
The problem with this heading into 2017 is that we all know that the SERPs are covered with new features and rich snippets, but very few SEOs actually optimize for these new SERP features.
In fact, in Moz’s study, over 40% of SEOs said they only optimize for 10 blue links, while 31% said they just optimize for images, news, and maps. Meanwhile, only 29% said they currently optimize for all features. This is something that will need to change in 2017, if you expect to stay competitive.
Quality Is Becoming a Sitewide Metric
This one should be apparent, but a lot of SEOs only pay attention to the top performing pages on their site. Typically, this will equate to spending most of their time on the top 10% performing pages, while spending a little bit of time on the middle performing pages.
All in all most SEOs are only spending time optimizing 50% of their total pages on the site and leave the bottom performing pages behind. Sitewide quality matters big time to Google. If you want your site to be the best of the best and perform with the best of the best, then you’ll need to make sure the quality of every page on your site is top-notch.
I believe this goes along with the 10x content theory, which outlines why and how your content should be 10 times better than the next best competitor. While this is easier said than done, it’s something all SEOs should strive for in 2017.
RankBrain + Hummingbird Are Changing How Content Can Rank
This one should have SEOs and content creators ecstatic. More content can rank for a variety of queries than ever before. Here is the example Rand used in his presentation:
Because of RankBrain and Hummingbird, Google better understands semantics and the meanings of words. They also are getting more attuned to using search intent and user behavior after a given query to better determine what search results to display for any given query.
In the example, you can see the user searching for “video game with Italian plumbers” and Google knows that the user is intending on searching for Mario Bros. This also helps quality control for Google. These updates allow Google to understand that searches such as “best mobile phone,” “highest quality mobile phone,” and “top rated smartphone” have essentially the same intent and now return nearly identical results. Sometimes it’s not what we say, but what we mean, and in terms of search, Google knows the difference.
PPC Continues Leveling Off Trend
So we all know paid search isn’t going anywhere. However total paid search spending is leveling off. In Q2, paid spending was down 19% in growth when compared to Q1. CPCs have fallen by 6% compared to only 3% in Q1, as well (data via Merkle).
Advertisers are seeing a decelerating spending growth in Google, but Bing and Yahoo Gemini were the larger drags on growth. I believe this is attributed to fewer search ads appearing above the fold than there used to be.
It’s making things more competitive and only allowing big spenders to compete for the top spot driving out some of the smaller spenders. I expect to see this trend continue. To keep up, it seems Google is making it tougher to be able to tell an ad from a regular search result.
Now, there is barely a difference between an ad and an organic result. The only major difference being the little green “ad” box which mirrors the color of the URL.
Thanks to Clickstream Data, We Know More About How Searchers Engage with Google
Clickstream Data has given us the opportunity to know more about searchers than ever before.
- 29% of all search queries according to this data are long tail type keywords. This means there is still a huge amount of long tail in search and now with voice search becoming more apparent, this trend will only continue to increase in 2017.
- The average searcher is now performing 3 queries per day on a desktop or laptop, of those queries 1.19% of google.com U.S. searches result in an ad click.
- This means Google makes 90% of their revenue on 1% of the clicks that happen. 49% of those clicks.
- 49% of clicks went to Google properties (Maps, YouTube, Ads, etc.).
- 40% of searches in 2016 resulted in no clicks at all – making optimizing for rich snippets that much more important.
- 25% of desktop queries come through Chrome Instant.
Here are the top sites receiving Google Search Traffic from Rand’s presentation:
Together these sites make up 22.8% of all Google search traffic referrals. None of these should come as a surprise and I would suspect we’ll see very similar data for 2017. More data on Google traffic referral numbers below:
- The top 100 websites make up for 30.59% of all Google search traffic referrals.
- The top 1,000 websites make up for 45.58%.
- While the top 10,000 websites make up for 62.66%.
Facebook’s referral traffic is even more skewed towards the top sites:
- The top 20 websites account for 16.39%.
- The top 100 account for 30.97%.
- The top 1,000 account for 59.4%.
- The top 10,000 account for 81%.
What Does It All Mean for 2017?
In conclusion, these are the trends for search as we head into 2017. 2016 has been a big year for search: We were introduced to RankBrain, Google’s journey into machine learning; ads became more blended in the search results; Penguin merged into the daily changing algorithm; local updates such as Pigeon and Possum happened; and more.
Search changes every day, so I expect even more changes into 2017. The big things to look out for will be voice search, the continued evolution of Google’s machine learning, and the changes they make to make search even smarter, and the ever-growing need to optimize for rich snippets and all the many new features Google has rolled out.
Search is still the largest driver of traffic to any given website, so the need for search experts is only going to continue to grow. As long as there are websites, there will be a need for experts in the space to help them get found.