SEO Industry Updates – February 2019

seo industry updatesGoogle Featured Snippets Stealing Traffic

Summary

A screenshot of a featured snippet result (pictured above) posted on Twitter by Cyrus Shepard that appeared to hide publisher origin data has reignited backlash towards Google for “stealing publisher traffic.”

Google regularly tests new layouts and featured snippets in its search results. The issue being raised with marketers is nothing new, but is centered around the concept that in some of these featured results Google is “hiding” publisher data (URL source, brand, author, etc.).

Publishers and marketers are concerned if Google is using their content to deliver a direct answer to consumers in search results, then users will not have a need to click over to their website.

Key Takeaways

In this specific example, publisher source data is only visible if the user clicks on the downward facing arrow the list.

  1. Responding to these concerns, Google’s search liaison, Danny Sullivan, said he would pass along the community concern and suggestion to show publisher links more visibly if such a feature becomes standard.
  2. Though it has gained far reaching attention, this featured snippet type appears to still be a test. Google uses data from these smaller experiments to determine user sentiment before finalizing new design or user experiences for global rollout. So it is possible that this feature does not become a standard result type in search.
  3. As it is still just a test, it is unknown what the potential impact on click-through-rate a result of this nature could have. That said, it is still wise to optimize for featured snippets when applicable as they are a strong means of representing your brand as “the answer” for a given query, and are generally seen as valuable real estate in search for publishers.

Google Updates Search Performance Reports In Search Console

Summary

This month, Google announced additional updates to the new Search Console surrounding the reporting of domain properties:

Beginning at the end of March 2019, performance data in Search Console will report on the canonical URL for page rather than the URL in search. This will have an affect on the data as traffic from a different property will now be shifted to the canonical URL.

Domain properties will be consolidated to a single property to view the site performance as a whole. For example, properties such as subdomains (support.domain.com) or alternate versions (www.domain.com) will be consolidated under one domain property (domain.com). This replaces property sets from the old version of Search Console which Google previously announced that they were discontinuing.

Google has already begun the process of consolidating various properties under one domain, but have access to information as a whole users will need to verify their properties through DNS records.

Key Takeaways

To ensure your site is fully prepared for these updates, Google recommends taking the following steps:

Ensure your pages identify a canonical URL with <link rel=”canonical” href=”https://domain.com/”>. This provides Google with a clear understanding of the preferred version of pages on your site for efficient indexing and reduces the potential for being penalized for duplicate content.

If you are unsure if your pages have a canonical URL specified, you can utilize the URL Inspection Tool built in to Search Console. The Inspection Tool allows you to see if/how your site has been indexed by Google as well as the canonical URL chosen. (More information on the tool can be found through Google Webmasters.)

Verify all versions of your site (http, https, www, non-www, m-dot). Regardless if all the URLs on your site are set up to be HTTPS and/or non-www, this will allow you to have full access to Search Console data across your properties and allow you to identify any issues across all versions should they arise.

Check that the users on your properties are correct as they will only be able to see data for the properties that they have been granted access to (ex. http://domain.com/ vs. https://domain.com/ are separate properties).

Create a snapshot of your search console data to identify which URLs may have been consolidated. Google will provide access to both sets of data for a temporary time period before the transition is complete.

5 SEO Factors to Consider with On-Page

Summary

SEO is an organic system that needs a strong foundation in five factors. With these factors in place following SEO best practices can be a bit easier.

  1. Content Relevance

Brands need to keep up with content and make sure it is relevant. According to a study done by Ahrefs, only 91 % of online content produces no traffic from Google; this is partly from SEO not keeping up with all the changes to Google algorithms. Google has made these changes in the hope to help them understand users search patterns better.

Google has added new features such as answer boxes, knowledge panels, and diversity in search results.

  • Creating deep content is where users can see or find a new perspective on a topic and find information best to suit their needs. This type of content should be between 2,250 and 2,500 words to provide the best results.
  • Organizing content requires ensuring title tags and header tags are optimized. Allowing the user to view and access content with ease versus having to hunt it down.
  1. User Engagement

Google Analytics can help provide metrics to show how engaging a website is for users. With these metrics, marketers can make sure sites and content are appealing. Here are a few parameters that will help evaluate user engagement and provide insights on interactions on a website.

  • Pages per sessions and time spent on a page are great metrics in figuring out what content is getting a user’s attention. The more time users spend on a site, the more content they will view. The more pages they view, the more engaged and the longer they stay. Both factors work together and provide marketers a story to help optimize content better.
  • Bounce rate can also paint a picture on how engaging a site can be for users. High bounce rates can be useful. It can mean the user found the information and moved on fast. Other times it is bad to have high rates; it can say the content is not what the user needs or the material could be dull or boring.
  • Click-Through Rate could mean your meta description and title tag are not compelling enough to get the user to engage with a site if low. As stated above, make sure content is relevant to the topics on the site and that is exciting and engages the user’s attention.
  1. Technical Structure

Technical SEO is the primary foundation of keyword ranking and impacting the user’s engagement experience. In auditing a site, you need to make sure that the site is crawlable, secure, and has clean URLs.

  • Crawlability is essential for pages on your sitemap and homepage. Links need to be interlinked correctly as not to send users to 404 pages or have slow loading.
  • Security HTTPS migration is essential because it helps with Google ranking. Make sure content is consistent with what is being written and shared on sites.
  • Clean URLs require a clear path with 200 codes. Be aware of where content is linking make sure it’s not a broken link or a redirect.
  1. Interlinking

Internal links allow you to move around a website freely or from page to page. Make sure a site interlinking is continuously updated and consistent with changes.

  • Deep links allow for the orphaned pages to links with a category page passing authority from one page and giving it to another making sure that each page gets indexed. Doing this can help a user perform additional actions on a site and find other materials needed on a topic or section.
  • Organized hierarchy website topics need to be organized and well mapped out to allow a user’s to see a topic with ease and to provide proper engagement with relevant content. When organizing content using a “top-down” approach, allows for search engines to crawl and index pages under buckets or clusters.
  1. Mobile Response 

Two significant factors to remember is mobile-friendly design and fast page speeds. It is important when designing new pages that they meet and response on a mobile platform. Things to watch out for are: the dimensions of a device, how a page will fit on that device screen, and how easy is it to navigate to other pages on the site. All can help improve users experience while searching.

Key Takeaways

Remember SEO is a more organic and holistic way to search. To keep it healthy we need to continue optimizing the above factors. By making sure content is relevant, users engage properly with content and websites, technical issues don’t exist resolved, interlinks work, and sites are mobile friendly.

Using The Kondo Method for SEO

Summary

Over the past few years, Netflix has been responsible for jet settings people’s careers. This last year Marie Kondo released her new original series “Tidying Up with Marie Kondo” overnight she became a sensation and a household name. With her improving our lives and helping us “spark joy,” where else can her principles be applied? How about in the SEO world? Below is how we can use her six steps to improve your current SEO strategy.

Commit yourself to tidying up (your site).

After creating website content and making it live, you can’t just leave it alone and let it go free. To make sure organic lift is happening, it takes time and commitment. You need to perform an SEO audit, correct errors that are found and make sure you are receiving the required data.

Imagine your ideal lifestyle (once your site reaches its full potential).

Establishing clear SEO strategies for clients and campaigns. Do this makes it easier to stay focused on the primary goal at hand, successful organic performance.

Finish discarding (thin and low-quality content) first.

Discard old or irrelevant content. Make sure pages are not linked to your sitemap and discard items from old subdomains, subfolders or PDFs.

Tidy by category, not by location (in the SERPS).

Once all old content and bad links are gone, optimize your categories. First by keyword research, make sure all meta titles, meta descriptions, header elements, and image text is all there. Next, perform a technical audit and remove any 404s. Verify there are interlinks and revise any other issues that come up.

Follow the right order (as you prioritize SEO efforts).

Keep in mind client limitations and make sure your knowledge is creating the most impact when optimizing content.

Ask yourself if it “sparks joy” … (yes, an increase in organic performance always “sparks joy”)

Ultimately, you should see results that “spark joy,” however if this is not the case you may need to reevaluate your strategies or tactics.

Key Takeaways

As SEO continues to grow it is essential to have a well-organized game plan. It requires input from other teams with audits and processes that need to take place. Using the six steps above makes it easier to formulate a game plan that in the end will provide success and “spark joy.

Moz Upgrades “Domain Authority” Metric

Summary

Domain Authority is a metric developed by Moz to help predict how well a website will rank. Websites are scored on a 1-100 scale, 100 being the strongest and most likely to rank in organic search results. This metric has been an industry standard for measuring & comparing the SEO value of websites.

As of March 5th, Moz has rolled out a new and improved “Domain Authority 2.0.”  From Moz’s announcement, here are some improvements we can expect from DA 2.0:

  • Better understanding of websites that haven’t ranked in the past
  • Better detection devaluing sites with spammy backlinks
  • More responsive to Google’s algorithm changes

Moz’s new Domain Authority algorithm integrates many factors, such as Spam Scores, Moz’s machine learning model that identifies common features among banned & penalized sites in addition to analysis from the improved Moz Link Explorer (with data on +35 trillion links).

Key Takeaways

When measuring the success of websites, it’s important to remember Domain Authority is a relative metric, not absolute. Track the change in your site’s DA in relation to the DA of your competitors.

The sites most affected are those engaged in spammy link practices.

Most site owners should not be alarmed if their DA drops some. Moz’s Russ Jones says there is, “On average, a 6% reduction in Domain Authority for randomly selected domains from the web. Thus, if your Domain Authority drops a few points, you are well within the range of normal.”

New Call-to-action

Comments are closed.