SEO Industry Updates – July 2019

SEO industry updates for July 2019

Google Search Console Adds Two New Features

In the last few months, Google has been updating and refining some reports and tools within the Google Search Console (GSC). All the changes are meant to help SEO professionals overcome any challenges or obstacles they may currently be facing with campaigns or reporting. In addition to the reports and tools, Google added recently two new features to the GSC, to help find and fix errors in the code and structured data markups.

  • First New Feature – Search feature has been added. This may seem a bit mild and minor, but this features cuts time down and allows you to pinpoint a piece of code faster. 
  • Second New Feature – GSC now allows for code copying and editing. A feature like this is beneficial; it will enable you to copy the bad code and adjust it and then place it back into to GSC to test.

Having these two new features helps reduce the time it takes to find and debug, allowing that extra time to be used for improving strategy and working on other ways to optimize campaigns. GSC is continuously trying to improve the program, and through feedback from SEO professionals, they can make positive changes to GSC. As the year is only half over, expect to see more improvements to the platform and other ways, Google will enhance Google Search Console. 

June 2019 Core Update: Recovery & Advice

Last month digital marketers and SEO strategists were informed that Google was rolling out a search ranking update, June 2019 Core Update. No matter if you’re dealing with a large and impactful algorithm update or a broad update like the June 2019 Core Update, Google has some advice:

“Here’s an update about updates — updates to our search algorithms. As explained before, each day, Google usually releases one or more changes designed to improve our results. Most have little noticeable change but help us continue to incrementally improve search….”

For this specific algorithm update, we are all probably aware that there is no specific fix to make to your site to adapt to the update, and there’s a reason for this. John Mueller from Google stated in a webmaster hangout on June 14 that “it is not just one thing to fix but many fundamental issues you need to fix or improve on your site so Google can trust the site”.

With that being said, what exactly can be done to help with recovery after this June 2019 Core Update?

  • Google’s Danny Sullivan states, While this June 2019 Core Update is not like the Panda updates, Panda Google Advice & Tips remain very helpful to consider about how to improve content generally.
  • Mueller mentioned  three main examples, in his webinar, that could also come into play as recovery tactics:
    • Does the site look outdated?
    • Do people not recognize who your authors are?
    • Are the author photos not their real photos but stock images?

When Mueller was asked why there is nothing to fix when it comes to this specific core update, he had this to say:

“I think it’s a bit tricky because we’re not focusing on something very specific where we’d say like for example when we rolled out the speed update. That was something where we could talk about specifically, this is how we’re using mobile speed and this is how it affects your website, therefore you should focus on speed as well.”

During this webinar, Mueller felt the need to provide hope for web publishers and marketers by suggesting that Google may provide official guidance in the future:

“I know a lot people have been asking for more advice, more specific advice so maybe there’s something that we can put together. We’ll see what we can do internally to put out a newer version of a blog post or kind of provide some more general information about some of the changes we’ve been thinking about there.”

To learn more about what John Mueller has to say about the June 2019 Core Update, in general and recovery tactics, give his webmaster hangout a listen: English Google Webmaster Central (office hours hangout).

Key Takeaways

Search Engine Journal summarized Google’s advice when it comes to this June 2019 Core Update. Even though we don’t have a clear picture of what to look out for and what we can do during the recovery stage, there are some high-level takeaways to be familiar with.

  • Look at the Big Picture – Step back and understand that some website issues might be external issues. Users are evolving, user habits are changing, and as a result, so are Google search results.
  • Remember that Algorithms Evolve – Google’s algorithm is less about lining up keywords and more about solving user problems, understanding what the user wants out of a search query, and finding the right web pages that solve a specific search. Instead of focusing too much on keywords, focus more on providing quick and in-depth information to users.
  • Get a 3rd Party to Analyze Your Site – It always helps to get a fresh opinion on insights about your site. It might be easier than to point out the bigger issues.

Google Cancels support for robort.txt NoIndex

Google has announced new updates to the way their crawlers will treat robots.txt files when crawling & indexing a site. After proposing a standardization of the Robots Exclusion Protocol, which has been used for the past 25 years to direct the actions of web crawlers, the webmaster team also announced the removal of support of the noindexing directive in a robots.txt file. 

According to Google’s Gary Illyes, the noindex command in robots.txt has been helpful to webmasters, the proportion of sites that were improperly using this practice has caused more harm than good for most webmasters, leading Google to remove support. 

In order for sites to make any necessary updates, Google announced that these features will officially be unsupported after September 1, 2019.

What Is robots.txt & Why Is This Important?

A robots.txt file provides a signal to web crawlers which pages should be ignored when crawling & indexing a site. Previous to Google’s announced updates the robots file could also be used to block crawlers from indexing specific URLs using the noindex command.

In lieu of using noindex in robots.txt Google has recommended the following alternatives for blocking pages from being crawled or indexed before the new changes go into effect in September:

For more information, read Google’s official announcement on their Webmaster Central Blog.

Google Search Console in 2019

There has been a myriad of bugs plaguing Google Search Console the past few months.  Here is a brief rundown:

  • April 4th: SEOs noticed pages dropping out of Google’s index (in search and reported in Google Search Console)
  • April 6th: Google confirmed a bug caused pages to be de-indexed
  • April 10th: Google announces they fixed the indexing bug 
  • April 15th: Google announced that although the indexing bug was fixed, the URL Inspection tool in Google Search Console may still have outdated information 
  • April 29th: SEOs reported GSC data was finally up to date 
  • May 1st: SEOs reported bugs in GSC’s Manual Action report.  SEOs noticed manual action notifications disappearing without any traffic improvements. 
  • June 24th: SEOs reported seeing 10-day delay of index coverage in GSC 
  • June 25th: Google confirmed the bug was fixed 

Despite the recent bugs, Google Search Console in 2019 is much more robust and useful than in previous years.  Since the launch of the new Google Search Console in January 2018, Google has continued to make improvements.  Here are just a few reasons why GSC in 2019 is better than the old version: 

  • Updated interface that is intuitive and easy to use 
  • Ability to manage all subdomains within one GSC property, instead of creating separate properties for www, non-www, http, https, versions of the site 
  • The Performance report has 16 months of data, improved from only 3 months
  • The Inspect URL report allows users to test any URL on the property’s domain to see its status within Google’s index.  It reports on when the page was last crawled, whether it is indexed, whether it was crawled as “Googlebot desktop” or “Google smartphone.”  Also included is the HTTP status code, any issues with loading JavaScript resources, and rendered screenshots.  
  • The Coverage report includes detailed information about errors with indexing, and reasons as to why certain pages were excluded from indexing (either blocked by robots.txt, non-canonical, server error, etc.)
  • The Enhancement report identifies any issues with Mobile Usability or Structured Data

Although it seems like there’s a new GSC bug every week, it’s good to put things in perspective and acknowledge how much better the tool is than it used to be!  

IMI monthly newsletter

Comments are closed.