December 2018 SEO Industry Updates

search industry updates

Google Launches Q&A Structured Data for Question & Answer Pages

(image credit: Search Engine Land)


Announced on December 3, 2018, Google is now expanding its support for rich results when it comes to Question & Answer pages in Search and on the web. This new structured data was tested a year ago, but is now being expanded even more.

Once asking a question in Search, Google will show the searcher a carousel of answers available to a snippet that asks a question. In addition, Google will also display “Top Answer” across the best answer based on certain criteria.

To expand on this Q&A result format, Google has added a new structured data, Q&A pages. Google states, “We have developed a new rich result type for question and answer sites.” So what exactly are these newly structured Q&A pages? They are web pages that contain data in a question and answer format. Marking up these pages helps Google generate a better snippet for your page.

What sites work with this Q&A Structured Data?

  • Google states the Q&A Structured Data works for sites that already have pages in a question and answer format. For example, social new sites, expert forums, and help and support message boards.

How do sites implement this?

Why should we care?

  • These rich results will help amplify the display of a search result among the other search results. With your snippet displaying a carousel of a variety of answers to scroll through, this could help drive higher CTR than a site that only provides the standard blue link for a normal snippet answer.

Key Takeaways

  • Tested a year ago, Google is expanding on rich snippet results when it comes to Question & Answer pages in Search.
  • Once users type a question in Search, pages with Q&A structured data implemented will appear in the rich snippet result with a carousel of answers, and can even display a “top answer” on the top of the best answer in the snippets result.
  • This new SERP rollout could help drive a higher CTR than a snippet that only displays the standard blue link to an expanded answer.

Google Releases Speakable Markup for News Publishers Interested in Google Assistant

(image credit: Search Engine Land)


New markup and schema brings more news specific content to the Google Assistant and Google Home devices. Announced back in July 2018, Google is working on a new markup, named speakable, aimed at news publishers looking to feature content in the Google Assistant. Speakable allows publishers to mark up the most relevant sections of their news article to be read aloud by the Google Assistant, such as the Google Home devices. (Currently only available for English language users.)

This Google markup is currently listed as “BETA” and is subject to change. Google states, “We are currently developing this feature and you may see changes in requirements or guidelines.”

This new speakable markup is now on, and notes a brief description of the markup:

Indicates sections of a Web page that are particularly ‘speakable’ in the sense of being highlighted as being especially appropriate for text-to-speech conversion. Other sections of a page may also be usefully spoken in particular circumstances; the ‘speakable’ property serves to indicate the parts most likely to be generally useful for speech.

In addition to this newest markup, there are a few guidelines that should be followed, technical and content.

Technical Guidelines:

Don’t add speakable structured data to content that could sound confusing in a voice-only format.

  1. Datelines, location where the story was reported
  2. Photo captions
  3. Source attributions

Focus on key points when adding speakable structured data, rather than the entire article. This will allow listeners to get an idea of the story without having the readout cut off important details.

Content Guidelines:

Speakable structured data content should have concise headlines and summaries that provide users with useful information.

If the top of the story is included in the speakable structured data, you should rewrite this section to break up information into individual sentences. This way Google Assistant devices can read the information more clearly.

For optimal audio user experience, Google recommends 20-30 seconds of content per section of speakable structured data. (Roughly two to three sentences.)

An example of a Google Home device using this speakable structured data, according to Google, looks like this:

When people ask the Google Assistant — “Hey Google, what’s the latest news on NASA?” the Google Assistant responds with an excerpt from a news article and the name of the news organization. Then the Google Assistant asks if the user would like to hear another news article and also sends the relevant links to the user’s mobile device.

Key Takeaways

  • The new speakable structured data markup is aimed towards news publishers and their content through Google Assistants.
  • Google is currently still developing this feature and we may see changes as they continue to work on this.
  • This new markup is currently available on with a description, guidelines, and examples.
  • There are technical guidelines and content guidelines that news publishers should follow when using this new speakable data markup. 

Google Tests Comments For Live Events

(image via 9to5google)

Over the past few months, Google has been testing a feature in the search results that allows users to comment directly on live events. Currently, the events are limited to live sports games such as soccer matches.

Here’s how to test this feature & leave a comment (via Google):

  1. Go to or open the Google app Google Search.
  2. Search for a sports game. (ex. Liverpool match.)
  3. In the overview box, tap or click More More.
  4. Tap or click Comments.
  5. Tap or click Add a public comment.
  6. Enter your comment.

Comments left on events will be attributed to your Google account and can be viewed in the profile settings.


  • As viewership of major sporting events has become more accessible on TV and online, social media has become an integral part of the viewing experience as it provides real-time commentary that anyone can join.
  • By experimenting with comments on live events, Google could be looking to compete with Twitter, Facebook, and Instagram stories in driving conversations online. Instead of highlighting tweets around searches for these events, Google can capitalize on searches by having comments integrated into the box score panel.

Google PageSpeed adds Lighthouse Integration

(image via Google Page Speed Insights)

As the importance on page load time has become more of a focus for webmasters, Google has provided a variety of tools to test the speed of a site including Page Speed Insights, Mobile Page Speed Insights, and Lighthouse. In order to provide to provide more in-depth information about the performance of a site, Google released an update to Page Speed Insights.that integrates Lighthouse.

Full page speed insights reports will now include scores for Mobile & Desktop pages reflecting:

  • Field Data: Real world load times from users visiting a site with the Chrome Browser
  • Lab Data: Simulated load times from mobile devices
  • Opportunities: Suggestions on how to improve speed metrics
  • Diagnostics: Additional information for web developers to analyze
  • Audit: Provides insights on how well your site is performing for various SEO best practices


  • Monitoring the speed of your site is important to ensure that it is fully optimized. Users who visit pages with longer load times are more likely to leave the page to return to their search results.
  • With Lighthouse now integrated into Page Speed Insights, webmasters have more information on potential causes for slow page load speeds & can make better decisions on optimizations such as reducing image files or removing unnecessary plugins.

Structured Data and Indexing API Now Available for Live Stream Videos

(image credit: Search Engine Land)


Live stream videos can appear in Google search results with a “live” label. The label displays over the video thumbnail in a carousel format. Examples of live stream content include sports events, award shows, influencer videos, and even video games.

This is made possible with the new live stream structured data. This data ultimately signals to Google when a video is live or when it will begin. More information on implementing the livestream structured data is explained by Google Developers.

In conjunction with the new structured data, Google has made available its Indexing API for live stream content. The purpose of this is to expedite Google’s indexing of livestream content by crawling the content quickly.

Key Takeaways

  • Google can display “live” labels on live stream content when marked up appropriately.
  • New livestream structured data provides necessary information to Google, such as start and end dates.
  • Indexing API can now be used for livestreams in order for Google to crawl and index quickly.

New Call-to-action

Comments are closed.