Google’s John Mueller Discusses June 2019 Update Recovery via @martinibuster

Google’s John Mueller was asked in a Webmaster Hangout what to do if a site is suffering a traffic loss due to Google’s June 2019 broad core algorithm update. John Mueller’s answer provided insights into understanding what is happening.

Then Mueller provided hope that Google may offer further guidance on what to do.

Webmaster Asks If It’s a Content Issue?

The person making the question states they’re a news publisher. They ask that because they deal in content, that it may be that the core update issue for them is content related.

Here is the question:

“We’re a news publisher website, primarily focusing on the business finance vertical. we probably have been impacted by the June Core Update as we’ve seen a drastic traffic drop from the June 1st week.

Agreed that the update specifies that there are no fixes and no major changes that need to be made to lower the impact.

But for a publisher whose core area is content news, doesn’t it signal that it’s probably the content, the quality or the quantity which triggered Google’s algorithm to lower down the quality signal of the content being put up on the website which could have led to a drop of traffic? “

The questioner states that webmasters need more guidance:

“…it would really help if Google could come out and share some advice to webmasters and websites.

Not site specific, but category or vertical specific at least on how to take corrective measures and actions to mitigate the impact of core updates.

It would go a long way in helping websites who are now clueless as to what impacted them.”

Screenshot of Google's John Mueller in a Webmaster Hangout

Screenshot of Google's John Mueller in a Webmaster HangoutGoogle’s John Mueller was asked how to recover from a Google broad core algorithm update.

Why Nothing to Fix

John Mueller did not suggest fixing anything specific. He explained that the reason there’s nothing specific to fix is because a core update encompasses a broader range of factors.

Google’s John Mueller explains:

“I think it’s a bit tricky because we’re not focusing on something very specific where we’d say like for example when we rolled out the speed update.

That was something where we could talk about specifically, this is how we’re using mobile speed and this is how it affects your website, therefore you should focus on speed as well.”

Core Update, Relevance and Quality

John Mueller then discussed the core updates within the context of relevance and quality updates. He did not say that core algo updates were specifically just about relevance or just about quality. He seemed to mention those to aspects as a way to show how these kinds of updates do not have specific fixes.

Here is how John Mueller explained it:

“With a lot of the relevance updates, a lot of the kind of quality updates, the core updates that we make, there is no specific thing where we’d be able to say you did this and you should have done that and therefore we’re showing things differently.”

John Mueller then explained, as an example, of how changes that are external to a website could impact how Google ranks websites.

This is what he said:

“Sometimes the web just evolved. Sometimes what users expect evolves and similarly, sometimes our algorithms are, the way that we try to determine relevance, they evolve as well.”

That may be the most a Googler has said so far to explain about core algorithm updates.

It follows along with what I’ve been saying, that factors like how Google determines what it means for a page to be relevant to a user can change. Some continue to focus on “quality” issues, fixing things like missing biographies, too much advertising on a page, but that kind of advice ignores relevance issues.

John mentions quality, but he also mentioned how users and the web evolve. That’s not a quality issue. Those are factors that are external to a website that need to be considered.

Nothing to Fix

John Mueller related that there aren’t specific things to fix. But he suggested that it may be useful to understand how users see your site, how useful your site is to users.

Here’s what John Mueller said:

“And with that, like you mentioned, you’ve probably seen the tweets from Search Liaison, there’s often nothing explicit that you can do to kind of change that.

What we do have is an older blog post from Amit Singhal which covers a lot of questions that you can ask yourself, about the quality of your website. That’s something I always recommend going through. , That’s something that I would also go through with people who are not associated with your website.”

John Mueller may have been citing a Webmaster Central blog post from 2011 titled, More Guidance on Building High-quality Sites

In it, the author provides a large number of questions a site owner should ask themselves about their content.

Here is a sample of the kinds of questions Google suggests you should ask yourself:

  • “Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the page provide substantial value when compared to other pages in search results?”

Ask a Third Party For a Critique

John Mueller then suggested that a third party that is unfamiliar with your site may be able to see issues that are not apparent to you.

What John Mueller said:

“So, often you as a site owner you have an intimate relationship with your website you know exactly that it’s perfect. But someone who is not associated with your website might look at your website and compare it to other websites and say, well, I don’t know if I could really trust your website because it looks outdated or because I don’t know who these people are who are writing about things.

All of these things play a small role and it’s not so much that there’s any technical thing that you can change in your line of HTML or server setting.

It’s more about the overall picture where users nowadays would look at it and say, well I don’t know if this is as relevant as it used to be because these vague things that I might be thinking about.

So that’s where I’d really try to get people who are un-associated with your website to give you feedback on that.”

John Mueller suggested asking in web communities, including the Webmaster Help Forums, to see how others see your site, if they could spot problems.

One issue with that is that every community have specific points of views that sometimes don’t allow them to get past their biases to see what the real problem is. That’s not a criticism but an observation on the nature of opinions is that they tend to vary.

Here’s what he said:

“…you can talk with other people who’ve seen a lot of websites and who can look at your websites and say well, I don’t know the layout looks outdated or the authors are people that nobody knows or you have stock photos images of instead of author photos. It’s like, why do you have that?

All of these things are not explicit elements that our algorithms would be trying to pinpoint but rather things that kind of combine to create a bigger picture.”

That’s good advice. Familiarity does make a person unable to identify what the problems are in a website.

I know from experience that it’s not uncommon for a site owner who comes to me for help with their site is sometimes surprised that their site contains problems with their content, is outdated in some way or has room for improvement in the way the content is written.

Sometimes they intuit that something is wrong but they can’t see it. I once had a site owner come to me with a negative SEO problem but the feedback I received directly from Google was that they were suffering from content issues related to Google’s Panda algorithm.

It was a shock for them to hear that their content was bad. But having it confirmed by Google made them better able to see that yes, there were problems with the content.

Google May Provide Additional Official Guidance

John Mueller appeared to be moved by the situation experienced by the web publisher. He offered more advice and insight into core algorithm updates than has previously offered and it went beyond the “nothing to fix” advice that, while true, is still felt to be unsatisfactory by many in the web community.

Mueller then offered hope by suggesting he would inquire about providing additional guidance for web publishers.

“I know a lot people have been asking for more advice, more specific advice so maybe there’s something that we can put together. We’ll see what we can do internally to put out a newer version of a blog post or kind of provide some more general information about some of the changes we’ve been thinking about there.”

Takeaway: See the Big Picture

The important takeaways are to be able to step back and see the big picture, which means:

Some issues are external to the website. For example, many fashion brands no longer publish blogs. An SEO recently attributed that to a failure in the content strategy. But that’s missing the big picture. The reason many fashion brands no longer publish blog posts is because users don’t consume that kind of content. They consume content on Instagram, Facebook, Twitter and other social media websites.

That’s an example of how users evolve and how it’s not a problem with your site, but rather a change in user habits that may be reflected in the kinds of pages that Google shows in the search results.

Takeaway: Algorithms Evolve

Google’s algorithm does not really match keywords to web pages. It’s about solving problems for users. Google’s increasingly updating how it understands what users want when they type a query. Google is also updating how it understands the problems that a web page solves.

A website that focuses too much on keywords and not enough on providing quick information to users who need it quickly and deep information to users who need depth, may find that Google’s algorithms no longer favor them. Not because your site is broken and needs fixing. But because it does not solve the problem for the user in the way Google has determined users want them solved.

Takeaway: Have a Third Party Review Your Site

Lastly, it may be helpful to have a fresh set of eyes review your website. If that doesn’t provide insights, then someone with experience diagnosing relevance issues may be useful.

Read: June 2019 Broad Core Algo Update: It’s More than E-A-T

Read: What is a Broad Core Algorithm Update?

Watch: Webmaster Hangout

Screenshots by Author, Modified by Author 

Google’s John Mueller Predicts Dynamic Rendering Won’t Be Needed in a Few Years via @MattGSouthern

Google’s John Mueller predicts that dynamic rendering will only be a temporary workaround for helping web crawlers process JavaScript.

Eventually, all web crawlers will be able to process JavaScript, Mueller believes. So in a few years’ time relying on dynamic rendering may not be necessary.

Mueller made this prediction during a recent Google Webmaster Central hangout when a site owner asked if there’s any reason why they shouldn’t use dynamic rendering.

Here is the question that was submitted:

“We’re thinking of the option to start only serving server-side rendering for bots on some of our pages. Is this an accepted behavior by Google & friends nowadays? Or do you see any objections on why not to do this?”

In response, Mueller said dynamic rendering is definitely something that Google considers to be an acceptable solution. In the near future, however, sites won’t need to rely on it as much.

Googlebot can already process every type of JavaScript page, and Mueller suspects all other crawlers will follow suit.

Mueller says dynamic rendering is a temporary workaround until other crawlers catch up. Although “temporary” might mean a couple of years, he clarifies.

What makes this prediction particularly interesting is that dynamic rendering was only introduced last year at Google I/O 2018.

Now, a little over a year later, Mueller predicts this innovative solution for serving JavaScript to bots will only be needed for a few years.

It will be interesting to look back on this and see how Mueller’s prediction pans out.

Hear Mueller’s full response below, starting at the 18:38 mark:

[embedded content]

“So you can definitely do this, from our point of view. This is what we call, I believe, dynamic rendering, which is basically when you’re pre-rendering the pages for a specific set of users. Usually, that includes crawlers, social media user agents, all of those things that are basically not normal users that wouldn’t be able to process JavaScript.

That’s certainly something you could do. Sometimes it also makes sense to use server-side rendering for users as well. Sometimes you can significantly speed up the delivery of HTML pages to them. So it’s not something that I’d consider that you only need to do for bots, it’s probably worthwhile to check to see if there are ways you can leverage that same setup for users as well. Maybe you can, maybe that doesn’t make sense in this specific case.

In any case, from our side specifically, it’s something that you can do. I suspect over time, over the long run, it will be something that you’ll have to do less and less. Googlebot is able to crawl pretty much every JavaScript-type page nowadays. I suspect other user agents will follow up with that over time as well.

So I would see this as something kind of as a temporary workaround – where temporary might mean a couple of years – but it’s more of a time-limited workaround. At some point pretty much every irrelevant user agent will be able to process JavaScript.”

Google Rolls Out a New Look for Desktop Search Results via @MattGSouthern

Google has widely rolled out a new design for desktop search results which features colorful icons in the top navigation bar.

Google was initially spotted testing the new look back in March. As of today, it appears that everyone has access to the new design.

Previously, the top navigation menu was just text, so this adds a little more character to the search results pages.

You can see in the examples below how the new icons light up with color when they’re selected.

Google Rolls Out a New Look for Desktop Search Results

Google Rolls Out a New Look for Desktop Search Results

Google Rolls Out a New Look for Desktop Search Results

Google Rolls Out a New Look for Desktop Search Results

Today’s changes to desktop search results are purely cosmetic, as everything still functions the same way.

There are no noticeable changes to how search results are presented underneath the top navigation bar.

Google: Self-Referencing Canonicals Are Not Critical via @MattGSouthern

Google’s John Mueller recently stated that self-referencing canonical tags are not absolutely necessary, but they do help.

In Mueller’s words: “It’s a great practice to have a self-referencing canonical but it’s not critical.”

This topic came up during a recent Google Webmaster Central hangout when a site owner asked about the importance of using self-referencing canonicals.

Canonicals are typically used to link a non-canonical page to the canonical version, but they can also be used to link a page to itself.

Self-referencing canonicals are beneficial because URLs may get linked to with parameters and UTM tags.

When that happens, Google may pick up the URL with parameters as the canonical version. So a self-referencing canonical lets you specify which URL you want to have recognized as the canonical URL.

Google recommends using self-referencing canonicals as a best practice, but they’re not required in order for Google to pick up on the correct version of a URL.

Hear Mueller’s full response in the video below, starting at the 28:53 mark:

[embedded content]

“It’s not critical to have a self-referencing canonical tag on a page, but it does make it easier for us to pick exactly the URL that you want to have chosen as canonical.

We use a number of factors to pick a canonical URL, and the rel-canonical does play a role in that.

So, in particular, things like URL parameters, or if the URL is tagged in any particular way – maybe you have links going to that page that are tagged for analytics, for example – then it might happen that we pick that tagged URL as a canonical.

And with the rel-canonical you’re telling us that you really, really want this URL that you’re specifying as the canonical…

So it’s a great practice to have a self-referencing canonical but it’s not critical. It’s not something that you must do, it’s just something that helps to make sure this markup is picked up properly.”

Google: We Don’t Evaluate a Site’s Authority via @MattGSouthern

Google doesn’t specifically measure the authority of a website, according to webmaster trends analyst John Mueller.

This was stated in the most recent Google Webmaster Central hangout when Mueller was asked how a site can increase its authority.

The webmaster who asked the question says their site lost a significant amount of organic traffic following the June core algorithm update.

Somehow, the webmaster came to the conclusion that their site’s authority dropped by 50 percent as a result of the core update.

However, that would be impossible to determine, as Mueller says Google doesn’t have any kind of ‘authority’ metric.

“In general, Google doesn’t evaluate a site’s authority. So it’s not something where we would give you a score on authority and say this is the general score for authority on your website. That’s not something we would be applying here.”

Google’s quality rater guidelines have a section on evaluating authority, but earlier in the hangout Mueller mentioned quality raters do not evaluate sites on an individual basis.

In other words, quality raters are not looking at sites and assigning scores based on how authoritative the sites appear to be.

The people who site owners should get to evaluate their sites are actual users, Mueller says. He recommends seeking feedback from current or potential users with regards to their perception of a site’s authority.

Find out if real users feel like they can trust the content on a website. From there, you should be able to gather feedback on how to appear more authoritative.

Here is the full quote from Mueller:

“If you’re thinking about authority, if you’re thinking about the search quality raters, then that sounds like you’re kind of on the right track there. One of the other questions was also on expertise, authority, trustworthiness – that kind of goes in the same direction.

It’s something, from my point of view, where I would try to get more input from users and potential users. Really try to get the more hard feedback that’s sometimes hard to take where people can really tell you where they think – like comparing different sites in the same niche – where they see issues that you could be doing. Or where they look at your page and think I can’t really trust the content that’s on here.

It’s probably the case that you’ve already been doing a lot of these things really well, but maybe there are things you could be doing even better in that regard.”

Hear Mueller’s full response below, starting at the 23:47 mark:

[embedded content]

Google Recommends Using JavaScript “Responsibly” via @MattGSouthern

Google’s Martin Splitt, a webmaster trends analyst, recommends reducing reliance on JavaScript in order to provide the best experience for users.

In addition, “responsible” use of JavaScript can also help ensure that a site’s content is not lagging behind in Google’s search index.

These points were brought up during the latest SEO Mythbusting video which focuses on web performance.

Joined by Ada Rose Cannon of Samsung, Splitt discussed a number of topics about web performance as it relates to SEO.

The discussion naturally led to the topic of JavaScript, as overuse of JS can seriously drag down the performance of a website.

Here are some highlights from the discussion.

JavaScript sites may be lagging behind

Overuse of JavaScript can be especially detrimental to sites that publish fresh content on a daily basis.

As a result of Google’s two-pass indexing process, fresh content on a JS-heavy site may not be indexed in search results for up to a week after it has been published.

When crawling a JS-heavy web page, Googlebot will first render the non-JS elements like HTML and CSS.

The page then gets put into a queue and Googlebot will render and index the rest of the content when more resources are available.

Use dynamic rendering to avoid a delay in indexing

One way to get around the problem of indexing lag, other than using hybrid rendering or server-side rendering, is to utilize dynamic rendering.

Dynamic rendering provides Googlebot with a static rendered version of a page, which will help it get indexed faster.

Rely mostly on HTML and CSS, if possible

When it comes to crawling, indexing, and overall user experience its best to rely primarily on HTML and CSS.

Splitt says HTML and CSS are more “resilient” than JavaScript because they degrade more gracefully.

For further information, see the full video below:

[embedded content]

Google Search Console Now Shows 90 Days of Search & Discover Data via @MattGSouthern

Google is making an adjustment to the performance report in Search Console, which will now display 90 full days of data by default.

This change will affect data shown for clicks from search results and clicks from the Discover feed.

Previously, Search Console displayed 28 days of data.

Google Search Console Now Shows 90 Days of Search & Discover Data

Google Search Console Now Shows 90 Days of Search & Discover Data

A small change to be sure, but a welcome one. As SEOs and marketers, we can never have too much data.

If you’re not seeing Discover data in Search Console it’s likely because your site is not receiving a significant number of clicks and/or impressions from the Discover feed.

Discover typically surfaces newer content, however, it may also surface content from months or years ago if it’s still relevant.

The content shown in Discover is tailored to an individual’s search and browsing activity. As long as your content is indexed in search results it is eligible to appear in Discover.

Discover only appears on mobile devices, so it primarily surfaces mobile-friendly web pages. It may help even more if the content is an AMP page.

Lastly, Discover is full of rich, visual content. So you can improve your chances of appearing in Discover by using images and videos in your articles.

Google’s John Mueller Answers Whether Author Bio is Necessary via @martinibuster

On a Google Webmaster Hangout, Google’s John Mueller was asked if the author bio page was necessary in order to meet Google’s E-A-T guidelines. Mueller’s response downplayed the necessity of author bio pages as a technical issue and suggested it was a user experience issue.

Authorship Signals

The SEO industry believes that Google’s Quality Raters Guidelines describe how to rank better in Google. It is from those guidelines that the belief that naming who the author is and listing their biography and credentials are a technical requirement to check off  the SEO ranking signal list.

But Google has never said that authorship biographies were a ranking signal. And the Quality Raters Guidelines were never represented by Google as listing ranking related signals.

John Mueller’s answer does not recommend authorship as a ranking signal. Instead, he frames it as user experience issue.

Expertise, authority and trustworthiness are important. But they are not the entire algorithm.

The Necessity of Author Biography Pages

The webmaster was concerned that their author biography pages were not being seen by Googlebot because they were noindexed. Noindex is an HTML element that tells search engines to exclude a web page from the search engine index.

The concern was that because Google could not see the author biography pages that this would have a negative effect on rankings. But according to Google’s John Mueller, that’s not the case at all.

Here is the question:

“Can you speak to the necessity of E-A-T and author biography pages linked from an article?

….So, kind of the necessity of the author biography pages. Should we have the author’s credentials on the article itself or is linking to their biography by their byline good enough?

We have an issue where the author bio pages are meta noindex. Does it stop GoogleBot or Google Quality Raters from accessing the pages?”

John Mueller began by trying to define what the acronym E-A-T meant.

He raised his  head and stared upward for a few seconds trying to recall what it meant.

Screenshot of Google's John Mueller struggling to remember what the acronym E-A-T stood for.

Screenshot of Google's John Mueller struggling to remember what the acronym E-A-T stood for.Google’s John Mueller raised his head and gazed upward as he appeared to struggle for a moment to remember what the acronym E-A-T stood for.

John Mueller then went on to incorrectly recall what the acronym E-A-T stood for. He repeatedly referred to the “A” as Authority. Google’s Quality Raters Guidelines consistently referred to the E-A-T as Expertise, Authoritativeness and Trustworthiness. It’s authoritativeness, not authority.

Here is John Mueller’s response:

So E-A-T is expertise, authority, trustworthiness, I think…

And it comes from our quality raters guidelines, which are basically the guidelines that we give the folks who help us to improve our algorithms overall.

So… first of all it’s worth keeping in mind that our quality raters help us to improve our algorithms overall. They do not review individual websites.

So it’s not something where you need to optimize your websites for access by quality raters.

John Mueller Downplays Author Bio Pages

Mueller does not at any point indicate that the author biography page is an important SEO factor. There is no indication from him that it is important to show the author bio. Instead, he focuses on how it impacts site visitors.

Here is what Mueller said:

With regards to author pages and expertise, authority and trustworthiness, that’s something where I’d recommend checking that out with your users and doing maybe a short user study, specifically for your set up, for the different set ups that you have, trying to figure out how you can best show that the people who are creating content for your website, they’re really great people, they’re people who know what they’re talking about, they have credentials or whatever is relevant within your field.

Author Pages May Not be Required

Mueller then went on to state that author biographies are not a technical issue that needs to be addressed.

This contradicts a pervasive SEO belief.  Many SEOs insist that failure to include an author biography could result in a loss of rankings.

This belief is seen in many posts on Google’s Webmaster Help Forums. For example, a “silver” level member of Google’s own Webmaster Help Forums cited authorship signals when trying to diagnose why a website lost rankings.

“The website has no information about the valid organization of the publisher.  …This contradicts to the following recommendations of Expertise, Authoritativeness, Trustworthiness – EAT of Google:
● Who (what individual, company, business, foundation, etc.) is responsible for the website.
● Who (what individual, company, business, foundation, etc.) created the content on the page.”

Google’s Quality Raters Guidelines were not produced to give insights into Google’s algorithm. Yet members of Google’s own Webmaster Help forums treat the information as if it holds insights into why a site may have lost rankings.

John Mueller advises that author biographies are not a technical requirement:

So that’s less something I would focus on this as a technical thing like you need to do this, this and this or this type of markup for these pages but rather more as a quality thing, as a user experience thing where you can actually do user tests with your users directly.

Author Biographies are Not a Ranking Signal?

There is no evidence that an author biography is a ranking signal. That’s something that’s so easy to fake, that it makes sense to not make it a ranking signal. So maybe it’s time to move beyond the endemic reductionist thinking that seeks to miniaturize Google’s algorithm to simple technical factors.

There is no evidence that author biographies are the critical ranking factor that many in the SEO claim it to be.

Watch the Webmaster Hangout here.

Google Announces Free Webmaster Conferences via @martinibuster

Google announced that they will be hosting free conferences worldwide. The goal is to bring search related information direct to people in their own country and in their own language. Each event will be completely free.

Google promised to reveal the details of more Webmaster Conferences coming to communities around the world.

What Are the Webmaster Conferences?

The Webmaster Conferences are intended to match the needs of the webmasters located in the communities in which they are held.

According to Google:

“These events are primarily located where it’s difficult to access search conferences or information about Google Search, or where there’s a specific need for a Search event. For example, if we identify that a region has problems with hacked sites, we may organize an event focusing on that specific topic.”

Webmaster Conferences Coming to Europe and North America

Google noted that events are coming to Europe and North America. No dates or locations are to be announced.

According to Google’s announcement:

“We will also host web communities in Europe and North America later this year, so keep an eye out for the announcements!”

Webmaster Conference Japan March & April 2019

Google hosted a Webmaster Conference in Okinawa and Fukuoka, Japan in March and April 2019. Gary Illyes and several Japanese Googlers attended.

Sessions at the Japanese Webmaster Conferences included:

  • The New Search Console & Helping The Long Tail Web
  • Google Image Search

Webmaster Conferences are Coming to India and Indonesia

Fifteen Webmaster conferences are coming to India from June 17, 2019 to Aug 09, 2019. Two events are coming to Indonesia.

The complete list of conferences and registration forms for the Indian conferences are here.

According to the official Google announcement for the Indian events:

“We are hosting the Webmaster conferences 2019 (rebranded from Google Search conference) as Google deeply cares about the creation of local language content and discovery of this content. These conferences are a series of one day events across 15 cities in India…”

Read the official Google Webmaster Conferences announcement here

Find a list of upcoming events here

Webmaster Conference Page here.

Images by Shutterstock, Modified by Author

Google’s How News Works, aimed at clarifying news transparency

In May, Google announced the launch of a new website aimed at explaining how they serve and address news across Google properties and platforms.

The site, How News Works, states Google’s mission as it relates to disseminating news in a non-biased manner. The site aggregates a variety of information about how Google crawls, indexes, and ranks news stories as well as how news can be personalized for the end user.

How News Works provides links to various resources within the Google news ecosystem all in one place and is part of The Google News Initiative.

What is The Google News Initiative?

The Google News Initiative (GNI) is Google’s effort to work with news industry professionals to “help journalism thrive in the digital age.” The GNI is driven and summarized by the GNI website which provides information about a variety of initiatives and approaches within Google including:

  • How to work with Google (e.g., partnership opportunities, training tools, funding opportunities)
  • A list of current partnerships and case studies
  • A collection of programs and funding opportunities for journalists and news organizations
  • A catalog of Google products relevant to journalists

Google attempts to work with the news industry in a variety of ways. For example, it provides funding opportunities to help journalists from around the world.

Google is now accepting applications (through mid-July) from North American and Latin American applicants to help fund projects that “drive digital innovation and develop new business models.” Applicants who meet Google’s specified criteria (and are selected) will be awarded up to $300,000 in funding (for U.S. applicants) or $250,000 (for Latin American applicants) with an additional award of up to 70% of the total project cost.

The GNI website also provides users with a variety of training resources and tools. Journalists can learn how to partner with Google to test and deploy new technologies such as the Washington Post’s participation in Google’s AMP Program (accelerated mobile pages).

AMP is an open source initiative that Google launched in February 2016 with the goal of making mobile web pages faster.

AMP mirrors content on traditional web pages, but uses AMP HTML, an open source format architected in an ultra-light way to reduce latency for readers.

News transparency and accountability

The GNI’s How It Works website reinforces Google’s mission to “elevate trustworthy information.” The site explains how the news algorithm works and links to Google’s news content policies.

The content policy covers Google’s approach to accountability and transparency, its requirements for paid or promotional material, copyright, restricted content, privacy/personalization and more.

This new GNI resource, a subsection of the main GNI website, acts as a starting point for journalists and news organizations to delve into Google’s vast news infrastructure including video news on YouTube.

Since it can be difficult to ascertain if news is trustworthy and accurate, this latest initiative by Google is one way that journalists (and the general public) can gain an understanding of how news is elevated and indexed on Google properties.

Related reading

New visual search innovations tap human emotions and biological buying triggers
Digital marketing strategy guide for B2B industrial manufacturers
Google China