Blog

Keep up to date with all the latest news

Google's robots.txt parser is now open source

For 25 years, the Robots Exclusion Protocol (REP) was only a de-facto standard. This had frustrating implications sometimes. On one hand, for webmasters, it meant uncertainty in corner cases, like when their text editor included BOM characters in their robots.txt files. On the other hand, for crawler and tool developers, it also brought uncertainty; for example, how should they deal with robots.txt files that are hundreds of megabytes large?Today, we announced that we're spearheading the effort to make the REP an internet standard. While this is an important step, it means extra work for developers who parse robots.txt files.We're here to help: we open sourced the C++ library that our production systems use for parsing and matching rules in robots.txt files. This library has been around for 20 years and it contains pieces of code that were written in the 90's. Since then, the library evolved; we learned a lot about how webmasters write robots.txt files and corner cases that we had to cover for, and added what we learned over the years also to the internet draft when it made sense.We also included a testing tool in the open source package to help you test a few rules. Once built, the usage is very straightforward:robots_main If you want to check out the library, head over to our GitHub repository for the robots.txt parser. We'd love to see what you can build using it! If you built something using the library, drop us a comment on Twitter, and if you have comments or questions about the library, find us on GitHub.Posted by Edu Pereda, Lode Vandevenne, and Gary, Search Open Sourcing team
  23 Hits
  0 Comments
23 Hits
0 Comments

Vlog # 1: Ryan Clutter of Horizon Media on Google Core Updates & More SEO

In our first ever vlog as part of the Search Engine Roundtable vlog series, I interviewed Ryan Clutter (@RyanClutt). Ryan is the Senior SEO Analyst at Horizon Media and full service marketing agency with an estimated billings of $8 billion and over 2,000 employees.

Note: Keep in mind that this my first run at the vlog, I am still getting use to using the camera and being comfortable with this format. It should get a lot better over time. I did the editing myself on this video but I think the next video, next Monday will be edited by someone more professional. I know there are audio issues and other things I can improve but I am asking to drop feedback in the comments on how to improve these.

In episode one I took you around the Horizon Media office, showed you some views outside of the office and then of the Freedom Tower in NYC. I then asked Ryan about his history in SEO, the types of projects he works on at Horizon and the types of clients he manages. We then talked about his day-to-day as the Senior SEO Analyst at Horizon Media and the tools he uses. Then we jumped into talking about Google algorithm updates, EAT, machine learning, the diversity update, the transparency around those updates and so on. We even talked about what he thinks is the future for our industry and SEO.

Here is the video:

You can subscribe to our YouTube channel by clicking here so you don't miss the next vlog where I interview Lily Ray of Path Interactive. I do have a nice lineup of interviews scheduled with SEOs from IBM, CBS and some of the top agencies, so you don't want to miss them - and I promise to continue to make these vlogs better over time. If you want to be interviewed, please fill out this form with your details.

Continue reading
  23 Hits
  0 Comments
23 Hits
0 Comments

Google Images Makes It Easier To Share GIFs

You all know I love GIFs and now Google is making it easier than ever to share GIFs that you find in Google Image search. I personally cannot replicate the feature on my iPhone on any browser but Google said it was here.

Google announced it on Twitter saying "Now this is something to get excited about. We're GIF-ing you a way to get your message across with a new 'Share GIFs' section on Google Images"

Here is what it looks like:

How will this help you as an SEO? I doubt it will outside of help you relax and smile a bit after dealing with a Google update.

Forum discussion at Twitter.

  36 Hits
  0 Comments
36 Hits
0 Comments

Google: It May Be Worth Looking Into Compressing HTML & CSS

Google's John Mueller confirmed on Twitter that it may be worth looking into compressing your HTML and CSS. He said "Sometimes minifying HTML & CSS can reduce the size of the files, so it can certainly be worth looking into that," when he was asked if it can help with SEO or rankings.

Of course it matters how bloated your pages are right now and if GoogleBot is having issues consuming the content on your pages or consuming all your pages within your site. But overall, tweaking things so your pages load faster is a good thing not just for SEO but more so for your users and conversions.

Here are those tweets:

Sometimes minifying HTML & CSS can reduce the size of the files, so it can certainly be worth looking into that.

— ð John ð (@JohnMu) June 30, 2019

Forum discussion at Twitter.

  28 Hits
  0 Comments
28 Hits
0 Comments

Formalizing the Robots Exclusion Protocol Specification

For 25 years, the Robots Exclusion Protocol (REP) has been one of the most basic and critical components of the web. It allows website owners to exclude automated clients, for example web crawlers, from accessing their sites - either partially or completely.In 1994, Martijn Koster (a webmaster himself) created the initial standard after crawlers were overwhelming his site. With more input from other webmasters, the REP was born, and it was adopted by search engines to help website owners manage their server resources easier.However, the REP was never turned into an official Internet standard, which means that developers have interpreted the protocol somewhat differently over the years. And since its inception, the REP hasn't been updated to cover today's corner cases. This is a challenging problem for website owners because the ambiguous de-facto standard made it difficult to write the rules correctly.We wanted to help website owners and developers create amazing experiences on the internet instead of worrying about how to control crawlers. Together with the original author of the protocol, webmasters, and other search engines, we've documented how the REP is used on the modern web, and submitted it to the IETF.The proposed REP draft reflects over 20 years of real world experience of relying on robots.txt rules, used both by Googlebot and other major crawlers, as well as about half a billion websites that rely on REP. These fine grained controls give the publisher the power to decide what they'd like to be crawled on their site and potentially shown to interested users. It doesn't change the rules created in 1994, but rather defines essentially all undefined scenarios for robots.txt parsing and matching, and extends it for the modern web. Notably:Any URI based transfer protocol can use robots.txt. For example, it's not limited to HTTP anymore and can be used for FTP or CoAP as well. Developers must parse at least the first 500 kibibytes of a robots.txt. Defining a maximum file size ensures that connections are not open for too long, alleviating unnecessary strain on servers. A new maximum caching time of 24 hours or cache directive value if available, gives website owners the flexibility to update their robots.txt whenever they want, and crawlers aren't overloading websites with robots.txt requests. For example, in the case of HTTP, Cache-Control headers could be used for determining caching time. The specification now provisions that when a previously accessible robots.txt file becomes inaccessible due to server failures, known disallowed pages are not crawled for a reasonably long period of time. Additionally, we've updated the augmented Backus–Naur form in the internet draft to better define the syntax of robots.txt, which is critical for developers to parse the lines.RFC stands for Request for Comments, and we mean it: we uploaded the draft to IETF to get feedback from developers who care about the basic building blocks of the internet. As we work to give web creators the controls they need to tell us how much information they want to make available to Googlebot, and by extension, eligible to appear in Search, we have to make sure we get this right.If you'd like to drop us a comment, ask us questions, or just say hi, you can find us on Twitter and in our Webmaster Community, both offline and online.Posted by Henner Zeller, Lizzi Harvey, and Gary
  36 Hits
  0 Comments
36 Hits
0 Comments

All Links are Not Created Equal: 20 New Graphics on Google's Valuation of Links

Posted by Cyrus-Shepard

Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:

Links on the web count as votes. Initially, all votes are equal.Pages which receive more votes become more important (and rank higher.)More important pages cast more important votes.

But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.

Links are no longer equal. Not by a long shot.

Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.

Continue reading
  33 Hits
  0 Comments
33 Hits
0 Comments

How to Deliver JSON-LD Recommendations the Easy Way - Whiteboard Friday

Posted by sergeystefoglo

When you work with large clients whose sites comprise thousands (or hundreds of thousands) of pages, it's a daunting task to add the necessary markup. In today's Whiteboard Friday, we welcome Sergey Stefoglo to share his framework for delivering JSON-LD recommendations in a structured and straightforward way.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hello, Moz fans. My name is Serge. I'm a consultant at Distilled, and this is another edition of Whiteboard Friday. Today I want to take the next few minutes to talk to you about one of my processes for delivering JSON-LD recommendations.

Now it's worth noting upfront that at Distilled we work with a lot of large clients that have a lot of pages on their website, thousands, hundreds of thousands of pages. So if you work at an agency that works with local businesses or smaller clients, this process may be a bit overkill, but I hope you find some use in it regardless.

Continue reading
  162 Hits
  0 Comments
162 Hits
0 Comments

Interactive Content and the Future of Live TV

Fortnite's live performances and Netflix's new ventures are evidence that younger audiences' expectations of content are changing: They don't want to just watch it, they want to experience it. What will be the next way to grab mass audiences' precious time and attention? Read the full article at MarketingProfs
  2 Hits
  0 Comments
2 Hits
0 Comments

Using STAT: How to Uncover Additional Value in Your Keyword Data

Posted by TheMozTeam

Changing SERP features and near-daily Google updates mean that single keyword strategies are no longer viable. Brands have a lot to keep tabs on if they want to stay visible and keep that coveted top spot on the SERP.

That’s why we asked Laura Hampton, Head of Marketing at Impression, to share some of the ways her award-winning team leverages STAT to surface all kinds of insights to make informed decisions.

Snag her expert tips on how to uncover additional value in your keyword data — including how Impression’s web team uses STAT’s API to improve client reporting, how to spot quick wins with dynamic tags, and what new projects they have up their sleeves. Take it away, Laura!

Spotting quick wins 

We all remember the traditional CTR chart. It suggests that websites ranking in position one on the SERPs can expect roughly 30 percent of the clicks available, with position two getting around 12 percent, position three seeing six percent, and so on (disclaimer: these may not be the actual numbers but, let’s face it, this formula is way outdated at this point anyway).

Continue reading
  231 Hits
  0 Comments
231 Hits
0 Comments

Link Building in 2019: Get by With a Little Help From Your Friends

Posted by kelseyreaves

Editor's note: This post first appeared in December of 2015, but because SEO (and Google) changes so quickly, we figured it was time for a refresh! 

The link building world is in a constant state of evolution. New tools are continually introduced to the market, with SEOs ready to discover what works best.

In 2015, I wrote an article for Moz about how our team switched over to a new email automation tool that drastically improved our overall outreach system — we increased our email reply rates by 187 percent in just one month. Which meant that our number of attainable backlinks also drastically increased. I wanted to see what's changed since I last wrote this post. Because in 2019, you need a lot more than new tools to excel in link building.

But first...

Looking back, it was pretty ingenious: Our link building program had automated almost every step in the outreach process. We were emailing hundreds of people a week, guest posting on numerous websites, and raking in 20–30 links per week. If anyone has been in the game long enough, you’ll know that’s an insane amount of links.

Continue reading
  71 Hits
  0 Comments
71 Hits
0 Comments

How to Monitor Hreflang Performance With Dynamic Tags in STAT

Posted by TheMozTeam

This post was originally published on the STAT blog.

If you’re familiar with hreflang, you’ll know just how essential this teensy bit of code is to a successful international campaign. You’ll also know that it comes with a boatload of moving parts — multiple pages, sections, and subdomains per country.

That’s a lot of data to track. And if you aren’t hyper-organized, it’s easy to miss out on some big time insights.

Lucky for you, there’s a handy way to track your hreflang campaigns: all you need are a few dynamic tags in STAT. And even luckier for you, Dan Nutter, Technical SEO Specialist at twentysix, agreed to share his wisdom on this very subject.

Continue reading
  90 Hits
  0 Comments
90 Hits
0 Comments

The Easiest PR-Focused Link Building Tip in the Book - Whiteboard Friday

Posted by randfish

Focused on new link acquisition for your clients or company? Link building is always a slog, but Rand has a PR-focused tip that makes it much easier to find people and publications that'll cover and amplify you. Check it out in this week's edition of Whiteboard Friday!

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we are talking about the easiest link building tip in the book. It is PR-focused, meaning press and public relations focused, and I'll dive right in.

If you are trying to get some new links for your new client, for your website, or for your company, start with this process. 

Continue reading
  79 Hits
  0 Comments
79 Hits
0 Comments

The New Moz Local Is Here! Can't-Miss Highlights & How to Get Started

Posted by MiriamEllis

Last month we announced that the new Moz Local would be arriving soon. We’re so excited — it’s here! If you're a current Moz Local customer, you may have already been exploring the new and improved platform this week! If not, signing up now will get you access to all the new goodies we have in store for you.

With any major change to a tool you use, it can take a bit for you to adjust. That’s why I wanted to write up a quick look at some of the highlights of the product, and from there encourage you to dig into our additional resources.

What are some key features to dig into?Full location data management

More than 90% of purchases happen in physical stores. The first object of local SEO is ensuring that people searching online for what you offer:

Encounter your businessAccess accurate information they can trust about itSee the signals they’re looking for to choose you for a transaction

Moz Local meets this reality with active and continuous synching of location data so that you can grow your authority, visibility, and the public trust by managing your standard business information across partnered data aggregators, apps, sites, and databases. This is software centered around real-time location data and profile management, providing updates as quickly as partners can support them. And, with your authorized connection to Google and Facebook, updates you make to your business data on these two powerhouse platforms are immediate. Moz Local helps you master the online consumer encounter.

Continue reading
  82 Hits
  0 Comments
82 Hits
0 Comments

The Ultimate Guide to Exploring Seattle This MozCon

Posted by Kirsten_Barkved

So, you’ve been debating for years about whether to attend MozCon and you’re finally ready to pull the trigger. Or, maybe you’re still not sure if MozCon is right for you and you’re wondering what the big deal is (a fair and reasonable thought).

Whether you’re still on the fence or looking to get hyped, here’s the spiel for why you should attend this year's MozCon. And if, after seeing our awesome agenda, you're in need more than our stellar line-up and amazing donuts to convince you, then look no further than this post.

We're less than four weeks away from MozCon, so we thought we'd dust off the old "things to do while in Seattle" list. So, if you’re attending or still doing research to see if the juice is worth the squeeze (how responsible of you!), here’s a sampling of the places you'll go whilst in Seattle for MozCon this July 15–17. 

Get your tickets before they're gone!

Continue reading
  55 Hits
  0 Comments
55 Hits
0 Comments

Did Google's Site Diversity Update Live Up to its Promise?

Posted by Dr-Pete

On June 6th, on the heels of a core update, Google announced that a site diversity update was also rolling out. This update offered a unique opportunity, because site diversity is something we can directly measure. Did Google deliver on their promise, or was this announcement mostly a PR play?

There are a lot of ways to measure site diversity, and we're going to dive pretty deep into the data. If you can't wait, here's the short answer — while Google technically improved site diversity, the update was narrowly targeted and we had to dig to find evidence of improvement.

How did average diversity improve?

Using the 10,000-keyword MozCast set, we looked at the average diversity across page-one SERPs. Put simply, we measured how many unique sub-domains were represented within each results page. Since page one of Google can have less than ten organic results, this was expressed as a percentage — specifically, the ratio of unique sub-domains to total organic results on the page. Here's the 30-day graph (May 19 – June 17):

A site diversity of 90 percent on a 10-result SERP would mean that nine out of ten sub-domains were unique, with one repeat. It's hard to see, but between June 6th and 7th, average diversity did improve marginally, from 90.23 percent to 90.72 percent (a 0.49 percent improvement). If we zoom in quite a bit (10x) on the Y-axis, we can see the trend over time:

Continue reading
  65 Hits
  0 Comments
65 Hits
0 Comments

Bye Bye Preferred Domain setting

As we progress with the migration to the new Search Console experience, we will be saying farewell to one of our settings: preferred domain.It's common for a website to have the same content on multiple URLs. For example, it might have the same content on http://example.com/ as on https://www.example.com/index.html. To make things easier, when our systems recognize that, we'll pick one URL as the "canonical" for Search. You can still tell us your preference in multiple ways if there's something specific you want us to pick (see paragraph below). But if you don't have a preference, we'll choose the best option we find. Note that with the deprecation we will no longer use any existing Search Console preferred domain configuration.You can find detailed explanations on how to tell us your preference in the Consolidate duplicate URLs help center article. Here are some of the options available to you:Use rel=”canonical” link tag on HTML pagesUse rel=”canonical” HTTP headerUse a sitemapUse 301 redirects for retired URLsSend us any feedback either through Twitter or our forum.Posted by Daniel Waisberg, Search Advocate
  60 Hits
  0 Comments
60 Hits
0 Comments

5 Ways You Might Mess up When Running SEO Split Tests

Posted by sam.nemzer

SEO split testing is a relatively new concept, but it’s becoming an essential tool for any SEO who wants to call themselves data-driven. People have been familiar with A/B testing in the context of Conversion Rate Optimisation (CRO) for a long time, and applying those concepts to SEO is a logical next step if you want to be confident that what you’re spending your time on is actually going to lead to more traffic.

At Distilled, we’ve been in the fortunate position of working with our own SEO A/B testing tool, which we’ve been using to test SEO recommendations for the last three years. Throughout this time, we’ve been able to hone our technique in terms of how best to set up and measure SEO split tests.

In this post, I’ll outline five mistakes that we’ve fallen victim to over the course of three years of running SEO split tests, and that we commonly see others making.

What is SEO Split testing?

Before diving into how it’s done wrong (and right), it’s worth stopping for a minute to explain what SEO split testing actually is.

Continue reading
  70 Hits
  0 Comments
70 Hits
0 Comments

Webmaster Conference: an event made for you

Over the years we attended hundreds of conferences, we spoke to thousands of webmasters, and recorded hundreds of hours of videos to help web creators find information about how to perform better in Google Search results. Now we'd like to go further: help those who aren't able to travel internationally and access the same information. Today we're officially announcing the Webmaster Conference, a series of local events around the world. These events are primarily located where it's difficult to access search conferences or information about Google Search, or where there's a specific need for a Search event. For example, if we identify that a region has problems with hacked sites, we may organize an event focusing on that specific topic. We want web creators to have equal opportunity in Google Search regardless of their language, financial status, gender, location, or any other attribute. The conferences are always free and easily accessible in the region where they're organized, and, based on feedback from the local communities and analyses, they're tailored for the audience that signed up for the events. That means it doesn't matter how much you already know about Google Search; the event you attend will have takeaways tailored to you. The talks will be in the local language, in case of international speakers through interpreters, and we'll do our best to also offer sign language interpretation if requested. Webmaster Conference OkinawaThe structure of the event varies from region to region. For example, in Okinawa, Japan, we had a wonderful half-day event with novice and advanced web creators where we focused on how to perform better in Google Images. At Webmaster Conference India and Indonesia, that might change and we may focus more on how to create faster websites. We will also host web communities in Europe and North America later this year, so keep an eye out for the announcements! We will continue attending external events as usual; we are doing these events to complement the existing ones. If you want to learn more about our upcoming events, visit the Webmaster Conference site which we'll update monthly, and follow our blogs and @googlewmc on Twitter! Posted by Takeaki Kanaya and Gary
  55 Hits
  0 Comments
55 Hits
0 Comments

A video series on SEO myths for web developers

We invited members of the SEO and web developer community to join us for a new video series called "SEO mythbusting". In this series, we discuss various topics around SEO from a developer's perspective, how we can work to make the "SEO black box" more transparent, and what technical SEO might look like as the web keeps evolving. We already published a few episodes: Web developer's 101: A look at Googlebot: Microformats and structured data: JavaScript and SEO: We have a few more episodes for you and we will launch the next episodes weekly on the Google Webmasters YouTube channel, so don't forget to subscribe to stay in the loop. You can also find all published episodes in this YouTube playlist. We look forward to hearing your feedback, topic suggestions, and guest recommendations in the YouTube comments as well as our Twitter account! Posted by Martin Splitt, friendly web fairy & series host, WTA team
  62 Hits
  0 Comments
62 Hits
0 Comments

Mobile-First Indexing by default for new domains

Over the years since announcing mobile-first indexing - Google's crawling of the web using a smartphone Googlebot - our analysis has shown that new websites are generally ready for this method of crawling. Accordingly, we're happy to announce that mobile-first indexing will be enabled by default for all new, previously unknown to Google Search, websites starting July 1, 2019. It's fantastic to see that new websites are now generally showing users - and search engines - the same content on both mobile and desktop devices!

You can continue to check for mobile-first indexing of your website by using the URL Inspection Tool in Search Console. By looking at a URL on your website there, you'll quickly see how it was last crawled and indexed. For older websites, we'll continue monitoring and evaluating pages for their readiness for mobile first indexing, and will notify them through Search Console once they're seen as being ready. Since the default state for new websites will be mobile-first indexing, there's no need to send a notification.

Using the URL Inspection Tool to check the mobile-first indexing status

Our guidance on making all websites work well for mobile-first indexing continues to be relevant, for new and existing sites. For existing websites we determine their readiness for mobile-first indexing based on parity of content (including text, images, videos, links), structured data, and other meta-data (for example, titles and descriptions, robots meta tags). We recommend double-checking these factors when a website is launched or significantly redesigned.

While we continue to support responsive web design, dynamic serving, and separate mobile URLs for mobile websites, we recommend responsive web design for new websites. Because of issues and confusion we've seen from separate mobile URLs over the years, both from search engines and users, we recommend using a single URL for both desktop and mobile websites.

Continue reading
  59 Hits
  0 Comments
59 Hits
0 Comments
01 July 2019
Website Marketing News
For 25 years, the Robots Exclusion Protocol (REP) was only a de-facto standard. This had frustrating implications sometimes. On one hand, for webmasters, it meant uncertainty in corner cases, like when their text editor included BOM characters in the...

MAKE PAYMENT

Affiliations

Newsletter Signup

FREE SEO REPORT SENT TO YOUR EMAIL

FREE Website Evaluation

Please type your full name.

Invalid email address.

Invalid Input

Invalid Input