In our first ever vlog as part of the Search Engine Roundtable vlog series, I interviewed Ryan Clutter (@RyanClutt). Ryan is the Senior SEO Analyst at Horizon Media and full service marketing agency with an estimated billings of $8 billion and over 2,000 employees.
Note: Keep in mind that this my first run at the vlog, I am still getting use to using the camera and being comfortable with this format. It should get a lot better over time. I did the editing myself on this video but I think the next video, next Monday will be edited by someone more professional. I know there are audio issues and other things I can improve but I am asking to drop feedback in the comments on how to improve these.
In episode one I took you around the Horizon Media office, showed you some views outside of the office and then of the Freedom Tower in NYC. I then asked Ryan about his history in SEO, the types of projects he works on at Horizon and the types of clients he manages. We then talked about his day-to-day as the Senior SEO Analyst at Horizon Media and the tools he uses. Then we jumped into talking about Google algorithm updates, EAT, machine learning, the diversity update, the transparency around those updates and so on. We even talked about what he thinks is the future for our industry and SEO.
Here is the video:
You can subscribe to our YouTube channel by clicking here so you don't miss the next vlog where I interview Lily Ray of Path Interactive. I do have a nice lineup of interviews scheduled with SEOs from IBM, CBS and some of the top agencies, so you don't want to miss them - and I promise to continue to make these vlogs better over time. If you want to be interviewed, please fill out this form with your details.
You all know I love GIFs and now Google is making it easier than ever to share GIFs that you find in Google Image search. I personally cannot replicate the feature on my iPhone on any browser but Google said it was here.
Google announced it on Twitter saying "Now this is something to get excited about. We're GIF-ing you a way to get your message across with a new 'Share GIFs' section on Google Images"
Here is what it looks like:
How will this help you as an SEO? I doubt it will outside of help you relax and smile a bit after dealing with a Google update.
Forum discussion at Twitter.
Google's John Mueller confirmed on Twitter that it may be worth looking into compressing your HTML and CSS. He said "Sometimes minifying HTML & CSS can reduce the size of the files, so it can certainly be worth looking into that," when he was asked if it can help with SEO or rankings.
Of course it matters how bloated your pages are right now and if GoogleBot is having issues consuming the content on your pages or consuming all your pages within your site. But overall, tweaking things so your pages load faster is a good thing not just for SEO but more so for your users and conversions.
Here are those tweets:
Sometimes minifying HTML & CSS can reduce the size of the files, so it can certainly be worth looking into that.— ð John ð (@JohnMu) June 30, 2019
Forum discussion at Twitter.
Posted by Cyrus-Shepard
Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:Links on the web count as votes. Initially, all votes are equal.Pages which receive more votes become more important (and rank higher.)More important pages cast more important votes.
But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.
Posted by sergeystefoglo
When you work with large clients whose sites comprise thousands (or hundreds of thousands) of pages, it's a daunting task to add the necessary markup. In today's Whiteboard Friday, we welcome Sergey Stefoglo to share his framework for delivering JSON-LD recommendations in a structured and straightforward way.
Click on the whiteboard image above to open a high-resolution version in a new tab!Video Transcription
Hello, Moz fans. My name is Serge. I'm a consultant at Distilled, and this is another edition of Whiteboard Friday. Today I want to take the next few minutes to talk to you about one of my processes for delivering JSON-LD recommendations.
Now it's worth noting upfront that at Distilled we work with a lot of large clients that have a lot of pages on their website, thousands, hundreds of thousands of pages. So if you work at an agency that works with local businesses or smaller clients, this process may be a bit overkill, but I hope you find some use in it regardless.
Posted by TheMozTeam
Changing SERP features and near-daily Google updates mean that single keyword strategies are no longer viable. Brands have a lot to keep tabs on if they want to stay visible and keep that coveted top spot on the SERP.
That’s why we asked Laura Hampton, Head of Marketing at Impression, to share some of the ways her award-winning team leverages STAT to surface all kinds of insights to make informed decisions.
Snag her expert tips on how to uncover additional value in your keyword data — including how Impression’s web team uses STAT’s API to improve client reporting, how to spot quick wins with dynamic tags, and what new projects they have up their sleeves. Take it away, Laura!Spotting quick wins
We all remember the traditional CTR chart. It suggests that websites ranking in position one on the SERPs can expect roughly 30 percent of the clicks available, with position two getting around 12 percent, position three seeing six percent, and so on (disclaimer: these may not be the actual numbers but, let’s face it, this formula is way outdated at this point anyway).
Posted by kelseyreaves
Editor's note: This post first appeared in December of 2015, but because SEO (and Google) changes so quickly, we figured it was time for a refresh!
The link building world is in a constant state of evolution. New tools are continually introduced to the market, with SEOs ready to discover what works best.
In 2015, I wrote an article for Moz about how our team switched over to a new email automation tool that drastically improved our overall outreach system — we increased our email reply rates by 187 percent in just one month. Which meant that our number of attainable backlinks also drastically increased. I wanted to see what's changed since I last wrote this post. Because in 2019, you need a lot more than new tools to excel in link building.But first...
Looking back, it was pretty ingenious: Our link building program had automated almost every step in the outreach process. We were emailing hundreds of people a week, guest posting on numerous websites, and raking in 20–30 links per week. If anyone has been in the game long enough, you’ll know that’s an insane amount of links.
Posted by TheMozTeam
This post was originally published on the STAT blog.
If you’re familiar with hreflang, you’ll know just how essential this teensy bit of code is to a successful international campaign. You’ll also know that it comes with a boatload of moving parts — multiple pages, sections, and subdomains per country.
That’s a lot of data to track. And if you aren’t hyper-organized, it’s easy to miss out on some big time insights.
Lucky for you, there’s a handy way to track your hreflang campaigns: all you need are a few dynamic tags in STAT. And even luckier for you, Dan Nutter, Technical SEO Specialist at twentysix, agreed to share his wisdom on this very subject.
Posted by randfish
Focused on new link acquisition for your clients or company? Link building is always a slog, but Rand has a PR-focused tip that makes it much easier to find people and publications that'll cover and amplify you. Check it out in this week's edition of Whiteboard Friday!
Click on the whiteboard image above to open a high-resolution version in a new tab!Video Transcription
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we are talking about the easiest link building tip in the book. It is PR-focused, meaning press and public relations focused, and I'll dive right in.
If you are trying to get some new links for your new client, for your website, or for your company, start with this process.
Posted by MiriamEllis
Last month we announced that the new Moz Local would be arriving soon. We’re so excited — it’s here! If you're a current Moz Local customer, you may have already been exploring the new and improved platform this week! If not, signing up now will get you access to all the new goodies we have in store for you.
With any major change to a tool you use, it can take a bit for you to adjust. That’s why I wanted to write up a quick look at some of the highlights of the product, and from there encourage you to dig into our additional resources.What are some key features to dig into?Full location data management
More than 90% of purchases happen in physical stores. The first object of local SEO is ensuring that people searching online for what you offer:Encounter your businessAccess accurate information they can trust about itSee the signals they’re looking for to choose you for a transaction
Moz Local meets this reality with active and continuous synching of location data so that you can grow your authority, visibility, and the public trust by managing your standard business information across partnered data aggregators, apps, sites, and databases. This is software centered around real-time location data and profile management, providing updates as quickly as partners can support them. And, with your authorized connection to Google and Facebook, updates you make to your business data on these two powerhouse platforms are immediate. Moz Local helps you master the online consumer encounter.
Posted by Kirsten_Barkved
So, you’ve been debating for years about whether to attend MozCon and you’re finally ready to pull the trigger. Or, maybe you’re still not sure if MozCon is right for you and you’re wondering what the big deal is (a fair and reasonable thought).
Whether you’re still on the fence or looking to get hyped, here’s the spiel for why you should attend this year's MozCon. And if, after seeing our awesome agenda, you're in need more than our stellar line-up and amazing donuts to convince you, then look no further than this post.
We're less than four weeks away from MozCon, so we thought we'd dust off the old "things to do while in Seattle" list. So, if you’re attending or still doing research to see if the juice is worth the squeeze (how responsible of you!), here’s a sampling of the places you'll go whilst in Seattle for MozCon this July 15–17.
Get your tickets before they're gone!
Posted by Dr-Pete
On June 6th, on the heels of a core update, Google announced that a site diversity update was also rolling out. This update offered a unique opportunity, because site diversity is something we can directly measure. Did Google deliver on their promise, or was this announcement mostly a PR play?
There are a lot of ways to measure site diversity, and we're going to dive pretty deep into the data. If you can't wait, here's the short answer — while Google technically improved site diversity, the update was narrowly targeted and we had to dig to find evidence of improvement.How did average diversity improve?
Using the 10,000-keyword MozCast set, we looked at the average diversity across page-one SERPs. Put simply, we measured how many unique sub-domains were represented within each results page. Since page one of Google can have less than ten organic results, this was expressed as a percentage — specifically, the ratio of unique sub-domains to total organic results on the page. Here's the 30-day graph (May 19 – June 17):
A site diversity of 90 percent on a 10-result SERP would mean that nine out of ten sub-domains were unique, with one repeat. It's hard to see, but between June 6th and 7th, average diversity did improve marginally, from 90.23 percent to 90.72 percent (a 0.49 percent improvement). If we zoom in quite a bit (10x) on the Y-axis, we can see the trend over time:
Posted by sam.nemzer
SEO split testing is a relatively new concept, but it’s becoming an essential tool for any SEO who wants to call themselves data-driven. People have been familiar with A/B testing in the context of Conversion Rate Optimisation (CRO) for a long time, and applying those concepts to SEO is a logical next step if you want to be confident that what you’re spending your time on is actually going to lead to more traffic.
At Distilled, we’ve been in the fortunate position of working with our own SEO A/B testing tool, which we’ve been using to test SEO recommendations for the last three years. Throughout this time, we’ve been able to hone our technique in terms of how best to set up and measure SEO split tests.
In this post, I’ll outline five mistakes that we’ve fallen victim to over the course of three years of running SEO split tests, and that we commonly see others making.What is SEO Split testing?
Before diving into how it’s done wrong (and right), it’s worth stopping for a minute to explain what SEO split testing actually is.
Over the years since announcing mobile-first indexing - Google's crawling of the web using a smartphone Googlebot - our analysis has shown that new websites are generally ready for this method of crawling. Accordingly, we're happy to announce that mobile-first indexing will be enabled by default for all new, previously unknown to Google Search, websites starting July 1, 2019. It's fantastic to see that new websites are now generally showing users - and search engines - the same content on both mobile and desktop devices!
You can continue to check for mobile-first indexing of your website by using the URL Inspection Tool in Search Console. By looking at a URL on your website there, you'll quickly see how it was last crawled and indexed. For older websites, we'll continue monitoring and evaluating pages for their readiness for mobile first indexing, and will notify them through Search Console once they're seen as being ready. Since the default state for new websites will be mobile-first indexing, there's no need to send a notification.
Using the URL Inspection Tool to check the mobile-first indexing status
Our guidance on making all websites work well for mobile-first indexing continues to be relevant, for new and existing sites. For existing websites we determine their readiness for mobile-first indexing based on parity of content (including text, images, videos, links), structured data, and other meta-data (for example, titles and descriptions, robots meta tags). We recommend double-checking these factors when a website is launched or significantly redesigned.
While we continue to support responsive web design, dynamic serving, and separate mobile URLs for mobile websites, we recommend responsive web design for new websites. Because of issues and confusion we've seen from separate mobile URLs over the years, both from search engines and users, we recommend using a single URL for both desktop and mobile websites.