Optimizing for Searcher Intent Explained in 7 Visuals

Posted by randfish

Ever get that spooky feeling that Google somehow knows exactly what you mean, even when you put a barely-coherent set of words in the search box? You’re not alone. The search giant has an uncanny ability to un-focus on the keywords in the search query and apply behavioral, content, context, and temporal/historical signals to give you exactly the answer you want.

For marketers and SEOs, this poses a frustrating challenge. Do we still optimize for keywords? The answer is “sort of.” But I think I can show you how to best think about this in a few quick visuals, using a single search query.

First… A short story.

I sent a tweet over the weekend about an old Whiteboard Friday video. Emily Grossman, longtime friend, all-around marketing genius, and official-introducer-of-millenial-speak-to-GenXers-like-me replied.

Emily makes fun of Rand's mustache on Twitter

Ha ha Emily. I already made fun of my own mustache so…

Anywho, I searched Google for “soz.” Not because I didn’t know what it means. I can read between lines. I’m hip. But, you know, sometimes a Gen-Xer wants to make sure.

The results confirm my guess, but they also helped illustrate a point of frequent frustration I have when trying to explain modern vs. classic SEO. I threw together these seven visuals to illustrate.

There you have it friends. Classic SEO ranking inputs still matter. They can still help. They’re often the difference between making it to the top 10 vs. having no shot. But too many SEOs get locked into the idea that rankings are made up of a combination of the “Old School Five”:

  1. Keyword use
  2. Links to the page
  3. Domain authority
  4. Anchor text
  5. Freshness

Don’t get me wrong — sometimes, these signals in a powerful enough combination can overwhelm Google’s other inputs. But those examples are getting harder to find.

The three big takeaways for every marketer should be:

  1. Google is working hard to keep searchers on Google. If you help them do that, they’ll often help you rank (whether this is a worthwhile endeavor or a Prisoner’s Dilemma is another matter)
  2. When trying to reverse why something ranks in Google, add the element of “how well does this solve the searcher’s query”
  3. If you’re trying to outrank a competitor, how you align your title, meta description, first few sentences of text, and content around what the searcher truly wants can make the difference… even if you don’t win on links 😉

Related: if you want to see how hard Google’s working to keep searchers on their site vs. clicking results, I’ve got some research on SparkToro showing precisely that.

P.S. I don’t actually believe in arbitrary birth year ranges for segmenting cohorts of people. The differences between two individuals born in 1981 can be vastly wider than for two people born in 1979 and 1985. Boomer vs. Gen X vs. Millenial vs. Gen Z is crappy pseudoscience rooted in our unhealthy desire to categorize and pigeonhole others. Reject that ish.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Five ways SEOs can utilize data with insights, automation, and personalization

Five ways SEOs can utilize data with insights, automation, and personalization.

Constantly evolving search results driven by Google’s increasing implementation of AI are challenging SEOs to keep pace. Search is more dynamic, competitive, and faster than ever before.

Where SEOs used to focus almost exclusively on what Google and other search engines were looking for in their site structure, links, and content, digital marketing now revolves solidly around the needs and intent of consumers.

This past year was perhaps the most transformative in SEO, an industry expected to top $80 billion in spending by 2020. AI is creating entirely new engagement possibilities across multiple channels and devices. Consumers are choosing to find and interact with information by voice search, or even on connected IoT appliances, and other devices. Brands are being challenged to reimagine the entire customer journey and how they optimize content for search, as a result.

How do you even begin to prioritize when your to-do list and the data available to you are growing at such a rapid pace? The points shared below intend to help you with that.

From analysis to activation, data is key

SEO is becoming less a matter of simply optimizing for search. Today, SEO success hinges on our ability to seize every opportunity. Research from my company’s Future of Marketing and AI Study highlights current opportunities in five important areas.

1. Data cleanliness and structure

As the volume of data consumers are producing in their searches and interactions increases, it’s critically important that SEOs properly tag and structure the information we want search engines to match to those queries. Google offers rich snippets and cards that enable you to expand and enhance your search results, making them more visually appealing but also adding functionality and opportunities to engage.

Example of structured data on Google

Google has experimented with a wide variety of rich results, and you can expect them to continue evolving. Therefore, it’s best practice to properly mark up all content so that when a rich search feature becomes available, your content is in place to capitalize on the opportunity.

You can use the Google Developers “Understand how structured data works” guide to get started and test your structured data for syntax errors here.

2. Increasingly automated actionable insights

While Google is using AI to interpret queries and understand results, marketers are deploying AI to analyze data, recognize patterns and deliver insights as output at rates humans simply cannot achieve. AI is helping SEOs in interpreting market trends, analyzing site performance, gathering and understanding competitor performance, and more.

It’s not just that we’re able to get insights faster, though. The insights available to us now may have gone unnoticed, if not for the in-depth analysis we can accomplish with AI.

Machines are helping us analyze different types of media to understand the content and context of millions of images at a time and it goes beyond images and video. With Google Lens, for example, augmented reality will be used to glean query intent from objects rather than expressed words.

Opportunities for SEOs include:

  • Greater ability to define opportunity space more precisely in a competitive context. Understand underlying need in a customer journey
  • Deploying longer-tail content informed by advanced search insights
  • Better content mapping to specific expressions of consumer intent across the buying journey

3. Real-time response and interactions

In a recent “State of Chatbots” report, researchers asked consumers to identify problems with traditional online experiences by posing the question, “What frustrations have you experienced in the past month?”

Screenshot of users' feedback on website usage experiences

As you can see, at least seven of the top consumer frustrations listed above can be solved with properly programmed chatbots. It’s no wonder that they also found that 69% of consumers prefer chatbots for quick communication with brands.

Search query and online behavior data can make smart bots so compelling and efficient in delivering on consumer needs that in some cases, the visitor may not even realize it’s an automated tool they’re dealing with. It’s a win for the consumer, who probably isn’t there for a social visit anyway as well as for the brand that seeks to deliver an exceptional experience even while improving operational efficiency.

SEOs have an opportunity to:

  • Facilitate more productive online store consumer experiences with smart chatbots.
  • Redesign websites to support visual and voice search.
  • Deploy deep learning, where possible, to empower machines to make decisions, and respond in real-time.

4. Smart automation

SEOs have been pretty ingenious at automating repetitive, time-consuming tasks such as pulling rankings reports, backlink monitoring, and keyword research. In fact, a lot of quality digital marketing software was born out of SEOs automating their own client work.

Now, AI is enabling us to make automation smarter by moving beyond simple task completion to prioritization, decision-making, and executing new tasks based on those data-backed decisions.

Survey on content development using AI

Content marketing is one area where AI can have a massive impact, and marketers are on board. We found that just four percent of respondents felt they were unlikely to use AI/deep learning in their content strategy in 2018, and over 42% had already implemented it.

In content marketing, AI can help us quickly analyze consumer behavior and data, in order to:

  • Identify content opportunities
  • Build optimized content
  • Promote the right content to the most motivated audience segments and individuals

5. Personalizations that drive business results

Personalization was identified as the top trend in marketing at the time of our survey, followed closely by AI (which certainly drives more accurate personalizations). In fact, you could argue that the top four trends namely, personalization, AI, voice search, and mobile optimization are closely connected if not overlapping in places.

Across emails, landing pages, paid advertising campaigns, and more, search insights are being injected into and utilized across multiple channels. These intend to help us better connect content to consumer needs.

Each piece of content produced must be purposeful. It needs to be optimized for discovery, a process that begins in content planning as you identify where consumers are going to find and engage with each piece. Smart content is personalized in such a way that it meets a specific consumer’s need, but it must deliver on the monetary needs of the business, as well.

Check out these 5 steps for making your content smarter from a previous column for more.

How SEOs are uniquely positioned to drive smarter digital marketing forward

As the marketing professionals have one foot in analysis and the other solidly planted in creative, SEOs have a unique opportunity to lead smart utilization and activation of all manners of consumer data.

You understand the critical importance of clean data input (or intelligent systems that can clean and make sense of unstructured data) and differentiating between first and third-party data. You understand economies of scale in SEO and the value in building that scalability into systems from the ground up.

SEOs have long nurtured a deep understanding of how people search for and discover information, and how technology delivers. Make the most of your current opportunities by picking your low-hanging fruit opportunities for quick wins. Focus your efforts on putting the scalable, smart systems in place that will allow you to anticipate consumer needs, react quickly, report SEO appropriately, and convey business results to the stakeholders who will determine budgets in future.

Jim Yu is the founder and CEO of leading enterprise SEO and content performance platform BrightEdge. He can be found on Twitter .

You might like to read these next:

Related reading

How to speed up SEO analysis API advantages for SEO experts (with bonus)
Common technical SEO issues and fixes, for aggregators and finance brands
faceted navigation in ecommerce
marketing automation for SEOs, five time-saving strategies

A new app to map and monitor the world’s freshwater supplyA new app to map and monitor the world’s freshwater supply

Today, on World Water Day, we’re proud to showcase a new platform enabling all countries to freely measure and monitor when and where water is changing: UN’s Water-Related Ecosystems, or sdg661.app. Released last week in Nairobi at the UN Environment Assembly (UNEA), the app provides statistics for every country’s annual surface water (like lakes and rivers). It also shows changes from 1984 through 2018 through interactive maps, graphs and full-data downloads.

This project is only possible because of the unique partnerships between three very different organizations. In 2016, European Commission’s Joint Research Centre (JRC) and Google released the Global Surface Water Explorer in tandem with a publication in “Nature.” An algorithm developed by the JRC to map water was run on Google Earth Engine. The process took more than 10 million hours of computing time, spread across more than 10,000 computers in parallel, a feat that would have taken 600 years if run on a modern desktop computer. But the sheer magnitude of the high resolution global data product tended to limit analysis to only the most tech savvy users and countries.

The new app, created in partnership with United Nations Environment, aims to make this water data available to everyone. Working with member countries to understand their needs, it features smaller, more easily manageable tables and maps at national and water body levels. Countries can compare data with one another, and for the first time gain greater understanding of the effects of water policy, and infrastructure like dams, diversions, and irrigation practices on water bodies that are shared across borders.

Ask a Techspert: Why am I getting so many spam calls?Ask a Techspert: Why am I getting so many spam calls?

Editor’s Note: Do you ever feel like a fish out of water? Try being a tech novice and talking to an engineer at a place like Google. Ask a Techspert is a new series on the Keyword asking Googler experts to explain complicated technology for the rest of us. This isn’t meant to be comprehensive, but just enough to make you sound smart at a dinner party.

Growing up, I was taught to say “Schottenfels residence” when answering the phone. It was the polite way of doing things. When the phone rang, it was usually family, friends and, yes, the occasional telemarketer on the other side of the line. Then things changed. Personal calls moved to mobile phones, and the landline became the domain of robocalls. My cell was a sanctuary, free of the pesky automated dialers that plague the landlines of yore. Until recently.

Today, it feels like the only phone calls I get are spam calls. And I know I’m not alone. According to a recent Google survey, half of respondents received at least one spam call per day, and one third received two or more per day.

And people are answering those calls. More than one third of respondents worry that a call from an unknown number is a call about a loved one, and another third think it could be a call from a potential loved one, so they pick up. And almost everyone agrees: Spam calls are the worst. In fact, 75 percent of those surveyed think spam calls are more annoying than spam texts or emails.

So what’s the deal with spam calls? And how can we stop them from happening? For the latest edition of Ask a Techspert, I spoke to Paul Dunlop, the product manager for the Google Phone App, to better understand why, all of the sudden, spam calls are happening so frequently, and what tools, like Pixel’s Call Screen feature, you can use  to avoid the headache.

Why spam calls are more common lately

According to Paul, voice-over IP (VoIP) is the culprit. These are phone calls made using the web instead of a traditional telephone line, and today they’re cheaper and easier than ever to use. “Using VoIP technology, spammers place phone calls over the Internet and imitate a different phone number,” Paul says. “It used to be that they had a fixed number, and you could block that number. Now with VoIP, spammers have the ability to imitate any phone number.” Paul says this became possible when companies, which wanted to call customers from call centers, made it so one general 1-800 number for a business showed up on caller IDs. So what started as a common-sense solution ended up becoming an easy loophole for spammers.

This is called spoofing, and there’s nothing in phone systems—the infrastructure of telephones—that can prevent spam callers from imitating numbers. “You can actually be spammed by your own phone number,” Paul says. “But the most common is neighborhood spam, using your area code and the first three digits of your phone number, which increases the likelihood you’ll answer.”

How Pixel can help you avoid picking up spam calls

Social listening 101: Six crucial keywords to track

Social listening 101 Six crucial keywords to track

Social listening is a tactic that’s not unheard of. Quite a number of brands use it these days and even more consider trying it out in the near future. However, for many, the step-by-step process of social listening remains unclear.

This article aims to answer the most burning questions about social listening:

  • What is a keyword?
  • Which keywords should you monitor?
  • How do you get relevant and comprehensive results instead of all the noise that the Internet is filled with?

What is a keyword?

As we know, social listening is a process that requires a social media listening/social media monitoring tool (e.g., Awario, Mention, Brandwatch). The first thing you do when you open the app is entering keywords to monitor.

Keywords are the words that describe best what you want to find on social media platforms and the web. A keyword can be one word (e.g. “Philips”), two words (e.g. “Aleh Barysevich”), four words (e.g. “search engine optimization tool”), etc. Each one of these examples presents one keyword. After you typed in your keyword(s), the tool will search for mentions of these keywords and collect them in a single place.

Screenshot of mentions for a specific keyword

Which keywords should you monitor?

You can monitor absolutely anything. You can monitor the keywords “Brexit” or “let’s dance” or “hello, is it me you’re looking for”. However, in terms of marketing purposes, there are six main types of keywords that you are most likely to monitor. They are:

1. Brand/company
2. Competitors
3. Person
4. Campaign
5. Industry
6. URL

Now let’s go through each type together to make sure you understand the goals behind monitoring these keywords and how to get the most out of them.

1. Brand/Company

Monitoring your brand/your company is essential in most cases. While the goals of social listening can be very diverse (reputation management, brand awareness, influencer marketing, customer service), most of these goals require listening to what people say about your brand.

To make sure you don’t miss any valuable mentions, include common misspellings and abbreviations of your brand name as well.

In case your brand name is a common word (e.g. “Apple” or “Orange”) make sure to choose a tool that gives you an option to introduce “negative” keywords. These would be keywords such as “apple tree”, “apple juice”, “apple pie”. Excluding them from your search will help get mentions of Apple the brand only. Any tool that has a boolean search option will also save you from tons of such irrelevant mentions.

2. Competitors

Pick a couple of your main competitors (or even just one), and enter their brand/company name as a separate project. There’s a good reason for that: Questions and complaints directed at your competitors can be replied by your social media manager first. They could explain why your brand is better/doesn’t have specific problems that your competitor does. This is social selling, a process of finding hot leads on social media.

Most social media monitoring tools also let you compare how your brand is doing on social media against your competitor’s brand. This can be useful for tracking your progress and discovering new ideas.

For example, knowing which social networks, which locations, and what time slots get your competitor more attention could help you upgrade your social media strategy. Knowing how their campaigns, social media posts, and product releases perform could help you improve your own plans, and avoid some mishaps.

3. Person

The CEO of your company might not necessarily be the company’s face or even a public persona at all. However, if reputation management is one of your goals, monitoring mentions of the CEO are important. Their actions on social media could easily attract attention and cause a social media crisis. Also, you’ll know straight away about any publications that mention your company’s CEO.

Same, of course, goes for any other people in the company.

4. Campaign

It’s crucial to monitor marketing (and other) campaigns as well as product launches. Reactions on social media happen very quickly. Only by monitoring such events in real time, you’ll know straight away if it’s going well or not, if it’s working at all, and if there are problems that you might’ve not noticed while creating the campaign. The earlier you know how the reality is unfolding, the better. To monitor a campaign, enter its name if it has one, its slogan, and/or its hashtag as a keyword.

Example of how social media activities could go wrong

It’s important to understand that there are loads of marketing campaigns that have caused serious problems for the companies. Something that could’ve been avoided with social media monitoring.

5. Industry

Not in every industry can you monitor the so-called “industry keywords”. However, if you can, these are the source of endless opportunities. Most of these are in the realms of social selling, brand awareness, and influencer marketing.

For example, if your product is a productivity app, this would be your keyword “productivity app”. Include a couple of synonyms and words such as “looking for”, or “can anyone recommend” and you’ll get mentions from people that look for a product like yours. Specify the language and the location to get more relevant results.

With a social media monitoring tool that finds influencers, you can go to the list of influencers that is built around your industry keywords and choose the ones to work with.

Example of finding influencers using social listening keywords

6. URL

Monitoring your brand by excluding your brand’s URL (which is possible with a social media monitoring tool) is important for SEO purposes. It’s a big part of link-building. All you have to do is find mentions of your brand that don’t link to your brand, reach out to the author, and ask for a link. In most cases, the authors wouldn’t mind adding the link to your site.

Besides, you can monitor competitors’ URLs. This will give you a list of sources where they get links from. It’s only logical that if the author is interested in the niche and is willing to write about your competitor, they probably wouldn’t mind reviewing your product as well.

Conclusion

There’s a lot you can do with social media monitoring. All you have to do is start. Starting is the hardest part. Then, appetite, ideas, and knowledge come with eating. Hopefully, this article gave you a clear idea of where to start.

Aleh is the Founder and CMO at SEO PowerSuite and Awario. He can be found on Twitter at .

Related reading

webinar marketing

Hot off the press: Talking media with Google News Lab’s directorHot off the press: Talking media with Google News Lab’s director

When I was growing up, reading the news meant thumbing through the local paper every week on my way to the Sunday comics section. These days, staying up-to-date on world events looks a little different: I skim email newsletters, scroll through social media feeds, occasionally pick up a magazine, and of course, read Google News.

As newsrooms around the world keep up with these changes, there’s one team at Google thinking about how technology can help build the future of media: the News Lab. To mark the one-year anniversary of the Google News Initiative, I sat down with News Lab’s director and cofounder, Olivia Ma, for today’s She Word interview. Here’s what I learned—straight from the source—about why Olivia set out on this career path, how she stays focused in a world where the news never sleeps and what she’s reading outside of the office.

How do you explain your job at a dinner party?

As the mother of two young kids, I don’t make it to that many dinner parties these days. But if I find myself at a table filled with adults, I’d tell them this: I lead a team at Google called News Lab that works with newsrooms across the globe to help them navigate the transition to a digital future. 

In the early days of News Lab, we focused on training journalists to use our products that helped them tell stories, such as Google Trends and Google Earth. Now, we immerse ourselves in the needs of journalists, publishers and news consumers so that our engineering teams can build better products. Every day we work to answer the question: How can technology play a role in helping newsrooms grow their audiences and build sustainable businesses?

What initially drew you to journalism?  

My dad spent his career working as a journalist at publications like Newsweek, U.S. News and World Report and The Washington Post. As a kid, my class would visit my his office to learn about how magazines and newspapers were printed—the old fashioned way, with ink and paper.

It wasn’t until college that I also caught the journalism bug, and I decided to dedicate my career to tackling the tricky challenges facing the news industry. By that time, my dad had started working at The Washington Post where he helped transition the newspaper online. Up until he passed away in 2011 we’d talk about what we thought journalism would look like in the digital age. I’m honored to continue his legacy—albeit from a different vantage point.

The One-Hour Guide to SEO, Part 2: Keyword Research – Whiteboard Friday

Posted by randfish

Before doing any SEO work, it’s important to get a handle on your keyword research. Aside from helping to inform your strategy and structure your content, you’ll get to know the needs of your searchers, the search demand landscape of the SERPs, and what kind of competition you’re up against.

In the second part of the One-Hour Guide to SEO, the inimitable Rand Fishkin covers what you need to know about the keyword research process, from understanding its goals to building your own keyword universe map. Enjoy!

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another portion of our special edition of Whiteboard Friday, the One-Hour Guide to SEO. This is Part II – Keyword Research. Hopefully you’ve already seen our SEO strategy session from last week. What we want to do in keyword research is talk about why keyword research is required. Why do I have to do this task prior to doing any SEO work?

The answer is fairly simple. If you don’t know which words and phrases people type into Google or YouTube or Amazon or Bing, whatever search engine you’re optimizing for, you’re not going to be able to know how to structure your content. You won’t be able to get into the searcher’s brain, into their head to imagine and empathize with them what they actually want from your content. You probably won’t do correct targeting, which will mean your competitors, who are doing keyword research, are choosing wise search phrases, wise words and terms and phrases that searchers are actually looking for, and you might be unfortunately optimizing for words and phrases that no one is actually looking for or not as many people are looking for or that are much more difficult than what you can actually rank for.

The goals of keyword research

So let’s talk about some of the big-picture goals of keyword research. 

Understand the search demand landscape so you can craft more optimal SEO strategies

First off, we are trying to understand the search demand landscape so we can craft better SEO strategies. Let me just paint a picture for you.

I was helping a startup here in Seattle, Washington, a number of years ago — this was probably a couple of years ago — called Crowd Cow. Crowd Cow is an awesome company. They basically will deliver beef from small ranchers and small farms straight to your doorstep. I personally am a big fan of steak, and I don’t really love the quality of the stuff that I can get from the store. I don’t love the mass-produced sort of industry around beef. I think there are a lot of Americans who feel that way. So working with small ranchers directly, where they’re sending it straight from their farms, is kind of an awesome thing.

But when we looked at the SEO picture for Crowd Cow, for this company, what we saw was that there was more search demand for competitors of theirs, people like Omaha Steaks, which you might have heard of. There was more search demand for them than there was for “buy steak online,” “buy beef online,” and “buy rib eye online.” Even things like just “shop for steak” or “steak online,” these broad keyword phrases, the branded terms of their competition had more search demand than all of the specific keywords, the unbranded generic keywords put together.

That is a very different picture from a world like “soccer jerseys,” where I spent a little bit of keyword research time today looking, and basically the brand names in that field do not have nearly as much search volume as the generic terms for soccer jerseys and custom soccer jerseys and football clubs’ particular jerseys. Those generic terms have much more volume, which is a totally different kind of SEO that you’re doing. One is very, “Oh, we need to build our brand. We need to go out into this marketplace and create demand.” The other one is, “Hey, we need to serve existing demand already.”

So you’ve got to understand your search demand landscape so that you can present to your executive team and your marketing team or your client or whoever it is, hey, this is what the search demand landscape looks like, and here’s what we can actually do for you. Here’s how much demand there is. Here’s what we can serve today versus we need to grow our brand.

Create a list of terms and phrases that match your marketing goals and are achievable in rankings

The next goal of keyword research, we want to create a list of terms and phrases that we can then use to match our marketing goals and achieve rankings. We want to make sure that the rankings that we promise, the keywords that we say we’re going to try and rank for actually have real demand and we can actually optimize for them and potentially rank for them. Or in the case where that’s not true, they’re too difficult or they’re too hard to rank for. Or organic results don’t really show up in those types of searches, and we should go after paid or maps or images or videos or some other type of search result.

Prioritize keyword investments so you do the most important, high-ROI work first

We also want to prioritize those keyword investments so we’re doing the most important work, the highest ROI work in our SEO universe first. There’s no point spending hours and months going after a bunch of keywords that if we had just chosen these other ones, we could have achieved much better results in a shorter period of time.

Match keywords to pages on your site to find the gaps

Finally, we want to take all the keywords that matter to us and match them to the pages on our site. If we don’t have matches, we need to create that content. If we do have matches but they are suboptimal, not doing a great job of answering that searcher’s query, well, we need to do that work as well. If we have a page that matches but we haven’t done our keyword optimization, which we’ll talk a little bit more about in a future video, we’ve got to do that too.

Understand the different varieties of search results

So an important part of understanding how search engines work — we’re going to start down here and then we’ll come back up — is to have this understanding that when you perform a query on a mobile device or a desktop device, Google shows you a vast variety of results. Ten or fifteen years ago this was not the case. We searched 15 years ago for “soccer jerseys,” what did we get? Ten blue links. I think, unfortunately, in the minds of many search marketers and many people who are unfamiliar with SEO, they still think of it that way. How do I rank number one? The answer is, well, there are a lot of things “number one” can mean today, and we need to be careful about what we’re optimizing for.

So if I search for “soccer jersey,” I get these shopping results from Macy’s and soccer.com and all these other places. Google sort has this sliding box of sponsored shopping results. Then they’ve got advertisements below that, notated with this tiny green ad box. Then below that, there are couple of organic results, what we would call classic SEO, 10 blue links-style organic results. There are two of those. Then there’s a box of maps results that show me local soccer stores in my region, which is a totally different kind of optimization, local SEO. So you need to make sure that you understand and that you can convey that understanding to everyone on your team that these different kinds of results mean different types of SEO.

Now I’ve done some work recently over the last few years with a company called Jumpshot. They collect clickstream data from millions of browsers around the world and millions of browsers here in the United States. So they are able to provide some broad overview numbers collectively across the billions of searches that are performed on Google every day in the United States.

Click-through rates differ between mobile and desktop

The click-through rates look something like this. For mobile devices, on average, paid results get 8.7% of all clicks, organic results get about 40%, a little under 40% of all clicks, and zero-click searches, where a searcher performs a query but doesn’t click anything, Google essentially either answers the results in there or the searcher is so unhappy with the potential results that they don’t bother taking anything, that is 62%. So the vast majority of searches on mobile are no-click searches.

On desktop, it’s a very different story. It’s sort of inverted. So paid is 5.6%. I think people are a little savvier about which result they should be clicking on desktop. Organic is 65%, so much, much higher than mobile. Zero-click searches is 34%, so considerably lower.

There are a lot more clicks happening on a desktop device. That being said, right now we think it’s around 60–40, meaning 60% of queries on Google, at least, happen on mobile and 40% happen on desktop, somewhere in those ranges. It might be a little higher or a little lower.

The search demand curve

Another important and critical thing to understand about the keyword research universe and how we do keyword research is that there’s a sort of search demand curve. So for any given universe of keywords, there is essentially a small number, maybe a few to a few dozen keywords that have millions or hundreds of thousands of searches every month. Something like “soccer” or “Seattle Sounders,” those have tens or hundreds of thousands, even millions of searches every month in the United States.

But people searching for “Sounders FC away jersey customizable,” there are very, very few searches per month, but there are millions, even billions of keywords like this. 

The long-tail: millions of keyword terms and phrases, low number of monthly searches

When Sundar Pichai, Google’s current CEO, was testifying before Congress just a few months ago, he told Congress that around 20% of all searches that Google receives each day they have never seen before. No one has ever performed them in the history of the search engines. I think maybe that number is closer to 18%. But that is just a remarkable sum, and it tells you about what we call the long tail of search demand, essentially tons and tons of keywords, millions or billions of keywords that are only searched for 1 time per month, 5 times per month, 10 times per month.

The chunky middle: thousands or tens of thousands of keywords with ~50–100 searches per month

If you want to get into this next layer, what we call the chunky middle in the SEO world, this is where there are thousands or tens of thousands of keywords potentially in your universe, but they only have between say 50 and a few hundred searches per month.

The fat head: a very few keywords with hundreds of thousands or millions of searches

Then this fat head has only a few keywords. There’s only one keyword like “soccer” or “soccer jersey,” which is actually probably more like the chunky middle, but it has hundreds of thousands or millions of searches. The fat head is higher competition and broader intent.

Searcher intent and keyword competition

What do I mean by broader intent? That means when someone performs a search for “soccer,” you don’t know what they’re looking for. The likelihood that they want a customizable soccer jersey right that moment is very, very small. They’re probably looking for something much broader, and it’s hard to know exactly their intent.

However, as you drift down into the chunky middle and into the long tail, where there are more keywords but fewer searches for each keyword, your competition gets much lower. There are fewer people trying to compete and rank for those, because they don’t know to optimize for them, and there’s more specific intent. “Customizable Sounders FC away jersey” is very clear. I know exactly what I want. I want to order a customizable jersey from the Seattle Sounders away, the particular colors that the away jersey has, and I want to be able to put my logo on there or my name on the back of it, what have you. So super specific intent.

Build a map of your own keyword universe

As a result, you need to figure out what the map of your universe looks like so that you can present that, and you need to be able to build a list that looks something like this. You should at the end of the keyword research process — we featured a screenshot from Moz’s Keyword Explorer, which is a tool that I really like to use and I find super helpful whenever I’m helping companies, even now that I have left Moz and been gone for a year, I still sort of use Keyword Explorer because the volume data is so good and it puts all the stuff together. However, there are two or three other tools that a lot of people like, one from Ahrefs, which I think also has the name Keyword Explorer, and one from SEMrush, which I like although some of the volume numbers, at least in the United States, are not as good as what I might hope for. There are a number of other tools that you could check out as well. A lot of people like Google Trends, which is totally free and interesting for some of that broad volume data.



So I might have terms like “soccer jersey,” “Sounders FC jersey”, and “custom soccer jersey Seattle Sounders.” Then I’ll have these columns: 

  • Volume, because I want to know how many people search for it; 
  • Difficulty, how hard will it be to rank. If it’s super difficult to rank and I have a brand-new website and I don’t have a lot of authority, well, maybe I should target some of these other ones first that are lower difficulty. 
  • Organic Click-through Rate, just like we talked about back here, there are different levels of click-through rate, and the tools, at least Moz’s Keyword Explorer tool uses Jumpshot data on a per keyword basis to estimate what percent of people are going to click the organic results. Should you optimize for it? Well, if the click-through rate is only 60%, pretend that instead of 100 searches, this only has 60 or 60 available searches for your organic clicks. Ninety-five percent, though, great, awesome. All four of those monthly searches are available to you.
  • Business Value, how useful is this to your business? 
  • Then set some type of priority to determine. So I might look at this list and say, “Hey, for my new soccer jersey website, this is the most important keyword. I want to go after “custom soccer jersey” for each team in the U.S., and then I’ll go after team jersey, and then I’ll go after “customizable away jerseys.” Then maybe I’ll go after “soccer jerseys,” because it’s just so competitive and so difficult to rank for. There’s a lot of volume, but the search intent is not as great. The business value to me is not as good, all those kinds of things.
  • Last, but not least, I want to know the types of searches that appear — organic, paid. Do images show up? Does shopping show up? Does video show up? Do maps results show up? If those other types of search results, like we talked about here, show up in there, I can do SEO to appear in those places too. That could yield, in certain keyword universes, a strategy that is very image centric or very video centric, which means I’ve got to do a lot of work on YouTube, or very map centric, which means I’ve got to do a lot of local SEO, or other kinds like this.

Once you build a keyword research list like this, you can begin the prioritization process and the true work of creating pages, mapping the pages you already have to the keywords that you’ve got, and optimizing in order to rank. We’ll talk about that in Part III next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Managed Google Play earns key certifications for security and privacyManaged Google Play earns key certifications for security and privacy

With managed Google Play, organizations can build a customized and secure mobile application storefront for their teams, featuring public and private applications. Organizations’ employees can take advantage of the familiarity of a mobile app store to browse and download company-approved apps.

As with any enterprise-grade platform, it’s critical that the managed Google Play Store operates with the highest standards of privacy and security. Managed Google Play has been awarded three important industry designations that are marks of meeting the strict requirements for information security management practices.

Granted by the International Organization for Standardization, achieving ISO 27001 certification demonstrates that a company meets stringent privacy and security standards when operating an Information Security Management System (ISMS). Additionally, managed Google Play received SOC 2 and 3 reports, which are benchmarks of strict data management and privacy controls. These designations and auditing procedures are developed by the American Institute of Certified Public Accountants (AICPA).

Meeting a high bar of security management standards

To earn the ISO 27001 certification, auditors from Ernst & Young performed a thorough audit of managed Google Play based on established privacy principles. The entire methodology of documentation and procedures for managing other companies’ data are reviewed during an audit, and must be made available for regular compliance review. Companies that use managed Google Play are assured their data is managed in compliance with this industry standard. Additionally, ISO 27001 certification is in line with GDPR compliance.

Secure data management

With SOC 2 and SOC 3 reports, the focus is on controls relevant to data security, availability, processing integrity, confidentiality and privacy, which are verified through auditing reports. In managed Google Play, the data and private applications that enter Google’s systems are administered according to strict protocols, including determinations for who can view them and under what conditions. Enterprises require and receive the assurance that their information is handled with the utmost confidentiality and that the integrity of their data is preserved. For many companies, the presence of an SOC 2 and 3 report is a requirement when selecting a specific service. These reports prove that a service company has met and is abiding by best practices set forth by AICPA to ensure data security.

Our ongoing commitment to enterprise security

With managed Google Play, companies’ private apps for internal use are protected with a set of verified information security management processes and policies to ensure intellectual property is secure. This framework includes managed Google Play accounts that are used by enterprise mobility management (EMM) partners to manage devices.

Our commitment is that Android will continue to be a leader in enterprise security. As your team works across devices and shares mission-critical data through applications hosted in managed Google Play, you have the assurance of a commitment to providing your enterprise the highest standards of security and privacy.

Build your next iOS and Android app with FlutterBuild your next iOS and Android app with Flutter

And Supernova, a design-to-code tool, recently announced support for exporting Sketch designs directly to Flutter, allowing users of this popular design and wire-framing tool to turn their ideas directly into code.

Fast apps on each platform

Rather than introducing a layer of abstraction between your code and the underlying operating system, Flutter apps are native apps—meaning they compile directly to both iOS and Android devices.

Flutter’s programming language, Dart, is designed around the needs of apps that are created for global audiences. It’s easy to learn, contains a comprehensive set of libraries and packages that reduce the amount of code you have to write and is built for developer performance. When you’re ready to release your app, you can compile your code directly to the ARM machine code of your phone—meaning what you write is exactly what appears on the device—so you can harness the full power of your phone, rather than using a language like JavaScript that needs a separate engine to run.

Robots.txt best practice guide + examples

robots.txt best practice guide

The robots.txt file is an often overlooked and sometimes forgotten part of a website and SEO.

But nonetheless, a robots.txt file is an important part of any SEO’s toolset, whether or not you are just starting out in the industry or you are a chiseled SEO veteran.

What is a robots.txt file?

A robots.txt file can be used for for a variety of things, from letting search engines know where to go to locate your sites sitemap to telling them which pages to crawl and not crawl as well as being a great tool for managing your sites crawl budget.

You might be asking yourself “wait a minute, what is crawl budget?” Well crawl budget is what what Google uses to effectively crawl and index your sites pages. As big a Google is, they still only have a limited number of resources available to be able to crawl and index your sites content.

If your site only has a few hundred URLs then Google should be able to easily crawl and index your site’s pages.

However, if your site is big, like an ecommerce site for example and you have thousands of pages with lots of auto-generated URLs, then Google might not crawl all of those pages and you will be missing on lots of potential traffic and visibility.

This is where the importance of prioritizing what, when and how much to crawl becomes important.

Google have stated that “having many low-value-add URLs can negatively affect a site’s crawling and indexing.” This is where having a robots.txt file can help with the factors affecting your sites crawl budget.

You can use the file to help manage your sites crawl budget, by making sure that search engines are spending their time on your site as efficiently (especially if you have a large site) as possible and crawling only the important pages and not wasting time on pages such as login, signup or thank you pages.

Why do you need robots.txt?

Before a robot such as Googlebot, Bingbot, etc. crawls a webpage, it will first check to see if there is in fact a robots.txt file and, if one exists, they will usually follow and respect the directions found within that file.

A robots.txt file can be a powerful tool in any SEO’s arsenal as it’s a great way to control how search engine crawlers/bots access certain areas of your site. Keep in mind that you need to be sure you understand how the robots.txt file works or you will find yourself accidentally disallowing Googlebot or any other bot from crawling your entire site and not having it be found in the search results!

But when done properly you can control such things as:

  1. Blocking access to entire sections of your site (dev and staging environment etc.)
  2. Keeping your sites internal search results pages from being crawled, indexed or showing up in search results.
  3. Specifying the location of your sitemap or sitemaps
  4. Optimizing crawl budget by blocking access to low value pages (login, thank you, shopping carts etc..)
  5. Preventing certain files on your website (images, PDFs, etc.) from being indexed

Robots.txt Examples

Below are a few examples of how you can use the robots.txt file on your own site.

Allowing all web crawlers/robots access to all your sites content:

User-agent: *
Disallow:

Blocking all web crawlers/bots from all your sites content:

User-agent: *
Disallow: /

You can see how easy it is to make a mistake when creating your sites robots.txt as the difference from blocking your entire site from being seen is a simple forward slash in the disallow directive (Disallow: /).

Blocking a specific web crawlers/bots from a specific folder:

User-agent: Googlebot
Disallow: /

Blocking a web crawlers/bots from a specific page on your site:

User-agent: Disallow: /thankyou.html

Exclude all robots from part of the server:

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

This is example of what the robots.txt file on the theverge.com’s website looks like:

The example file can be viewed here: www.theverge.com/robots.txt

You can see how The Verge use their robots.txt file to specifically call out Google’s news bot “Googlebot-News” to make sure that it doesn’t crawl those directories on the site.

It’s important to remember that if you want to make sure that a bot doesn’t crawl certain pages or directories on your site, that you call out those pages and or directories in the in “Disallow” declarations in your robots.txt file, like in the above examples.

You can review how Google handles the robots.txt file in their robots.txt specifications guide, Google has a current maximum file size limit for the robots.txt file, the maximum size for Google is set at 500KB, so it’s important to be mindful of the size of your sites robots.txt file.

How to create a robots.txt file

Creating a robots.txt file for your site is a fairly simple process, but it’s also easy to make a mistake. Don’t let that discourage you from creating or modifying a robots file for your site. This article from Google walks you through the robots.txt file creation process and should help you get comfortable creating your very own robots.txt file.

Once you are comfortable with creating or modify your site’s robots file, Google has another great article that explains how to test your sites robots.txt file to see if it is setup correctly.

Checking if you have a robots.txt file

If you are new to the robots.txt file or are not sure if your site even has one, you can do a quick check to see. All you need to do to check is go to your sites root domain and then add /robots.txt to the end of the URL. Example: www.yoursite.com/robots.txt

If nothing shows up, then you do not have a robots.txt file for you site. Now would be the perfect time to jump in and test out creating one for your site.

Best Practices:

  1. Make sure all important pages are crawlable, and content that won’t provide any real value if found in search are blocked.
  2. Don’t block your sites JavaScript and CSS files
  3. Always do a quick check of your file to make sure nothing has changed by accident
  4. Proper capitalization of directory, subdirectory and file names
  5. Place the robots.txt file in your websites root directory for it to be found
  6. Robots.txt file is case sensitive,  the file must be named “robots.txt” (no other variations)
  7. Don’t use the robots.txt file to hide private user information as it will still be visible
  8. Add your sitemaps location to your robots.txt file.
  9. Make sure that you are not blocking any content or sections of your website you want crawled.

Things to keep in mind:

If you have a subdomain or multiple subdomains on your site, then you you will need to have a robots.txt file on each subdomain as well as on the main root domain. This would look something like this store.yoursite.com/robots.txt and yoursite.com/robots.txt.

Like mentioned above in the “best practices section” it’s important to remember not to use the robots.txt file to prevent sensitive data, such as private user information from being crawled and appearing in the search results.

The reason for this, is that it’s possible that other pages might be linking to that information and if there’s a direct link back it will bypass the robots.txt rules and that content may still get indexed. If you need to block your pages from truly being indexed in the search results, use should use different method like adding password protection or by adding a noindex meta tag to those pages. Google can not login to a password protected site/page, so they will not be able to crawl or index those pages.

Conclusion

While you might be a little nervous if you have never worked on robots.txt file before, rest assured it is fairly simple to use and set up. Once you get comfortable with the ins and outs of the robots file, you’ll be able to enhance your site’s SEO as well as help your site’s visitors and search engine bots.

By setting up your robots.txt file the right way, you will be helping search engine bots spend their crawl budgets wisely and help ensure that they aren’t wasting their time and resources crawling pages that don’t need to be crawled. This will help them in organizing and displaying your sites content in the SERPs in the best way possible, which in turn means you’ll have more visibility.

Keep in mind that it doesn’t necessarily take a whole lot of time and effort to setup your robots.txt file. For the most part, it’s a one-time setup, that you can then make little tweaks and changes to help better sculpt your site.

I hope the practices, tips and suggestions described in this article will help give you the confidence to go out and create/tweak your sites robots.txt file and at the same time help guide you smoothly through the process.

Michael McManus is Earned Media (SEO) Practice Lead at iProspect.

Related reading

cybersecurity in SEO, how website security affects your SEO performance