Five ways SEOs can utilize data with insights, automation, and personalization

Five ways SEOs can utilize data with insights, automation, and personalization.

Constantly evolving search results driven by Google’s increasing implementation of AI are challenging SEOs to keep pace. Search is more dynamic, competitive, and faster than ever before.

Where SEOs used to focus almost exclusively on what Google and other search engines were looking for in their site structure, links, and content, digital marketing now revolves solidly around the needs and intent of consumers.

This past year was perhaps the most transformative in SEO, an industry expected to top $80 billion in spending by 2020. AI is creating entirely new engagement possibilities across multiple channels and devices. Consumers are choosing to find and interact with information by voice search, or even on connected IoT appliances, and other devices. Brands are being challenged to reimagine the entire customer journey and how they optimize content for search, as a result.

How do you even begin to prioritize when your to-do list and the data available to you are growing at such a rapid pace? The points shared below intend to help you with that.

From analysis to activation, data is key

SEO is becoming less a matter of simply optimizing for search. Today, SEO success hinges on our ability to seize every opportunity. Research from my company’s Future of Marketing and AI Study highlights current opportunities in five important areas.

1. Data cleanliness and structure

As the volume of data consumers are producing in their searches and interactions increases, it’s critically important that SEOs properly tag and structure the information we want search engines to match to those queries. Google offers rich snippets and cards that enable you to expand and enhance your search results, making them more visually appealing but also adding functionality and opportunities to engage.

Example of structured data on Google

Google has experimented with a wide variety of rich results, and you can expect them to continue evolving. Therefore, it’s best practice to properly mark up all content so that when a rich search feature becomes available, your content is in place to capitalize on the opportunity.

You can use the Google Developers “Understand how structured data works” guide to get started and test your structured data for syntax errors here.

2. Increasingly automated actionable insights

While Google is using AI to interpret queries and understand results, marketers are deploying AI to analyze data, recognize patterns and deliver insights as output at rates humans simply cannot achieve. AI is helping SEOs in interpreting market trends, analyzing site performance, gathering and understanding competitor performance, and more.

It’s not just that we’re able to get insights faster, though. The insights available to us now may have gone unnoticed, if not for the in-depth analysis we can accomplish with AI.

Machines are helping us analyze different types of media to understand the content and context of millions of images at a time and it goes beyond images and video. With Google Lens, for example, augmented reality will be used to glean query intent from objects rather than expressed words.

Opportunities for SEOs include:

  • Greater ability to define opportunity space more precisely in a competitive context. Understand underlying need in a customer journey
  • Deploying longer-tail content informed by advanced search insights
  • Better content mapping to specific expressions of consumer intent across the buying journey

3. Real-time response and interactions

In a recent “State of Chatbots” report, researchers asked consumers to identify problems with traditional online experiences by posing the question, “What frustrations have you experienced in the past month?”

Screenshot of users' feedback on website usage experiences

As you can see, at least seven of the top consumer frustrations listed above can be solved with properly programmed chatbots. It’s no wonder that they also found that 69% of consumers prefer chatbots for quick communication with brands.

Search query and online behavior data can make smart bots so compelling and efficient in delivering on consumer needs that in some cases, the visitor may not even realize it’s an automated tool they’re dealing with. It’s a win for the consumer, who probably isn’t there for a social visit anyway as well as for the brand that seeks to deliver an exceptional experience even while improving operational efficiency.

SEOs have an opportunity to:

  • Facilitate more productive online store consumer experiences with smart chatbots.
  • Redesign websites to support visual and voice search.
  • Deploy deep learning, where possible, to empower machines to make decisions, and respond in real-time.

4. Smart automation

SEOs have been pretty ingenious at automating repetitive, time-consuming tasks such as pulling rankings reports, backlink monitoring, and keyword research. In fact, a lot of quality digital marketing software was born out of SEOs automating their own client work.

Now, AI is enabling us to make automation smarter by moving beyond simple task completion to prioritization, decision-making, and executing new tasks based on those data-backed decisions.

Survey on content development using AI

Content marketing is one area where AI can have a massive impact, and marketers are on board. We found that just four percent of respondents felt they were unlikely to use AI/deep learning in their content strategy in 2018, and over 42% had already implemented it.

In content marketing, AI can help us quickly analyze consumer behavior and data, in order to:

  • Identify content opportunities
  • Build optimized content
  • Promote the right content to the most motivated audience segments and individuals

5. Personalizations that drive business results

Personalization was identified as the top trend in marketing at the time of our survey, followed closely by AI (which certainly drives more accurate personalizations). In fact, you could argue that the top four trends namely, personalization, AI, voice search, and mobile optimization are closely connected if not overlapping in places.

Across emails, landing pages, paid advertising campaigns, and more, search insights are being injected into and utilized across multiple channels. These intend to help us better connect content to consumer needs.

Each piece of content produced must be purposeful. It needs to be optimized for discovery, a process that begins in content planning as you identify where consumers are going to find and engage with each piece. Smart content is personalized in such a way that it meets a specific consumer’s need, but it must deliver on the monetary needs of the business, as well.

Check out these 5 steps for making your content smarter from a previous column for more.

How SEOs are uniquely positioned to drive smarter digital marketing forward

As the marketing professionals have one foot in analysis and the other solidly planted in creative, SEOs have a unique opportunity to lead smart utilization and activation of all manners of consumer data.

You understand the critical importance of clean data input (or intelligent systems that can clean and make sense of unstructured data) and differentiating between first and third-party data. You understand economies of scale in SEO and the value in building that scalability into systems from the ground up.

SEOs have long nurtured a deep understanding of how people search for and discover information, and how technology delivers. Make the most of your current opportunities by picking your low-hanging fruit opportunities for quick wins. Focus your efforts on putting the scalable, smart systems in place that will allow you to anticipate consumer needs, react quickly, report SEO appropriately, and convey business results to the stakeholders who will determine budgets in future.

Jim Yu is the founder and CEO of leading enterprise SEO and content performance platform BrightEdge. He can be found on Twitter .

You might like to read these next:

Related reading

How to speed up SEO analysis API advantages for SEO experts (with bonus)
Common technical SEO issues and fixes, for aggregators and finance brands
faceted navigation in ecommerce
marketing automation for SEOs, five time-saving strategies

A new app to map and monitor the world’s freshwater supplyA new app to map and monitor the world’s freshwater supply

Today, on World Water Day, we’re proud to showcase a new platform enabling all countries to freely measure and monitor when and where water is changing: UN’s Water-Related Ecosystems, or sdg661.app. Released last week in Nairobi at the UN Environment Assembly (UNEA), the app provides statistics for every country’s annual surface water (like lakes and rivers). It also shows changes from 1984 through 2018 through interactive maps, graphs and full-data downloads.

This project is only possible because of the unique partnerships between three very different organizations. In 2016, European Commission’s Joint Research Centre (JRC) and Google released the Global Surface Water Explorer in tandem with a publication in “Nature.” An algorithm developed by the JRC to map water was run on Google Earth Engine. The process took more than 10 million hours of computing time, spread across more than 10,000 computers in parallel, a feat that would have taken 600 years if run on a modern desktop computer. But the sheer magnitude of the high resolution global data product tended to limit analysis to only the most tech savvy users and countries.

The new app, created in partnership with United Nations Environment, aims to make this water data available to everyone. Working with member countries to understand their needs, it features smaller, more easily manageable tables and maps at national and water body levels. Countries can compare data with one another, and for the first time gain greater understanding of the effects of water policy, and infrastructure like dams, diversions, and irrigation practices on water bodies that are shared across borders.

Ask a Techspert: Why am I getting so many spam calls?Ask a Techspert: Why am I getting so many spam calls?

Editor’s Note: Do you ever feel like a fish out of water? Try being a tech novice and talking to an engineer at a place like Google. Ask a Techspert is a new series on the Keyword asking Googler experts to explain complicated technology for the rest of us. This isn’t meant to be comprehensive, but just enough to make you sound smart at a dinner party.

Growing up, I was taught to say “Schottenfels residence” when answering the phone. It was the polite way of doing things. When the phone rang, it was usually family, friends and, yes, the occasional telemarketer on the other side of the line. Then things changed. Personal calls moved to mobile phones, and the landline became the domain of robocalls. My cell was a sanctuary, free of the pesky automated dialers that plague the landlines of yore. Until recently.

Today, it feels like the only phone calls I get are spam calls. And I know I’m not alone. According to a recent Google survey, half of respondents received at least one spam call per day, and one third received two or more per day.

And people are answering those calls. More than one third of respondents worry that a call from an unknown number is a call about a loved one, and another third think it could be a call from a potential loved one, so they pick up. And almost everyone agrees: Spam calls are the worst. In fact, 75 percent of those surveyed think spam calls are more annoying than spam texts or emails.

So what’s the deal with spam calls? And how can we stop them from happening? For the latest edition of Ask a Techspert, I spoke to Paul Dunlop, the product manager for the Google Phone App, to better understand why, all of the sudden, spam calls are happening so frequently, and what tools, like Pixel’s Call Screen feature, you can use  to avoid the headache.

Why spam calls are more common lately

According to Paul, voice-over IP (VoIP) is the culprit. These are phone calls made using the web instead of a traditional telephone line, and today they’re cheaper and easier than ever to use. “Using VoIP technology, spammers place phone calls over the Internet and imitate a different phone number,” Paul says. “It used to be that they had a fixed number, and you could block that number. Now with VoIP, spammers have the ability to imitate any phone number.” Paul says this became possible when companies, which wanted to call customers from call centers, made it so one general 1-800 number for a business showed up on caller IDs. So what started as a common-sense solution ended up becoming an easy loophole for spammers.

This is called spoofing, and there’s nothing in phone systems—the infrastructure of telephones—that can prevent spam callers from imitating numbers. “You can actually be spammed by your own phone number,” Paul says. “But the most common is neighborhood spam, using your area code and the first three digits of your phone number, which increases the likelihood you’ll answer.”

How Pixel can help you avoid picking up spam calls

Social listening 101: Six crucial keywords to track

Social listening 101 Six crucial keywords to track

Social listening is a tactic that’s not unheard of. Quite a number of brands use it these days and even more consider trying it out in the near future. However, for many, the step-by-step process of social listening remains unclear.

This article aims to answer the most burning questions about social listening:

  • What is a keyword?
  • Which keywords should you monitor?
  • How do you get relevant and comprehensive results instead of all the noise that the Internet is filled with?

What is a keyword?

As we know, social listening is a process that requires a social media listening/social media monitoring tool (e.g., Awario, Mention, Brandwatch). The first thing you do when you open the app is entering keywords to monitor.

Keywords are the words that describe best what you want to find on social media platforms and the web. A keyword can be one word (e.g. “Philips”), two words (e.g. “Aleh Barysevich”), four words (e.g. “search engine optimization tool”), etc. Each one of these examples presents one keyword. After you typed in your keyword(s), the tool will search for mentions of these keywords and collect them in a single place.

Screenshot of mentions for a specific keyword

Which keywords should you monitor?

You can monitor absolutely anything. You can monitor the keywords “Brexit” or “let’s dance” or “hello, is it me you’re looking for”. However, in terms of marketing purposes, there are six main types of keywords that you are most likely to monitor. They are:

1. Brand/company
2. Competitors
3. Person
4. Campaign
5. Industry
6. URL

Now let’s go through each type together to make sure you understand the goals behind monitoring these keywords and how to get the most out of them.

1. Brand/Company

Monitoring your brand/your company is essential in most cases. While the goals of social listening can be very diverse (reputation management, brand awareness, influencer marketing, customer service), most of these goals require listening to what people say about your brand.

To make sure you don’t miss any valuable mentions, include common misspellings and abbreviations of your brand name as well.

In case your brand name is a common word (e.g. “Apple” or “Orange”) make sure to choose a tool that gives you an option to introduce “negative” keywords. These would be keywords such as “apple tree”, “apple juice”, “apple pie”. Excluding them from your search will help get mentions of Apple the brand only. Any tool that has a boolean search option will also save you from tons of such irrelevant mentions.

2. Competitors

Pick a couple of your main competitors (or even just one), and enter their brand/company name as a separate project. There’s a good reason for that: Questions and complaints directed at your competitors can be replied by your social media manager first. They could explain why your brand is better/doesn’t have specific problems that your competitor does. This is social selling, a process of finding hot leads on social media.

Most social media monitoring tools also let you compare how your brand is doing on social media against your competitor’s brand. This can be useful for tracking your progress and discovering new ideas.

For example, knowing which social networks, which locations, and what time slots get your competitor more attention could help you upgrade your social media strategy. Knowing how their campaigns, social media posts, and product releases perform could help you improve your own plans, and avoid some mishaps.

3. Person

The CEO of your company might not necessarily be the company’s face or even a public persona at all. However, if reputation management is one of your goals, monitoring mentions of the CEO are important. Their actions on social media could easily attract attention and cause a social media crisis. Also, you’ll know straight away about any publications that mention your company’s CEO.

Same, of course, goes for any other people in the company.

4. Campaign

It’s crucial to monitor marketing (and other) campaigns as well as product launches. Reactions on social media happen very quickly. Only by monitoring such events in real time, you’ll know straight away if it’s going well or not, if it’s working at all, and if there are problems that you might’ve not noticed while creating the campaign. The earlier you know how the reality is unfolding, the better. To monitor a campaign, enter its name if it has one, its slogan, and/or its hashtag as a keyword.

Example of how social media activities could go wrong

It’s important to understand that there are loads of marketing campaigns that have caused serious problems for the companies. Something that could’ve been avoided with social media monitoring.

5. Industry

Not in every industry can you monitor the so-called “industry keywords”. However, if you can, these are the source of endless opportunities. Most of these are in the realms of social selling, brand awareness, and influencer marketing.

For example, if your product is a productivity app, this would be your keyword “productivity app”. Include a couple of synonyms and words such as “looking for”, or “can anyone recommend” and you’ll get mentions from people that look for a product like yours. Specify the language and the location to get more relevant results.

With a social media monitoring tool that finds influencers, you can go to the list of influencers that is built around your industry keywords and choose the ones to work with.

Example of finding influencers using social listening keywords

6. URL

Monitoring your brand by excluding your brand’s URL (which is possible with a social media monitoring tool) is important for SEO purposes. It’s a big part of link-building. All you have to do is find mentions of your brand that don’t link to your brand, reach out to the author, and ask for a link. In most cases, the authors wouldn’t mind adding the link to your site.

Besides, you can monitor competitors’ URLs. This will give you a list of sources where they get links from. It’s only logical that if the author is interested in the niche and is willing to write about your competitor, they probably wouldn’t mind reviewing your product as well.

Conclusion

There’s a lot you can do with social media monitoring. All you have to do is start. Starting is the hardest part. Then, appetite, ideas, and knowledge come with eating. Hopefully, this article gave you a clear idea of where to start.

Aleh is the Founder and CMO at SEO PowerSuite and Awario. He can be found on Twitter at .

Related reading

webinar marketing

Hot off the press: Talking media with Google News Lab’s directorHot off the press: Talking media with Google News Lab’s director

When I was growing up, reading the news meant thumbing through the local paper every week on my way to the Sunday comics section. These days, staying up-to-date on world events looks a little different: I skim email newsletters, scroll through social media feeds, occasionally pick up a magazine, and of course, read Google News.

As newsrooms around the world keep up with these changes, there’s one team at Google thinking about how technology can help build the future of media: the News Lab. To mark the one-year anniversary of the Google News Initiative, I sat down with News Lab’s director and cofounder, Olivia Ma, for today’s She Word interview. Here’s what I learned—straight from the source—about why Olivia set out on this career path, how she stays focused in a world where the news never sleeps and what she’s reading outside of the office.

How do you explain your job at a dinner party?

As the mother of two young kids, I don’t make it to that many dinner parties these days. But if I find myself at a table filled with adults, I’d tell them this: I lead a team at Google called News Lab that works with newsrooms across the globe to help them navigate the transition to a digital future. 

In the early days of News Lab, we focused on training journalists to use our products that helped them tell stories, such as Google Trends and Google Earth. Now, we immerse ourselves in the needs of journalists, publishers and news consumers so that our engineering teams can build better products. Every day we work to answer the question: How can technology play a role in helping newsrooms grow their audiences and build sustainable businesses?

What initially drew you to journalism?  

My dad spent his career working as a journalist at publications like Newsweek, U.S. News and World Report and The Washington Post. As a kid, my class would visit my his office to learn about how magazines and newspapers were printed—the old fashioned way, with ink and paper.

It wasn’t until college that I also caught the journalism bug, and I decided to dedicate my career to tackling the tricky challenges facing the news industry. By that time, my dad had started working at The Washington Post where he helped transition the newspaper online. Up until he passed away in 2011 we’d talk about what we thought journalism would look like in the digital age. I’m honored to continue his legacy—albeit from a different vantage point.

Managed Google Play earns key certifications for security and privacyManaged Google Play earns key certifications for security and privacy

With managed Google Play, organizations can build a customized and secure mobile application storefront for their teams, featuring public and private applications. Organizations’ employees can take advantage of the familiarity of a mobile app store to browse and download company-approved apps.

As with any enterprise-grade platform, it’s critical that the managed Google Play Store operates with the highest standards of privacy and security. Managed Google Play has been awarded three important industry designations that are marks of meeting the strict requirements for information security management practices.

Granted by the International Organization for Standardization, achieving ISO 27001 certification demonstrates that a company meets stringent privacy and security standards when operating an Information Security Management System (ISMS). Additionally, managed Google Play received SOC 2 and 3 reports, which are benchmarks of strict data management and privacy controls. These designations and auditing procedures are developed by the American Institute of Certified Public Accountants (AICPA).

Meeting a high bar of security management standards

To earn the ISO 27001 certification, auditors from Ernst & Young performed a thorough audit of managed Google Play based on established privacy principles. The entire methodology of documentation and procedures for managing other companies’ data are reviewed during an audit, and must be made available for regular compliance review. Companies that use managed Google Play are assured their data is managed in compliance with this industry standard. Additionally, ISO 27001 certification is in line with GDPR compliance.

Secure data management

With SOC 2 and SOC 3 reports, the focus is on controls relevant to data security, availability, processing integrity, confidentiality and privacy, which are verified through auditing reports. In managed Google Play, the data and private applications that enter Google’s systems are administered according to strict protocols, including determinations for who can view them and under what conditions. Enterprises require and receive the assurance that their information is handled with the utmost confidentiality and that the integrity of their data is preserved. For many companies, the presence of an SOC 2 and 3 report is a requirement when selecting a specific service. These reports prove that a service company has met and is abiding by best practices set forth by AICPA to ensure data security.

Our ongoing commitment to enterprise security

With managed Google Play, companies’ private apps for internal use are protected with a set of verified information security management processes and policies to ensure intellectual property is secure. This framework includes managed Google Play accounts that are used by enterprise mobility management (EMM) partners to manage devices.

Our commitment is that Android will continue to be a leader in enterprise security. As your team works across devices and shares mission-critical data through applications hosted in managed Google Play, you have the assurance of a commitment to providing your enterprise the highest standards of security and privacy.

Build your next iOS and Android app with FlutterBuild your next iOS and Android app with Flutter

And Supernova, a design-to-code tool, recently announced support for exporting Sketch designs directly to Flutter, allowing users of this popular design and wire-framing tool to turn their ideas directly into code.

Fast apps on each platform

Rather than introducing a layer of abstraction between your code and the underlying operating system, Flutter apps are native apps—meaning they compile directly to both iOS and Android devices.

Flutter’s programming language, Dart, is designed around the needs of apps that are created for global audiences. It’s easy to learn, contains a comprehensive set of libraries and packages that reduce the amount of code you have to write and is built for developer performance. When you’re ready to release your app, you can compile your code directly to the ARM machine code of your phone—meaning what you write is exactly what appears on the device—so you can harness the full power of your phone, rather than using a language like JavaScript that needs a separate engine to run.

Robots.txt best practice guide + examples

robots.txt best practice guide

The robots.txt file is an often overlooked and sometimes forgotten part of a website and SEO.

But nonetheless, a robots.txt file is an important part of any SEO’s toolset, whether or not you are just starting out in the industry or you are a chiseled SEO veteran.

What is a robots.txt file?

A robots.txt file can be used for for a variety of things, from letting search engines know where to go to locate your sites sitemap to telling them which pages to crawl and not crawl as well as being a great tool for managing your sites crawl budget.

You might be asking yourself “wait a minute, what is crawl budget?” Well crawl budget is what what Google uses to effectively crawl and index your sites pages. As big a Google is, they still only have a limited number of resources available to be able to crawl and index your sites content.

If your site only has a few hundred URLs then Google should be able to easily crawl and index your site’s pages.

However, if your site is big, like an ecommerce site for example and you have thousands of pages with lots of auto-generated URLs, then Google might not crawl all of those pages and you will be missing on lots of potential traffic and visibility.

This is where the importance of prioritizing what, when and how much to crawl becomes important.

Google have stated that “having many low-value-add URLs can negatively affect a site’s crawling and indexing.” This is where having a robots.txt file can help with the factors affecting your sites crawl budget.

You can use the file to help manage your sites crawl budget, by making sure that search engines are spending their time on your site as efficiently (especially if you have a large site) as possible and crawling only the important pages and not wasting time on pages such as login, signup or thank you pages.

Why do you need robots.txt?

Before a robot such as Googlebot, Bingbot, etc. crawls a webpage, it will first check to see if there is in fact a robots.txt file and, if one exists, they will usually follow and respect the directions found within that file.

A robots.txt file can be a powerful tool in any SEO’s arsenal as it’s a great way to control how search engine crawlers/bots access certain areas of your site. Keep in mind that you need to be sure you understand how the robots.txt file works or you will find yourself accidentally disallowing Googlebot or any other bot from crawling your entire site and not having it be found in the search results!

But when done properly you can control such things as:

  1. Blocking access to entire sections of your site (dev and staging environment etc.)
  2. Keeping your sites internal search results pages from being crawled, indexed or showing up in search results.
  3. Specifying the location of your sitemap or sitemaps
  4. Optimizing crawl budget by blocking access to low value pages (login, thank you, shopping carts etc..)
  5. Preventing certain files on your website (images, PDFs, etc.) from being indexed

Robots.txt Examples

Below are a few examples of how you can use the robots.txt file on your own site.

Allowing all web crawlers/robots access to all your sites content:

User-agent: *
Disallow:

Blocking all web crawlers/bots from all your sites content:

User-agent: *
Disallow: /

You can see how easy it is to make a mistake when creating your sites robots.txt as the difference from blocking your entire site from being seen is a simple forward slash in the disallow directive (Disallow: /).

Blocking a specific web crawlers/bots from a specific folder:

User-agent: Googlebot
Disallow: /

Blocking a web crawlers/bots from a specific page on your site:

User-agent: Disallow: /thankyou.html

Exclude all robots from part of the server:

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

This is example of what the robots.txt file on the theverge.com’s website looks like:

The example file can be viewed here: www.theverge.com/robots.txt

You can see how The Verge use their robots.txt file to specifically call out Google’s news bot “Googlebot-News” to make sure that it doesn’t crawl those directories on the site.

It’s important to remember that if you want to make sure that a bot doesn’t crawl certain pages or directories on your site, that you call out those pages and or directories in the in “Disallow” declarations in your robots.txt file, like in the above examples.

You can review how Google handles the robots.txt file in their robots.txt specifications guide, Google has a current maximum file size limit for the robots.txt file, the maximum size for Google is set at 500KB, so it’s important to be mindful of the size of your sites robots.txt file.

How to create a robots.txt file

Creating a robots.txt file for your site is a fairly simple process, but it’s also easy to make a mistake. Don’t let that discourage you from creating or modifying a robots file for your site. This article from Google walks you through the robots.txt file creation process and should help you get comfortable creating your very own robots.txt file.

Once you are comfortable with creating or modify your site’s robots file, Google has another great article that explains how to test your sites robots.txt file to see if it is setup correctly.

Checking if you have a robots.txt file

If you are new to the robots.txt file or are not sure if your site even has one, you can do a quick check to see. All you need to do to check is go to your sites root domain and then add /robots.txt to the end of the URL. Example: www.yoursite.com/robots.txt

If nothing shows up, then you do not have a robots.txt file for you site. Now would be the perfect time to jump in and test out creating one for your site.

Best Practices:

  1. Make sure all important pages are crawlable, and content that won’t provide any real value if found in search are blocked.
  2. Don’t block your sites JavaScript and CSS files
  3. Always do a quick check of your file to make sure nothing has changed by accident
  4. Proper capitalization of directory, subdirectory and file names
  5. Place the robots.txt file in your websites root directory for it to be found
  6. Robots.txt file is case sensitive,  the file must be named “robots.txt” (no other variations)
  7. Don’t use the robots.txt file to hide private user information as it will still be visible
  8. Add your sitemaps location to your robots.txt file.
  9. Make sure that you are not blocking any content or sections of your website you want crawled.

Things to keep in mind:

If you have a subdomain or multiple subdomains on your site, then you you will need to have a robots.txt file on each subdomain as well as on the main root domain. This would look something like this store.yoursite.com/robots.txt and yoursite.com/robots.txt.

Like mentioned above in the “best practices section” it’s important to remember not to use the robots.txt file to prevent sensitive data, such as private user information from being crawled and appearing in the search results.

The reason for this, is that it’s possible that other pages might be linking to that information and if there’s a direct link back it will bypass the robots.txt rules and that content may still get indexed. If you need to block your pages from truly being indexed in the search results, use should use different method like adding password protection or by adding a noindex meta tag to those pages. Google can not login to a password protected site/page, so they will not be able to crawl or index those pages.

Conclusion

While you might be a little nervous if you have never worked on robots.txt file before, rest assured it is fairly simple to use and set up. Once you get comfortable with the ins and outs of the robots file, you’ll be able to enhance your site’s SEO as well as help your site’s visitors and search engine bots.

By setting up your robots.txt file the right way, you will be helping search engine bots spend their crawl budgets wisely and help ensure that they aren’t wasting their time and resources crawling pages that don’t need to be crawled. This will help them in organizing and displaying your sites content in the SERPs in the best way possible, which in turn means you’ll have more visibility.

Keep in mind that it doesn’t necessarily take a whole lot of time and effort to setup your robots.txt file. For the most part, it’s a one-time setup, that you can then make little tweaks and changes to help better sculpt your site.

I hope the practices, tips and suggestions described in this article will help give you the confidence to go out and create/tweak your sites robots.txt file and at the same time help guide you smoothly through the process.

Michael McManus is Earned Media (SEO) Practice Lead at iProspect.

Related reading

cybersecurity in SEO, how website security affects your SEO performance

Russia wants to cut itself off from the global internet. Here’s what that really means.

In the next two weeks, Russia is planning to attempt something no other country has tried before. It’s going to test whether it can disconnect from the rest of the world electronically while keeping the internet running for its citizens. This means it will have to reroute all its data internally, rather than relying on servers abroad.

The test is key to a proposed “sovereign internet” law currently working its way through Russia’s government. It looks likely to be eventually voted through and signed into law by President Vladimir Putin, though it has stalled in parliament for now.

Pulling an iron curtain down over the internet is a simple idea, but don’t be fooled: it’s a fiendishly difficult technical challenge to get right. It is also going to be very expensive. The project’s initial cost has been set at $38 million by Russia’s financial watchdog, but it’s likely to require far more funding than that. One of the authors of the plan has said it’ll be more like $304 million, Bloomberg reports, but even that figure, industry experts say, won’t be enough to get the system up and running, let alone maintain it.

Not only that, but it has already proved deeply unpopular with the general public. An estimated 15,000 people took to the streets in Moscow earlier this month to protest the law, one of the biggest demonstrations in years.

Operation disconnect

So how will Russia actually disconnect itself from the global internet? “It is unclear what the ‘disconnect test’ might entail,” says Andrew Sullivan, president and CEO of the Internet Society. All we know is that if it passes, the new law will require the nation’s internet service providers (ISPs) to use only exchange points inside the country that are approved by Russia’s telecoms regulator, Roskomnadzor.

Sign up for The Download

Your daily dose of what’s up in emerging technology

These exchange points are where internet service providers connect with each other. It’s where their cabling meets at physical locations to exchange traffic. These locations are overseen by organizations known as internet exchange providers (IXPs). Russia’s largest IXP is in Moscow, connecting cities in Russia’s east but also Riga in neighboring Latvia.

MSK-IX, as this exchange point is known, is one of the world’s largest. It connects over 500 different ISPs and handles over 140 gigabits of throughput during peak hours on weekdays. There are six other internet exchange points in Russia, spanning most of its 11 time zones. Many ISPs also use exchanges that are physically located in neighboring countries or that are owned by foreign companies. These would now be off limits. Once this stage is completed, it would provide Russia with a literal, physical “on/off switch” to decide whether its internet is shielded from the outside world or kept open.

What’s in a name?

As well as rerouting its ISPs, Russia will also have to unplug from the global domain name system (DNS) so traffic cannot be rerouted through any exchange points that are not inside Russia.

The DNS is basically a phone book for the internet: when you type, for example, “google.com” into your browser, your computer uses the DNS to translate this domain name into an IP address, which identifies the correct server on the internet to send the request. If one server won’t respond to a request, another will step in. Traffic behaves rather like water—it will seek any gap it can to flow through.

“The creators of the DNS wanted to create a system able to work even when bits of it stopped working, regardless of whether the decision to break parts of it was deliberate or accidental,” says Brad Karp, a computer scientist at University College London. This in-built resilience in the underlying structure of the internet will make Russia’s plan even harder to carry out.

The actual mechanics of the DNS are operated by a wide variety of organizations, but a majority of the “root servers,” which are its foundational layer, are run by groups in the US. Russia sees this as a strategic weakness and wants to create its own alternative, setting up an entire new network of its own root servers.

“An alternate DNS can be used to create an alternate reality for the majority of Russian internet users,” says Ameet Naik, an expert on internet monitoring for the software company ThousandEyes. “Whoever controls this directory controls the internet.” Thus, if Russia can create its own DNS, it will have at least a semblance of control over the internet within its borders.

This won’t be easy, says Sullivan. It will involve configuring tens of thousands of systems, and it will be difficult, if not impossible, to identify all the different access points citizens use to get online (their laptops, smartphones, iPads, and so on). Some of them will be using servers abroad, such as Google’s Public DNS, which Russia simply won’t be able to replicate—so the connection will fail when a Russian user tries to access them.

If Russia can successfully set up its own DNS infrastructure across the country and compel its ISPs to use it, then Russian users are likely not to notice, unless they try to access a website that’s censored. For example, a user trying to connect to facebook.com could be redirected to vk.com, which is a Russian social-media service with an uncanny resemblance to Facebook. 

This coming test—no official date has been given— will show us whether the necessary preparation has been done. For the West, it’s important not to underestimate the Russian state’s will, or ability, to make sure it happens.

Resilience and control

The purpose, the Kremlin says, is to make Russia’s internet independent and easier to defend against attacks from abroad. To begin with, it could help Russia resist existing sanctions from the US and the EU, and any potential future measures. It also makes sense to make the internet inside your country accessible in the event it gets physically severed from the rest of the world. For example, in 2008 there were three separate instances of major damage to the internet’s physical cabling under the sea (blamed on ships’ anchors), which cut off access for users in the Middle East, India, and Singapore. If the affected countries had been able to reroute traffic, this disruption might have been avoided.

Many observers see the move as part of Russia’s long tradition of trying to control the flow of information between citizens. Russia has already passed legislation requiring search engines to delete some results, and in 2014 it obliged social networks to store Russian users’ data on servers inside the country. It has also banned encrypted messaging apps like Telegram. Just this week, Russia’s government signed into law two new vaguely worded bills that make it a crime to “disrespect the state” or spread “fake news” online. The new plan to reroute Russian traffic is an “escalation,” says Sergey Sanovich, a Russian researcher at Stanford who specializes in online censorship. “I’d say it’s a dangerous escalation,” he adds.

Photo of demonstrators shouting and hold signs during the Free Internet rally

ASSOCIATED PRESS

If so, it’s an escalation that has been a long time coming. The conversation between ISPs and the security services has been going on for more than two decades, according to Keir Giles, an expert on Russian security who works for the think tank Chatham House. Security officials in Russia have always seen the internet as more of a threat than an opportunity.

“Russia wants to be able to do this while insulating itself from the consequences, by preemptively cutting itself off from global infrastructure,” Giles says.

If Russia is seeking inspiration, it need just look east. China has been terrifically successful in shaping the online experience for its citizens to its advantage. However, China decided to exert a high degree of control over the development of the internet while it was at a nascent stage. Russia was preoccupied at that time with the collapse of the Soviet Union, so it is quite late to the party. China embedded the homegrown ISP and DNS infrastructure that Russia hopes to construct way back in the early 2000s. Trying to impose this architecture retrospectively is an awful lot harder. “China took control very early on, and decided that all traffic in and out must be controlled and regulated,” says Naik.

The fallout

In contrast, Russian businesses and citizens are firmly enmeshed in the global internet and use a lot more foreign services, such as Microsoft cloud tools, than Chinese people do. It’s not yet clear what impact the disconnection will have on these, but it’s possible that if the plug is pulled on external traffic routes, Russian citizens may lose access to them. While many cloud services can “mirror” their content in different regions, none of the major cloud services (Microsoft, Google or Amazon Web Services) have data centers based in Russia. Replicating these services within Russia’s borders is not trivial and would require significant investment and time, says Naik. The coming test might be intended to address this issue, according to Sullivan.

Another potential problem is that many Russian ISPs carry traffic on behalf of other companies or ISPs, with reciprocal arrangements that they carry traffic for Russian ISPs too. If it’s done incorrectly, Russia’s plan means a “whole bunch of the traffic going in and out of Russia will just fall into a black hole,” says Naik.

If the experiment goes wrong and large parts of the internet go down in Russia, it could cost the nation’s economy dearly (disconnecting from the internet has been incredibly costly for countries that have experienced it, deliberately or otherwise). That doesn’t mean the Kremlin won’t go ahead with it anyway, Giles believes.

If it happens, don’t expect Russians to hand over their internet rights freely: as in China, it’s likely that determined, tech-savvy citizens will be able to exploit any weaknesses in the system and circumvent it. For example, during protests in Turkey, people shared ways to access the global DNS directly, thus thwarting their government’s block on social-media websites.

One recent event that may have given Russia more impetus to push forward with the plan is the hacking by the US Cyber Command of the Internet Research Agency, the infamous Russian “troll factory” that allegedly used social media to sow division in the US during the 2016 election.

“The threat is real. The number of people who access antigovernment internet content is growing,” says Kirill Gusov, a journalist and political expert in Moscow. The government controls the media and television, but the internet remains beyond its grasp. “I’d not be surprised if the FSB [the successor to the KGB] approached Putin and reported on this attack, which coincided with their desire to suppress internet freedom because they are losing control over society,” he says.

Though it’s still not clear when if ever the law will become a reality, the Russian government isn’t known for being flexible or responsive to public pressure. It’s far more likely to be delayed than dead.

How to Clearly Articulate What You or Your Brand Do: Clarity Consultant Steve Woodruff on Marketing Smarts [Podcast]

“You can’t read the label of the jar you’re in.” (Or, for that matter, write that label from the inside). Those words of wisdom come courtesy of Steve Woodruff, a “clarity consultant” at Clarity Fuel.

Listen to it later:

People don’t buy what they don’t understand, which is why Steve’s work is so important: He specializes in connecting people with their purpose, their message, and with other people—in order to create new business opportunities.

I invited Steve to Marketing Smarts to talk about his book Clarity Wins: Get Heard. Get Referred, and to share tips on how you can let go off the jargon and make it easy for people to understand what you do, which audiences you serve, and what makes you different.

Essentially, I wanted to ask Steve how you can proactively write the label that appears on the jar you’re in!

Here are a few highlights from our conversation:

Clearly articulating what you do (and for whom you do it) is a differentiator, because so few companies can communicate that simply (01:02): “Many companies have a vision and a mission statement. That’s way up there at the 30,000-foot level. And then, often, they have the sales messaging, the marketing messaging, the various playbooks. But there’s this missing layer—what I call the ‘keystone layer’—the Articles of Clarity, where anybody from the top of the organization down and anybody on the outside can see in clear, plain, human speech exactly what the company stands for, what their audience is, what their offerings are, and what the message is. The vast majority of companies do not have a good, clear, simple way of articulating themselves.”

Create a “verbal business card,” a 15-second explanation of what you do (01:55): “It often comes down to the first ‘moment of truth,’ when somebody asks you in a networking meeting, ‘Hey, what do you do?’ Everybody’s been asked that question. What often comes out of people’s mouths is a bunch of jargon or something that really gives no clear idea of exactly what we do and who we do it for.

“What that means is, number 1, we lose the audience if, in 15 seconds, they don’t have a clue what we do. But that also undercuts what I call ‘the second moment of truth,’ which is, If that person understands me, I can refer them to the next person. But you’ve got to give me a verbal business card, a word package, so I can have in my mind a picture.

“Then, when I run into somebody and they say, ‘We’re real worried about the legal issues of social media,’ I go, ‘Ah, that’s Kerry Gorgone; let me make a referral.’ And as we all know, referrals are the best way to get business. So the point of this book is, How do we make people and businesses ‘referral-ready?’ How do we make that first ‘moment of truth’ so effective that the second ‘moment of truth’ can happen.”

To create a “verbal business card,” simplify your message using snippets, stories, and symbols (04:05): “There’s three ways to boil down our message. You use snippets, which are very short, succinct, plain statements. You use stories, because the brain is hard-wired to absorb stories, so we have to tell people stories. People will remember them. And then: symbols.

“Symbols is when we talk about…Mercedes, Walmart—all the different things that use an existing memory hook. Because if I can get a concept in your head on an existing memory hook, you’re much more inclined to remember me and be able to pass on my name then if I say ‘here are the five different bullet points.’ Nobody’s going to remember five bullet points.”

For more information, visit SteveWoodruff.com or follow Steve on Twitter: @swoodruff, and be sure to get your copy of Clarity Wins: Get Heard. Get Referred.

Steve and I talked about much more, so be sure to listen to the entire show, which you can do above, or download the mp3 and listen at your convenience. Of course, you can also subscribe to the Marketing Smarts podcast in iTunes or via RSS and never miss an episode!

This episode brought to you by GoToWebinar:

GoToWebinar makes it easy to produce engaging online events. Whether you want to connect with your prospects, customers or employees, GoToWebinar has the tools and analytics you need. Start creating interactive and educational webinars your audience will love.

Music credit: Noam Weinstein.