Welcome!

Helping adolescents navigate the unique educational, social and behavioral obstacles related to attending high school. I also love to bake cookies.

Loading,
0
% complete
  • Local SEO Spam Tactics Are Working: How You Can Fight Back

    Posted by Casey_Meraz

    For years, I’ve been saying that if you have a problem with spammers in local results, you can just wait it out. I mean, if Google cared about removing spam and punishing those who are regular spammers we’d see them removed fast and often, right?

    While there are instances where spam has been removed, it seems these are not fast fixes, permanent fixes, or even very common. In fact, they seem few and far between. So today I’m changing my tune a bit to call more attention to the spam issues people employ that violate Google My Business terms and yet continue to win in the SERPs.

    The problems are rampant and blatant. I’ve heard and seen many instances of legitimate businesses changing their names just to rank better and faster for their keywords.

    Another problem is that Google is shutting down MapMaker at the end of March. Edits will still be allowed, but they’ll need to be made through Google Maps.

    If Google is serious about rewarding brands in local search, they need to encourage it through their local search algorithms.

    For some people, it’s gotten so bad that they’re actually suing Google. On January 13, 2017, for instance, a group of fourteen locksmiths sued Google, Yahoo, and Bing over fake spam listings, as reported by Joy Hawkins.

    While some changes — like the Possum update — seemed to have a positive impact overall, root problems (such as multiple business listings) and many other issues still exist in the local search ecosystem.

    And there are other technically non-spammy ways that users are also manipulating Google results. Let’s look at a couple of these examples.

    It’s not all spam. Businesses are going to great lengths to stay within the GMB guidelines & manipulate results.

    Let’s look at an example of a personal injury attorney in the Denver market. Recently, I came across these results when doing a search for trial attorneys:

    Look at the #2 result listing, entitled “Denver Trial Lawyers.” I originally thought this was spam and wanted to report it, but I had to do my due diligence first.

    To start, I needed to verify that the listing was actually spam by looking at the official business name. I pulled up their website and, to my surprise, the business name in the logo is actually “Denver Trial Lawyers.”

    This intrigued me, so I decided to see if they were using a deceptive logo to advertise the business name or if this was the actual business name.

    I checked out the Colorado Secretary of State’s website and did a little digging around. After a few minutes I found the legally registered trade name through their online search portal. The formation date of this entity was 7/31/2008, so they appear to have been planning on using the name for some time.

    I also reviewed their MapMaker listing history to see when this change was made and whether it reflected the trade name registration. I saw that on October 10, 2016 the business updated their MapMaker listing to reflect the new business name.

    After all of this, I decided to take this one step further and called the business. When I did, the auto-attendant answered with “Thank you for calling Denver Trial Lawyers,” indicating that this is their legitimate business name.

    I guess that, according to the Google My Business Guidelines, this can be considered OK. They state:

    “Your name should reflect your business’ real-world name, as used consistently on your storefront, website, stationery, and as known to customers. Accurately representing your business name helps customers find your business online.”

    But what does that mean for everyone else?

    Recently, Gyi Tsakalakis also shared this beautiful screenshot on Twitter of a SERP with three businesses using their keywords in the business name:

    It seems they’re becoming more and more prominent because people see they’re working.

    To play devil’s advocate, there are also businesses that legitimately sport less-than-creative names, so where do you draw the line? (Note: I’ve been following some of above businesses for years; I can confirm they’ve changed their business names to include keywords).

    Here’s another example

    If you look closely, you’ll find more keyword- and location-stuffed business names popping up every day.

    Here’s an interesting case of a business (also located in Denver) that might have been trying to take advantage of Near Me searches, as pointed out by Matt Lacuesta:

    Do you think this business wanted to rank for Near Me searches in Denver? Maybe it’s just a coincidence. It’s funny, nonetheless.

    How are people actively manipulating local results?

    While there are many ways to manipulate a Google My Business result, today we’re going to focus on several tactics and identify the steps you can take to help fight back.

    Tactic #1: Spammy business names

    Probably the biggest problem in Google’s algorithm is the amount of weight they put into a business name. At a high level, it makes sense that they would treat this with a lot of authority. After all, if I’m looking for a brand name, I want to find that specific brand when I’m doing a search.

    The problem is that people quickly figured out that Google gives a massive priority to businesses with keywords or locations in their business names.

    In the example below, I did a search for “Fresno Personal Injury Lawyers” and was given an exact match result, as you can see in the #2 position:

    However, when I clicked through to the website, I found it was for a firm with a different name. In this case, they blatantly spammed their listing and have been floating by with nice rankings for quite some time.

    I reported their listing a couple of times and nothing was done until I was able to escalate this. It’s important to note that the account I used to edit this listing didn’t have a lot of authority. Once an authoritative account approved my edit, it went live.

    The spam listing below has the keyword and location in the business name.

    We reported this listing using the process outlined below, but sadly the business owner noticed and changed it back within hours.

    How can you fight back against spammy business names?

    Figuring out how to fight back against people manipulating results is now your job as an SEO. In the past, some in the industry have given the acronym “SEO” a bad name due to the manipulative practices they performed. Now it’s our job to give us a better name by helping to police these issues.

    Since Google MapMaker is now disappearing, you’ll need to make edits in Google Maps directly. This is also a bit of a problem, as there’s no room to leave comments for evidence.

    Here are the steps you should take to report a listing with incorrect information:

    1. Make sure you’re signed into Google
    2. Locate the business on maps.google.com
    3. Once the business is located, open it up and look for the “Suggest an edit” option:

    4. Once you select it, you’ll be able to choose the field you want to change:
    5. Make the necessary change and then hit submit! (Don’t worry — I didn’t make the change above.)

    Now, don’t expect anything to happen right away. It can take time for changes to take place. Also, the trust level of your profile seems to play a big role in how Google evaluates these changes. Getting the approval by someone with a high level of trust can make your edits go live quickly.

    Make sure you check out all of these great tips from Joy Hawkins on The Ultimate Guide to Fighting Spam on Google Maps, as well.

    Tactic #2: Fake business listings

    Another issue that we see commonly with maps spam is fake business listings. These listings are completely false businesses that black-hat SEOs build just to rank and get more leads.

    Typically we see a lot of these in the locksmith niche — it’s full of people creating fake listings. This is one of the reasons Google started doing advanced verification for locksmiths and plumbers. You can read more about that on Mike Blumenthal’s blog.

    Joy Hawkins pointed out a handy tip for identifying these listings on her blog, saying:

    “Many spammers who create tons of fake listings answer their phone with something generic like ‘Hello, locksmith’ or 'Hello, service.’”

    I did a quick search in Denver for a plumber and it wasn’t long before I found a listing with an exact match name. Using Joy’s tips, I called the number and it was disconnected. This seemed like an illegitimate listing to me.

    Thankfully, in this case, the business wasn’t ranking highly in the search results:

    When you run into these types of listings, you’ll want to take a similar approach as we did above and report the issue.

    Tactic #3: Review spam

    Review spam can come in many different forms. It’s clear that Google’s putting a lot of attention into reviews by adding sorting features and making stars more prominent. I think Google knows they can do a better job with their reviews overall, and I hope we see them take it a little bit more seriously.

    Let’s look at a few different ways that review spam appears in search results.

    Self-reviews & competitor shaming

    Pretty much every business knows they need reviews, but they have trouble getting them. One way people get them is to leave them on their own business.

    Recently, we saw a pretty blatant example where someone left a positive five-star review for a law firm and then five other one-star reviews for all of their competitors. You can see this below:

    Although it’s very unethical for these types of reviews to show up, it happens everyday. According to Google’s review and photo policies, they want to:

    “Make sure that the reviews and photos on your business listing, or those that you leave at a business you’ve visited, are honest representations of the customer experience. Those that aren’t may be removed.”

    While I’d say that this does violate the policies, figuring out which rule applies best is a little tricky. It appears to be a conflict of interest, as defined by Google’s review guidelines below:

    In this particular case, a member of our staff, Dillon Brickhouse, reached out to Google to see what they would say.

    Unfortunately, Google told Dillon that since there was no text in the review, nothing could be done. They refused to edit the review.

    And, of course, this is not an isolated case. Tim Capper recently wrote an article — “Are Google My Business Guidelines & Spam Algos Working?” — in which he identified similar situations and nothing had been done.

    How can you fight against review stars?

    Although there will still be cases where spammy reviews are ignored until Google steps up their game, there is something you can try to remove bad reviews. In fact, Google published the exact steps on their review guidelines page here.

    You can view the steps and flag a review for removal using the method below:

    What can you do if the basics don’t work?

    There are a ton of different ways to spam local listings. What can you do if you’ve reported the issue and nothing changes?

    While edits may take up to six weeks to go live, the next step involves you getting more public about the issue. The key to the success of this approach is documentation. Take screenshots, record dates, and keep a file for each issue you’re fighting. That way you can address it head-on when you finally get the appropriate exposure.

    Depending on whether or not the listing is verified, you’ll want to try posting in different forums:

    Verified listings

    If the listing you’re having trouble with is a verified listing, you’ll want to make a public post about it in the Google My Business Community forum. When posting, make sure to provide all corresponding evidence, screenshots, etc. to make the case very clear to the moderators. There’s a Spam and Policy section on the forum where you can do this.

    Unverified listings

    However, some spam listings are not verified listings. In these cases ,Joy Hawkins recommends that you engage with the Local Guides Connect Forum here.

    Key takeaways

    Sadly, there’s not a lot we can do outside of the basics of reporting results, but hopefully being more proactive about it and making some noise will encourage Google to take steps in the right direction.

    1. Start being more proactive about reporting listings and reviews that are ignoring the guidelines. Be sure to record the screenshots and take evidence.
    2. If the listings still aren’t being fixed after some time, escalate them to the Google My Business Community forum.
    3. Read Joy Hawkins’ post from start to finish on The Ultimate Guide to Fighting Spam in Google Maps
    4. Don’t spam local results. Seriously. It’s annoying. Continually follow and stay up-to-date on the Google My Business guidelines.
    5. Lastly, don’t think the edit you made is the final say or that it’ll stay around forever. The reality is that they could come back. During testing for this post, the listing for “Doug Allen Personal Injury Attorney Colorado Springs” came back within hours based on an owner edit.

    In the future, I’m personally looking forward to seeing some major changes from Google with regards to how they rank local results and how they monitor reviews. I would love to see local penalties become as serious as manual penalties.

    How do you think Google can fight this better? What are your suggestions? Let me know in the comments below.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5579690
    via IFTTT
  • Structuring URLs for Easy Data Gathering and Maximum Efficiency

    Posted by Dom-Woodman

    Imagine you work for an e-commerce company.

    Wouldn’t it be useful to know the total organic sessions and conversions to all of your products? Every week?

    If you have access to some analytics for an e-commerce company, try and generate that report now. Give it 5 minutes.

    Done?

    Or did that quick question turn out to be deceptively complicated? Did you fall into a rabbit hole of scraping and estimations?

    Not being able to easily answer that question — and others like it — is costing you thousands every year.

    Let’s jump back a step

    Every online business, whether it’s a property portal or an e-commerce store, will likely have spent hours and hours agonizing over decisions about how their website should look, feel, and be constructed.

    The biggest decision is usually this: What will we build our website with? And from there, there are hundreds of decisions, all the way down to what categories should we have on our blog?

    Each of these decisions will generate future costs and opportunities, shaping how the business operates.

    Somewhere in this process, a URL structure will be decided on. Hopefully it will be logical, but the context in which it’s created is different from how it ends up being used.

    As a business grows, the desire for more information and better analytics grows. We hire data analysts and pay agencies thousands of dollars to go out, gather this data, and wrangle it into a useful format so that smart business decisions can be made.

    It’s too late. You’ve already wasted £1000s a year.

    It’s already too late; by this point, you’ve already created hours and hours of extra work for the people who have to analyze your data and thousands will be wasted.

    All because no one structured the URLs with data gathering in mind.

    How about an example?

    Let’s go back to the problem we talked about at the start, but go through the whole story. An e-commerce company goes to an agency and asks them to get total organic sessions to all of their product pages. They want to measure performance over time.

    Now this company was very diligent when they made their site. They’d read Moz and hired an SEO agency when they designed their website and so they’d read this piece of advice: products need to sit at the root. (E.g. mysite.com/white-t-shirt.)

    Apparently a lot of websites read this piece of advice, because with minimal searching you can find plenty of sites whose product pages that rank do sit at the root: Appleyard Flowers, Game, Tesco Direct.

    At one level it makes sense: a product might be in multiple categories (LCD & 42” TVs, for example), so you want to avoid duplicate content. Plus, if you changed the categories, you wouldn’t want to have to redirect all the products.

    But from a data gathering point of view, this is awful. Why? There is now no way in Google Analytics to select all the products unless we had the foresight to set up something earlier, like a custom dimension or content grouping. There is nothing that separates the product URLs from any other URL we might have at the root.

    How could our hypothetical data analyst get the data at this point?

    They might have to crawl all the pages on the site so they can pick them out with an HTML footprint (a particular piece of HTML on a page that identifies the template), or get an internal list from whoever owns the data in the organization. Once they’ve got all the product URLs, they’ll then have to match this data to the Google Analytics in Excel, probably with a VLOOKUP or, if the data set is too large, a database.

    Shoot. This is starting to sound quite expensive.

    And of course, if you want to do this analysis regularly, that list will constantly change. The range of products being sold will change. So it will need to be a scheduled scrape or automated report. If we go the scraping route, we could do this, but crawling regularly isn’t possible with Screaming Frog. Now we’re either spending regular time on Screaming Frog or paying for a cloud crawler that you can schedule. If we go the other route, we could have a dev build us an internal automated report we can go to once we can get the resource internally.

    Wow, now this is really expensive: a couple days’ worth of dev time, or a recurring job for your SEO consultant or data analyst each week.

    This could’ve been a couple of clicks on a default report.

    If we have the foresight to put all the products in a folder called /products/, this entire lengthy process becomes one step:

    Load the landing pages report in Google Analytics and filter for URLs beginning with /product/.

    Congratulations — you’ve just cut a couple days off your agency fee, saved valuable dev time, or gained the ability to fire your second data analyst because your first is now so damn efficient (sorry, second analysts).

    As a data analyst or SEO consultant, you continually bump into these kinds of issues, which suck up time and turn quick tasks into endless chores.

    What is unique about a URL?

    For most analytics services, it’s the main piece of information you can use to identify the page. Google Analytics, Google Search Console, log files, all of these only have access to the URL most of the time and in some cases that’s all you’ll get — you can never change this.

    The vast majority of site analyses requires working with templates and generalizing across groups of similar pages. You need to work with templates and you need to be able to do this by URL.

    It’s crucial.

    There’s a Jeff Bezos saying that’s appropriate here:

    “There are two types of decisions. Type 1 decisions are not reversible, and you have to be very careful making them. Type 2 decisions are like walking through a door — if you don’t like the decision, you can always go back.”

    Setting URLs is very much a Type 1 decision. As anyone in SEO knows, you really don’t want to be constantly changing URLs; it causes a lot of problems, so when they’re being set up we need to take our time.

    How should you set up your URLs?

    How do you pick good URL patterns?

    First, let’s define a good pattern. A good pattern is something which we can use to easily select a template of URLs, ideally using contains rather than any complicated regex.

    This usually means we’re talking about adding folders because they’re easiest to find with just a contains filter, i.e. /products/, /blogs/, etc.

    We also want to keep things human-readable when possible, so we need to bear that in mind when choosing our folders.

    So where should we add folders to our URLs?

    I always ask the following two questions:

    • Will I need to group the pages in this template together?
      • If a set of pages needs grouping I need to put them in the same folder, so we can identify this by URL.
    • Are there crucial sub-groupings for this set of pages? If there are, are they mutually exclusive and how often might they change?
      • If there are common groupings I may want to make, then I should consider putting this in the URL, unless those data groupings are liable to change.

    Let’s look at a couple examples.

    Firstly, back to our product example: let’s suppose we’re setting up product URLs for a fashion e-commerce store.

    Will I need to group the products together? Yes, almost certainly. There clearly needs to be a way of grouping in the URL. We should put them in a /product/ folder.

    Within in this template, how might I need to group these URLs together? The most plausible grouping for products is the product category. Let’s take a black midi dress.

    What about putting “little black dress” or “midi” as a category? Well, are they mutually exclusive? Our dress could fit in the “little black dress” category and the “midi dress” category, so that’s probably not something we should add as a folder in the URL.

    What about moving up a level and using “dress” as a category? Now that is far more suitable, if we could reasonably split all our products into:

    • Dresses
    • Tops
    • Skirts
    • Trousers
    • Jeans

    And if we were happy with having jeans and trousers separate then this might indeed be an excellent fit that would allow us to easily measure the performance of each top-level category. These also seem relatively unlikely to change and, as long as we’re happy having this type of hierarchy at the top (as opposed to, say, “season,” for example), it makes a lot of sense.

    What are some common URL patterns people should use?

    Product pages

    We’ve banged on about this enough and gone through the example above. Stick your products in a /products/ folder.

    Articles

    Applying the same rules we talked about to articles and two things jump out. The first is top-level categorization.

    For example, adding in the following folders would allow you to easily measure the top-level performance of articles:

    • Travel
    • Sports
    • News

    You should, of course, be keeping them all in a /blog/ or /guides/ etc. folder too, because you won’t want to group just by category.

    Here’s an example of all 3:

    • A bad blog article URL: example.com/this-is-an-article-name/
    • A better blog article URL: example.com/blog/this-is-an-article-name/
    • An even better blog article URL: example.com/blog/sports/this-is-an-article-name

    The second, which obeys all our rules, is author groupings, which may be well-suited for editorial sites with a large number of authors that they want performance stats on.

    Location grouping

    Many types of websites often have category pages per location. For example:

    • Cars for sale in Manchester - /for-sale/vehicles/manchester
    • Cars for sale in Birmingham. - /for-sale/vehicles/birmingham

    However, there are many different levels of location granularity. For example, here are 4 different URLs, each a more specific location in the one above it (sorry to all our non-UK readers — just run with me here).

    • Cars for sale in Suffolk - /for-sale/vehicles/suffolk
    • Cars for sale in Ipswich - /for-sale/vehicles/ipswich
    • Cars for sale in Ipswich center - /for-sale/vehicles/ipswich-center
    • Cars for sale on Lancaster road - /for-sale/vehicles/lancaster-road

    Obviously every site will have different levels of location granularity, but a grouping often missing here is providing the level of location granularity in the URL. For example:

    • Cars for sale in Suffolk - /for-sale/cars/county/suffolk
    • Cars for sale in Ipswich - /for-sale/vehicles/town/ipswich
    • Cars for sale in Ipswich center - /for-sale/vehicles/area/ipswich-center
    • Cars for sale on Lancaster road - /for-sale/vehicles/street/lancaster-road

    This could even just be numbers (although this is less ideal because it breaks our second rule):

    • Cars for sale in Suffolk - /for-sale/vehicles/04/suffolk
    • Cars for sale in Ipswich - /for-sale/vehicles/03/ipswich
    • Cars for sale in Ipswich center - /for-sale/vehicles/02/ipswich-center
    • Cars for sale on Lancaster road - /for-sale/vehicles/01/lancaster-road

    This makes it very easy to assess and measure the performance of each layer so you can understand if it’s necessary, or if perhaps you’ve aggregated too much.

    What other good (or bad) examples of this has the community come across? Let’s here it!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5574196
    via IFTTT
  • The 6 Values (and 4 Benefits) of Agile Marketing - Whiteboard Friday

    Posted by AgileJim

    You’ve probably heard of agile processes in regards to software development. But did you know those same key values can have a huge impact if applied to marketing, as well? Being adaptive, collaborative, and iterative are necessary skills when we live in a world where Google can pull the rug out from under us at a moment’s notice.

    In today’s Whiteboard Friday, we welcome guest host Jim Ewel, founder of AgileMarketing.net, as he describes what’s important in the agile marketing process and why incorporating it into your own work is beneficial.

    Click on the whiteboard image above to open a high-resolution version in a new tab!

    Video Transcription

    Hey, Moz fans, this is Jim Ewel. I’m the blogger behind AgileMarketing.net, the leading blog on agile marketing, and I’m here to talk to you today about agile marketing.

    Agile marketing is an approach to marketing that takes its inspiration from agile software development. Like agile software development, it has a set of values and it has a set of benefits, and we’re going to talk about those values and benefits today.

    6 Values of Agile Marketing

    Value number one: Responding to change over following a plan.

    It’s not that we don’t plan. It’s just that we don’t write 30- to 40-page marketing plans. Instead, every quarter, we write a one-page plan that specifies our goals, our aspirations to get everybody on the same page, and then every two to four weeks, we reset our priorities. We say, “This is what we’re going to get done during this two- to four-week period.”

    Value number two: Rapid iterations over “big bang” campaigns.

    In traditional marketing, we get together in a room and we say, “We’re going to run a campaign for three to six months to a year.”

    We hash out the idea of what we’re going to do for that campaign. Then we communicate to the agency. They come up with creative. They review it with us. We go back and forth, and eventually we’ll run that campaign for three to six months. And you know what happens at the end of that campaign? We always declare victory because we’ve spent so much money and time on that campaign that every time we say, “It worked.”

    Well, we take a very different approach in agile marketing. We take an iterative approach. We start out with a little strategy. We meet for half an hour or an hour to figure out what do we think might work. Then we figure out how to test it. We measure the results, and this is very important, we document the learning.

    If something doesn’t work, we test it out and it doesn’t work, it’s okay because we’ve learned something. We’ve learned what doesn’t work. So then we iterate again, and we try something else and we do that, we get that cycle going in a very effective way.

    Value number three: Testing and data over opinions and conventions

    Here, again, the importance is that we’re not following the highest-paid person’s opinion. No HiPPOs. It’s all about: “Did we test it? Do we have data? Do we have the right metrics?” It’s important to select the right metrics and not vanity metrics, which make us feel good, but don’t really result in an improvement to the business.

    Value number four: Many small experiments over a few big bets

    And I like to talk about here the 70:20:10 rule. The idea behind the 70:20:10 rule is that we spend 70% of our budget and 50% of our time on the things that we know that work. We do it broadly across all our audiences.

    We then spend 20% of our budget and 25% of our time modifying the things that we know that work and trying to improve them. Maybe we distribute it in a little different way or we modify the content, we modify what the page looks like. But, anyways, we’re trying to improve that content.

    And the last 10% of our budget and 25% of our time, we spend on wild ideas, things where we fully expect that only about 2 or 3 out of 10 ideas is really going to work, and we focus those things on those creative, wild ideas that are going to be the future 70% and 20%.

    Value number five: Individuals and interactions over one-size-fits-all

    Now, I like to think about this in terms of one of the experiences that I have with SEO. I get a lot of requests for link building, and a lot of the requests that I get are form requests. They write me a little message that they’re writing to hundreds of other people, and I don’t pay any attention to those requests.

    I’m looking for somebody who really knows that I’m writing a blog about agile marketing, who’s interacting with me, who maybe says something about a post that I put on Agile Marketing, and those people are the ones that I’m going to give my business to, in effect, and I’m going to do some link building with them. Same thing applies to all of our marketing.

    Value number six: Collaboration over hierarchy and silos

    One of the key things in many marketing organizations is that different silos of the organization don’t seem to talk to each other. Maybe marketing isn’t talking to sales, or marketing hasn’t got the ear of senior management.

    Well, one of the things we do in agile marketing is we put some processes in place to make sure that all of those groups are collaborating. They’re setting the priorities together, and they’re reviewing the results together.

    4 Benefits of Agile Marketing

    As a result of these six values, there are four important benefits to agile marketing.

    I. The first is that you can get more done

    I’ve taught a lot of teams agile marketing, and, as a whole, they tell me that they get about 30% to 40% more done with agile marketing. I had one team tell me they got 400% more done, but that’s not typical. So they’re getting more done, and they’re getting more done because they’re not doing rework and they’re working on the right priorities.

    II. Getting the right things done

    Because you’re working with sales, you’re working with senior management to set the priorities, you’re making sure with agile marketing that you’re getting the right things done, and that’s important.

    III. Adapting to change

    Part of our life today in marketing is that things change. We know that Google is going to change their PageRank algorithm in 2017. We don’t know exactly how, but we know it’s going to happen, and we need to be able to adapt to that change quickly and accurately, and we put processes in place in agile marketing to make sure that happens.

    IV. Improved communications

    Improved communications both within the marketing team and, probably even more important, outside the marketing team to sales and senior management.

    By representing what we’re getting done on something like a Kanban board, everybody can see exactly what marketing is working on, where it’s at, and what they’re getting done.

    So that’s agile marketing in a nutshell. I’d love to hear your comments, and thanks for watching.

    Video transcription by Speechpad.com


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5561206
    via IFTTT
  • Your Daily SEO Fix: Keywords, Concepts, Page Optimization, and Happy NAPs

    Posted by FeliciaCrawford

    Howdy, readers! We’re back with our last round of videos for this go of the Daily SEO Fix series. To recap, here are the other topics we’ve covered previously:

    Today we’ll be delving into more keyword and concept research, quick wins for on-page optimization, and a neat way to stay abreast of duplicates and inaccuracies in your local listings. We use Moz Pro, the MozBar, and Moz Local in this week’s fixes.


    Fix #1: Grouping and analyzing keywords by label to judge how well you’re targeting a concept

    The idea of “concepts over keywords” has been around for a little while now, but tracking rankings for a concept isn’t quite as straightforward as it is for keywords. In this fix, Kristina shows you how to label groups of keywords to track and sort their rankings in Moz Pro so you can easily see how you’re ranking for grouped terms, chopping and analyzing the data as you see fit.


    Fix #2: Adding alternate NAP details to uncover and clean up duplicate or inaccurate listings

    If you work in local SEO, you know how important it is for listings to have an accurate NAP (name, address, phone number). When those details change for a business, it can wreak absolute havoc and confuse potential searchers. Jordan walks you through adding alternate NAP details in Moz Local to make sure you uncover and clean up old and/or duplicate listings, making closure requests a breeze. (This Whiteboard Friday is an excellent explanation of why that’s really important; I like it so much that I link to it in the resources below, too. ;)

    Remember, you can always use the free Check Listing tool to see how your local listings and NAP are popping up on search engines:

    Is my NAP accurate?


    Fix #3: Research keywords and concepts to fuel content suggestions — on the fly

    You’re already spying on your competitors’ sites; you might as well do some keyword research at the same time, right? Chiaryn walks you through how to use MozBar to get keyword and content suggestions and discover how highly ranking competitor sites are using those terms. (Plus a cameo from Lettie Pickles, star of our 2015 Happy Holidays post!)


    Fix #4: Discover whether your pages are well-optimized as you browse — then fix them with these suggestions

    A fine accompaniment to your on-the-go keyword research is on-the-go on-page optimization. (Try saying that five times fast.) Janisha gives you the low-down on how to check whether a page is well-optimized for a keyword and identify which fixes you should make (and how to prioritize them) using the SEO tool bar.


    Further reading & fond farewells

    I’ve got a whole passel of links if you’re interested in reading more educational content around these topics. And by “reading,” I mean “watching,” because I really stacked the deck with Whiteboard Fridays this time. Here you are:

    And of course, if you need a better handle on all this SEO stuff and reading blog posts just doesn’t cut the mustard, we now offer classes that cover all the essentials.

    My sincere thanks to all of you tuning in to check out our Daily SEO Fix video series over the past couple of weeks — it’s been fun writing to you and hearing from you in the comments! Be sure to keep those ideas and questions comin’ — we’re listening.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5555387
    via IFTTT
  • How to Do a Content Audit [Updated for 2017]

    Posted by Everett

    This guide provides instructions on how to do a content audit using examples and screenshots from Screaming Frog, URL Profiler, Google Analytics (GA), and Excel, as those seem to be the most widely used and versatile tools for performing content audits.

    {Expand for more background}


    TABLE OF CONTENTS


    What is a content audit?

    A content audit for the purpose of SEO includes a full inventory of all indexable content on a domain, which is then analyzed using performance metrics from a variety of sources to determine which content to keep as-is, which to improve, and which to remove or consolidate.

    What is the purpose of a content audit?

    A content audit can have many purposes and desired outcomes. In terms of SEO, they are often used to determine the following:

    • How to escape a content-related search engine ranking filter or penalty
    • Content that requires copywriting/editing for improved quality
    • Content that needs to be updated and made more current
    • Content that should be consolidated due to overlapping topics
    • Content that should be removed from the site
    • The best way to prioritize the editing or removal of content
    • Content gap opportunities
    • Which content is ranking for which keywords
    • Which content should be ranking for which keywords
    • The strongest pages on a domain and how to leverage them
    • Undiscovered content marketing opportunities
    • Due diligence when buying/selling websites or onboarding new clients

    While each of these desired outcomes and insights are valuable results of a content audit, I would define the overall “purpose” of one as:

    The purpose of a content audit for SEO is to improve the perceived trust and quality of a domain, while optimizing crawl budget and the flow of PageRank (PR) and other ranking signals throughout the site.

    Often, but not always, a big part of achieving these goals involves the removal of low-quality content from search engine indexes. I’ve been told people hate this word, but I prefer the “pruning” analogy to describe the concept.

    How & why “pruning” works

    {Expand for more on pruning}

    How to do a content audit

    Just like anything in SEO, from technical and on-page changes to site migrations, things can go horribly wrong when content audits aren’t conducted properly. The most common example would be removing URLs that have external links because link metrics weren’t analyzed as part of the audit. Another common mistake is confusing removal from search engine indexes with removal from the website.

    Content audits start with taking an inventory of all content available for indexation by search engines. This content is then analyzed against a variety of metrics and given one of three “Action” determinations. The “Details” of each Action are then expanded upon.

    The variety of combinations of options between the “Action” of WHAT to do and the “Details” of HOW (and sometimes why) to do it are as varied as the strategies, sites, and tactics themselves. Below are a few hypothetical examples:

    You now have a basic overview of how to perform a content audit. More specific instructions can be found below.

    The process can be roughly split into three distinct phases:

    1. Inventory & audit
    2. Analysis & recommendations
    3. Summary & reporting

    The inventory & audit phase

    Taking an inventory of all content, and related metrics, begins with crawling the site.

    One difference between crawling for content audits and technical audits:

    Technical SEO audit crawls are concerned with all crawlable content (among other things).

    Content audit crawls for the purpose of SEO are concerned with all indexable content.

    {Expand for more on crawlable vs. indexable content}

    All of this is changing rapidly, though. URLs as the unique identifier in Google’s index are probably going away. Yes, we’ll still have URLs, but not everything requires them. So far, the word “content” and URL has been mostly interchangeable. But some URLs contain an entire application’s worth of content. How to do a content audit in that world is something we’ll have to figure out soon, but only after Google figures out how to organize the web’s information in that same world. From the looks of things, we still have a year or two.

    Until then, the process below should handle most situations.

    Step 1: Crawl all indexable URLs

    A good place to start on most websites is a full Screaming Frog crawl. However, some indexable content might be missed this way. It is not recommended that you rely on a crawler as the source for all indexable URLs.

    In addition to the crawler, collect URLs from Google Analytics, Google Webmaster Tools, XML Sitemaps, and, if possible, from an internal database, such as an export of all product and category URLs on an eCommerce website. These can then be crawled in “list mode” separately, then added to your main list of URLs and deduplicated to produce a more comprehensive list of indexable URLs.

    Some URLs found via GA, XML sitemaps, and other non-crawl sources may not actually be “indexable.” These should be excluded. One strategy that works here is to combine and deduplicate all of the URL “lists,” and then perform a crawl in list mode. Once crawled, remove all URLs with robots meta or X-Robots noindex tags, as well as any URL returning error codes and those that are blocked by the robots.txt file, etc. At this point, you can safely add these URLs to the file containing indexable URLs from the crawl. Once again, deduplicate the list.

    Crawling roadblocks & new technologies

    Crawling very large websites

    First and foremost, you do not need to crawl every URL on the site. Be concerned with indexable content. This is not a technical SEO audit.

    {Expand for more about crawling very large websites}


    Crawling dynamic mobile sites

    This refers to a specific type of mobile setup in which there are two code-bases –– one for mobile and one for desktop –– but only one URL. Thus, the content of a single URL may vary significantly depending on which type of device is visiting that URL. In such cases, you will essentially be performing two separate content audits. Proceed as usual for the desktop version. Below are instructions for crawling the mobile version.

    {Expand for more on crawling dynamic websites}

    Crawling and rendering JavaScript

    One of the many technical issues SEOs have been increasingly dealing with over the last couple of years is the proliferation of websites built on JavaScript frameworks and libraries like React.js, Ember.js, and Angular.js.

    {Expand for more on crawling Javascript websites}

    Step 2: Gather additional metrics

    Most crawlers will give you the URL and various on-page metrics and data, such as the titles, descriptions, meta tags, and word count. In addition to these, you’ll want to know about internal and external links, traffic, content uniqueness, and much more in order to make fully informed recommendations during the analysis portion of the content audit project.

    Your process may vary, but we generally try to pull in everything we need using as few sources as possible. URL Profiler is a great resource for this purpose, as it works well with Screaming Frog and integrates easily with all of the APIs we need.

    Once the Screaming Frog scan is complete (only crawling indexable content) export the “Internal All” file, which can then be used as the seed list in URL Profiler (combined with any additional indexable URLs found outside of the crawl via GSC, GA, and elsewhere).

    This is what my URL Profiler settings look for a typical content audit for a small- or medium-sized site. Also, under “Accounts” I have connected via API keys to Moz and SEMrush.

    Once URL Profiler is finished, you should end up with something like this:

    Screaming Frog and URL Profiler: Between these two tools and the APIs they connect with, you may not need anything else at all in order to see the metrics below for every indexable URL on the domain.

    The risk of getting analytics data from a third-party tool

    We’ve noticed odd data mismatches and sampled data when using the method above on large, high-traffic websites. Our internal process involves exporting these reports directly from Google Analytics, sometimes incorporating Analytics Canvas to get the full, unsampled data from GA. Then VLookups are used in the spreadsheet to combine the data, with URL being the unique identifier.

    Metrics to pull for each URL:

    • Indexed or not?
      • If crawlers are set up properly, all URLs should be “indexable.”
      • A non-indexed URL is often a sign of an uncrawled or low-quality page.
    • Content uniqueness
      • Copyscape, Siteliner, and now URL Profiler can provide this data.
    • Traffic from organic search
      • Typically 90 days
      • Keep a consistent timeframe across all metrics.
    • Revenue and/or conversions
      • You could view this by “total,” or by segmenting to show only revenue from organic search on a per-page basis.
    • Publish date
      • If you can get this into Google Analytics as a custom dimension prior to fetching the GA data, it will help you discover stale content.
    • Internal links
      • Content audits provide the perfect opportunity to tighten up your internal linking strategy by ensuring the most important pages have the most internal links.
    • External links
    • Landing pages resulting in low time-on-site
      • Take this one with a grain of salt. If visitors found what they want because the content was good, that’s not a bad metric. A better proxy for this would be scroll depth, but that would probably require setting up a scroll-tracking “event.”
    • Landing pages resulting in Low Pages-Per-Visit
      • Just like with Time-On-Site, sometimes visitors find what they’re looking for on a single page. This is often true for high-quality content.
    • Response code
      • Typically, only URLs that return a 200 (OK) response code are indexable. You may not require this metric in the final data if that’s the case on your domain.
    • Canonical tag
      • Typically only URLs with a self-referencing rel=“canonical” tag should be considered “indexable.” You may not require this metric in the final data if that’s the case on your domain.
    • Page speed and mobile-friendliness

    Before you begin analyzing the data, be sure to drastically improve your mental health and the performance of your machine by taking the opportunity to get rid of any data you don’t need. Here are a few things you might consider deleting right away (after making a copy of the full data set, of course).


    Things you don’t need when analyzing the data

    {Expand for more on removing unnecessary data}

    Hopefully by now you’ve made a significant dent in reducing the overall size of the file and time it takes to apply formatting and formula changes to the spreadsheet. It’s time to start diving into the data.

    The analysis & recommendations phase

    Here’s where the fun really begins. In a large organization, it’s tempting to have a junior SEO do all of the data-gathering up to this point. I find it useful to perform the crawl myself, as the process can be highly informative.

    Step 3: Put it all into a dashboard

    Even after removing unnecessary data, performance could still be a major issue, especially if working in Google Sheets. I prefer to do all of this in Excel, and only upload into Google Sheets once it’s ready for the client. If Excel is running slow, consider splitting up the URLs by directory or some other factor in order to work with multiple, smaller spreadsheets.

    Creating a dashboard can be as easy as adding two columns to the spreadsheet. The first new column, “Action,” should be limited to three options, as shown below. This makes filtering and sorting data much easier. The “Details” column can contain freeform text to provide more detailed instructions for implementation.

    Use Data Validation and a drop-down selector to limit Action options.

    Step 4: Work the content audit dashboard

    All of the data you need should now be right in front of you. This step can’t be turned into a repeatable process for every content audit. From here on the actual step-by-step process becomes much more open to interpretation and your own experience. You may do some of them and not others. You may do them a little differently. That’s all fine, as long as you’re working toward the goal of determining what to do, if anything, for each piece of content on the website.

    A good place to start would be to look for any content-related issues that might cause an algorithmic filter or manual penalty to be applied, thereby dragging down your rankings.

    Causes of content-related penalties

    These typically fall under three major categories: quality, duplication, and relevancy. Each category can be further broken down into a variety of issues, which are detailed below.

    {Expand to learn more about quality, duplication, and relevancy issues}

    It helps to sort the data in various ways to see what’s going on. Below are a few different things to look for if you’re having trouble getting started.

    {Expand to learn more about what to look for}

    Taking the hatchet to bloated websites

    For big sites, it’s best to use a hatchet-based approach as much as possible, and finish up with a scalpel in the end. Otherwise, you’ll spend way too much time on the project, which eats into the ROI.

    This is not a process that can be documented step-by-step. For the purpose of illustration, however, below are a few different examples of hatchet approaches and when to consider using them.

    {Expand for examples of hatchet approaches}

    As you can see from the many examples above, sorting by “Page Type” can be quite handy when applying the same Action and Details to an entire section of the website.

    After all of the tool set-up, data gathering, data cleanup, and analysis across dozens of metrics, what matters in the end is the Action to take and the Details that go with it.

    URL, Action, and Details: These three columns will be used by someone to implement your recommendations. Be clear and concise in your instructions, and don’t make decisions without reviewing all of the wonderful data-points you’ve collected.

    Here is a sample content audit spreadsheet to use as a template, or for ideas. It includes a few extra tabs specific to the way we used to do content audits at Inflow.

    WARNING!

    As Razvan Gavrilas pointed out in his post on Cognitive SEO from 2015, without doing the research above you risk pruning valuable content from search engine indexes. Be bold, but make highly informed decisions:

    Content audits allow SEOs to make informed decisions on which content to keep indexed “as-is,” which content to improve, and which to remove.

    The reporting phase

    The content audit dashboard is exactly what we need internally: a spreadsheet crammed with data that can be sliced and diced in so many useful ways that we can always go back to it for more insight and ideas. Some clients appreciate that as well, but most are going to find the greater benefit in our final content audit report, which includes a high-level overview of our recommendations.

    Counting actions from Column B

    It is useful to count the quantity of each Action along with total organic search traffic and/or revenue for each URL. This will help you (and the client) identify important metrics, such as total organic traffic for pages marked to be pruned. It will also make the final report much easier to build.

    Step 5: Writing up the report

    Your analysis and recommendations should be delivered at the same time as the audit dashboard. It summarizes the findings, recommendations, and next steps from the audit, and should start with an executive summary.

    Here is a real example of an executive summary from one of Inflow’s content audit strategies:

    As a result of our comprehensive content audit, we are recommending the following, which will be covered in more detail below:

    Removal of about 624 pages from Google index by deletion or consolidation:

    • 203 Pages were marked for Removal with a 404 error (no redirect needed)
    • 110 Pages were marked for Removal with a 301 redirect to another page
    • 311 Pages were marked for Consolidation of content into other pages
      • Followed by a redirect to the page into which they were consolidated

    Rewriting or improving of 668 pages

    • 605 Product Pages are to be rewritten due to use of manufacturer product descriptions (duplicate content), these being prioritized from first to last within the Content Audit.
    • 63 “Other” pages to be rewritten due to low-quality or duplicate content.

    Keeping 226 pages as-is

    • No rewriting or improvements needed

    These changes reflect an immediate need to “improve or remove” content in order to avoid an obvious content-based penalty from Google (e.g. Panda) due to thin, low-quality and duplicate content, especially concerning Representative and Dealers pages with some added risk from Style pages.

    The content strategy should end with recommended next steps, including action items for the consultant and the client. Below is a real example from one of our documents.

    We recommend the following three projects in order of their urgency and/or potential ROI for the site:

    Project 1: Remove or consolidate all pages marked as “Remove”. Detailed instructions for each URL can be found in the “Details” column of the Content Audit Dashboard.

    Project 2: Copywriting to improve/rewrite content on Style pages. Ensure unique, robust content and proper keyword targeting.

    Project 3: Improve/rewrite all remaining pages marked as “Improve” in the Content Audit Dashboard. Detailed instructions for each URL can be found in the “Details” column

    Content audit resources & further reading

    Understanding Mobile-First Indexing and the Long-Term Impact on SEO by Cindy Krum
    This thought-provoking post begs the question: How will we perform content inventories without URLs? It helps to know Google is dealing with the exact same problem on a much, much larger scale.

    Here is a spreadsheet template to help you calculate revenue and traffic changes before and after updating content.

    Expanding the Horizons of eCommerce Content Strategy by Dan Kern of Inflow
    An epic post about content strategies for eCommerce businesses, which includes several good examples of content on different types of pages targeted toward various stages in the buying cycle.

    The Content Inventory is Your Friend by Kristina Halvorson on BrainTraffic
    Praise for the life-changing powers of a good content audit inventory.

    Everything You Need to Perform Content Audits


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5549613
    via IFTTT
  • The Step-By-Step Guide to Testing Voice Search Via PPC

    Posted by purna_v

    I was conned into my love of cooking by my husband.

    Never having set foot in the kitchen until the grand old age of 22, my husband (then boyfriend) — a former chef — said he’d teach me some simple recipes. I somewhat enjoyed the process but very much enjoyed the lavish praise he’d bestow upon me when eating whatever I whipped up.

    Highly encouraged that I seemingly had an innate culinary genius, I looked to grow my repertoire of recipes. As a novice, I found recipe books inspiring but confusing. For example, a recipe that called for cooked chicken made me wonder how on Earth I was meant to cook the chicken to get cooked chicken.

    Luckily, I discovered the life-changing power of fully illustrated, step-by-step recipes.

    Empowered by the clear direction they provided, I conquered cuisine after cuisine and have since turned into a confident cook. It took me only a few months to realize all that praise was simply a ruse to have me do most of the cooking. But by then I was hooked.

    When it comes to voice search, I’ve talked and written a lot about the subject over the past year. Each time, the question I get asked is “What’s the best way to start?”

    Today I’ll share with you an easy-to-follow, step-by-step guide to empower you to create your own voice search test. It’s sure to become one of your favorite recipes in coming months as conversational interfaces continue their rapid adoption rate.

    Testing voice search? But it’s not monetized.

    That’s correct. It’s not monetized as of yet. However, the usage rates have been growing exponentially. Already search engines are reporting that:

    • One out of ten searches are voice (per Baidu)
    • Twenty percent of all mobile Android searches are voice (Google)
    • Usage spans all age ranges, as we discovered at Cortana (which is owned by Microsoft, my employer):

    With Cortana being integrated into Windows 10, what we’re seeing is that age range demographics are now comparable to what eMarketer is reporting for overall smartphone usage. What this means: Using digital assistants is becoming more and more common. It’s no longer an edge case.

    More importantly, voice searches done on the search engines can often have PPC ads in the resultant SERPs — as you’ll see in my examples below.

    Why a PPC test?

    It’s easier to get started by testing voice search via PPC since you can get more detailed reporting across multiple levels.

    I would recommend taking a teeny-tiny budget — even $50 is often good enough — and putting it toward a voice search test. (Don’t fret, SEOs, I do have some tips in here for you as well.)

    Before we start…

    Here’s a quick reminder of how voice searches differ from text searches:

    1. Voice has longer queries
    2. Natural language means more question phrases
    3. Natural language reveals intent clearly
    4. Voice search has high local value
    5. And greatly impacts third-party listings

    You can read about it in more detail in my previous Moz article on the subject.


    Let’s get cooking!

    Step 1: See what, if any, voice activity exists for you currently

    Goal: Find out what voice-related activity exists by identifying Assumed Voice Queries.

    Estimated time needed: 30 min

    Tools needed: Search Query Reports (SQRs) and Excel

    A good place to start is by identifying how your audience is currently using voice to interact with you. In order to do this, we’ll need to look for what we can term “assumed voice queries.”

    Sidebar: What are Assumed Voice Queries?

    Since the search engines do not currently provide separate detailed reporting on voice queries, we can instead use the core characteristics of these queries to identify them. The subtle difference between keyboard search and voice search is “whom” people think they are interacting with.

    In the case of keyboard search, the search box clearly ties to a machine. Searchers input logical keywords they think will give them the best search results. They generally leave out filler words, such as “the,” “of,” “a,” and “and.” They also tend not to use question words; for example, “bicycle store,” rather than “what is a bicycle store?”

    But when a searcher uses voice search, he is not using a keyboard. It’s more like he’s talking to an actual human. You wouldn’t say to a person “bicycle store.” You might say: “Hey Cortana, what is the best place to buy a bicycle near me?”

    The key difference between text and voice search is that voice queries will be full thoughts, structured the way people speak, i.e. long-tailed queries in natural language. Voice searches tend to be approximately 4.2 words or longer on average, according to research from both Google and Microsoft Cortana.

    Thus, assumed voice queries would be keywords that fit in with these types of queries: longer and looking like natural language.

    Caveat: This isn’t going to be 100% accurate, of course, but it’s a good place to start for now.

    Even just eight months ago, things were fairly black and white. Some clients would have assumed voice queries while others didn’t. Lately, however, I’m seeing that most clients I look at have some element of assumed voice queries, indicative of how the market is growing.

    Okay, back to step 1

    a.) Start by downloading your search term report from within your Bing Ads or Google AdWords account. This is also commonly referred to as the search query report. You want to run this for at least the past 30 or 60 days (depending on volume). If you don’t have a PPC account, you can pull your search term report from Google Search Console or Bing Webmaster Tools.

    b.) Open it up in Excel, so we can get sorting.

    c.) Sort the columns to just the essentials. I usually keep only the search term, as well as the impression columns. For larger accounts, you may prefer to leave on the campaign and ad group name columns as well.

    d.) Sort by query length to isolate the search queries that are 5+ keywords in length — I’m going with 5 here simply to increase the odds that these would be assumed voice queries. A simple Excel formula — taught to me by my colleague John Gagnon—- can help count the number of words:

    Replace A1 with the actual cell number of your search term, and then drag that formula down the sheet. Here it becomes C2 instead of A1:

    e.) Calculate and sort, first by query length and then by impressions to find the assumed voice search queries with the most impressions. The result? You’ll get your final list — success!


    Step 2: Extrapolate, theme, sort

    Goal: Find additional keywords that could be missing and organize the list based on intent.

    Estimated time needed: 45 min

    Tools needed: Keyword tools of choice and Excel

    Now that you can see the assumed voice queries, you’ll have handy insights into your customer’s motivation. You know what your audience is searching for, and also important, what they are not searching for.

    Next, we need to build upon this list of keywords to find high-value potential queries we should add to our list. There are several helpful tools for this, such as Keyword Explorer and Answer the Public.

    a.) Go to the keyword research tool of your choice. In this example, I’ve used SEMRush. Notice how they provide data on organic and paid search for our subject area of “buy (a) bicycle.”

    b.) Next, let’s see what exists in question form. For any given subject area, the customer could have myriad questions along the spectrum of motivation. This output comes from a query on Answer the Public for “buy a bicycle,” showing the what, when, where, why, and how questions that actually express motivational intent:

    c.) These questions can now be sorted by degree of intent.

    • Is the searcher asking a fact-based question, looking for background information?
    • Are they farther along the process, looking at varieties of the product?
    • Are they approaching making a purchase decision, doing comparison shopping?
    • Are they ready to buy?

    Knowing the stage of the process the customer is in can help tailor relevant suggestions, since we can identify core themes and sort by intent. My brilliant colleague Julie Dilleman likes to prepare a chart such as this one, to more effectively visualize the groupings:

    d.) Use a research tool such as Bing Ads Intelligence or your demographic reports in Google Analytics to answer core questions related to these keywords, such as:

    • What’s the searcher age and gender breakdown for these queries?
    • Which device is dominating?
    • Which locations are most popular?

    These insights are eminently actionable in terms of bid modifications, as well as in guiding us to create effective ad copy.


    Step 3: Start optimizing campaigns

    Goal: Review competitive landscape and plan campaign optimizations.

    Estimated time needed: 75 min

    Tools needed: PPC account, NAP listings, Schema markup

    To get the lay of the land, we need to look at what shows up for these searches in the voice platforms with visual interfaces — i.e., the Search Engine Results Pages (SERPs) and Digital Personal Assistants — to see what type of results show up. Does the search provide map listings and reviews? Where are they pulling the data from? Are ads showing?

    a.) Run searches across multiple platforms. In my example, I am using Siri, Google app and Cortana on my desktop.

    Near me-type searches:

    These all had map listings in common — Apple maps, Google maps, and Bing maps, respectively.

    Research-type queries:

    Siri got it wrong and led me to a store, while both Google and Bing Ads provided me with SERPs to answer my question.

    Quick answer-type queries:

    While Siri pulled up multiple results from a Bing search, both Google and Cortana found what they considered to be the most helpful answer and read them aloud to me while also providing the option for looking at additional results.

    b.) Optimize your NAPs. Make sure you have listings that have an accurate name, address, phone number, and open hours on the top business listings such as Apple Maps, Google My Business, and Bing Places for Business.

    c.) Ensure you have proper Schema markup on your site. The more information you can provide to the search engines, the more effectively they can rank and display your site. Be sure to add in:

    • Contact info
    • Reviews
    • Articles/Events/Content

    d.) Optimize your PPC campaigns.

    1. Choose a small handful of voice search queries from your list across different intents.
    2. Add to new ad groups under existing campaigns. This helps you to take advantage of historical quality score benefits.
    3. Adjust bid modifiers based on your research on age, gender, and device.
    4. Adjust bids based on intent. For example, the following keywords demonstrate completely different levels of purchase intent:
    • Do I need a hybrid or mountain bike? – More research-based.
    • Who invented the bicycle? – Zero purchase intent. Add this as a negative keyword.
    • When does bike store XYZ open today? – High likelihood to purchase. Bid up.

    Step 4: Be the best answer

    Goal: Serve the right message at the right time in the right place.

    Estimated time needed: 60 min

    Tools needed: Creativity and Excel

    Make sure you have the relevant ad for the query. Relevance is critical — the results must be useful or they won’t be used.

    Do you have the right extensions to tailor toward the motivational intent noted above and the consumer’s ultimate goal? Make it easy for customers to get what they want without confusion.

    Voice searches cover a variety of different intents, so it’s important to ensure the ad in your test will align well with the intent of the query. Let’s consider this example:

    If the search query is “what’s the best waterproof digital camera under $500?” then your ad should only talk about digital cameras that are waterproof and around the $500 range. Doing this helps make it more seamless for the customer since the selections steps along the way are much reduced.

    A few additional tips and ideas:

    a.) Voice searches seem to frequently trigger product listing ads (PLAs) from the search engines, which makes sense since the images make them easier to sort through:

    If you can but haven’t already done so, look at setting up Shopping Campaigns within your PPC accounts, even just for your top-selling products.

    b.) For results when the SERPs come up, be sure to use ad extensions to provide additional information to your target audience. Consider location, contact, conversion, and app information that is relevant. They make it easy for customers to find the info they need.

    c.) Check citations and reviews to ensure you’re showing up at your best. If reviews are unfavorable, consider implementing reputation management efforts.

    d.) Work to earn more featured snippets, since the search engines often will read them out as the top answer. Dr. Pete has some excellent tips in this Moz article.

    e.) Your helpful content will come to excellent use with voice search — share it as a PPC ad for the higher-funnel assumed voice queries to help your test.

    f.) Video has been getting much attention — and rightly so! Given the increased engagement it can provide, as well as its ability to stand out in the SERPs, consider offering video content (as extensions or regular content) for relevant assumed voice queries.


    Step 5: Analyze. Rinse. Repeat.

    Goal: Review performance and determine next steps.

    Estimated time needed: 60 min

    Tools needed: Analytics and Excel

    Here’s where the power of PPC can shine. We can review reporting across multiple dimensions to gauge how the test is performing.

    Quick note: It may take several weeks to gather enough data to run meaningful reports. Remember that voice search volume is small, though significant.

    a.) First, determine the right KPIs. For example,

    • Lower-funnel content will, of course, have the most conversion-specific goals that we’re used to.
    • Research-type queries will need to be measured by micro-conversions and different KPIs such as form fills, video views, and leads generated.

    b.) Pull the right reports. Helpful reports include:

    • The keyword performance report will show you the impressions, clicks, CTR, quality score, conversions, and much more about each individual keyword within your campaigns. Use the keyword report to find out which keywords are triggering your ads, generating clicks, and leading to conversions. You can also identify keywords that are not performing well to determine whether you want to delete them.
    • Ad performance reports show you the impressions, clicks, spend, and conversions for each ad. Use this report to help you determine which ads are leading to the most clicks and conversions, and which are not performing. Remember, having underperforming ads in your campaigns can pull down the quality of your campaign.
    • Filter by device and by demographics. This combination telling us what devices are dominating and who is converting can help us to adjust bids and create more effective ad copy.
    • Create a campaign report looking at your PLA performance. Do tweaks or major overhauls to close gaps versus your expectations.

    c.) Determine where you can personalize further. AgilOne research indicates that “more than 70% of consumers expect a personalized experience with the brands they interact with.”

    Carefully pairing the the most ad messaging with each assumed voice query is incredibly important here.


    Let’s recap

    Step 1. See what, if any, voice activity exists for you currently.

    Step 2. Extrapolate. Theme. Sort.

    Step 3. Start optimizing campaigns.

    Step 4: Be the best answer.

    Step 5. Analyze. Rinse. Repeat.

    Pretty do-able, right?

    It’s relatively simple and definitely affordable. Spend four or five hours completing your own voice search test. It can open up worlds of opportunity for your business. It’s best to start testing now while there’s no fire under us and we can test things out in a low-risk environment — an ideal way to get a leg-up over the competition. Bon appétit!

    Have you tried some other tests to address voice search queries? Please do share in the comments below.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5543600
    via IFTTT
  • Helpful Tips for Doing Search in a Low-Volume Niche

    Posted by Jeremy_Gottlieb

    SEO — you know, that thing you do whereby everyone and their mother will find your site on the web. Easy, right? “Can you SEO this page for me?” or “We’re about to launch a webinar. Can you SEO-ify it, please?” I’m sure most of you reading this can probably relate to these types of questions and the ensuing pressure from bosses or clients. If you’re lucky, you work in a realm where there’s plenty of search volume to chase, featured snippets to occupy, and answer boxes to solve. But what about those who work in the low-search volume niches typically seen in B2B, or with companies pioneering a new product or service that no one really knows about yet (so they obviously can’t be searching for it)?

    This blog post is for you, the digital marketer who toils and struggles to drive search visibility where there hardly is any. Let’s get to work.

    Search, as I’ll refer to it here, includes both paid and organic. Neither of these may ultimately be the best channel for your organization, but after reading this post, hopefully you’ll be able to verify whether your search channels are humming along and working harmoniously, while leaving other sources of user acquisition to bear the brunt of the load. Three topics I will cover in this post are SEO, paid search, and CRO, but please keep in mind: these are not the only possible digital marketing actions that can be done for an organization in a low-search volume niche. This is just a glimpse into what may be possible, and hopefully it can spark inspiration for you or your client in ways you’d either forgotten about or hadn’t thought of. Whether you’re just starting out in digital marketing or you’ve been around for a while, I hope this will be able to provide some direction.

    1. SEO

    Sometimes I think of SEO as a skyscraper, though this may just be because I’m surrounded by them in Distilled’s New York City office (come join us!). In order to reach greater heights via SEO, you need to make sure the foundation of your building is in order. And what I mean by “foundation” is the technical structure of your site. Things that you’d want to check will include:

    • Is the link profile clean?
    • Does the site have strong internal linking?
      • Do pages get created and then fall into a black hole?
    • Can search engines crawl the site?
      • Are there noindex, robots.txt, canonical, or other tags that hide desired content from being ranked?
    • Has the site been hacked?
    • Are there descriptive and unique title tags and meta descriptions?
    • Is tracking set up properly (i.e. Google Analytics)?
    • Does the site appear trustworthy and authoritative?

    Targeting transactional queries

    Once the foundation is in order, it’s time to begin the keyword research. Establish which queries are most vital to the organization, how much search volume they have, and which ones are most likely to yield conversions, whatever that means to the organization. With your foundation in order, you can take the most important queries and try to match them to existing pages on the site, such as the homepage and key product/services pages. It may turn out that the queries an organization should be targeting don’t have pages available yet. That’s okay — you’ll just need to create them. I generally recommend that shorter-tail queries (two or three words) be targeted by primarily by product or service pages, with longer queries either handled by those very pages or by a Q&A section and/or a blog. This is just one way to handle a hierarchy and avoids a cluttered navigation with hundreds of long-tail queries and content, though it is by no means a rule.

    Targeting higher-funnel queries

    Once the key queries have been locked down and the content plan created, we can move on to more informational queries. It’s very likely that these more higher-part-of-the-funnel queries will require content that’s less sales-y and will be more informational, making desired conversions (like consultation signups) less likely from this crowd, at least on the first interaction. You’ll need to build strong content that answers the users’ queries and establishes the organization as thought leaders and experts at all levels of a particular niche.

    Let’s say, for example, we’re responsible for driving traffic for an organization that allows people to invest in solar energy. Lots of people buy stocks and bonds and real estate, but how many invest in solar energy or power purchase agreements? Transactional-type queries, those most likely to provide us with customers, don’t get searched all that much.

    Now, let’s take a look at some longer-tail queries that are tangentially related to our main offering:

    These queries clearly have more search volume, but appear to be more informational. “CSR” (in the above example) most often means “corporate social responsibility,” a term frequently aligned with impact investing, where investments not only are expected to produce financial returns, but have a positive social effect as well. From these queries we’d be able to help provide proof to users and search engines that the organization is indeed an expert in the particular realm of solar energy and investing. Our desired audience may come to us with different initial intents, but we can begin to funnel people down the path towards eventually becoming clients.

    As will be discussed further in this post, the point here is to drive traffic organically, even if that very traffic is unlikely to convert. With optimizations to the content, we’ll be able to solicit emails and try to drive visitors further into the funnel, but first we just need to make sure that we’re enhancing our visibility and driving more unpaid traffic.

    Key tips:

    • Target transactional queries with pages optimized for the ideal conversion
    • Target informational queries and modify pages to push the user deeper into the funnel towards more transactional pages
      • If a blog is perceived as a waste of resources and useless traffic, it’s probably not being fully leveraged

    2. Paid search

    Oftentimes, organizations will use SEO and paid search for their user acquisition, but will silo the two channels so that they don’t work together. Simply put, this is a mistake. Using paid spend for Google or Bing Adwords in conjunction with an organization’s SEO efforts will assist the company’s bottom line.

    Get your tracking right

    When beginning a paid campaign, it’s absolutely vital to set up tracking properly from the beginning. Do not miss this step. Without setting up tracking properly, it will be impossible to tie back conversions to paid and organic and see their relationship. If you already have paid attribution set up, double-check to ensure that there’s no double counting from having multiple GA tracking snippets, or if you’re using a landing page generator like Unbounce or HubSpot, that you’ve added in tracking on those platforms. Sometimes when using landing page generator tools (like HubSpot), you might elect to have an in-line thank you section display instead of redirecting someone to an external link. If you use an in-line thank you, the URL will not change and will make tracking more difficult in Google Analytics. This is not impossible to get around (events tracking can do the trick), but is something to keep in mind.

    Bid on your money keywords

    Without getting too fancy, a very important next step is to identify the transactional, important keywords — the ones that might be costly to buy, but that are worth the spend. Waiting for results from organic search or for the different channels to successfully harmonize may take longer than a boss or C-suite might be willing to wait for, so getting results directly from traditional paid search will require a strong setup from the get-go.

    The magic of RLSA

    Remarketing Lists for Search Ads (RLSAs) allow organizations to remarket to specific people who have visited a specific page on their site, either by bidding on keywords one typically wouldn’t bid on, or by altering the bid up or down. This doesn’t create new traffic; it only displays to those who have visited your site in the past. The magic of this is that when done properly, you can potentially achieve lower cost-per-clicks and conversions, as the audience seeing these ads is already familiar with your brand.

    Let’s use, for example, the strategy of creating content around “what are alternative investments?” or “how to invest responsibly?”. These would be informational-level queries, representing topics people would like to investigate further. While the ideal scenario for our business would be that everyone would automatically want to invest with us, we know this isn’t likely to be the typical case. Instead, we’ll use organic search to earn traffic from less competitive, informational queries, and use RLSA to bid on queries that would ordinarily be too competitive for us, like “investing” or “how to start investing.” By using pixels and remarketing to anyone who visited our “what are alternative investments” page, we know that the person is more familiar with us and we can try to bid on broader queries that may have been either too expensive for us in the first place, or unlikely to generate conversions. In this case, because the user is already familiar with the brand, it can lead to higher click-through and conversion rates.

    Much has already been written about RLSA strategies, so for more information you can begin here:

    Advanced remarketing

    Another option is to create more informational content for queries that are less competitive than some other terms, but that also isn’t as likely to get people to convert when they visit (i.e. most blog content). Let’s say that our blog captures email addresses, either through forms, popups, or some other means. With our captured emails, we’d be able to build an email list and submit it to Adwords, then target people in Google Search, Gmail, and YouTube. We can target existing users (people aligned with a particular email) or people who are similar to the audience and share similar web habits. With this tool, we can expand our potential audience.

    If one were to run broad-match search ads against a general population (not one that had been cookied by a site), it would likely get very expensive very quickly and would be likely to have low conversion rates. Using broad match with RLSAs is a smart approach that mitigates the risk of complete budget destruction from people with little intent to convert, while allowing organizations to see what people are searching for; it can be an extremely powerful tool for keyword discovery.

    By using broad search and RLSAs, your organization will be able to find out faster what people are actually searching for. Any keywords that cost money but that aren’t relevant or aren’t converting can be added to a negative keyword filter. Ones that are valuable should be added to exact match and, depending on the keyword, may be worthy of having content developed for it so that traffic can be captured without paying for each individual click.

    Key tips:

    • Make sure tracking is properly set up
    • Ensure you’re bidding on transactional queries
    • Landing pages MUST have a clear goal and be optimized for one desired conversion
    • RLSAs can be used for keyword discovery and may enable you to bid on more transactional, generally competitive keywords

    3. CRO

    It’s not uncommon for organizations operating in low-search volume niches to also have fairly long sales cycles. The endgame of what we’re trying to accomplish here is to drive people from an informational mindset to a transactional mindset. We’re operating under the assumption that there are few searches for the service or good we’re trying to provide, so we’re going to get people to our service or good via the backdoor. The way we’ll do this is by guiding people from content that speaks to an informational query to our conversion pages.

    To be clear, getting the ultimate conversion on our site might not require sending someone to a product page. It’s totally possible that someone may be interested in our ultimate goal after having landed on a tangentially-related page.

    Let’s use the example again of the solar energy investment company. We’ll say that our ultimate goal is to get people to open an account where they actually invest in a power purchase agreement (PPA). Understanding what a PPA is isn’t important, but what should be conveyed is that getting anyone to actually spend money and link a bank account to the site is not a simple task. There’s friction — people need to trust that they won’t be robbed, that their financial information will be protected, and that their money is actually going where they expect it to go. Knowing that there’s friction in the funnel, we’re likely going to need multiple points of engagement with the potential client and will need to provide information and trust signals along the way to answer their questions.

    Hunting microconversions

    That said, our first goal should be to optimize and provide high-quality landing pages for the person who searches “solar energy investment.” Once we handle that low-hanging fruit, we need to move on to the tangential queries, like “what are the advantages of solar energy?”. Within this page, we should frame the benefits of solar energy and use multiple call-to-actions or banners to persuade someone to learn more about how to invest in solar energy. It’s totally plausible that someone who searches for “what are the advantages of solar energy?” has no interest in investing whatsoever and will leave the page as soon as their question is deemed answered. It’s also possible that they never even make it to the landing page itself because the Google SERP has answered the question for them:

    We can’t be scared of this tactic just because Google is stealing content and placing the information within the search results. Featured snippets still have very high click-through rates (meaning users still visit that content) and we don’t know which queries will trigger featured snippets tomorrow or in six months from now. All we can do is create the best content for users’ queries.

    For the visitors who are interested in the potential of solar energy investment, there are several ways that we can keep them engaged:

    1. Email capture popups
      1. This can be done via time-elapsed or exit intent versions
    2. Static or sticky call-to-actions (for products, demos, or email capture) either within the content or adjacent to the text in right or left-hand rails

    AMP to accelerate traffic growth

    Google’s Accelerated Mobile Pages (AMP) are one of my favorite SERP enhancements that Google has made in the past few years. As a quick reminder, AMP provide cached, streamlined HTML that makes loading pages on mobile crazy-fast. AMP pages also show a little lightning bolt icon in the SERPs; eventually this will condition users that any page without a lightning bolt will be slow. They don’t allow for interstitials or popups, and even have their own area within search results. Google is heavily investing in this space and is incentivizing publishers to do so as well. Creating AMP variations of your organization’s content can be a strong idea for driving more web traffic, but it can come with some potential pitfalls that you should be aware of.

    Tracking

    AMP pages require their own Google Analytics tracking and it does not come standard. If you use a CMS or GTM that automatically places GA tracking code within the head, you will not automatically be covered with AMP pages. Make sure you set up tracking properly.

    No popups

    I just mentioned that email capture popups are a great way to ensure multiple points of engagement with users who otherwise may have just visited a particular site one time. By capturing emails, you can doing remarketing, send product emails, keep people apprised of updates with your organization, and create similar audiences, among other benefits as well. However, once you create AMP and they begin to replace your m. or responsive pages on mobile within the search results, your popups will no longer appear. While you won’t be able to get the true functionality of popups, a suitable workaround is to add email form capture in-line within your AMP content:

    When it comes to CRO for pages that receive organic traffic, it’s not the end of the world if a person doesn’t undertake an action; we’re not paying for them. Just by visiting our page, we can cookie them and remarket to them on search and other paid channels like Facebook and Twitter. We’ve extracted value from our visitors and they don’t even know it.

    On the other hand, when a visitor arrives via paid search, we need to be doing everything in our power to make sure that the person undertakes a desired action. That desired action could be providing an email in exchange for a download, scheduling a consultation, purchasing a product, or providing other information. It bears repeating, though: if you’re paying for clicks and have not made a concerted effort to design your landing page in such a way that users are most likely to undertake the desired action, you’re wasting money. I do not claim that there is some sort of silver bullet that will work across every single niche and every single audience for every single product. Using a gated landing page for one client may work best for some, while soliciting user information via a form might work best for another. The only way to know is to test and see how users interact.

    Key tips:

    • Some ultimate conversions have a lot of friction; don’t shy away from microconversions
    • If you already get traffic and it “doesn’t convert,” think critically about how it would be possible to re-engage with those users or what they might feel comfortable providing you with at their level of interest
    • AMP pages need separate GA tracking and do not allow popups

    Tying it all together

    Let’s recap this. When an organization cannot bank on a large enough search volume in its particular niche to provide the necessary runway for growth, it needs to think creatively about how to best harmonize organic and paid search channels. Truthfully, all organizations (regardless of the size of the search volume in their niche) should do this, but it’s particularly important in low-search volume niches because without it, growth is likely to be far slower and smaller than it could be.

    For the sake of argument, we assume that the product or service doesn’t have much popularity, so we need to expand into informational queries, the topics that one would search before they know that they could use the service or product.

    We need to ensure that we quickly and properly identify the transactional queries in our niche, and build pages that fulfill the intent of the user’s query. These pages should almost always have a call-to-action that allows people to take advantage of their interest immediately.

    However, we’re looking for growth, so we need to think even bigger. We need to provide content for the people who are searching for queries that demonstrate some sort of interest in our niche, but don’t necessarily know that they want our service or product. We build out those pages, populating them with content and resources that fulfill the user’s query, but also provide calls-to-action that capture emails and/or drive users further into the funnel. People who don’t realize that they want your product or service may not react well to hard sells and high barriers to entry. Asking for an email address can be far more palatable and keep the conversation going.

    If using AMP pages to gain more visibility, make sure that you have properly set up Google Analytics first and have added in email form captures at different points within the content, not just at the end — most of your readers won’t make it there. Depending on what our strategy is, we may also want to begin cookie-ing users for remarketing.

    When using paid search, as with organic search, we need to make sure that we’re properly targeting the transactional queries we need — the ones where people are most likely to undertake a desired action. By using RLSAs we can also potentially bid on more generic, short-tail queries that might have yielded low conversion rates if we were to have exposed them to the broader Internet community at large, but could prove very successful if we only show them to people who have visited our site or specific pages. In addition to possibly converting at a higher rate than a regular paid search campaign, RLSAs can serve as a great keyword discovery tool without completely decimating your budget.

    In the vast majority of cases, traffic for traffic’s sake is useless. If your traffic doesn’t undertake the actions that you want them to, chances are it will be declared useless and investment into content creation may decrease. I’ve seen it happen. Your traffic does not need to convert via buying a product or scheduling a demo the first time they visit, but if you have microconversions (like email capture) set up, you’ll put yourself in a much better position to re-engage with your visitors, find new similar visitors, and drive more conversions.

    One last nugget of wisdom from Distilled’s own Head of PPC, Rich Cotton:

    The main benefit of one agency running PPC and SEO is communication; aligning marketing messages, sharing data, keeping a consistent user experience, making lines of communication for the client easier. By ensuring that your PPC and SEO teams are working together, PPC can fill gaps in SERP exposure for organic, test new copy, and share important keyword data that PPC still has control of.

    Rather than competing, when drawing up attribution models, an integrated approach allows us to share the value driven and work holistically for the benefit of the client, rather than fight to prove that our channel was the more effective one. Your marketing dollars will go where they are most needed, not be argued over by inter-agency politics.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5537690
    via IFTTT
  • Customizing the Payment Button Styles of the Stripe Payments Plugin

    {EntryTitle} was first posted on Tips and Tricks HQ

  • Ranking Multiple Domains to Own More SERP Real Estate - Whiteboard Friday

    Posted by randfish

    Is it better to rank higher in a single position frequently, or to own more of the SERP real estate consistently? The answer may vary. In today’s Whiteboard Friday, Rand presents four questions you should ask to determine whether this strategy could work for you, shares some high-profile success cases, and explores the best ways to go about ranking more than one site at a time.

    Click on the whiteboard image above to open a high-resolution version in a new tab!

    Video Transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about ranking multiple domains so you can own a bunch of the SERP real estate and whether you should do that, how you should do that, and some ways to do that.

    I’ll show you an example, because I think that will help kick us off. So you are almost certainly familiar, if you’ve played around in the world of real estate SERPs, with Zillow and Trulia. Zillow started up here in Seattle. They bought Trulia a couple of years ago and have been doing pretty amazingly well. In fact, I was speaking at a real estate conference in New York recently, and my God, I did an example where I was searching for tons of cities plus homes for sale or plus real estate or houses, and Zillow and Trulia, along with a couple others, are in the top five for every single city I checked no matter how big or small. So very, very impressive SEO.

    One of the things that a lot of SEOs have seen, not just with Zillow and Trulia, but with a few others like them is that, man, they own multiple listings in the SERPs, and so they kind of dominate the real estate here and get even more clicks as an entity, a combined entity than they would if Zillow had, for example, when they bought Trulia, redirected Trulia.com to Zillow. On Whiteboard Friday and at Moz and a lot of people in the SEO world often recommend that when you buy another domain or when you’re combining entities, that you do actually 301 redirect, because it can help bring up the rankings here.

    The reason Zillow did not do that, and I think wisely so, is that they already dominated these SERPs so well that they figured pushing Trulia’s rankings into their own and combining the two entities would, yes, probably move them from number two and three to number one in some places, but they already own number one in a ton of these. Trulia was almost always one or two or three. Why not own all of that? Why not own 66% of the top three consistently, rather than number one a little more frequently? I think that was probably the right move for them.

    Questions to ask

    As a result, many SEOs asked themselves, “Should I do something similar? Should I buy other domains, or should I start other domains? Should I run multiple sites and try and rank for many different keyword phrases or a few keywords that I care very, very deeply about?” The answer is, well, before you do that, before you make any call, ask yourself these four questions. The answers to them will help you determine whether you should follow in these footsteps.

    1. Do I need to dominate multiple results for a keyword or set of keywords MORE than I need better global rankings or a larger set of keywords sending visits?

    So first off, do you need to dominate multiple results for a keyword or a small set of keywords more than you need to improve global rankings? Global rankings, I mean like all the keywords that your site could rank for potentially or that you do rank for now or could help you to rank a larger set of keywords that send visits and traffic.

    You kind of have to weigh these two things. It’s either: Do I want two out of the top three results to be mine for this one keyword, or do I want these 10 keywords that I’m ranking for to broadly move up in rankings generally?
    A lot of the time, this will bias you to go, “Wait a minute, no, the opportunity is not in these few keywords where I could dominate multiple positions. It’s in moving up the global rankings and making my ability to rank for any set of keywords greater.”

    Even at Moz today, Moz does very well in the rankings for a lot of terms around SEO. But if, for example, let’s say we were purchased by Search Engine Land or we bought Search Engine Land. If those two entities were combined, and granted, we do rank for many, many similar keywords, but we would probably not keep them separate. We would probably combine them, because the opportunity is still greater in combination than it is in dominating multiple results the way Zillow and Trulia are. This is a pretty rare circumstance.

    2. Will I cannibalize link equity opportunities with multiple sites? Can I get enough link equity & authority signals to rank both?

    Second, are you going to cannibalize link equity opportunities with multiple sites, and do you have the ability to get enough equity and authority signals to rank both domains or all three or all four or whatever it is?

    A challenge that many SEOs encounter is that building links and building up the authority to rank is actually the toughest part of the SEO equation. The keyword targeting and ranking multiple domains, that’s nice to have, but first you’ve got to build up a site that’s got enough link equity. If it is challenging to earn links, maybe the answer is, hey, we should combine all our efforts or we should on work on all our efforts. Remember, even though Zillow owns Trulia, Trulia and Zillow are one entity, the links between them don’t help the other one rank very much. It was already a case, before Zillow bought them, that Trulia and Zillow independently ranked. The two sites offer different experiences and some different listings and all that kind of stuff.

    There are reasons why Google keeps them separately and why Zillow and Trulia keep them separately. But that’s going to be really tough. If you’re a smaller business or a smaller website starting out, you’re trying to decide where should you put your link equity efforts, it might lean a little more this way.

    3. Should I use my own domain(s), should I buy an existing site that ranks, or should I do barnacle SEO?

    Number three. Should you use your own domain if you decide that you need to have multiple domains ranking for a single keyword? A good example of this case scenario is reputation management for your own brand name or for maybe someone who works at your company, some particular product that you make, whatever it is, or you’re very, very focused and you know, “Hey, this one keyword matters more than everything else that we do.”

    Okay. Now the question would be: Should you use your own domain or a new domain that you buy and register and start building up? Should you buy an existing domain, something that already ranks, or should you do barnacle SEO? So mysite2.com, that would be basically you’re registering a new domain, you’re building it up from scratch, you’re growing that brand, and you’re trying to build all the signals that you’ll need.

    You could buy a competitor that’s already ranking in the search results, that already has equity and ranking ability. Or you could say, “Hey, we see that this Quora question is doing really well. Can we answer that question tremendously well?” Or, “We see that Medium can perform tremendously well here. You know what? We can write great posts on Medium.” “We see that LinkedIn does really well in this sector. Great. We can do some publishing on LinkedIn.” Or, “There’s a list of companies on this page. We can make sure that we’re the number-one listed company on that page.” Okay. That kind of barnacle SEO, we did a Whiteboard Friday about that a few months ago, and you can check that out too.

    4. Will my multi-domain strategy cost time/money that would be better spent on boosting my primary site’s marketing? Will those efforts cause brand dilution or sacrifice potential brand equity?

    And number four, last but not least, will your multi-site domain strategy cost you time and money that would be better spent on boosting your primary site’s marketing efforts? It is the case that you’re going to sacrifice something if you’re putting effort into a different website versus putting all your marketing efforts into one domain.

    Now, one reason that people certainly do this is because they’re trying riskier tactics with the second site. Another reason is because they’ve already dominated the rankings as much as they want, or because they’re trying to build up multiple properties so that they can sell one off. They’re very, very good at link building this space already and growing equity and those sorts of things.

    But the other question you have to ask is: Will this cause brand dilution? Or is it going to sacrifice potential brand equity? One of the things that we’ve observed in the SEO world is that rankings alone do not make for revenue. It is absolutely the case that people are choosing which domains to click on and which domains to buy from and convert on based on the brand and their brand familiarity. When you’re building up a second site, you’ve got to be building up a second brand. So that’s an additional cost and effort.

    Now, I don’t want to rain on the entire parade here. Like we’ve said in a few of these, there are reasons why you might want to consider multiple domains and reasons why a multi-domain strategy can be effective for some folks. It’s just that I think it might be a little less often and should be undertaken with more care and attention to detail and to all these questions than what some folks might be doing when they buy a bunch of domains and hope that they can just dominate the top 10 right out of the gate.

    All right, everyone, look forward to your thoughts on multi-domain strategies, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5523972
    via IFTTT
  • Giving Away the Farm: Proposal Development for New SEO Agencies

    Posted by BrianChilds

    There’s a huge difference between making money from selling SEO and actually making a living — or making a difference, for that matter. A new marketing agency will quickly discover that surviving on $1,000 contracts is challenging. It takes time to learn the client and their customers, and poorly written contracts can lead to scope creep and dissatisfied clients.

    It’s common for agencies to look for ways to streamline operations to assist with scaling their business, but one area you don’t want to streamline is the proposal research process. I actually suggest going in the opposite direction: create proposals that give away the farm.

    Details matter, both to you and your prospective client

    I know what you’re thinking: Wait a minute! I don’t want to do a bunch of work for free!

    I too am really sensitive to the idea that a prospective client may attempt to be exploitative. I think it’s a risk worth taking. Outlining the exact scope of services forces you to do in-depth research on your prospect’s website and business, to describe in detail what you’re going to deliver. Finding tools and processes to scale the research process is great, but don’t skip it. Detailing your findings builds trust, establishes your team as a high-quality service provider, and will likely make you stand out amongst a landscape of standard-language proposals.

    Be exceptional. Here’s why I think this is particularly important for the proposal development process.

    Avoid scope creep & unrealistic expectations

    Just like the entrepreneur that doesn’t want to tell anyone their amazing idea without first obtaining an NDA, new SEO agencies may be inclined to obscure their deliverables in standard proposal language out of fear that their prospect will take their analysis and run. Generic proposal language is sometimes also used to reduce the time and effort involved in getting the contract out the door.

    This may result in two unintended outcomes:

    1. Lack of specific deliverables can lead to contract scope creep.
    2. It can make you lazy and you end up walking into a minefield.

    Companies that are willing to invest larger sums of money in SEO tend to have higher expectations, and this cuts both ways. Putting in the work to craft a detailed proposal not only shows that you actually care about their business, but it also helps manage the contract’s inevitable growth when you’re successful.

    Misalignment of goals or timelines can sour a relationship quickly. Churn in your contracts is inevitable, but it’s much easier to increase your annual revenue by retaining a client for a few more months than trying to go out and find a replacement. Monetizing your work effectively and setting expectations is an excellent way to make sure the relationship is built on firm ground.

    Trust is key

    Trust is foundational to SEO: building trustworthy sites, creating valuable and trustworthy content, becoming a trusted resource for your community that’s worth linking to. Google rewards this kind of intent.

    Trust is an ethos; as an SEO, you’re a trust champion. You can build trust with a prospect by being transparent and providing overwhelming value in your proposal. Tell your clients exactly what they need to do based on what you discover in your research.

    This approach also greases the skids a little when approaching the prospect for the first time. Imagine the difference between a first touch with your prospect when you request a chance to discuss research you’ve compiled, versus a call to simply talk about general SEO value. By developing an approach that feels less like a sales process, you can navigate around the psychological tripwires that make people put up barriers or question your trustworthiness.

    This is also referred to as “consultative sales.” Some best practices that business owners typically respond well to are:

    • Competitive research. A common question businesses will ask about SEO relates to keywords: What are my competitors ranking for? What keywords have they optimized their homepage for? One thing I like to do is plug the industry leader’s website into Open Site Explorer and show what content is generating the most links. Exporting the Top Pages report from OSE makes for a great leave-behind.
    • Top questions people are asking. Research forum questions that relate to the industry or products your prospect sells. When people ask questions on Yahoo Answers or Quora, they’re often doing so because they can’t find a good answer using search. A couple of screenshots can spark a discussion around how your prospective client’s site can add value to those online discussions.

    Yes, by creating a more detailed proposal you do run the risk that your target company will walk away with the analysis. But if you suspect that the company is untrustworthy, then I’d advise walking away before even building the analysis in the first place; just try getting paid on time from an untrustworthy company.

    Insights can be worth more

    By creating a very transparent, “give away the farm”-type document, SEOs empower themselves to have important discussions prior to signing a contract. Things like:

    • What are the business goals this company wants to focus on?
    • Who are the people they want to attract?
    • What products or pages are they focused on?

    You’ll have to understand at least this much to set up appropriate targeting, so all the better to document this stuff beforehand. And remember, having these conversations is also an investment in your prospect’s time — and there’s some psychology around getting your target company to invest in you. It’s called “advancement” of the sale. By getting your prospect to agree to a small, clearly defined commitment, it pulls them further down the sales funnel.

    In the case of research, you may choose to ask the client for permission to conduct further research and report on it at a specified time in the future. You can use this as an opportunity to anchor a price for what that research would cost, which frames the scope of service prices later on.

    By giving away the farm, you’ll start off the relationship as a trusted advisor. And even if you don’t get the job to do the SEO work itself, it’s possible you can develop a retainer where you help your prospect manage digital marketing generally.

    Prepping the farm for sale

    It goes without saying, but making money from SEO requires having the right tools for the job. If you’re brand-new to the craft, I suggest practicing by auditing a small site. (Try using the site audit template we provide in the site audit bootcamp.) Get comfortable with the tools, imagine what you would prioritize, and maybe even do some free work for a site to test out how long it takes to complete relatively small tasks.

    Imagine you were going to approach that website and suggest changes. Ask yourself:

    • Who are they selling to?
    • What keywords and resources does this target user value?
    • What changes would you make that would improve search rank position for those terms?
    • What would you do first?
    • How long would it take? (In real human time, not starving-artist-who-never-sleeps time.)

    Some of the tools that I find most helpful are:

    • Moz Pro Campaigns > Custom Reports. This is an easy one. Create a Moz Pro campaign (campaigns are projects that analyze the SEO performance of a website over time) and then select “Custom Reports” in the top-right of the Campaign interface. Select the modules you want to include — site crawl and keyword rankings against potential competitors are good ones — and then offer to send this report to your prospect for free. It’s a lot harder for a customer to turn something off than it is to turn something on. Give away a custom report and then set up time to talk through the results on a weekly basis.
    • Builtwith.com. This free service allows you to investigate a number of attributes related to a website, including the marketing software installed. Similar to a WHOIS search, I use this to understand whether the prospect is overloaded with software or if they completely lack any marketing automation. This can be helpful for suggesting tools that will improve their insights immediately. Who better to help them implement those tools or provide a discount than you?
    • Keyword Explorer > Lists. Create a list in Keyword Explorer and look for the prevalence of SERP features. This can tell you a lot about what kinds of content are valuable to their potential visitor. Do images show up a lot? What about videos? These could be opportunities for your customer.
    • MozBar. Use the Page Analysis tab in MozBar to assess some of the website’s most important pages. Check page load speed in the General Attributes section. Also see if they have enticing titles and descriptions.
    • Site crawl. If you don’t have Moz Pro, I recommend downloading Screaming Frog. It can crawl up to 500 pages on a site for free and then allow you to export the results into a .csv file. Look for anything that could be blocking traffic to the site or reducing the chance that pages are getting indexed, such as 4XX series errors or an overly complex robots.txt file. Remedying these can be quick wins that provide a lot of value. If you start a Moz Pro campaign, you can see how these issues are reduced over time.

    Want to learn how to add SEO to your existing portfolio of marketing services?

    Starting on April 4th, 2017, Moz is offering a 3-day training seminar on How to Add SEO to Your Agency. This class will be every Tuesday for 3 weeks and will cover some of the essentials for successfully bringing SEO into your portfolio.

    Sign up for the seminar!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5520424
    via IFTTT
  • Your Daily SEO Fix: Link Building & Ranking Zero

    Posted by FeliciaCrawford

    Last week, we shared a series of four short videos intended to help you accomplish some easy wins using Moz Pro: Your Daily SEO Fix: The Keyword Research Edition. Week Two (that’s this week!) is focused on link building, identifying opportunities to take over SERP features, and doing that all-important competitive research.

    This time around, we’re using a mix of Open Site Explorer, Fresh Web Explorer, and Moz Pro. Open Site Explorer has some free capabilities, so if you’d like to follow along…

    Open OSE in a new tab!

    If you’re a Moz Pro subscriber, crack open your campaigns and settle in. If you’d like to see what all the fuss is about without committing, you can dip your toes in with a free 30-day trial. And now that that’s out of the way, let’s get started!


    Fix #1: Link building & brand building via unlinked mentions

    “Moz” is an SEO software company, yes, but it’s also Morissey’s nickname and short for “Mozambique.” All three of those things get mentioned around the web a bunch on any given day, but if we want to identify link building opportunities just to our site, it could get confusing quick. Luckily, Jordan’s here to explain how to quickly find unlinked mentions of your site or brand using Open Site Explorer and keep those pesky Smiths references out of your results.


    Fix #2: Prioritizing and organizing your link building efforts

    Link building requires more than just finding opportunities, of course. April shows how you can prioritize your efforts by identifying the most valuable linking opportunities in Open Site Explorer, then dives into how you can cultivate a continuous stream of fresh related content ripe for a link-back with Fresh Web Explorer.


    Fix #3: Ranking in position zero with SERP features in Moz Pro

    If you have keywords that aren’t ranking in the first few results pages, don’t despair — there’s hope yet. There are tons of opportunities to rank above the first organic result with the prevalence of SERP features. In this video, Ellie shows how you can identify keywords that need some love, track SERP feature opportunities for them, filter your keywords to show only those that surface certain SERP features, and more.


    Fix #4: Gleaning insights from your competitors’ backlink profiles

    Remember April from Fix #2? She’s back and ready to show you how to get the skinny on your competitors’ juicy backlink profiles using both your Moz Pro campaign and Open Site Explorer.


    One step beyond

    That wraps up our latest week of fixes! We’ve got one last round coming at you next Thursday. As always, if you’re curious and want to follow along, you can try it all out firsthand by taking a free trial of Moz Pro. We also offer several SEO bootcamp courses that can get you started on fundamentals if this whole SEO thing is pretty new to you.

    If you’re looking for some more meaty info on these topics, I’ve put together a short list of light reading for you:

    Thanks for reading along, friends, and we’ll see you again for the last installment of the Daily SEO Fix series next week!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5516097
    via IFTTT
  • The State of Searcher Behavior Revealed Through 23 Remarkable Statistics

    Posted by randfish

    One of the marketing world’s greatest frustrations has long been the lack of data from Google and other search engines about the behavior of users on their platforms. Occasionally, Google will divulge a nugget of bland, hard-to-interpret information about how they process more than X billion queries, or how many videos were uploaded to YouTube, or how many people have found travel information on Google in the last year. But these numbers aren’t specific enough, well-sourced enough, nor do they provide enough detail to be truly useful for all the applications we have.

    Marketers need to know things like: How many searches happen each month across various platforms? Is Google losing market share to Amazon? Are people really starting more searches on YouTube than Bing? Is Google Images more or less popular than Google News? What percent of queries are phrased as questions? How many words are in the average query? Is it more or less on mobile?

    These kinds of specifics help us know where to put our efforts, how to sell our managers, teams, and clients on SEO investments, and, when we have this data over time, we can truly understand how this industry that shapes our livelihoods is changing. Until now, this data has been somewhere between hard and impossible to estimate. But, thanks to clickstream data providers like Jumpshot (which helps power Moz’s Keyword Explorer and many of our keyword-based metrics in Pro), we can get around Google’s secrecy and see the data for ourselves!

    Over the last 6 months, Russ Jones and I have been working with Jumpshot’s Randy Antin, who’s been absolutely amazing — answering our questions late at night, digging in with his team to get the numbers, and patiently waiting while Russ runs fancy T-Distributions on large datasets to make sure our estimates are as accurate as possible. If you need clickstream data of any kind, I can’t recommend them enough.

    If you’re wondering, “Wait… I think I know what clickstream data is, but you should probably tell me, Rand, just so I know that you know,” OK. :-) Clickstream monitoring means Jumpshot (and other companies like them — SimilarWeb, Clickstre.am, etc.) have software on the device that records all the pages visited in a browser session. They anonymize and aggregate this data (don’t worry, your searches and visits are not tied to you or to your device), then make parts of it available for research or use in products or through APIs. They’re not crawling Google or any other sites, but rather seeing the precise behavior of devices as people use them to surf or search the Internet.

    Clickstream data is awesomely powerful, but when it comes to estimating searcher behavior, we need scale. Thankfully, Jumpshot can deliver here, too. Their US panel of Internet users is in the millions (they don’t disclose exact size, but it’s between 2–10) so we can trust these numbers to reliably paint a representative picture. That said, there may still be biases in the data — it could be that certain demographics of Internet users are more or less likely to be in Jumpshot’s panel, their mobile data is limited to Android (no iOS), and we know that some alternative kinds of searches aren’t captured by their methodology**. Still, there’s amazing stuff here, and it’s vastly more than we’ve been able to get any other way, so let’s dive in.

    23 Search Behavior Stats

    Methodology: All of the data was collected from Jumpshot’s multi-million user panel in October 2016. T-distribution scaling was applied to validate the estimates of overall searches across platforms. All other data is expressed as percentages. Jumpshot’s panel includes mobile and desktop devices in similar proportions, though no devices are iOS, so users on Macs, iPhones, and iPads are not included.

    #1: How many searches are *really* performed on Google.com each month?

    On the devices and types of queries Jumpshot can analyze, there were an average of 3.4 searches/day/searcher. Using the T-Distribution scaling analysis on various sample set sizes of Jumpshot’s data, Russ estimated that the most likely reality is that between 40–60 billion searches happen on Google.com in the US each month.

    Here’s more detail from Russ himself:

    “…All of the graphs are non-linear in shape, which indicates that as the samples get bigger we are approaching correct numbers but not in a simple % relationship… I have given 3 variations based on the estimated number of searches you think happen in the US annually. I have seen wildly different estimates from 20 billion to 100 billion, so I gave a couple of options. My gut is to go with the 40 billion numbers, especially since once we reach the 100MM line for 40 and 60B, there is little to no increase for 1 billion keywords, which would indicate we have reached a point where each new keyword is searched just 1 time.”

    How does that compare to numbers Google’s given? Well, in May of 2016, Google told Search Engine Land they “processed at least 2 trillion searches per year.” Using our Jumpshot-based estimates, and assuming October of 2016 was a reasonably average month for search demand, we’d get to 480–720 billion annual searches. That’s less than half of what Google claims, but Google’s number is WORLDWIDE! Jumpshot’s data here is only for the US. This suggests that, as Danny Sullivan pointed out in the SELand article, Google could well be handling much, much more than 2 trillion annual searches.

    Note that we believe our 40–60 billion/month number is actually too low. Why? Voice searches, searches in the Google app and Google Home, higher search use on iOS (all four of which Jumpshot can’t measure), October could be a lower-than-average month, some kinds of search partnerships, and automated searches that aren’t coming from human beings on their devices could all mean our numbers are undercounting Google’s actual US search traffic. In the future, we’ll be able to measure interesting things like growth or shrinkage of search demand as we compare October 2016 vs other months.

    #2: How long is the average Google search session?

    Form the time of the initial query to the loading of the search results page and the selection of any results, plus any back button clicks to those SERPs and selection of new results, the all-in average was just under 1 minute. If that seems long, remember that some search sessions may be upwards of an hour (like when I research all the best ryokans in Japan before planning a trip — I probably clicked 7 pages deep into the SERPs and opened 30 or more individual pages). Those long sessions are dragging up that average.

    #3: What percent of users perform one or more searches on a given day?

    This one blew my mind! Of the millions of active, US web users Jumpshot monitored in October 2016, only 15% performed at least one or more searches in a day. 45% performed at least one query in a week, and 68% performed one or more queries that month. To me, that says there’s still a massive amount of search growth opportunity for Google. If they can make people more addicted to and more reliant on search, as well as shape the flow of information and the needs of people toward search engines, they are likely to have a lot more room to expand searches/searcher.

    #4: What percent of Google searches result in a click?

    Google is answering a lot of queries themselves. From searches like “Seattle Weather,” to more complicated ones like “books by Kurt Vonnegut” or “how to remove raspberry stains?”, Google is trying to save you that click — and it looks like they’re succeeding.

    66% of distinct search queries resulted in one or more clicks on Google’s results. That means 34% of searches get no clicks at all. If we look at all search queries (not just distinct ones), those numbers shift to a straight 60%/40% split. I wouldn’t be surprised to find that over time, we get closer and closer to Google solving half of search queries without a click. BTW — this is the all-in average, but I’ve broken down clicks vs. no-clicks on mobile vs. desktop in #19 below.

    #5: What percent of clicks on Google search results go to AdWords/paid listings?

    It’s less than I thought, but perhaps not surprising given how aggressive Google’s had to be with ad subtlety over the last few years. Of distinct search queries in Google, only 3.4% resulted in a click on an AdWords (paid) ad. If we expand that to all search queries, the number drops to 2.6%. Google’s making a massive amount of money on a small fraction of the searches that come into their engine. No wonder they need to get creative (or, perhaps more accurately, sneaky) with hiding the ad indicator in the SERPs.

    #6: What percent of clicks on Google search results go to Maps/local listings?

    This is not measuring searches and clicks that start directly from maps.google.com or from the Google Maps app on a mobile device. We’re talking here only about Google.com searches that result in a click on Google Maps. That number is 0.9% of Google search clicks, just under 1 in 100. We know from MozCast that local packs show up in ~15% of queries (though that may be biased by MozCast’s keyword corpus).

    #7: What percent of clicks on Google search results go to links in the Knowledge Graph?

    Knowledge panels are hugely popular in Google’s results — they show up in ~38% of MozCast’s dataset. But they’re not nearly as popular for search click activity, earning only ~0.5% of clicks.

    I’m not totally surprised by that. Knowledge panels are, IMO, more about providing quick answers and details to searchers than they are about drawing the click themselves. If you see Knowledge Panels in your SERPs, don’t panic too much that they’re taking away your CTR opportunity. This made me realize that Keyword Explorer is probably overestimating the degree to which Knowledge Panels remove organic CTR (e.g. Alice Springs, which has only a Knowledge Panel next to 10 blue links, has a CTR opportunity of 64).

    #8: What percent of clicks on Google search results go to image blocks?

    Images are one of the big shockers of this report overall (more on that later). While MozCast has image blocks in ~11% of Google results, Jumpshot’s data shows images earn 3% of all Google search clicks.

    I think this happens because people are naturally drawn to images and because Google uses click data to specifically show images that earn the most engagement. If you’re wondering why your perfectly optimized image isn’t ranking as well in Google Images as you hoped, we’ve got strong suspicions and some case studies suggesting it might be because your visual doesn’t draw the eye and the click the way others do.

    If Google only shows compelling images and only shows the image block in search results when they know there’s high demand for images (i.e. people search the web, then click the “image” tab at the top), then little wonder images earn strong clicks in Google’s results.

    #9: What percent of clicks on Google search results go to News/Top Stories results?

    Gah! We don’t know for now. This one was frustrating and couldn’t be gathered due to Google’s untimely switch from “News Results” to “Top Stories,” some of which happened during the data collection period. We hope to have this in the summer, when we’ll be collecting and comparing results again.

    #10: What percent of clicks on Google search results go to Twitter block results?

    I was expecting this one to be relatively small, and it is, though it slightly exceeded my expectations. MozCast has tweet blocks showing in ~7% of SERPs, and Jumpshot shows those tweets earning ~0.23% of all clicks.

    My guess is that the tweets do very well for a small set of search queries, and tend to be shown less (or shown lower in the results) over time if they don’t draw the click. As an example, search results for my name show the tweet block between organic position #1 and #2 (either my tweets are exciting or the rest of my results aren’t). Compare that to David Mihm, who tweeted very seldomly for a long while and has only recently been more active — his tweets sit between positions #4 and #5. Or contrast with Dr. Pete, whose tweets are above the #1 spot!

    #11: What percent of clicks on Google search results go to YouTube?

    Technically, there are rare occasions when a video from another provider (usually Vimeo) can appear in Google’s SERPs directly. But more than 99% of videos in Google come from YouTube (which violates anti-competitive laws IMO, but since Google pays off so many elected representatives, it’s likely not an issue for them). Thus, we chose to study only YouTube rather than all video results.

    MozCast shows videos in 6.3% of results, just below tweets. In Jumpshot’s data, YouTube’s engagement massively over-performed its raw visibility, drawing 1.8% of all search clicks. Clearly, for those searches with video intent behind them, YouTube is delivering well.

    #12: What percent of clicks on Google search results go to personalized Gmail/Google Mail results?

    I had no guess at all on this one, and it’s rarely discussed in the SEO world because it’s so relatively difficult to influence and obscure. We don’t have tracking data via MozCast because these only show in personalized results for folks logged in to their Gmail accounts when searching, and Google chooses to only show them for certain kinds of queries.

    Jumpshot, however, thanks to clickstream tracking, can see that 0.16% of search clicks go to Gmail or Google Mail following a query, only a little under the number of clicks to tweets.

    #13: What percent of clicks on Google search results go to Google Shopping results?

    The Google Shopping ads have become pretty compelling — the visuals are solid, the advertisers are clearly spending lots of effort on CTR optimization, and the results, not surprisingly, reflect this.

    MozCast has Shopping results in 9% of queries, while clickstream data shows those results earning 0.55% of all search clicks.

    #14: What percent of Google searches result in a click on a Google property?

    Google has earned a reputation over the last few years of taking an immense amount of search traffic for themselves — from YouTube to Google Maps to Gmail to Google Books and the Google App Store on mobile, and even Google+, there’s a strong case to be made that Google’s eating into opportunity for 3rd parties with bets of their own that don’t have to play by the rules.

    Honestly, I’d have estimated this in the 20–30 percent range, so it surprised me to see that, from Jumpshot’s data, all Google properties earned only 11.8% of clicks from distinct searches (only 8.4% across all searches). That’s still significant, of course, and certainly bigger than it was 5 years ago, but given that we know Google’s search volume has more than doubled in the last 5 years, we have to be intellectually honest and say that there’s vastly more opportunity in the crowded-with-Google’s-own-properties results today than there was in the cleaner-but-lower-demand SERPs of 5 years ago.

    #15: What percent of all searches happen on any major search property in the US?

    I asked Jumpshot to compare 10 distinct web properties, add together all the searches they receive combined, and share the percent distribution. The results are FASCINATING!

    Here they are in order:

    1. Google.com 59.30%
    2. Google Images 26.79%
    3. YouTube.com 3.71%
    4. Yahoo! 2.47%
    5. Bing 2.25%
    6. Google Maps 2.09%
    7. Amazon.com 1.85%
    8. Facebook.com 0.69%
    9. DuckDuckGo 0.56%
    10. Google News 0.28%

    I’ve also created a pie chart to help illustrate the breakdown:

    If the Google Images data shocks you, you’re not alone. I was blown away by the popularity of image search. Part of me wonders if Halloween could be responsible. We should know more when we re-collect and re-analyze this data for the summer.

    Images wasn’t the only surprise, though. Bing and Yahoo! combine for not even 1/10th of Google.com’s search volume. DuckDuckGo, despite their tiny footprint compared to Facebook, have almost as many searches as the social media giant. Amazon has almost as many searches as Bing. And YouTube.com’s searches are nearly twice the size of Bing’s (on web browsers only — remember that Jumpshot won’t capture searches in the YouTube app on mobile, tablet, or TV devices).

    For the future, I also want to look at data for Google Shopping, MSN, Pinterest, Twitter, LinkedIn, Gmail, Yandex, Baidu, and Reddit. My suspicion is that none of those have as many searches as those above, but I’d love to be surprised.

    BTW — if you’re questioning this data compared to Comscore or Nielsen, I’d just point out that Jumpshot’s panel is vastly larger, and their methodology is much cleaner and more accurate, too (at least, IMO). They don’t do things like group site searches on Microsoft-owned properties into Bing’s search share or try to statistically sample and merge methodologies, and whereas Comscore has a *global* panel of 2 million, Jumpshot’s *US-only* panel of devices is considerably larger.

    #16: What’s the distribution of search demand across keywords?

    Let’s go back to looking only at keyword searches on Google. Based on October’s searches, the top 1MM queries accounts for about 25% of all searches with the top 10MM queries accounting for about 45% and the top 1BB queries accounting for close to 90%. Jumpshot’s kindly illustrated this for us:

    The long tail is still very long indeed, with a huge amount of search volume taking place in keywords outside the top 10 million most-searched-for queries. In fact, almost 25% of all search volume happens outside the top 100 million keywords!

    I illustrated this last summer with data from Russ’ analysis based on Clickstre.am data, and it matches up fairly well (though not exactly; Jumpshot’s panel is far larger).

    #17: How many words does the average desktop vs. mobile searcher use in their queries?

    According to Jumpshot, a typical searcher uses about 3 words in their search query. Desktop users have a slightly higher query length due to having a slightly higher share of queries of 6 words or more than mobile (16% for desktop vs. 14% for mobile).

    I was actually surprised to see how close desktop and mobile are. Clearly, there’s not as much separation in query formation as some folks in our space have estimated (myself included).

    #18: What percent of queries are phrased as questions?

    For this data, Jumpshot used any queries that started with the typical “Who,” “What,” “Where,” “When,” “Why,” and “How,” as well as “Am” (e.g. Am I registered to vote?) and “Is” (e.g. Is it going to rain tomorrow?). The data showed that ~8% of search queries are phrased as questions .

    #19: What is the difference in paid vs. organic CTR on mobile compared to desktop?

    This is one of those data points I’ve been longing for over many years. We’ve always suspected CTR on mobile is lower than on desktop, and now it’s confirmed.

    For mobile devices, 40.9% of Google searches result in an organic click, 2% in a paid click, and 57.1% in no click at all. For desktop devices, 62.2% of Google searches result in an organic click, 2.8% in a paid click, and 35% in no click. That’s a pretty big delta, and one that illustrates how much more opportunity there still is in SEO vs. PPC. SEO has ~20X more traffic opportunity than PPC on both mobile and desktop. If you’ve been arguing that mobile has killed SEO or that SERP features have killed SEO or, really, that anything at all has killed SEO, you should probably change that tune.

    #20: What percent of queries on Google result in the searcher changing their search terms without clicking any results?

    You search. You don’t find what you’re seeking. So, you change your search terms, or maybe you click on one of Google’s “Searches related to…” at the bottom of the page.

    I’ve long wondered how often this pattern occurs, and what percent of search queries lead not to an answer, but to another search altogether. The answer is shockingly big: a full 18% of searches lead to a change in the search query!

    No wonder Google has made related searches and “people also ask” such a big part of the search results in recent years.

    #21: What percent of Google queries lead to more than one click on the results?

    Some of us use ctrl+click to open up multiple tabs when searching. Others click one result, then click back and click another. Taken together, all the search behaviors that result in more than one click following a single search query in a session combine for 21%. That’s 21% of searches that lead to more than one click on Google’s results.

    #22: What percent of Google queries result in pogo-sticking (i.e. the searcher clicks a result, then bounces back to the search results page and chooses a different result)?

    As SEOs, we know pogo-sticking is a bad thing for our sites, and that Google is likely using this data to reward pages that don’t get many pogo-stickers and nudge down those who do. Altogether, Jumpshot’s October data saw 8% of searches that followed this pattern of search > click > back to search > click a different result.

    Over time, if Google’s successful at their mission of successfully satisfying more searchers, we’d expect this to go down. We’ll watch that the next time we collect results and see what happens.

    #23: What percent of clicks on non-Google properties in the search results go to a domain in the top 100?

    Many of us in the search and web marketing world have been worried about whether search and SEO are becoming “winner-take-all” markets. Thus, we asked Jumpshot to look at the distribution of clicks to the 100 domains that received the most Google search traffic (excluding Google itself) vs. those outside the top 100.

    The results are somewhat relieving: 12.6% of all Google clicks go to the top 100 search-traffic-receiving domains. The other 87.4% are to sites in the chunky middle and long tail of the search-traffic curve.


    Phew! That’s an immense load of powerful data, and over time, as we measure and report on this with our Jumpshot partners, we’re looking forward to sharing trends and additional numbers, too.

    If you’ve got a question about searcher behavior or search/click patterns, please feel free to leave it in the comments. I’ll work with Russ and Randy to prioritize those requests and make the data available. It’s my goal to have updated numbers to share at this year’s MozCon in July.


    ** The following questions and responses from Jumpshot can illustrate some of the data and methodology’s limitations:

    Rand: What search sources, if any, might be missed by Jumpshot’s methodology?
    Jumpshot: We only looked at Google.com, except for the one question that asked specifically about Amazon, YouTube, DuckDuckGo, etc.

    Rand: Do you, for example, capture searches performed in all Google apps (maps, search app, Google phone native queries that go to the web, etc)?
    Jumpshot: Nothing in-app, but anything that opens a mobile browser — yes.

    Rand: Do you capture all voice searches?
    Jumpshot: If it triggers a web browser either on desktop or on mobile, then yes.

    Rand: Is Google Home included?
    Jumpshot: No.

    Rand: Are searches on incognito windows included?
    Jumpshot: Yes, should be since the plug-in is at the device level, we track any URL regardless.

    Rand: Would searches in certain types of browsers (desktop or mobile) not get counted?
    Jumpshot: From a browser perspective, no. But remember we have no iOS data so any browser being used on that platform will not be recorded.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5504761
    via IFTTT
  • Google Algorithmic Penalties Still Happen, Post-Penguin 4.0

    Posted by MichaelC-15022

    When Penguin 4.0 launched in September 2016, the story from Gary Illyes of Google was that Penguin now just devalued spammy links, rather than penalizing a site by adjusting the site’s ranking downward, AKA a penalty.

    Apparently for Penguin there is now “less need” for a disavow, according to a Facebook discussion between Gary Illyes and Barry Schwartz of Search Engine Land back in September. He suggested that webmasters can help Google find spammy sites by disavowing links they know are bad. He also mentioned that manual actions still happen — and so I think we can safely infer that the disavow file is still useful in manual penalty recovery.

    But algorithmic penalties DO still exist. A client of mine, who’d in the past built a lot of really spammy links to one of their sites, had me take a look at their backlinks about 10 days ago and build a disavow file. There was no manual penalty indicated in Search Console, but they didn’t rank at all for terms they were targeting — and they had a plenty strong backlink profile even after ignoring the spammy links.

    I submitted the disavow file on March 2nd, 2017. Here’s the picture of what happened to their traffic:

    4 days after the disavow file submission, their traffic went from just a couple hundred visits/day from Google search to nearly 3,000.

    Penguin might no longer be handing out penalties, but clearly there are still algorithmic penalties handed out by Google. And clearly, the disavow file still works on these algorithmic penalties.

    Perhaps we just need to give them another animal name. (Personally, I like the Okapi… goes along with the black-and-white animal theme, and, like Google algorithmic penalties, hardly anyone knows they still exist.)

    Image courtesy Chester Zoo on Flickr.

    I look forward to animated comments from other SEOs and webmasters who might have been suspecting the same thing!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5502290
    via IFTTT
  • Rankings Correlation Study: Domain Authority vs. Branded Search Volume

    Posted by Tom.Capper

    A little over two weeks ago I had the pleasure of speaking at SearchLove San Diego. My presentation, Does Google Still Need Links, looked at the available evidence on how and to what extent Google is using links as a ranking factor in 2017, including the piece of research that I’m sharing here today.

    One of the main points of my presentation was to argue that while links still do represent a useful source of information for Google’s ranking algorithm, Google now has many other sources, most of which they would never have dreamed of back when PageRank was conceived as a proxy for the popularity and authority of websites nearly 20 years ago.

    Branded search volume is one such source of information, and one of the sources that is most accessible for us mere mortals, so I decided to take a deeper look on how it compared with a link-based metric. It also gives us some interesting insight into the KPIs we should be pursuing in our off-site marketing efforts — because brand awareness and link building are often conflicting goals.

    For clarity, by branded search volume, I mean the monthly regional search volume for the brand of a ranking site. For example, for the page https://www.walmart.com/cp/Gift-Cards/96894, this would be the US monthly search volume for the term “walmart” (as given by Google Keyword Planner). I’ve written more about how I put together this dataset and dealt with edge cases below.

    When picking my link-based metric for comparison, domain authority seemed a natural choice — it’s domain-level, which ought to be fair given that generally that’s the level of precision with which we can measure branded search volume, and it came out top in Moz’s study of domain-level link-based factors.

    A note on correlation studies

    Before I go any further, here’s a word of warning on correlation studies, including this one: They can easily miss the forest for the trees.

    For example, the fact that domain authority (or branded search volume, or anything else) is positively correlated with rankings could indicate that any or all of the following is likely:

    • Links cause sites to rank well
    • Ranking well causes sites to get links
    • Some third factor (e.g. reputation or age of site) causes sites to get both links and rankings

    That’s not to say that correlation studies are useless — but we should use them to inform our understanding and prompt further investigation, not as the last word on what is and isn’t a ranking factor.

    Methodology

    (Or skip straight to the results!)

    The Moz study referenced above used the provided 800 sample keywords from all 22 top-level categories in Google Keyword Planner, then looked at the top 50 results for each of these. After de-duplication, this results in 16,521 queries. Moz looked at only web results (no images, answer boxes, etc.), ignored queries with fewer than 25 results in total, and, as far as I can tell, used desktop rankings.

    I’ve taken a slightly different approach. I reached out to STAT to request a sample of ~5,000 non-branded keywords for the US market. Like Moz, I stripped out non-web results, but unlike Moz, I also stripped out anything with a baserank worse than 10 (baserank being STAT’s way of presenting the ranking of a search result when non-web results are excluded). You can see the STAT export here.

    Moz used Mean Spearman correlations, which is a process that involves ranking variables for each keyword, then taking the average correlation across all keywords. I’ve also chosen this method, and I’ll explain why using the below example:

    Keyword

    SERP Ranking Position

    Ranking Site

    Branded Search Volume of Ranking Site

    Per Keyword Rank of Branded Search Volume

    Keyword A

    1

    example1.com

    100,000

    1

    Keyword A

    2

    example2.com

    10,000

    2

    Keyword A

    3

    example3.com

    1,000

    3

    Keyword A

    4

    example4.com

    100

    4

    Keyword A

    5

    example5.com

    10

    5

    For Keyword A, we have wildly varying branded search volumes in the top 5 search results. This means that search volume and rankings could never be particularly well-correlated, even though the results are perfectly sorted in order of search volume.

    Moz’s approach avoids this problem by comparing the ranking position (the 2nd column in the table) with the column on the far right of the table — how each site ranks for the given variable.

    In this case, correlating ranking directly with search volume would yield a correlation of (-)0.75. Correlating with ranked search volume yields a perfect correlation of 1.

    This process is then repeated for every keyword in the sample (I counted desktop and mobile versions of the same keyword as two keywords), then the average correlation is taken.

    Defining branded search volume

    Initially, I thought that pulling branded search volume for every site in the sample would be as simple as looking up the search volume for their domain minus its subdomain and TLD (e.g. “walmart” for https://www.walmart.com/cp/Gift-Cards/96894). However, this proved surprisingly deficient. Take these examples:

    Are the brands for these sites “cruise,” “wordpress,” and “sd,” respectively? Clearly not. To figure out what the branded search term was, I started by taking each potential candidate from the URL, e.g., for ecotalker.wordpress.com:

    • Ecotalker
    • Ecotalker wordpress
    • Wordpress.com
    • Wordpress

    I then worked out what the highest search volume term was for which the subdomain in question ranked first — which in this case is a tie between “Ecotalker” and “Ecotalker wordpress,” both of which show up as having zero volume.

    I’m leaning fairly heavily on Google’s synonym matching in search volume lookup here to catch any edge-edge-cases — for example, I’m confident that “ecotalker.wordpress” would show up with the same search volume as “ecotalker wordpress.”

    You can see the resulting dataset of subdomains with their DA and branded search volume here.

    (Once again, I’ve used STAT to pull the search volumes in bulk.)

    The results: Brand awareness > links

    Here’s the main story: branded search volume is better correlated with rankings than domain authority is.

    However, there’s a few other points of interest here. Firstly, neither of these variables has a particularly strong correlation with rankings — a perfect correlation would be 1, and I’m finding a correlation between domain authority and rankings of 0.071, and a correlation between branded search volume and rankings of 0.1. This is very low by the standards of the Moz study, which found a correlation of 0.26 between domain authority and rankings using the same statistical methods.

    I think the biggest difference that accounts for this is Moz’s use of 50 web results per query, compared to my use of 10. If true, this would imply that domain authority has much more to do with what it takes to get you onto the front page than it has to do with ranking in the top few results once you’re there.

    Another potential difference is in the types of keyword in the two samples. Moz’s study has a fairly even breakdown of keywords between the 0–10k, 10k–20k, 20k–50k, and 50k+ buckets:

    On the other hand, my keywords were more skewed towards the low end:

    However, this doesn’t seem to be the cause of my lower correlation numbers. Take a look at the correlations for rankings for high volume keywords (10k+) only in my dataset:

    Although the matchup between the two metrics gets a lot closer here, the overall correlations are still nowhere near as high as Moz’s, leading me to attribute that difference more to their use of 50 ranking positions than to the keywords themselves.

    It’s worth noting that my sample size of high volume queries is only 980.

    Regression analysis

    Another way of looking at the relationship between two variables is to ask how much of the variation in one is explained by the other. For example, the average rank of a page in our sample is 5.5. If we have a specific page that ranks at position 7, and a model that predicts it will rank at 6, we have explained 33% of its variation from the average rank (for that particular page).

    Using the data above, I constructed a number of models to predict the rankings of pages in my sample, then charted the proportion of variance explained by those models below (you can read more about this metric, normally called the R-squared, here).

    Some explanations:

    • Branded Search Volume of the ranking site - as discussed above
    • Log(Branded Search Volume) - Taking the log of the branded search volume for a fairer comparison with domain authority, where, for example, a DA 40 site is much more than twice as well linked to as a DA 20 site.
    • Ranked Branded Search Volume - How this site’s branded search volume compares to that of other sites ranking for the same keyword, as discussed above

    Firstly, it’s worth noting that despite the very low R-squareds, all of the variables listed above were highly statistically significant — in the worst case scenario, within a one ten-millionth of a percent of being 100% significant. (In the best case scenario being a vigintillionth of a vigintillionth of a vigintillionth of a nonillionth of a percent away.)

    However, the really interesting thing here is that including ranked domain authority and ranked branded search volume in the same model explains barely any more variation than just ranked branded search volume on its own.

    To be clear: Nearly all of the variation in rankings that we can explain with reference to domain authority we could just as well explain with reference to branded search volume. On the other hand, the reverse is not true.

    If you’d like to look into this data some more, the full set is here.

    Nice data. Why should I care?

    There are two main takeaways here:

    1. If you care about your domain authority because it’s correlated with rankings, then you should care at least as much about your branded search volume.
    2. The correlation between links and rankings might sometimes be a bit of a red-herring — it could be that links are themselves merely correlated with some third factor which better explains rankings.

    There are also a bunch of softer takeaways to be had here, particularly around how weak (if highly statistically significant) both sets of correlations were. This places even more emphasis on relevancy and intent, which presumably make up the rest of the picture.

    If you’re trying to produce content to build links, or if you find yourself reading a post or watching a presentation around this or any other link building techniques in the near future, there are some interesting questions here to add to those posed by Tomas Vaitulevicius back in November. In particular, if you’re producing content to gain links and brand awareness, it might not be very good at either, so you need to figure out what’s right for you and how to measure it.

    I’m not saying in any of this that “links are dead,” or anything of the sort — more that we ought to be a bit more critical about how, why, and when they’re important. In particular, I think that they might be of increasingly little importance on the first page of results for competitive terms, but I’d be interested in your thoughts in the comments below.

    I’d also love to see others conduct similar analysis. As with any research, cross-checking and replication studies are an important step in the process.

    Either way, I’ll be writing more around this topic in the near future, so watch this space!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5499078
    via IFTTT
  • Better Alternatives to "Expert Roundup"-Style Content - Whiteboard Friday

    Posted by randfish

    You may be tempted to publish that newest round of answers you’ve gotten from industry experts, but hold off — there’s a better way. In today’s Whiteboard Friday, Rand explains why expert roundups just aren’t the best use of your time and effort, and how to pivot your strategy to create similar content that’ll make the juice worth the squeeze.

    Click on the whiteboard image above to open a high-resolution version in a new tab!

    Video Transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to look at some better alternatives to the expert roundup-style content that’s become extremely popular on the web. There are a few reasons why it’s popular. So let’s talk about why SEOs and content marketers do so many expert roundups, why this became a popular content format.

    Why do SEOs and content marketers even use “expert roundups?”

    Okay. It turns out if you’ve got a piece of content that’s like “75 Experts Share Their Favorite Constitutional Law Cases,” maybe you interviewed a bunch of constitutional laws scholars and you put together this article, there’s a bunch of nice things that you actually do get from this, which is why people use this format, right?

    You kind of get automatic outreach, because if you talk to these people, you’ve had a connection with them. You’ve built a little bit of a relationship. There’s now something of an incentive to share for these folks and the potential for a link. All of those are sort of elements that people are looking for, well, that marketers are looking for from their content.

    The nice thing is you’ve got this long cadre of individuals who have contributed, and they create the content, which means you don’t have to, saving you a bunch of time and energy. They become your amplifier so you can kind of sit back and relax when it comes time to broadcast it out there. You just tell them it’s ready, and they go and push it. They lend your content credibility. So even if you don’t have any credibility with your brand or with your website, they deliver it for you. You don’t have to do that.

    There are a few big problems with this kind of content.

    Those are all really nice things. Don’t get me wrong. I understand why. But there are some big, big problems with expert roundup-style content.

    1. Like many easy-to-replicate tactics, expert roundups become WAY overdone.

    First one, like many of the easy to replicate tactics, expert roundup has got spam to hack. They became way, way overdone. I get emails like this. “Dear Fishkin, I roundup. You write. Do this. Then share. Okay. Bye, Spammy McSpams-A-Lot.”

    Look, Mr. McSpams-A-Lot, I appreciate how often you think of me. I love that every day there are a couple of offers like this in my inbox. I try to contribute to less than one every two or three weeks and only the ones that look super credible and real interesting. But jeez, can you imagine if you are truly an expert, who can lend credibility and create lots of amplification, you’re getting overwhelmed with these kinds of requests, and people are probably getting very tired of reading them, especially in certain market segments where they’ve become way too overdone.

    2. It’s hard for searchers to get valuable, useful info via this format — and search engines don’t like it, either.

    But even if it’s the case that you can get all these experts to contribute and it’s not overdone in your market space, there are two other big problems. One, the content format is awful, awful for trying to get valuable and useful information. It rarely actually satisfies either searchers or engines.

    If you search for constitutional law cases and you see “75 Experts Share Their Favorite Constitutional Law Cases,” you might click. But my god, have you gone through those types of content? Have you tried to read a lot of those roundups? They are usually awful, just terrible.

    You might get a nugget here or there, but there’s a bunch of contributions that are multiple paragraphs long and try to include links back to wherever the expert is trying to get their links going. There’s a bunch of them that are short and meaningless. Many of them overlap.

    It’s annoying. It’s bad. It’s not well-curated. It’s not well-put together. There are exceptions. Sometimes people put real effort into them and they get good, but most of the time these are real bad things, and you rarely see them in the search results.

    BuzzSumo did a great analysis of content that gets shares and gets links and gets rankings. Guess what did not fall into it — expert roundups.

    3. Roundups don’t earn as many links, and the traffic spike from tweets is temporary.

    Number three. That’s number three. The links that the creators want from these roundups, that they’re hoping they’re going to get, it doesn’t end up there most of the time. What usually happens is you get a short traffic spike, some additional engagement, some additional activity on mostly Twitter, sometimes a little bit Facebook or LinkedIn, but it’s almost all social activity, and it’s a very brief spike.

    5 formats to try instead

    So what are some better alternatives? What are some things we can do? Well, I’ve got five for you.

    1. Surveys

    First off, if you’re going to be creating content that is around a roundup, why not do almost exactly the same process, but rather than asking a single question or a set of questions that people are replying to, ask them to fill out a short survey with a few data points, because then you can create awesome graphs and visuals, which have much stronger link earning potential. It’s the same outreach effort, but for much more compelling content that often does a better job of ranking, is often more newsworthy and link worthy. I really, really like surveys, and I think that they can work tremendously well if you can put them together right.

    2. Aggregations of public data

    Second, let’s say you go, “Oh, Rand, that would be great, but I want to survey people about this thing, and they won’t give me the information that I’m looking for.” Never fear. You can aggregate public data.

    So a lot of these pieces of information that may be interesting to your audience, that you could use to create cool visuals, the graphs and charts and all that kind of thing and trend lines, are actually available on the web. All you need to do is cite those sources, pull in that data, build it yourself, and then you can outreach to the people who are behind these companies or these organizations or these individuals, and then say, “Hey, I made this based on public data. Can you correct any errors?” Now you’ve got the outreach, which can lead to the incentive to share and to build a link. Very cool.

    3. Experiments and case studies

    So this is taking a much smaller group, saying, “I’m only going to work with this one person or these couple of people, or I’m going to do it myself. Here’s what Seattle’s most influential law firm found when they challenged 10 state laws.” Well, there you go. Now I’ve got an interesting, wholly formed case study. I only had to work with one expert, but chances are good that lots and lots of people will be interested in this. It’s also excellent for newsworthiness. It often can get lots of press coverage in whatever industry you’re in.

    4. Seeking out controversial counter-opinions on a topic

    Fourth, if you’re going to do a roundup-style thing and you’re going to collect multiple opinions, if you can find a few points or a single subject around which multiple experts have different opinions, that could be just two people, it could be four or five, it could be seven or eight, but you’re basically trying to create this controversy.

    You’re saying like, “Here are these people on this side of this issue. Here are these people on this side of this issue, Wil Reynolds versus Rand Fishkin on link building.” I think we did a presentation like that in Minneapolis last year or a couple years ago. It was super fun. Wil and I got up on stage, and we sort of debated with each other. There were no losers in that debate. It was great.

    This leverages the emotional response you’re seeking of conflict. It creates more engaging content by far, and there’s more incentive for the parties who participate to link and share, because they’re sort of showing off their opinion and trying to make counterpoints. You can get a lot of good things.

    5. Not just text!

    Number five. If you’ve decided, “You know what? None of these formats or any others work. I really, really want to do a roundup. I think it can work for me,” okay. But do me a favor and try something that is not just text, not just text.

    Muzli is a newsletter I subscribe to in the design world that does lots of roundup-style content, but the roundups are all visuals. They’re visuals. They’re like UI interactions and GIFs and animations and illustrations. I actually really love those. Those get great engagement, and they rank, by the way. They rank quite well. Many of the ones that they link to in the newsletter do well.

    You can do this with visuals. You can do it with data. You could do it with revenue numbers. You could do it with tools. You could do it with products, whatever it is.

    I would suggest thinking a little more broadly than, “Dear Fishkin, I roundup. You write.” I think that there’s a lot more opportunity outside of the pure expert roundup space, and I hope you’ll share your creative ideas with us and the successes you’ve seen.

    We look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5483612
    via IFTTT
  • Your Daily SEO Fix: The Keyword Research Edition

    Posted by FeliciaCrawford

    Back in 2015, we had an epiphany. Every day, via every channel, you — our readers, subscribers, community members, and social followers — would ask us really good questions. (You’re an incredibly intelligent, friendly, inquisitive bunch, you know that? It’s humbling.) A lot of those questions were about how to accomplish your SEO goals, and it got us thinking.

    Moz is an educational resource, it’s true, but we also offer a suite of tools (both free and paid) that can help you achieve those goals. Why not provide a space for those two things to converge? And thus, the idea of the Daily SEO Fix was born: quick 1–3 minute videos shared throughout the week that feature Mozzers describing how to solve problems using the tools we know best.

    It’s two years later now, and both our tools and our industry have evolved. Time to revisit this idea, no?

    Today’s series of Daily SEO Fixes feature our keyword research tool, Keyword Explorer. Perhaps you’ve heard us mention it a couple times — we sure like it, and we think it could help you, too. And you don’t have to be a subscriber to check this puppy out — anyone on the whole wide Internet can use it to research two queries a day for free. If you’re logged into your Moz community account, you get five free queries.

    Open Keyword Explorer in a new tab!

    Queue it up in another browser tab to follow along, if you’d like!*

    *Keep in mind that some features, such as lists, are only available when you’re also a Moz Pro Medium subscriber or above. If you’re bursting with curiosity, you can always check out the 30-day free trial, which features everything you’d see in a paid subscription… but for free. :)


    Fix #1: Nitty-gritty keyword research

    Let’s get down to brass tacks: your keyword research. Janisha’s here to walk you through…

    • Researching your keyword;
    • Determining whether it strikes the right balance of volume, difficulty, and opportunity;
    • How to quickly analyze the SERPs for your query and see what factors could be affecting your ranking opportunity;
    • Finding keyword suggestions ripe with promise; and
    • Organizing your newly discovered keywords into lists.

    Fix #2: Finding question keywords to boost your content & win featured snippets

    When you answer the questions searchers are actually asking, you’ve got way more opportunity to rank, earn qualified traffic to your site, and even win yourself a featured snippet or two. Brittani shows you how to broaden your page content by speaking to your audience’s most burning questions.


    Fix #3: Updating your keyword metrics on a whim

    If you’re hot on the trail of a good ranking, you don’t have the time or patience to wait for your metrics to update on their own. Kristina shows you how to get that sweet, sweet, up-to-date data after you’ve organized a list of related keywords in Keyword Explorer.


    Fix #4: Moving curated keyword lists to Moz Pro for long-term tracking

    If you’re interested in tracking the overall SEO progress of a site and digging into the nuts and bolts of your keyword data, you’ll want to pay attention. Kristina’s back to explain how to import your curated Keyword Explorer lists into a Moz Pro campaign to track long-term rankings for a specific site.


    That’s a wrap for Week 1!

    There you have it — four ways to level up your keyword research and knock some to-dos off your list. We’ll be back next Thursday with more fixes from a new group of Mozzers; keep an eye on our social channels for a sneak peek, and maybe try a free spin of Moz Pro if you’d like to follow along.

    Curious about what else you can do with Keyword Explorer? Here are some fab resources:

    And if you’re fairly new to the game or looking for ways to grow your team members’ SEO knowledge, be sure to check out our classes on introductory SEO, keyword research, site audits, link building, reporting, and more.

    See you next week, friends!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5477902
    via IFTTT
  • SEO Rankings Drop: A Step-by-Step Guide to Recovery

    Posted by KristinaKledzik

    A few weeks ago, rankings for pages on a key section of my site dropped an average of a full position in one day. I’ve been an SEO for 7 years now, but I still ran around like a chicken with my head cut off, panicked that I wouldn’t be able to figure out my mistake. There are so many things that could’ve gone wrong: Did I or my team unintentionally mess with internal link equity? Did we lose links? Did one of Google’s now-constant algorithm updates screw me over?

    Since the drop happened to a group of pages, I made the assumption it had to do with our site or page structure (it didn’t). I wasted a good day focused on technical SEO. Once I realized my error, I decided to put together a guide to make sure that next time, I’ll do my research effectively. And you, my friends, will reap the rewards.

    First, make sure there’s actually a rankings change

    Okay, I have to start with this: before you go down this rabbit hole of rankings changes, make sure there was actually a rankings change. Your rankings tracker may not have localized properly, or have picked up on one of Google’s rankings experiments or personalization.

    Find out:

    • Has organic traffic dropped to the affected page(s)?
      • We’re starting here because this is the most reliable data you have about your site. Google Search Console and rankings trackers are trying to look at what Google’s doing; your web analytics tool is just tracking user counts.
      • Compare organic traffic to the affected page(s) week-over-week both before and after the drop, making sure to compare similar days of the week.
      • Is the drop more significant than most week-over-week changes?
      • Is the drop over a holiday weekend? Is there any reason search volume could’ve dropped?
    • Does Google Search Console show a similar rankings drop?
      • Use the Search Analytics section to see clicks, impressions, and average position for a given keyword, page, or combo.
      • Does GSC show a similar rankings drop to what you saw in your rankings tracker? (Make sure to run the report with the selected keyword(s).)
    • Does your rankings tracker show a sustained rankings drop?
      • I recommend tracking rankings daily for your important keywords, so you’ll know if the rankings drop is sustained within a few days.
      • If you’re looking for a tool recommendation, I’m loving Stat.

    If you’ve just seen a drop in your rankings tool and your traffic and GSC clicks are still up, keep an eye on things and try not to panic. I’ve seen too many natural fluctuations to go to my boss as soon as I see an issue.

    But if you’re seeing that there’s a rankings change, start going through this guide.

    Figure out what went wrong

    1. Did Google update their algorithm?

    Google rolls out a new algorithm update at least every day, most silently. Good news is, there are leagues of SEOs dedicated to documenting those changes.

    • Are there any SEO articles or blogs talking about a change around the date you saw the change? Check out:
    • Do you have any SEO friends who have seen a change? Pro tip: Make friends with SEOs who run sites similar to yours, or in your industry. I can’t tell you how helpful it’s been to talk frankly about tests I’d like to run with SEOs who’ve run similar tests.

    If this is your issue…

    The bad news here is that if Google’s updated their algorithm, you’re going to have to change your approach to SEO in one way or another.

    Make sure you understand:

    Your next move is to put together a strategy to either pull yourself out of this penalty, or at the very least to protect your site from the next one.

    2. Did your site lose links?

    Pull the lost links report from Ahrefs or Majestic. They’re the most reputable link counters out there, and their indexes are updated daily.

    • Has there been a noticeable site-wide link drop?
    • Has there been a noticeable link drop to the page or group of pages you’ve seen a rankings change for?
    • Has there been a noticeable link drop to pages on your site that link to the page or group of pages you’ve seen a rankings change for?
      • Run Screaming Frog on your site to find which pages link internally to the affected pages. Check internal link counts for pages one link away from affected pages.
    • Has there been a noticeable link drop to inbound links to the page or group of pages you’ve seen a rankings change for?
      • Use Ahrefs or Majestic to find the sites that link to your affected pages.
        • Have any of them suffered recent link drops?
        • Have they recently updated their site? Did that change their URLs, navigation structure, or on-page content?

    If this is your issue…

    The key here is to figure out who you lost links from and why, so you can try to regain or replace them.

    • Can you get the links back?
      • Do you have a relationship with the site owner who provided the links? Reaching out may help.
      • Were the links removed during a site update? Maybe it was accidental. Reach out and see if you can convince them to replace them.
      • Were the links removed and replaced with links to a different source? Investigate the new source — how can you make your links more appealing than theirs? Update your content and reach out to the linking site owner.
    • Can you convince your internal team to invest in new links to quickly replace the old ones?
      • Show your manager(s) how much a drop in link count affected your rankings and ask for the resources it’ll take to replace them.
      • This will be tricky if you were the one to build the now-lost links in the first place, so if you did, make sure you’ve put together a strategy to build longer-term ones next time.

    3. Did you change the affected page(s)?

    If you or your team changed the affected pages recently, Google may not think that they’re as relevant to the target keyword as they used to be.

    • Did you change the URL?
      • DO NOT CHANGE URLS. URLs act as unique identifiers for Google; a new URL means a new page, even if the content is the same.
    • Has the target keyword been removed from the page title, H1, or H2s?
    • Is the keyword density for the target keyword lower than it used to be?
    • Can Google read all of the content on the page?
      • Look at Google’s cache by searching for cache:www.yourdomain.com/your-page to see what Google sees.
    • Can Google access your site? Check Google Search Console for server and crawl reports.

    If this is your issue…

    Good news! You can probably revert your site and regain the traffic you’ve lost.

    • If you changed the URL, see if you can change it back. If not, make sure the old URL is 301 redirecting to the new URL.
    • If you changed the text on the page, try reverting it back to the old text. Wait until your rankings are back up, then try changing the text again, this time keeping keyword density in mind.
    • If Google can’t read all of the content on your page, THIS IS A BIG DEAL. Communicate that to your dev team. (I’ve found dev teams often undervalue the impact of SEO, but “Googlebot can’t read the page” is a pretty understandable, impactful problem.)

    4. Did you change internal links to the affected page(s)?

    If you or your team added or removed internal links, that could change the way link equity flows through your site, changing Google’s perceived value of the pages on your site.

    • Did you or your team recently update site navigation anywhere? Some common locations to check:
      • Top navigation
      • Side navigation
      • Footer navigation
      • Suggested products
      • Suggested blog posts
    • Did you or your team recently update key pages on your site that link to target pages? Some pages to check:
      • Homepage
      • Top category pages
      • Linkbait blog posts or articles
    • Did you or your team recently update anchor text on links to target pages? Does it still include the target keyword?

    If this is your issue…

    Figure out how many internal links have been removed from pointing to your affected pages. If you have access to the old version of your site, run Screaming Frog (or a similar crawler) on the new and old versions of your site so you can compare inbound link counts (referred to as inlinks in SF). If you don’t have access to the old version of your site, take a couple of hours to compare navigation changes and mark down wherever the new layout may have hurt the affected pages.

    How you fix the problem depends on how much impact you have on the site structure. It’s best to fix the issue in the navigational structure of the site, but many of us SEOs are overruled by the UX team when it comes to primary navigation. If that’s the case for you, think about systematic ways to add links where you can control the content. Some common options:

    • In the product description
    • In blog posts
    • In the footer (since UX will generally admit, few people use the footer)

    Keep in mind that removing links and adding them back later, or from different places on the site, may not have the same effect as the original internal links. You’ll want to keep an eye on your rankings, and add more internal links than the affected pages lost, to make sure you regain your Google rankings.

    5. Google’s user feedback says you should rank differently.

    Google is using machine learning to determine rankings. That means they’re at least in part measuring the value of your pages based on their click-through rate from SERPs and how long visitors stay on your page before returning to Google.

    • Did you recently add a popup that is increasing bounce rate?
    • Is the page taking longer to load?
      • Check server response time. People are likely to give up if nothing happens for a few seconds.
      • Check full page load. Have you added something that takes forever to load and is causing visitors to give up quickly?
    • Have you changed your page titles? Is that lowering CTR? (I optimized page titles in late November, and that one change moved the average rank of 500 pages up from 12 to 9. One would assume things can go in reverse.)

    If this is your issue…

    • If the issue is a new popup, do your best to convince your marketing team to test a different type of popup. Some options:
      • Scroll popups
      • Timed popups
      • Exit popups
      • Stable banners at the top or bottom of the page (with a big CLICK ME button!)
    • If your page is taking longer to load, you’ll need the dev team. Put together the lost value from fewer SEO conversions now that you’ve lost some rankings and you’ll have a pretty strong case for dev time.
    • If you’ve changed your page titles, change them back, quick! Mark this test as a dud, and make sure you learn from it before you run your next test.

    6. Your competition made a change.

    You may have changed rank not because you did anything, but because your competition got stronger or weaker. Use your ranking tool to identify competitors that gained or lost the most from your rankings change. Use a tool like Versionista (paid, but worth it) or Wayback Machine (free, but spotty data) to find changes in your competitors’ sites.

    • Which competitors gained or lost the most as your site’s rankings changed?
    • Has that competition gained or lost inbound links? (Refer to #2 for detailed questions)
    • Has that competition changed their competing page? (Refer to #3 for detailed questions)
    • Has that competition changed their internal link structure? (Refer to #4 for detailed questions)
    • Has that competition started getting better click-through rates or dwell time to their pages from SERPs? (Refer to #5 for detailed questions)

    If this is your issue…

    You’re probably fuming, and your managers are probably fuming at you. But there’s a benefit to this: you can learn about what works from your competitors. They did the research and tested a change, and it paid off for them. Now you know the value! Imitate your competitor, but try to do it better than them this time — otherwise you’ll always be playing catch up.

    Now you know what to do

    You may still be panicking, but hopefully this post can guide you to some constructive solutions. I find that the best response to a drop in rankings is a good explanation and a plan.

    And, to the Moz community of other brilliant SEOs: comment below if you see something I’ve missed!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5471595
    via IFTTT
  • The Moz 2016 Annual Report

    Posted by SarahBird

    I have a longstanding tradition of boring Moz readers with our exhaustive annual reports (2012, 2013, 2014, 2015).

    If you’re avoiding sorting the recycling, going to the gym, or cleaning out your closet, I have got a *really* interesting post that needs your attention *right now*.

    (Yeah. I know it’s March. But check this out, I had pneumonia in Jan/Feb so my life slid sideways for a while.)

    Skip to your favorite parts:

    Part 1: TL;DR

    Part 2: Achievements unlocked

    Part 3: Oh hai, elephant. Oh hai, room.

    Part 4: More wood, fewer arrows

    Part 5: Performance (metrics vomit)

    Part 6: Inside Moz HQ

    Part 7: Looking ahead


    Part 1: TL;DR

    We closed out 2016 with more customers and revenue than 2015. Our core SEO products are on a roll with frequent, impactful launches.

    The year was not all butterflies and sunshine, though. Some of our initiatives failed to produce the results we needed. We made some tough calls (sunsetting some products and initiatives) and big changes (laying off a bunch of folks and reallocating resources). On a personal level, it was the most emotionally fraught time in my career.

    Thank the gods, our hard work is paying off. Moz ended the year cashflow, EBITDA, and net income profitable (on a monthly basis), and with more can-do spirit than in years past. In fact, in the month of December we added a million dollars cash to the business.

    We’re completely focused on our mission to simplify SEO for everyone through software, education, and community.


    Part 2: Achievements unlocked

    It blows my mind that we ended the year with over 36,000 customers from all over the world. We’ve got brands and agencies. We’ve got solopreneurs and Fortune 500s. We’ve got hundreds of thousands of people using the MozBar. A bunch of software companies integrate with our API. It’s humbling and awesome. We endeavor to be worthy of you!

    We were very busy last year. The pace and quality of development has never been better. The achievements captured below don’t come even close to listing everything. How many of these initiatives did you know about?


    Part 3: Oh hai, elephant. Oh hai, room.

    When a few really awful things happen, it can overshadow the great stuff you experience. That makes this a particularly hard annual report to write. 2016 was undoubtedly the most emotionally challenging year I’ve experienced at Moz.

    It became clear that some of our strategic hypotheses were wrong. Pulling the plug on those projects and asking people I care deeply about to leave the company was heartbreaking. That’s what happened in August 2016.

    As Tolstoy wrote, “Happy products are all alike; every unhappy product is unhappy in its own way.” The hard stuff happened. Rehashing what went wrong deserves a couple chapters in a book, not a couple lines in a blog post. It shook us up hard.

    And *yet*, I am determined not to let the hard stuff take away from the amazing, wonderful things we accomplished and experienced in 2016. There was a lot of good there, too.

    Smarter people than me have said that progress doesn’t happen in a straight line; it zigs and zags. I’m proud of Mozzers; they rise to challenges. They lean into change and find the opportunity in it. They turn their compassion and determination up to 11. When the going gets tough, the tough get going.

    I’ve learned a lot about Moz and myself over the last year. I’m taking all those learnings with me into the next phase of Moz’s growth. Onwards.


    Part 4: More wood, fewer arrows

    At the start of 2016, our hypothesis was that our customers and community would purchase several inbound marketing tools from Moz, including SEO, local SEO, social analytics, and content marketing. The upside was market expansion. The downside was fewer resources to go around, and a much more complex brand and acquisition funnel.

    By trimming our product lines, we could reallocate resources to initiatives showing more growth potential. We also simplified our mission, brand, and acquisition funnel.

    It feels really good to be focusing on what we love: search. We want to be the best place to learn and do SEO.

    Whenever someone wonders how to get found in search, we want them to go to Moz first. We aspire to be the best in the world at the core pillars of SEO: rankings, keywords, site audit and optimization, links, location data management.

    SEO is dynamic and complex. By reducing our surface area, we can better achieve our goal of being the best. We’re putting more wood behind fewer arrows.


    Part 5: Performance (metrics vomit)

    Check out the infographic view of our data barf.

    We ended the year at ~$42.6 million in gross revenue, amounting to ~12% annual growth. We had hoped for better at the start of the year. Moz Pro is still our economic engine, and Local drives new revenue and cashflow.

    Gross profit margin increased a hair to 74%, despite Moz Local being a larger share of our overall business. Product-only gross profit margin is a smidge higher at 76%. Partner relationships generally drag the profit margin on that product line.

    Our Cost of Revenue (COR) went up in raw numbers from the previous year, but it didn’t increase as much as revenue.

    Total Operating Expenses came to about ~$41 million. Excluding the cost of the restructure we initiated in August, the shape and scale of our major expenses has remained remarkably stable.

    We landed at -$5.5 million in EBITDA, which was disappointingly below our plan. We were on target for our budgeted expenses. As we fell behind our revenue goals, it became clear we’d need to right-size our expenses to match the revenue reality. Hence, we made painful cuts.

    I’m happy/relieved/overjoyed to report that we were EBITDA positive by September, cashflow positive by October, and net income positive by November. Words can’t express how completely terrible it would have been to go through what we all went through, and *not* have achieved our business goals.

    My mind was blown when we actually added a million in cash in December. I couldn’t have dared to dream that… Ha ha! They won’t all be like that! It was the confluence of a bunch of stuff, but man, it felt good.


    Part 6: Inside MozHQ

    Thanks to you, dear reader, we have a thriving and opinionated community of marketers. It’s a great privilege to host so many great exchanges of ideas. Education and community are integral to our mission. After all, we were a blog before we were a tech company. Traffic continues to climb and social keeps us busy. We love to hear from you!

    We added a bunch of folks to the Moz Local, Moz.com, and Customer Success teams in the last half of the year. But our headcount is still lower than last year because we asked a lot of talented people to leave when we sunsetted a bunch of projects last August. We’re leaner, and gaining momentum.

    Moz is deeply committed to making tech a more inclusive industry. My vision is for Moz to be a place where people are constantly learning and doing their best work. We took a slight step back on our gender diversity gains in 2016. Ugh. We’re not doing much hiring in 2017, so it’s going to be challenging to make substantial progress. We made a slight improvement in the ratio of underrepresented minorities working at Moz, which is a positive boost.

    The tech industry has earned its reputation of being unwelcoming and myopic.

    Mozzers work hard to make Moz a place where anyone could thrive. Moz isn’t perfect; we’re human and we screw up sometimes. But we pick ourselves up, dust off, and try again. We continue our partnership with Ada Academy, and we’ve deepened our relationship with Year Up. One of my particular passions is partnering with programs that expose girls and young women to STEM careers, such as Ignite Worldwide, Techbridge, and BigSisters.

    I’m so proud of our charitable match program. We match Mozzer donations 150% up to $3k. Over the years, we’ve given over half a million dollars to charity. In 2016, we gave over $111,028 to charities. The ‘G’ in TAGFEE stands for ‘generous,’ and this is one of the ways we show it.

    One of our most beloved employee benefits is paid, PAID vacation. We give every employee up to $3,000 to spend on his or her vacation. This year, we spent over half a million dollars exploring the world and sucking the marrow out of life.


    Part 7: Looking ahead

    Dear reader, I don’t have to tell you that search has been critical for a long time.

    This juggernaut of a channel is becoming *even more* important with the proliferation of search interfaces and devices. Mobile liberated search from the desktop by bringing it into the physical world. Now, watches, home devices, and automobiles are making search ubiquitous. In a world of ambient search, SEO becomes even more important.

    SEO is more complicated and dynamic than years past because the number of human interfaces, response types, and ranking signals are increasing. We here at Moz are wild about the complexity. We sink our teeth into it. It drives our mission: Simplify SEO for everyone through software, education, and community.

    We’re very excited about the feature and experience improvements coming ahead. Thank you, dear reader, for sharing your feedback, inspiring us, and cheering us on. We look forward to exploring the future of search together.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5465399
    via IFTTT
  • How to Add Your Site to Google Search Console

    {EntryTitle} was first posted on Tips and Tricks HQ

  • How to Include Recipes in WordPress Posts

    {EntryTitle} was first posted on Tips and Tricks HQ

  • How to Backup the WordPress Database

    {EntryTitle} was first posted on Tips and Tricks HQ

  • Infinite "People Also Ask" Boxes: A Glimpse at Google's Deep Learning Edges

    Posted by BritneyMuller

    A glimpse into Google’s machine learning?

    You’ve likely seen the People Also Ask (Related Questions) boxes in SERPs. These accordion-like question and answer boxes are Google’s way of saying, “Hey, you beautiful searcher, you! These questions also relate to your search… maybe you’re interested in exploring these too? Kick off your shoes, stay a while!”

    However, few people have come across infinite PAAs. These occur when you expand a PAA question box to see 2 or 3 other related questions appear at the bottom. These infinite PAA lists can continue into the hundreds, and I’ve been lucky enough to come across 75+ of these gems!

    So, grab a coffee and buckle up! I’d like to take you on a journey of my infinite PAA research, discoveries, machine learning hypothesis, and how you can find PAA opportunities.

    Why PAAs should matter to you

    PAAs have seen a 1,723% growth in SERPs since 7/31/15 via Mozcast! ← Tweet this stat!

    Compare that to featured snippets, which have seen only a 328% growth since that timeframe.

    Research has also shown that a single PAA can show up in 21 unique SERPs! How ‘bout dem apples?! PAA opportunities can take over some serious SERP real estate.

    My infinite PAA obsession

    These mini-FAQs within search results have fascinated me since Google started testing of them in 2015. Then in November 2016, I discovered Google’s PAA dynamic testing:

    The above infinite PAA expanded into the hundreds! This became an obsession of mine as I began to notice them across multiple devices (for a variety of different searches) and coined them “PAA Black Holes.”

    I began saving data from these infinite PAAs to see if I could find any patterns, explore how Google might be pulling this data, and dive deeper into how the questions/topics changed as a result of my expanding question boxes, etc.

    After seeing a couple dozen infinite PAAs, I began to wonder if this was actually a test to implement in search, but several industry leaders assured me this was more likely a bug.

    They were wrong.

    Infinite People Also Ask boxes are live

    Now integrated into U.S. SERPs (sorry foreign friends, but get ready for this to potentially migrate your way) you can play with these on desktop & mobile:

    Why does Google want people to spend more time on individual SERPs (instead of looking at several)? Could they charge more for advertisements on SERPs with these sticky, expansive PAAs? Might they eventually start putting ads in PAAs? These are the questions that follow me around like a shadow.

    To get a better idea of the rise of PAAs, here’s a timeline of my exploratory PAA research:

    PAA timeline

    April 17, 2015 - Google starts testing PAAs

    July 29, 2015 - Dr. Pete gets Google to confirm preferred “Related Questions” name

    Aug 15, 2015 - Google tests PAA Carousels on desktop

    Dec 30, 2015 - Related Questions (PAAs) grow +500% in 5 months

    Mar 11, 2016 - See another big uptick in Related Questions (PAAs) in Mozcast

    Nov 11, 2016 - Robin Rozhon notices PAA Black Hole

    Nov 23, 2016 - Brit notices PAA Black Hole

    Nov 29, 2016 - STAT Analytics publishes a research study on PAAs

    Dec 12, 2016 - Realized new PAA results would change based on expanded PAA

    Dec 14, 2016 - Further proof PAAs dynamically load based on what you click

    Dec 19, 2016 - Still seeing PAA Black Holes

    Dec 22, 2016 - Discovered a single PAA result (not a 3-pack)

    Jan 11, 2016 - Made a machine learning (TensorFlow) discovery and hypothesis!

    Jan 22, 2016 - Discovered a PAA Black Hole on a phone

    Jan 25, 2016 - Discovered a PAA Black Hole that maxed out at 9

    Feb 10, 2017 - PAA Black Holes go live!

    Feb 14, 2017 - Britney Muller is still oblivious to PAA Black Holes going live and continues to hypothesize how they are being populated via entity graph-based ML.


    3 big infinite PAA discoveries:

    #1 - Google caters to browsing patterns in real time

    It took me a while to grasp that I can manipulate the newly populated question boxes based on what I choose to expand.

    Below, I encourage more Vans-related PAAs by clicking “Can I put my vans in the washing machine?” Then, I encourage more “mildew”-related ones simply by clicking a “How do you get the mildew smell out of clothes” PAA above:

    Another example of this is when I clicked “organic SEO” at the very top of a 100+ PAA Black Hole (the gif would make you dizzy, so I took a screenshot instead). It altered my results from “how to clean leather” to “what is seo” and “what do you mean by organic search”:


    #2 - There are dynamic dead ends

    When I reach an exhaustive point in my PAA expansions (typically ~300+), Google will prompt the first two PAAs, as in: “We aren’t sure what else to provide, are you interested in these again?”

    Here is an example of that happening: I go from “mitosis”-related PAAs (~300 PAAs deep) to a repeat of the first two PAAs: “What is Alexa ranking based on?” and “What is the use of backlinks?”:

    This reminds me of a story told by Google machine learning engineers: whenever an early ML model couldn’t identify a photograph, it would say a default ‘I don’t know’ answer of: “Men talking on cell phone.” It could have been a picture of an elephant dancing, and if the ML model wasn’t sure what it was, it would say “Men talking on cell phone.”

    My gut tells me that G reverts back to the strongest edge cases (the first two PAAs) to your original query when running out of a certain relational threshold of PAAs.

    It will then suggest the third and fourth PAA when you push these limits to repeat again, and so on.


    #3 - Expand & retract one question to explore the most closely related questions

    This not only provides you with the most relevant PAAs to the query you’re expanding and retracting, but if it’s in your wheelhouse, you can quickly discover other very relevant PAA opportunities.

    Here I keep expanding and retracting “What is the definition of SEO?”:

    Notice how “SEO” or “search engine optimization” is in every subsequent PAA!? This is no coincidence and has a lot to do with the entity graph.

    First, let’s better understand machine learning and why an entity-based, semi-supervised model is so relevant to search. I’ll then draw out what I think is happening with the above results (like a 5-year-old), and go over ways you can capture these opportunities! Woohoo!


    Training data’s role in machine learning

    Mixups are commonplace in machine learning, mostly due to a lack of quality training data.

    Well-labeled training data is typically the biggest component necessary in training an accurate ML model.

    Fairly recently, the voice search team at Google came across an overwhelming amount of EU voice data that was interpreted as “kdkdkdkd.” An obvious exclusion in their training data (who says “kdkdkdkd”?!), the engineers had no idea what could be prompting that noise. Confused, they finally figured out that it was the trains and subways making that noise!

    This is a silly example of adding the “kdkdkd” = Trains/Subways training data. Google is now able to account for these pesky “kdkdkdkd” inclusions.


    Relational data to the rescue

    Because we don’t always have enough training data to properly train a ML model, we look to relational data for help.

    Example: If I showed you the following picture, you could gather a few things from it, right? Maybe that it appears to be a female walking down a street, and that perhaps it’s fall by her hat, scarf, and the leaves on the ground. But it’s hard to determine a whole lot else, right?

    What about now? Here are two other photos from the above photo’s timeline:

    Aha! She appears to be a U.S. traveler visiting London (with her Canon Ti3 camera). Now we have some regional, demographic, and product understanding. It’s not a whole lot of extra information, but it provides much more context for the original cryptic photo, right?

    Perhaps, if Google had integrated geo-relational data with their voice machine learning, they could have more quickly identified that these noises were occurring at the same geolocations. This is just an example; Google engineers are WAY smarter than myself and have surely thought of much better solutions.


    Google leverages entity graphs similarly for search

    Google leverages relational data (in a very similarly way to the above example) to form better understandings of digital objects to help provide the most relevant search results.

    A kind of scary example of this is Google’s Expander: A large-scale ML platform to “exploit relationships between data objects.”

    Machine learning is typically “supervised” (training data is provided, which is more common) or “unsupervised” (no training data). Expander, however, is “semi-supervised,” meaning that it’s bridging the gap between provided and not-provided data. ← SEO pun intended!

    Expander leverages a large, graph-based system to infer relationships between datasets. Ever wonder why you start getting ads about a product you started emailing your friend about?

    Expander is bridging the gap between platforms to better understand online data and is only going to get better.


    Relational entity graphs for search

    Here is a slide from a Google I/O 2016 talk that showcases a relational word graph for search results:

    Slide from Breakthroughs in Machine Learning Google I/O 2016 video.

    Solid edges represent stronger relationships between nodes than the dotted lines. The above example shows there is a strong relationship between “What are the traditions of halloween” and “halloween tradition,” which makes sense. People searching for either of those would each be satisfied by quality content about “halloween traditions.”

    Edge strength can also be determined by distributional similarity, lexical similarity, similarity based on word embeddings, etc.


    Infinite PAA machine learning hypothesis:

    Google is providing additional PAAs based on the strongest relational edges to the expanded query.

    You can continue to see this occur in infinite PAAs datasets. When a word with two lexical similarities overlaps the suggested PAAs, the topic changes because of it:

    The above topic change occurred through a series of small relational suggestions. A PAA above this screenshot was “What is SMO stands for?” (not a typo, just a neural network doing its best people!) which led to “What is the meaning of SMO?”, to “What is a smo brace?” (for ankles).

    This immediately made me think of the relational word graph and what I envision Google is doing:

    I hope my parents hang this on their fridge.

    My hypothesis is that the machine learning model computes that because I’m interested in “SMO,” I might also be interested in ankle brace “SMO.”

    There are ways for SEOs and digital marketers to leverage topical relevance and capture PAAs opportunities.


    4 ways to optimize for machine learning & expand your topical reach for PAAs:

    Topical connections can always be made within your content, and by adding additional high quality topically related content, you can strengthen your content’s edges (and expand your SERP real estate). Here are some quick and easy ways to discover related topics:

    #1: Quickly discover Related Topics via MozBar

    MozBar is a free SEO browser add-on that allows you to do quick SEO analysis of web pages and SERPs. The On-Page Content Suggestions feature is a quick and simple way to find other topics related to your page.

    Step 1: Activate MozBar on the page you are trying to expand your keyword reach with, and click the Page Optimization:

    Step 2: Enter in the word you are trying to expand your keyword reach with:

    Step 3: Click On-Page Content Suggestions for your full list of related keyword topics.

    Step 4: Evaluate which related keywords can be incorporated naturally into your current on-page content. In this case, it would be beneficial to incorporate “seo tutorial,” “seo tools,” and “seo strategy” into the Beginner’s Guide to SEO.

    Step 5: Some may seem like an awkward add to the page, like “seo services” and “search engine ranking,” but are relevant to the products/services that you offer. Try adding these topics to a better-fit page, creating a new page, or putting together a strong FAQ with other topically related questions.


    #2: Wikipedia page + SEOBook Keyword Density Checker*

    Let’s say you’re trying to expand your topical keywords in an industry you’re not very familiar with, like “roof repair.” You can use this free hack to pull in frequent and related topics.

    Step 1: Find and copy the roof Wikipedia page URL.

    Step 2: Paste the URL into SEOBook’s Keyword Density Checker:

    Step 3: Hit submit and view the most commonly used words on the Wikipedia page:

    Step 4: You can dive even deeper (and often more topically related) by clicking on the “Links” tab to evaluate the anchor text of on-page Wikipedia links. If a subtopic is important enough, it will likely have another page to link to:

    Step 5: Use any appropriate keyword discoveries to create stronger topic-based content ideas.

    *This tactic was mentioned in Experts On The Wire episode on keyword research tools.


    #3: Answer the Public

    Answer the Public is a great free resource to discover questions around a particular topic. Just remember to change your country if you’re not seeking results from the UK (the default).

    Step 1: Enter in your keyword/topic and select your country:

    Step 2: Explore the visualization of questions people are asking about your keyword:

    Doesn’t this person look like they’re admiring themselves in a mirror (or taking a selfie)? A magnifying glass doesn’t work from that distance, people!

    Note: Not all questions will be relevant to your research, like “why roof of mouth hurts” and “why roof of mouth itches.”

    Step 3: Scroll back up to the top to export the data to CSV by clicking the big yellow button (top right corner):

    The magnifying glass looks much larger here… perhaps it would work at that distance?

    Step 4: Clean up the data and upload the queries to your favorite keyword research tool (Moz Keyword Explorer, SEMRush, Google Keyword Planner, etc.) to discover search volume and SERP feature data, like featured snippets, reviews, related questions (PAA boxes), etc.

    Note: Google’s Keyword Planner does not support SERP features data and provides vague, bucket-based search volume.


    #4: Keyword research “only questions”

    Moz Keyword Explorer provides an “only questions” filter to uncover potential PAA opportunities.

    Step 1: Enter your keyword into KWE:

    Step 2: Click Keyword Suggestions:

    Step 3: Filter by “are questions”:

    Pro tip: Find grouped question keyword opportunities by grouping keywords by “low lexical similarity” and ordering them from highest search volume to lowest:

    Step 4: Select keywords and add to a new or previous list:

    Step 5: Once in a list, KWE will tell you how many “related questions” (People Also Ask boxes) opportunities are within your list. In this case, we have 18:

    Step 6: Export your keyword list to a Campaign in Moz Pro:

    Step 7: Filter SERP Features by “Related Questions” to view PAA box opportunities:

    Step 8: Explore current PAA box opportunities and evaluate where you currently rank for “Related Questions” keywords. If you’re on page 1, you have a better chance of stealing a PAA box.

    +Evaluate what other SERP features are present on these SERPs. Here, Dr. Pete tells me that I might be able to get a reviews rich snippet for “gutter installation”. Thanks, Dr. Pete!

    Hopefully, this research can help energize you to do topical research of your own to grab some relevant PAAs! PAAs aren’t going away anytime soon and I’m so excited for us to learn more about them.

    Please share your PAA experiences, questions, or comments below.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5459739
    via IFTTT
  • Aren't 301s, 302s, and Canonicals All Basically the Same? - Whiteboard Friday

    Posted by Dr-Pete

    They say history repeats itself. In the case of the great 301 vs 302 vs rel=canonical debate, it repeats itself about every three months. In today’s Whiteboard Friday, Dr. Pete explains how bots and humans experience pages differently depending on which solution you use, why it matters, and how each choice may be treated by Google.

    Click on the whiteboard image above to open a high-resolution version in a new tab!

    Video Transcription

    Hey, Moz fans, it’s Dr. Pete, your friendly neighborhood marketing scientist here at Moz, and I want to talk today about an issue that comes up probably about every three months since the beginning of SEO history. It’s a question that looks something like this: Aren’t 301s, 302s, and canonicals all basically the same?

    So if you’re busy and you need the short answer, it’s, “No, they’re not.” But you may want the more nuanced approach. This popped up again about a week [month] ago, because John Mueller on the Webmaster Team at Google had posted about redirection for secure sites, and in it someone had said, “Oh, wait, 302s don’t pass PageRank.”

    John said, “No. That’s a myth. It’s incorrect that 302s don’t pass PR,” which is a very short answer to a very long, technical question. So SEOs, of course, jumped on that, and it turned into, “301s and 302s are the same, cats are dogs, cakes are pie, up is down.” We all did our freakout that happens four times a year.

    So I want to get into why this is a difficult question, why these things are important, why they are different, and why they’re different not just from a technical SEO perspective, but from the intent and why that matters.

    I’ve talked to John a little bit. I’m not going to put words in his mouth, but I think 95% of this will be approved, and if you want to ask him, that’s okay afterwards too.

    Why is this such a difficult question?

    So let’s talk a little bit about classic 301, 302. So a 301 redirect situation is what we call a permanent redirect. What we’re trying to accomplish is something like this. We have an old URL, URL A, and let’s say for example a couple years ago Moz moved our entire site from seomoz.org to moz.com. That was a permanent change, and so we wanted to tell Google two things and all bots and browsers:

    1. First of all, send the people to the new URL, and, second,
    2. pass all the signals. All these equity, PR, ranking signals, whatever you want to call them, authority, that should go to the new page as well.

    So people and bots should both end up on this new page.

    A classic 302 situation is something like a one-day sale. So what we’re saying is for some reason we have this main page with the product. We can’t put the sale information on that page. We need a new URL. Maybe it’s our CMS, maybe it’s a political thing, doesn’t matter. So we want to do a 302, a temporary redirect that says, “Hey, you know what? All the signals, all the ranking signals, the PR, for Google’s sake keep the old page. That’s the main one. But send people to this other page just for a couple of days, and then we’re going to take that away.”

    So these do two different things. One of these tells the bots, “Hey, this is the new home,” and the other one tells it, “Hey, stick around here. This is going to come back, but we want people to see the new thing.”

    So I think sometimes Google interprets our meaning and can change things around, and we get frustrated because we go, “Why are they doing that? Why don’t they just listen to our signals?”

    Why are these differentiations important?

    The problem is this. In the real world, we end up with things like this, we have page W that 301s to page T that 302s to page F and page F rel=canonicals back to page W, and Google reads this and says, “W, T, F.” What do we do?

    We sent bad signals. We’ve done something that just doesn’t make sense, and Google is forced to interpret us, and that’s a very difficult thing. We do a lot of strange things. We’ll set up 302s because that’s what’s in our CMS, that’s what’s easy in an Apache rewrite file. We forget to change it to a 301. Our devs don’t know the difference, and so we end up with a lot of ambiguous situations, a lot of mixed signals, and Google is trying to help us. Sometimes they don’t help us very well, but they just run into these problems a lot.

    In this case, the bots have no idea where to go. The people are going to end up on that last page, but the bots are going to have to choose, and they’re probably going to choose badly because our intent isn’t clear.

    How are 301s, 302s, and rel=canonical different?

    So there are a couple situations I want to cover, because I think they’re fairly common and I want to show that this is complex. Google can interpret, but there are some reasons and there’s some rhyme or reason.

    1. Long-term 302s may be treated as 301s.

    So the first one is that long-term 302s are probably going to be treated as 301s. They don’t make any sense. If you set up a 302 and you leave it for six months, Google is going to look at that and say, “You know what? I think you meant this to be permanent and you made a mistake. We’re going to pass ranking signals, and we’re going to send people to page B.” I think that generally makes sense.

    Some types of 302s just don’t make sense at all. So if you’re migrating from non-secure to secure, from HTTP to HTTPS and you set up a 302, that’s a signal that doesn’t quite make sense. Why would you temporarily migrate? This is probably a permanent choice, and so in that case, and this is actually what John was addressing in this post originally, in that case Google is probably going to look at that and say, “You know what? I think you meant 301s here,” and they’re going to pass signals to the secure version. We know they prefer that anyway, so they’re going to make that choice for you.

    If you’re confused about where the signals are going, then look at the page that’s ranking, because in most cases the page that Google chooses to rank is the one that’s getting the ranking signals. It’s the one that’s getting the PR and the authority.

    So if you have a case like this, a 302, and you leave it up permanently and you start to see that Page B is the one that’s being indexed and ranking, then Page B is probably the one that’s getting the ranking signals. So Google has interpreted this as a 301. If you leave a 302 up for six months and you see that Google is still taking people to Page A, then Page A is probably where the ranking signals are going.

    So that can give you an indicator of what their decision is. It’s a little hard to reverse that. But if you’ve left a 302 in place for six months, then I think you have to ask yourself, “What was my intent? What am I trying to accomplish here?”

    Part of the problem with this is that when we ask this question, “Aren’t 302s, 301s, canonicals all basically the same?” what we’re really implying is, “Aren’t they the same for SEO?” I think this is a legitimate but very dangerous question, because, yes, we need to know how the signals are passed and, yes, Google may pass ranking signals through any of these things. But for people they’re very different, and this is important.

    2. Rel=canonical is for bots, not people.


    So I want to talk about rel=canonical briefly because rel=canonical is a bit different. We have Page A and Page B again, and we’re going to canonical from Page A to Page B. What we’re basically saying with this is, “Look, I want you, the bots, to consider Page B to be the main page. You know, for some reason I have to have these near duplicates. I have to have these other copies. But this is the main one. This is what I want to rank. But I want people to stay on Page A.”

    So this is entirely different from a 301 where I want people and bots to go to Page B. That’s different from a 302, where I’m going to try to keep the bots where they are, but send people over here.

    So take it from a user perspective. I have had in Q&A all the time people say, “Well, I’ve heard that rel=canonical passes ranking signals. Which should I choose? Should I choose that or 301? What’s better for SEO?”

    That’s true. We do think it generally passes ranking signals, but for SEO is a bad question, because these are completely different user experiences, and either you’re going to want people to stay on Page A or you’re going to want people to go to Page B.

    Why this matters, both for bots and for people

    So I just want you to keep in mind, when you look at these three things, it’s true that 302s can pass PR. But if you’re in a situation where you want a permanent redirect, you want people to go to Page B, you want bots to go to Page B, you want Page B to rank, use the right signal. Don’t confuse Google. They may make bad choices. Some of your 302s may be treated as 301s. It doesn’t make them the same, and a rel=canonical is a very, very different situation that essentially leaves people behind and sends bots ahead.

    So keep in mind what your use case actually is, keep in mind what your goals are, and don’t get over-focused on the ranking signals themselves or the SEO uses because all off these three things have different purposes.

    So I hope that makes sense. If you have any questions or comments or you’ve seen anything weird actually happen on Google, please let us know and I’ll be happy to address that. And until then, we’ll see you next week.

    Video transcription by Speechpad.com


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5446786
    via IFTTT
  • Mastering Google Search Operators in 67 Easy Steps

    Posted by Dr-Pete

    Any SEO worth their sustainably harvested pink Himalayan salt knows that Google offers a variety of advanced search operators – special commands that take you above and beyond regular text searches. Learning search operators is a bit like learning chess, though. It’s easy to memorize how each piece moves, but that’s about 1% of your path toward mastery. I know that the pointy-hat guy in chess moves diagonally, but that doesn’t mean I’m about to take on Kasparov or Deep Blue.

    Instead of just listing all of the operators and telling you what they do, I’d like to try something different. This post is a journey in 67 parts, split into five functional stories:

    1. Content Research
    2. Title Research
    3. Plagiarism Check
    4. Competitive Research
    5. Technical SEO/Audits

    You can skip around, but I’d suggest following the story from the beginning. When you’re done, you’ll understand not only what each operator does, but how to use it in real-world situations and mix-and-match it with other useful operators.


    I. Content Research

    Crafting original content in 2017 requires wading into the sea of content that’s already been created, and Google remains the most complete map of that sea. Advanced search operators are invaluable research tools for content marketers. Let’s walk through a sample content journey…

    1. Find all the content

    tesla

    Let’s say you’ve got a blog post to write about the inventor Nikola Tesla. You hop over to Google and search “tesla,” only to find a lot of results like this:

    Google has decided that Tesla Motors is the dominant intent for this phrase, which doesn’t help you very much for your current project.

    2. Narrow your search

    nikola tesla

    So, of course you add more keywords and narrow your search. Now you’re on the right track:

    Anyone who’s ever run a Google search understands this, but there’s an important point here that we often overlook. Whenever you string together more than one word in a Google search, Google connects them with a logical AND. This is true of both keywords and operators. If you combine operators, Google will assume that you meant AND and will try to meet all conditions.

    3. Mind special characters

    tesla ac/dc

    Let’s say you want to specifically find pages with the phrase “ac/dc”, so you try the search above:

    Notice the highlighted words – Google has returned anything matching “AC” and “DC” separately. In this case, they’ve treated the forward slash as the same as a space, which probably isn’t what you intended.

    4. Force exact match with quotes

    tesla “ac/dc”

    By putting quotation marks around a phrase, you can force an exact-match search. This requires Google to match the specific, full phrase – with all terms and in the order specified:

    This is a lot closer to what you probably expected. Notice the highlighting in the second result, where Google seems to have matched “AC-DC”. This is a lot closer than the previous attempt, but Google is still taking some liberties with the forward slash. Be sure to do a sanity check of results any time you use non-alphanumeric characters in a search.

    5. Force a logical OR

    tesla OR edison

    If you specifically want a logical OR between keywords or operators, use the “OR” operator. OR must be in all-caps, or, alternatively you can use the pipe symbol (|):

    Note that, in most cases, Google is still going to give priority to results that contain both terms. Specifying logical OR is most useful when two terms only co-occur rarely.

    6. Group terms with parentheses

    (tesla OR edison) alternating current

    Some operators, including OR, are more useful in complex searches. Here, we’re using parentheses to group “tesla OR edison” and then are adding “alternating current” as an AND condition:

    Requiring all three terms might be unnecessarily restrictive. By using both ANDs and ORs in the same search, we’re giving Google a bit more flexibility. Since you probably don’t want to memorize the precedence of all Google search operators, I highly recommend using parentheses whenever you’re in doubt.

    7. Exclude specific terms

    tesla -motors

    Maybe you want to know what other uses of “tesla” are out there, beyond Tesla Motors. You could use the (-) operator to tell Google to exclude any result with “motors” in it:

    Browsing these results, you can see quickly that Tesla is also a band and a unit of measurement. In addition, Tesla the company makes products other than cars. Keyword exclusions are also called “negative keywords” (thus the minus sign).

    8. Exclude multiple terms

    tesla -motors -car -battery

    Just like positive keywords, you can chain together negative keywords:

    Keep in mind that each minus sign should only be paired with a single keyword or operator.

    9. Exclude exact-match phrases

    tesla -motors -“rock n roll”

    You can exclude full phrases by using the (-) sign followed by the phrase in quotes:

    You can combine individual negative keywords with negative exact-match phrases as needed.

    10. Match broadly with wildcards

    tesla -motors “rock * roll”

    What if you specifically wanted to include more about the rock-n-roll band, but you didn’t care whether it was spelled “rock-n-roll,” “rock and roll,” or “rock & roll,” etc.? You can use the asterisk (*) operator as a wildcard to replace any single word:

    Wildcards behave most predictably within an exact-match phrase, allowing you to find near-matches when you can’t pin down your search to a single phrase. The (*) operator only operates on the word level. There is no single-character wildcard operator.

    11. Find terms near each other

    tesla AROUND(3) edison

    Here’s a nifty one. Maybe you want to find results where “Tesla” and “Edison” not only appear in the document but are fairly close to each other. The AROUND(X) operator tells Google to only return results where the two words are within X words of each other:

    Phrases like “Tesla vs. Thomas Edison” show up as matches, but an article where the two men were mentioned in separate paragraphs wouldn’t.

    12. Find near exact-match phrases

    “nikola tesla” AROUND(2) “thomas alva edison”

    What if, for some reason, you really needed references to include full names? You can combine AROUND(X) with exact-match phrases (in quotes):

    AROUND(X) only works on the entities immediately preceding and following it, so be careful when combining it with other operators or phrases that aren’t exact-match. Note that AROUND(0) returns strange results – if you want to return two words only if they appear together, use an exact-match phrase instead.

    13. Find content on specific sites

    nikola tesla site:pbs.org

    The “site:” operator is an advanced command that lets you specify a specific domain you want to search on. We usually think of it as a technical SEO and audit tool, but it can also help you refine content searches. Let’s say you remembered reading an article on PBS about Tesla, but lost the URL:

    Typically, you’ll use “site:” with a root domain (i.e. leave subdomains, like “www”, off) to match as broadly as possible. Advanced operators like “site:” can be combined with each other and with keywords.

    14. Find content on specific TLDs

    nikola tesla site:edu

    You don’t have to include a full domain with “site:”. For example, let’s say you wanted to find any content about Nikola Tesla on a university website. You could search on all “.edu” domains (also known as a Top-Level Domain, or TLD):

    The “site:” operator will not work on a partial domain name. It only accepts full domains, root domains, or TLDs. You can use it on country-specific TLDs (ccTLDs), such as “co.uk” or “com.sg”.

    15. Find content on multiple TLDs

    nikola tesla (site:gov OR site:edu)

    Just as with keywords, you can combine “site:” operators with logical OR to search multiple domains:

    Often, it’s easier and a bit less confusing to run individual searches, but this example is just to illustrate that you can combine advanced operators in complex ways.

    16. Dealing with broad matches

    discount airfare

    Google is getting better at matching synonyms, which is usually good thing, but it sometimes means that results are a lot broader than you might have expected:

    Here, a search for “discount airfare” is returning keywords like “cheapest flights,” “cheap flights,” “airfare deals,” and a variety of other combinations.

    17. Use exact-match to block synonyms

    “discount airfare”

    This is another situation where exact-match can help. It doesn’t just tell Google to use the full phrase, but it blocks Google from returning any kind of broad match, including synonyms:

    Obviously, the results may still contain synonyms (naturally written content often does), but using exact-match ensures that there will be at least one instance of “discount airfare” in each of the results you get back.

    18. Exact-match on a single word

    discount “airfare”

    This may seem counter-intuitive, but you can apply exact match to just one word. In this case, putting an exact match on “airfare” blocks Google from using synonyms just for that word:

    Here, Google is free to match on synonyms for “discount” (such as “cheapest”), but every result is forced to include “airfare.” Exact-match single words when you want to exclude variations of that word.

    19. What to do when exact-match fails

    “orbi vs eero vs google wifi”

    The other day, I was searching for articles that specifically compared Orbi, Eero, and Google Wifi networking hardware. Something odd happened when I searched on the exact-match phrase:

    It’s not obvious from the search results themselves, but the first result doesn’t contain the phrase anywhere in the body of the text. On rare occasion, Google may match a phrase on secondary relevance factors, such as inbound link anchor text.

    20. Search only in the body text

    intext:“orbi vs eero vs google wifi”

    In these rare cases, you can use the “intext:” operator. This forces Google to find the text in the body of the document. Now, all of the top results clearly have an exact match in the content itself:

    Interestingly, the second result reveals what happened with our last search. A Reddit post featured an article from The Verge with an alternate title and used that title as the anchor text. Reddit apparently had enough authority to generate a match via the anchor text alone.

    21. Find a set of keywords in the text

    allintext: orbi eero google wifi

    What if you want to find a set of words, but they don’t need to be in an exact-match phrase? You could use a separate “intext:” operator for each word, or you could use “allintext:” which tells Google to apply “intext:” to all of the words following the operator:

    All of the results have the target keywords in the body text, in some combination or order. Be very careful about mixing “allintext:” (or any “allin…:” operator) with other commands, or you could end up with unexpected results. The “allintext:” operator will automatically try to process anything that follows it.

    (Special thanks to Michael Martinez for working through some “intext:” examples with me on Twitter, and to Google’s Gary Illyes for clarifying some of the details about how exactly “intext:” works)


    II. Title Research

    You’ve done your content research, and now it’s time to pin down a title. You want to capture those clicks, but, of course, you don’t want to be unoriginal. Here are some search operator combos for title research.

    22. Check for a specific phrase

    “tesla vs edison”

    You’ve settled on using “Tesla vs. Edison” in your title, so let’s do a quick check on content with that exact-match phrase:

    You’ve pinned down Google to an exact-match phrase, but that phrase can occur anywhere in the text. How do we look for it in just the document title?

    23. Check for a phrase in the title

    intitle:“tesla vs edison”

    Use the “intitle:” operator to specify that a keyword or phrase (in quotes) has to occur in the document title:

    Be aware that sometimes Google may rewrite a display title in search results, so it’s possible to get a result back where the phrase doesn’t seem to match the title because Google has rewritten it.

    24. Check multiple keywords in title

    intitle:tesla intitle:vs intitle:edison

    If you want to check for multiple keywords in a title, but don’t want to restrict yourself to exact-match, you can string together multiple “intitle:” operators with single keywords:

    Of course, this can be a bit clunky. Luckily, there’s an easier way…

    25. Check multiple keywords easily

    allintitle: tesla vs edison

    Like “allintext:”, there’s an “allintitle:” operator. It will match any of the keywords following it:

    This returns roughly the same results as #24, which doesn’t make for a very interesting screenshot, but is exactly what we want it to do. Again, be careful combining “allintitle:” with other operators, as it will try to consume everything following it.

    26. Check for titles with lists

    intitle:“top 10 facts” tesla

    Maybe you’ve got your heart set on a listicle, but you want to make sure it hasn’t been done to death. You can combine an “intitle:” operator with a general keyword search on a topic:

    These results are all pages that talk about Tesla but have “Top 10 Facts” in the title.

    27. Find lists and exact-match phrases

    intitle:“top 10 facts” “nikola tesla”

    Oops, we ’re pulling in results about Tesla Motors again. Luckily, you can combine “intitle:” with exact-match phrases and other, more complex operator combos:

    This is much closer to what you probably had in mind, but the bad news is that the “Top 10” things does seem like it’s been overdone, even in the realm of Nikola Tesla.

    28. Check for Top X lists

    intitle:“top 7..9 facts” “nikola tesla”

    The range (..) operator lets you search for a specific range of numbers. Maybe you’re tired of Top 10, but don’t want too short of a list. Let’s check out what Top 7, 8, and 9 lists are out there:

    This returned only four results, and they were all videos. So, at least you’re on the right track, originality-wise. Once you master search operators, you’ll eventually reach the mythical end of the Internet.

    29. Check the title for this post

    intitle:“search operators” “in * easy steps”

    Let’s put all of this to the test – how original is my title for this post? I’m not expecting an exact match to a post with 67 steps, but what about any post mentioning “Search Operators” in the title that also uses some variation of “in * easy steps” anywhere in the result?

    It looks like I did alright, from an originality standpoint. Of course, there are many ways to mix-and-match operators to find similar titles. Ultimately, you have to decide how you define “unique.”


    III. Plagiarism Check

    You’ve finally published that article, but you suspect someone else may have copied it and is taking your traffic. Advanced search operators can be great for hunting down plagiarism.

    30. Find articles with your exact title

    intitle:“duplicate content in a post-panda world”

    Use the “intitle:” operator with your exact-match title to easily spot whether someone has copied your entire article with no modifications. Here’s a search based on a post I wrote a couple of years back:

    Ok, you probably didn’t need to know about the original article, so let’s try again…

    31. Find title matches, excluding sites

    intitle:“duplicate content in a post-panda world” -site:moz.com

    Use (-) with the “site:” operator to exclude specific sites. In this case, we already know that the original title was posted on Moz.com:

    It turns out that two of these sites are just linking to the post in kind of a low-quality but not outright malicious way. What you really want to know if someone is copying the text wholesale…

    32. Find unique, exact-match text

    “they were frolicking in our entrails” -site:moz.com

    Another alternative is to run exact-match on a long, unique phrase. Luckily, this particular blog post has some pretty unique phrases. I’m going to keep the Moz.com exclusion:

    The first result is a harmless (if slightly odd) Facebook post, but the other two are full, copied-and-pasted duplicates of the original post.

    33. Find unique text only in the body

    intext:“they were frolicking in our entrails” -site:moz.com -site:facebook.com

    If you want to be completely sure that the unique text is in the body of the document, you can use the “intext:” operator. Here, I’ve added both “intext:” and a Facebook exclusion. Within reason, it’s ok to mix-and-match a variety of operators:

    Practically speaking, “intext:” often returns similar results to the exact-match phrase by itself. I typically use “intext:” only when I’m seeing strange results or want to make absolutely sure that I’m only looking at document body text.

    34. Find a quote you’re not sure about

    i would rather kiss a wookiee

    What if you’re looking for a long quote, but you can’t remember if you’re getting that quote quite right? We often equate exact-match with long searches, but sometimes it’s better to let Google go broad:

    Here, Google is helpfully reminding me that I’m a lousy Star Wars fan. I’ve even got an article about all the other people who are wrong about this, too.


    IV. Competitive Research

    In some cases, your research may be very focused on what kind of content the competition is creating. Google search operators can help you easily narrow down what your competitors are up to…

    35. Start with a basic search

    tesla announcements

    Let’s say you want to find out who’s publishing Tesla Motors announcements, so you start with the simplest query you can think of:

    You’re probably not looking for Tesla’s own announcements, so you do an exclusion…

    36. Exclude obvious sites

    tesla announcements -site:tesla.com

    You grab the handy “site:” operator and run a negative (-) on Tesla’s own site, resulting in:

    That’s a little better. These are all pretty familiar competitors if you’re in the news game.

    37. Target specific competitors

    tesla announcements site:nytimes.com

    Maybe you want to focus on just one competitor. You can use the “site:” operator for that, too:

    Obviously, this approach is going to work best for large competitors with a high volume of content.

    38. Target a specific subdomain

    tesla announcements site:wheels.blogs.nytimes.com

    Remember that you can use “site:” with a full subdomain. Maybe you just want to find out what CNN’s “Wheels” auto industry blog is posting about.

    You can, of course, exclude specific subdomains with “-site:” as well.

    39. Target a specific author on a site

    tesla announcements site:nytimes.com “neal e boudette”

    Maybe you’re interested in just a single author. There’s no reliable author search operator for organic results, but in most cases, just including the author’s name as exact-match text will do the trick:

    Make sure to pull up an article first to see how the author’s name is presented (middle initial, etc.).

    40. Target by keywords, site, and title

    tesla announcements site:nytimes.com intitle:earnings

    If you wanted Tesla announcements in the New York Times that only mention “Earnings” in the title, then you can mix-and-match operators as needed:

    Don’t be afraid to get creative. The Google index is a big, big place and there’s always more to be found, especially on very large sites.

    41. Find related competitors

    related:nytimes.com

    What if you wanted to branch out to other publications? By using the “related:” operator with a root domain, Google will show you other sites/domains like the one you specify:

    The “related:” operator is great when it works, but be warned that it only works for certain niches and typically for larger sites. It’s also one of the rare Google search operators that can’t be combined with other operators.

    42. Find content in a specific path

    tesla announcements site:fortune.com/2016

    If you want to drill down into a site, you can specify URL folders with the “site:” operator. Forbes, for example, is conveniently organized with year-based folders, so you can easily see just articles from 2016:

    Keep in mind that this only works for parts of the URL directly following the domain name. So, how do you search on text in other parts of the URL?

    43. Search broadly for a “folder”

    tesla announcements inurl:2016

    Luckily, Google also has an “inurl:” operator. By searching on a year, for example, you can find that year anywhere it happens to appear in the result URL:

    Keep in mind that the text you specify “inurl:” can appear anywhere in the URL, not just at the folder level.

    44. Search by a specific date range

    tesla announcements daterange:2457663-2457754

    What if you really want to narrow down your date range? Google also has a “daterange:” operator which lets you pinpoint publication dates to the day, in theory. For example, here’s a search for Q4 of 2016:

    Unfortunately, in regular organic results, publication dates aren’t always accurate, and “daterange:” can, in practice, return some pretty strange results. You may have noticed, too, that that’s not your typical date format. The “daterange:” operator uses the Julian date format.

    45. Search by broad date range

    tesla announcement 2015..2017

    If you don’t need your date range to be particularly precise, consider using the range (..) operator with a year on either side of it. As numbers go, years are generally unique enough to return reasonable results:

    Please note that this is not specifically a date search, but as cheats go, it’s not a bad one. Unfortunately, the range operator doesn’t always work properly paired with “inurl:” and other advanced operators.

    46. Target just one type of file

    tesla announcements filetype:pdf

    The “filetype:” operator lets you specify an extension, such as PDF files. Let’s say you only want Tesla announcements that have been published as PDFs:

    Other file extensions to try are “doc” (Word), “xls” (Excel), “ppt” (PowerPoint), and “txt” (text files). You can also use “filetype:” to specify certain varieties of web pages, including “html”, “php”, “asp”, etc. Keep in mind that the file extension typically has to be listed in the URL, so these searches are not exhaustive.

    47. Find sites linking to competitors

    link:nytimes.com tesla

    The “link:” operator lets you do competitive link research. For example, the search above looks for all documents relevant to Tesla that have links from The New York Times:

    Ok, so mostly this tells you that The New York Times links a lot to The New York Times. That’s probably not quite what you were looking for…

    48. Find links excluding the source

    link:nytimes.com -site:nytimes.com tesla

    Let’s combine “link:” with a negative (-) “site:” operator to remove links from The New York Times:

    Please note that Google has deprecated a lot of the functionality of the “link:” operator and the results it returns are just a sample (and, potentially, an unreliable sample). For in-depth competitive link research, we strongly recommend third-party tools, including our own Open Site Explorer.

    49. Search inside link anchor text

    inanchor:“tesla announcements”

    You can use the “inanchor:” operator to search inside linked text. So, for example, the search above looks for sites being linked to from sites using “tesla announcements” in the linked text. In other words, the results represent the targets of those links (not the sources):

    Please note that, like the “link:” operator, the “inanchor:” operator represents only a small sample of the index and is no longer actively supported by Google. Use it with a grain of salt.

    50. Search multiple words in anchor text

    allinanchor: tesla announcements “model x”

    Like the other “allin…” varieties, “allinanchor:” applies to every word after it, looking for all of those words in the anchor text, but not as an exact-match:

    The three link-based operators (“link:”, “inanchor:”, “allinanchor:”) can be useful for your initial research, but do not expect them to return a full, accurate representation of all links to your site or your competitors.


    V. Technical SEO/Audits

    Advanced Google search operators can also be powerful tools for understanding how sites are indexed and for performing technical audits. Technical SEO is a complex subject, of course, but here are a few examples to get you started:

    51. Glimpse into a site’s index

    site:amazon.com

    It all starts with the “site:” operator, which, at its most basic level, can help you get a glimpse of how Google indexes a site. Here are a few results from Google’s index of Amazon.com:

    Please note that the result count here (and for any large-volume search) is at best an estimate. Given an estimate of 119,000,000 pages, though, we can be assured that the real number is massive. On the scale of any decent-sized site, you’re going to want to drill down…

    52. Filter out the “www” subdomain

    site:amazon.com -inurl:www

    To drill deep into a site’s index, the combination of “site:” with “inurl:” will quickly become your best friend. For example, maybe you want to see only pages on Amazon that aren’t under the “www” subdomain. You could use “site:” along with a negative match (-) on the “inurl:” operator:

    Even in the first few results, you can see a sampling of the other subdomains that Google is indexing. This can give you a good starting point for where to drill down next.

    53. Filter out multiple subdomains

    site:amazon.com -inurl:www -inurl:logistics -inurl:developer -inurl:kdp

    You can extend this concept pretty far, building successively on earlier searches to return narrower and narrower lists of pages. Here’s an example with four “-inurl:” operators:

    I’ve done this with over a dozen “inurl:” statements and am not aware of any fixed limit on how many operators you can combine in a single search. Most sites aren’t big enough to require those kinds of extremes, but it’s good to know that it’s possible if and when you need it.

    54. Focus on a single subdomain

    site:developer.amazon.com

    Alternatively, you can focus on a single subdomain. For this, I generally prefer to include the subdomain in the “site:” operator instead of using “inurl:”. Otherwise, you could find the text anywhere in the URL:

    You could extend this concept to dive deeper into any of the sub-folders returned here (“/ios”, “/ja”, etc.) and even combine a more specific “site:” operator with additional “inurl:” operators.

    55. Filter for non-secure pages

    site:amazon.com -inurl:https

    Interestingly, you can use “inurl:” to include or exclude secure (https:) pages:

    If you’re moving a site from “http:” to “https:”, this trick can help you make sure that new pages are being indexed properly and old pages are gradually disappearing from the index.

    56. Search for a URL parameter

    site:amazon.com inurl:field-keywords

    You can also use “inurl:” to target URL parameters on dynamic pages. For example, let’s say you want to see what kind of internal search pages Google is indexing on Amazon:

    Please note that there’s no way to specify a URL parameter – Google may find the text anywhere in the URL. On the bright side, many URL parameters tend to have unique names.

    57. Search multiple URL attributes

    allinurl: amazon field-keywords nikon

    Much like “allintitle:” and “allintext:”, there’s an “allinurl:” operator. In this example, you’re looking for internal search pages on Amazon that have the word “Nikon” in the URL:

    Unfortunately, “allinurl:” suffers from two problems. One, you can’t reliably combine it with “site:”, which limits your options. Two, it tends to return strange results. For example, notice that the top results for my US search were from Amazon France. In most cases, I recommend using multiple “inurl:” statements instead.

    58. Find stray text files

    site:amazon.com filetype:txt -inurl:robots.txt

    You might be wondering if you left any stray documentation files laying around your site that happened to get picked up by Google. You can do this using a combination of “site:” and “filetype:”:

    In this case, you want to exclude “robots.txt” (using “-inurl:”) because Amazon has dozens of Robots files. This combo is a good way to clean up files that have been accidentally left live on a site.

    59. Dig deep into duplicate content

    site:amazon.com “hot wheels 20 car gift pack”

    A site like Amazon has massive potential for internal duplicate content. By using the “site:” operator with exact match phrases, you can start to pin down near-duplicates:

    In this case, Google is still returning almost 1,000 results. Time to dig deeper…

    60. Dig through duplicate titles

    site:amazon.com intitle:“hot wheels 20 car gift pack”

    You can specifically using “site:” plus “intitle:” to find pages on a site that may be exact duplicates.

    Believe it or not, Google still returns over 100 matching pages. Let’s keep at it…

    61. Find title duplicates with exclusions

    site:amazon.com intitle:“hot wheels 20 car gift pack” -inurl:review -inurl:reviews

    You dig in and notice that many of the results in #60 are review pages, with either “review” or “reviews” in the URL. So, you build on the previous search and add two exclusions:

    Voilà… you’re down to just a half-dozen results. You just leveled up in technical SEO.

    62. Find similar products with different counts

    site:amazon.com “hot wheels * car gift pack”

    Maybe you’re curious about other Hot Wheels gifts packs that represent similar products but not exactly the same one. You could replace “20” with the wildcard (*) operator:

    Unfortunately, wildcards don’t play well with the “intitle:” operator, so you’ll generally be restricted to exact-match phrases outside of advanced operators.

    63. Find similar products with exclusions

    site:amazon.com “hot wheels * car gift pack” -20

    Given all of the previous searches, you probably don’t need to know about the 20-packs, so you can add an exclusion on the number 20 (just treat it as a word with negative match):

    Looks like there’s a healthy number of 5-car gift packs as well. The plot thickens…

    64. Follow the rabbit hole to Wonderland

    site:amazon.com “hot wheels * car gift pack” -20 -5

    It’s time to take the red pill and find out just how deep this rabbit hole goes. You can keep adding exclusions and take out the 5-packs as well:

    Finally, you’re nearing the bottom. This process may seem a bit obsessive, but auditing large sites is a process of identifying potential problems and drilling down until you either you pin down the issues or decide they aren’t worth worrying about. Once you master them, advanced search operators shine at drill-downs.

    65. Bonus: Show me the money!

    site:amazon.com “hot wheels” $19.95

    I woke up in a cold sweat at 2am realizing I had forgotten a search operator (sadly, while you may find it funny, this is not a joke). I warned earlier that special characters can produce weird results, but one that Google does recognize is the dollar sign ($):

    This isn’t really a site audit example, but it fits well with our Amazon story. Keep in mind that, while Google will honor the ($) in the results, they could appear anywhere in those results. Many Amazon pages list multiple prices. Still, it can be a useful tool to add to your arsenal.

    66. Find results in a price range

    site:amazon.com “hot wheels” $19..$20

    You can also combine a ($) search with the range operator (..) and search a range of prices. Let’s say you wanted to find any pages mentioning “Hot Wheels” and prices in the $19-20 range:

    While this tactic can definitely be useful for general product research, e-commerce sites can also use it in an audit to find pages with incorrect or outdated prices.

    67. Find other TLDs for your brand

    site:amazon.* -site:amazon.com

    This last tip could be either an audit trick or a way to track down the competition, depending on how you use it. Use the wildcard (*) in the top-level domain (TLD) to find any site with the same name, and then exclude the main site:

    For a large site, like Amazon, this could help you find other legitimate TLDs, including country-specific TLDs (ccTLDs). Alternatively, you could use this trick to find competitors who have registered your brand name under other TLDs.


    Wait, You’re Still Here?

    Congratulations for making it this far. I hope you’ve picked up at least a handful of useful tricks and the confidence to experiment. If you have favorites I’m missing, please feel free to share them in the comments. I’m sure there’s a good trick or ten I’ve never seen.

    If you need a quick reference, we’ve launched a new Search Operators reference and cheat sheet in the Learning Center. This resource reflects the current state of Google’s search operators, as best we know, including deprecated operators.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5435699
    via IFTTT
  • How to Create Content That Keeps Earning Links (Even After You Stop Promoting It)

    Posted by kerryjones

    Do your link building results look something like this?

    1. Start doing outreach
    2. Get links
    3. Stop doing outreach
    4. No more links

    Everyone talks about the long-term benefits of using content marketing as part of a link building strategy. But without the right type of content, your experience may be that you stop earning links as soon as you stop doing outreach.

    In this sense, you have to keep putting gas in the car for it to keep running (marketing “gas” = time, effort, and resources). But what if there was a way to fill up the car once, and that would give it enough momentum to run for months or even years?

    An example of this is a salary negotiations survey we published last year on Harvard Business Review. The study was picked up by TechCrunch months after we had finished actively promoting it. We didn’t reach out to TechCrunch. Rather, this writer presumably stumbled upon our content while doing research for his article.

    So what’s the key to long-term links? Content that acts as a source.

    The goal is to create something that people will find and link to when they’re in need of sources to cite in content they are creating. Writers constantly seek out sources that will back up their claims, strengthen an argument, or provide further context for readers. If your content can serve as a citation, you can be in a good position to earn a lot of passive links.

    Read on for information about which content types are most likely to satisfy people in need of sources and tips on how to execute these content types yourself.

    Original research and new data

    Content featuring new research can be extremely powerful for building authoritative links via a PR outreach strategy.

    A lot of the content we create for our clients falls under this category, but not every single link that our client campaigns earn are directly a result of us doing outreach.

    In many cases, a large number of links to our client research campaigns earn come from what we call syndication. This is what typically plays out when we get a client’s campaign featured on a popular, authoritative site (which is Site A in the following scenario):

    • Send content pitch to Site A.
    • Site A publishes article linking to content.
    • Site B sees content featured on Site A. Site B publishes article linking to content.
    • Site C sees content featured on Site A. Site C publishes article linking to content.
    • And so on…

    So, what does this have to do with long-term link earning? Once the content is strategically seeded on relevant sites using outreach and syndication, it is well-positioned to be found by other publishers.

    Site A’s content functions as the perfect citation for these additional publishers because it’s the original source of the newsworthy information, establishing it as the authority and thus making it more likely to be linked to. (This is what happened in the TechCrunch example I shared above.)

    Examples

    In a recent Experts on the Wire podcast, guest Andy Crestodina talked about the “missing stat.” According to Andy, most industries have “commonly asserted, but rarely supported” statements. These “stats” are begging for someone to conduct research that will confirm or debunk them. (Side note: this particular podcast episode inspired this post – definitely worth a listen!)

    To find examples of content that uncovers a missing stat in the wild, we can look right here on the Moz blog…

    Confirming industry assumptions

    When we did our native advertising versus content marketing study, we went into it with a hypothesis that many fellow marketers would agree with: Content marketing campaigns perform better than native advertising campaigns.

    This was a missing stat; there hadn’t been any studies done proving or debunking this assumption. Furthermore, there wasn’t any publicly available data about the average number of links acquired for content marketing campaigns. This was a concrete data point a lot of marketers (including us!) wanted to know since it would serve as a performance benchmark.

    As part of the study, we surveyed 30 content marketing agencies about how many links the average content marketing campaign earned, in addition to other questions related to pricing, client KPIs, and more.

    After the research was published here on Moz, we did some promotion to get our data featured on Harvard Business Review, Inc, and Marketing Land. This data is still being linked to and shared today without us actively promoting it, such as this mention on SEMRush’s blog and this mention on the Scoop It blog (pictured below).

    To date, it’s been featured on more than 80 root domains and earned dozens of co-citations. It’s worth noting that this has been about far more than acquiring high-quality links; this research has been extremely effective for driving new business to our agency, which it continues to do to this day.

    Debunking industry assumptions

    But research doesn’t always confirm presumptions. For example, Buzzsumo and Moz’s research collaboration examined a million online articles. A key finding of their research: There was no overall correlation between sharing and linking. This debunked a commonly held assumption among marketers that content that gets a lot of shares will earn a lot of links, and vice versa. To date, this post has received an impressive 403 links from 190 root domains (RDs) according to Open Site Explorer.

    How to use this strategy

    To find original research ideas, look at how many backlinks the top results have gotten for terms like:

    • [Industry topic] report
    • [Industry topic] study
    • [Industry topic] research

    Then, using the MozBar, evaluate what you see in the top SERPs:

    • Have the top results gotten a sizable number of backlinks? (This tells you if this type of research has potential to attract links.)
    • Is the top-ranking content outdated? Can you provide new information? (Try Rand’s tips on leveraging keywords + year.)
    • Is there a subtopic you could explore?

    Additionally, seeing what has already succeeded will allow you to determine two very important things: what can be updated and what can be improved upon. This is a great place to launch a brainstorm session for new data acquisition ideas.

    Industry trend and benchmark reports

    Sure, this content type overlaps with “New Research and Studies,” but it merits its own section because of its specificity and high potential.

    If your vertical experiences significant change from one year, quarter, or month to the next, there may be an opportunity to create recurring reports that analyze the state of your industry. This is a great opportunity to engage all different kinds of brands within your industry while also showcasing your authority in the subject.

    How?

    People often like to take trends and add their own commentary as to why trends are occurring or how to make the most of a new, popular strategy. That means they’ll often link to your report to provide the context.

    And there’s an added promotional benefit: Once you begin regularly publishing and promoting this type of content, your industry will anticipate future releases.

    Examples

    HubSpot’s State of Inbound report, which features survey data from thousands of HubSpot customers, has been published annually for the last eight years. To date, the URL that hosts the report has links from 495 RDs.

    Content Marketing Institute and MarketingProfs have teamed up for the last seven years to release two annual content marketing benchmark reports. The most recent report on B2B content marketing has earned links from 130 RDs. To gather the data, CMI and MarketingProfs emailed a survey to a sample of marketers from their own email marketing lists as well as a few lists from partner companies.

    In addition to static reports, you can take this a step further and create something dynamic that is continually updated, like Indeed’s Job Trends Search (171 RDs) which pulls from their internal job listing data.

    How to use this strategy

    Where can you find fresh industry data? Here are a few suggestions:

    Survey your customers/clients

    You have a whole pool of people who have been involved in your industry, so why not ask them some questions to learn more about their thoughts, needs, fears, and experiences?

    Talking directly to customers and clients is a great way to cut through speculation and discover exactly what problems they’re facing and the solutions they’re seeking.

    Survey your industry

    There are most likely companies in your industry that aren’t direct competitors but have a wealth of insight to provide to the overall niche.

    For example, we at Fractl surveyed 1,300 publishers because we wanted to learn more about what they were looking for in content pitches. This knowledge is valuable to any content marketers involved in content promotions (including ourselves!).

    Ask yourself: What aspect of your industry might need some more clarification, and who can you reach out to for more information?

    Use your internal company data

    This is often the easiest and most effective option. You probably have a ton of interesting data based on your interactions with customers and clients that would benefit fellow professionals in your industry.

    Think about these internal data sets you have and consider how you can break it down to reveal trends in your niche while also providing actionable insights to readers.

    Curated resources

    Research can be one of the most time-consuming aspects of creating content. If someone has pulled together a substantial amount of information on the topic in one place, it can save anyone else writing about it a lot of time.

    If you’re willing to put in the work of digging up data and examples, curated resource content may be your key to evergreen link building. Let’s look at a few common applications of this style of content.

    Examples

    Collections of statistics and facts

    Don’t have the means to conduct your own research? Combining insightful data points from credible sources into one massive resource is also effective for long-term link attraction, especially if you keep updating your list with fresh data.

    HubSpot’s marketing statistics list has attracted links from 963 root domains. For someone looking for data points to cite, a list like this can be a gold mine. This comprehensive data collection features their original data plus data from external sources. It’s regularly updated with new data, and there’s even a call-to-action at the end of the list to submit new stats.

    Your list doesn’t need to be as broad as the HubSpot example, which covers a wide range of marketing topics. A curated list around a more granular topic can work, too, such as this page filled with mobile email statistics (550 RDs).

    Concrete examples

    Good writers help readers visualize what they’re writing about. To do this, you need to show concrete evidence of abstract ideas. As my 7th grade English teacher used to tell us: show, don’t tell.

    By grouping a bunch of relevant examples in a single resource, you can save someone a lot of time when they’re in need of examples to illustrate the points they make in their writing. I can write thousands of words about the idea of 10x content, but without showing examples of what it looks like in action, you’re probably going to have a hard time understanding it. Similarly, the bulk of time it took me to create this post was spent finding concrete examples of the types of content I refer to.

    The resource below showcases 50 examples of responsive design. Simple in its execution, the content features screenshots of each responsive website and a descriptive paragraph or two. It’s earned links from 184 RDs.

    Authority Nutrition’s list of 20 high-protein foods has links from 53 RDs. If I’m writing a nutrition article where I mention high-protein foods, linking to this page will save me from researching and listing out a handful of protein-rich foods.

    How to use this strategy

    The first step is to determine what kind of information would be valuable to have all in one place for other professionals in your industry to access.

    Often times, it’s the same information that would be valuable for you.

    Here are some ways to brainstorm:

    • Explore your recent blog posts or other on-site content. What needed a lot of explaining? What topics did you wish you had more examples to link to? Take careful note of your own content needs while tackling your own work.
    • Examine comments on other industry articles and resources. What are people asking for? This is a gold mine for the needs of potential customers. You can take a similar approach on Reddit and Quora.
    • What works for other industries that you can apply to your own? Search for terms like the following to see what has been successful for other niches that you can apply to yours:
      • [Industry topic] examples
      • types of [industry topic]
      • list of [Industry topic]
      • [Industry topic] statistics OR stats
      • [Industry topic] facts

    No matter which way you choose to proceed, the time investment can help you garner many links down the line.

    Beginner content

    Every niche has a learning curve, with various words, concepts, and ideas being foreign to a beginner.

    Content that teaches noobs the ins and outs of your vertical has long-term linking potential. This type of content is popular for citations because it saves the writer from explaining things in their own words. Instead, they can link to the expert’s explanation.

    And the best part is you can tap your internal experts to provide great insights that can serve as the foundation for this type of content.

    Examples

    101 Content

    Moz’s Beginner’s Guide to SEO is a master class in how comprehensive beginner-level content becomes a link magnet. Not only does the guide have backlinks from more than 1,700 RDs, it also edges out the home page as the most-trafficked page on the site, according to SEMrush.

    “What is…?”

    Beginner content need not be as massive and thorough as the Moz guide to be linkable. It can be as simple as defining an industry term or concept.

    Moz’s meta description page, which has backlinks from 244 RDs, is a solid example of an authoritative yet simple answer to a “what is?” query.

    Another example is the first result in Google for the query “what is the Paleo diet,” which has 731 links from 228 RDs. It’s not a 10,000-word academic paper about the paleo diet. Rather, it’s a concise answer to the question. This page has served as an excellent source for anyone writing about the Paleo diet within the last several years.

    If a lot of adequate top-level, definition-style content already exists about topics related to your vertical, consider creating content around emerging terms and concepts that aren’t yet widely understood, but may soon be more mainstream.

    The perfect example of this? Creating a definitive explanation about content marketing before the entire world knew what content marketing meant. Case in point: Content Marketing Institute’s “What is Content Marketing?” page has amassed an impressive from 12,462 links from 1,100 root domains.

    How to use this strategy

    Buzzsumo recently released a new tool called Bloomberry which scours forums including Reddit and Quora for questions being asked about a keyword. You can search by time period (ex. questions asked within the last 6 months, all-time results, etc.) and filter by source (ex. only see questions asked in Reddit).

    Use Bloomberry to see what beginner questions are being asked about your keyword/topic. Keyword ideas include:

    • [Industry topic] definition
    • How does [industry topic] work
    • [Industry topic] guide
    • What is [industry topic]

    After doing the search, ask yourself:

    • What questions keep coming up?
    • How are these common questions being answered?

    Bloomberry is also useful for spotting research opportunities. Within the first few results for “SaaS” I found three potential research ideas.

    Pro tip: Return to these threads and provide an answer plus link to your content once it’s published.

    Yes, you still need to promote your content

    Don’t mistake this post as a call to stop actively doing outreach and promotion to earn links. Content promotion should serve as the push that gives your content the momentum to continue earning links. After you put in the hard work of getting your content featured on reputable sites with sizable audiences, you have strong potential to organically attract more links. And the more links your content has, the easier it will be for writers and publishers in need of sources to find it.

    What types of content do you think are best for earning citation links? I’d love to hear what’s worked for you – please share your experiences in the comments below.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5428622
    via IFTTT
  • 3 Tactics for Hyperlocal Keywords - Whiteboard Friday

    Posted by randfish

    Trying to target a small, specific region with your keywords can prove frustrating. While reaching a high-intent local audience is incredibly valuable, without volume data to inform your keyword research, you’ll find yourself hitting a wall. In this Whiteboard Friday, Rand shares how to uncover powerful, laser-focused keywords that will reach exactly the right people.

    Click on the whiteboard image above to open a high-resolution version in a new tab!

    Video Transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about hyperlocal keyword research. Now, this is a big challenge, not only for hyperlocal-focused businesses, but also for all kinds of websites that are trying to target very small regions, and many of them, with their keyword research and keyword targeting, on-page optimization.

    The problem:

    So the problem tends to be that most keyword research tools, and this includes things like the Google AdWords Tool, it includes Moz’s Keyword Explorer, or KeywordTool.io, or Übersuggest, or anybody you want to use, most of them are relying on volume data.

    So what happens is when you see a bunch of keyword suggestions, you type in “Sequim,” for example, Sequim is a tiny town on Washington’s peninsula, so across the Puget Sound from where we are here in Seattle. Sequim has a population of like 6,500 people or something like that, so very tiny. So most searches related to Sequim have no volume data in any of these tools. As a result, you don’t see a lot of information about: How can I target these keywords? What are the right ones to go after? You don’t know whether a keyword has zero searches a month, or whether it has four searches a month, and those four searchers are exactly who you want to get in front of, and this is really problematic.

    There are three solutions that we’ve seen professional SEOs use and that some of us here at Moz use and the Moz Local team uses, and these can be real handy for you.

    Solution 1: Use keyword data for larger, similar regions

    So the first one is to basically replicate the data by using keyword information that comes from similar regions nearby. So let’s say, okay, here we are in Sequim, Washington, population 6,669. But Port Angeles is only a few miles away. I think maybe a couple dozen miles away. But its population is more like 20,000. So we’ve got four or five times the keyword volume for most searches probably. This is going to include some outlying areas. So now we can start to get data. Not everything is going to be zero searches per month, and we can probably backtrack that to figure out what Sequim’s data is going to be like.

    The same thing goes for Ruidoso versus Santa Fe. Ruidoso, almost 8,000. But Santa Fe’s population is almost 10 times larger at 70,000. Or Stowe, Vermont, 4,300, tiny, little town. Burlington is nearby, 10 times bigger at 42,000. Great. So now I can take these numbers and I can intuit what the relative volumes are, because the people of Burlington are probably similar in their search patterns to the people of Stowe. There are going to be a few differences, but for most types of local searches this will work.

    Solution 2: Let Google autosuggest help

    The second one, Google autosuggest can be really helpful here. So Google Suggest does not care if there’s one search a month or one search in the last year, versus zero searched in the last year. They’ll still show you something. Well, zero searched in the last year, they won’t show you anything.

    But for example, when I search for “Sequim day,” I can intuit here, because of the ordering that Google Suggest shows me, that “Sequim day spa” is more popular than “day care.” Sequim, by the way, sounds like a lovely place to live if you are someone who enjoys few children and lots of spa time, apparently. Then, “day hikes.”

    So this technique doesn’t just work with Google itself. It’ll also work with Bing, with Google Maps, and with YouTube. Another suggestion on this one, you will see different results if you use a mobile device versus a desktop device. So you might want to change it up and try your mobile device. That can give you some different results.

    Solution 3: Use lexical or related SERP suggestions

    All right. Third tactic here, last one, you can use sort of two styles of keyword research. One is called lexical, which is basically the semantic relationships between words and phrases. The other one is related SERP suggestions, which is where a keyword research tool — Moz Keyword Explorer does this, SEMrush is very popular for this, and there are a few others — and they will basically show you search terms the links that came up, the search results that came up for “Sequim day care” also came up in searches for these terms and phrases. So these are like SERPs for which your SERP also ranked.

    You can see, when I searched for “Sequim day care,” I did this in Keyword Explorer, because I happen to have a Moz Keyword Explorer subscription. It’s very nice of Moz to give me that. You can see that I used two kinds of suggestions. One are related to keywords with similar results, so that’s the related SERPs. The other one was based on closely related topics, like the semantic, lexical thing. “Sequim day care” has given me great stuff like “Banbury School Nursery,” a nearby town, “secondary schools in Banbury,” “Horton Day Nursery,” which is a nursery that’s actually near there, “Port Angeles childcare,” “children’s nursery.”

    So now I’m getting a bunch of keyword suggestions that can potentially be relevant and lead me down a path. When I look at closely related topics, I can see things like closely related topics. By the way, what I did is I actually removed the term “Sequim,” because that was showing me a lot of things that are particular to that region. But if I search for “day care,” I can see lots of closely related topics, like day care center, childcare, school care, special needs children, preschool programs, and afterschool programs. So now I can take all of these and apply the name of the town and get these hyperlocal results.

    This is frustrating still. You don’t have nearly the data that you have for much more popular search terms. But this is a good way to start building that keyword list, targeting, experimenting, and testing out the on-page work that you’re going to need to do to rank for these terms. Then, you’ll start to see your traffic grow from these.

    Hyperlocal may be small, but it can be powerful, it can be very targeted, and it can bring you exactly the customers you’re looking for.

    So good luck with your targeting out there, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5405120
    via IFTTT
  • Local SEO & Beyond: Ranking Your Local Business in 2017

    Posted by Casey_Meraz

    In 2016, I predicted that ranking in the 3-pack was hard and it would continually get more competitive. I maintain that prediction for 2017, but I want to make one thing clear. If you haven’t done so, I believe local businesses should start to look outside of a local-SEO-3-Pack-ONLY focused strategy.

    While local SEO still presents a tremendous opportunity to grow your business, I’m going to look at some supplementary organic strategies you can take into your local marketing campaign, as well.

    In this post I’m going to address:

    • How local search has changed since last year
    • Why & how your overall focus may need to change in 2017
    • Actionable advice on how to rank better to get more local traffic & more business

    In local search success, one thing is clear

    The days of getting in the 3-pack and having a one-trick pony strategy are over. Every business wants to get the free traffic from Google’s local results, but the chances are getting harder everyday. Not only are you fighting against all of your competitors trying to get the same rankings, but now you’re also fighting against even more ads.

    If you thought it was hard to get top placement today in the local pack, just consider that you’re also fighting against 4+ ads before customers even have the possibility of seeing your business.

    Today’s SERPs are ad-rich with 4 paid ads at the top, and now it’s not uncommon to find paid listings prioritized in local results. Just take a look at this example that Gyi Tsakalakis shared with me, showing one ad in the local pack on mobile ranking above the 3-pack results. Keep in mind, there are four other ads above this.

    If you were on desktop and you clicked on one of the 3-pack results, you’re taken to the local finder. In the desktop search example below, once you make it to the local finder you’ll see two paid local results above the other businesses.

    Notice how only the companies participating in paid ads have stars. Do you think that gives them an advantage? I do.


    Don’t worry though, I’m not jaded by ads

    After all of that gloomy ad SERP talk, you’re probably getting a little depressed. Don’t. With every change there comes new opportunity, and we’ve seen many of our clients excel in search by focusing on multiple strategies that work for their business.

    Focusing on the local pack should still be a strong priority for you, even if you don’t have a pay-to-play budget for ads. Getting listed in the local finder can still result in easy wins — especially if you have the most reviews, as Google has very handy sorting options.

    If you have the highest rating score, you can easily get clicks when users decide to sort the results they see by the business rating. Below is an example of how users can easily sort by ratings.

    But what else can you do to compete effectively in your local market?


    Consider altering your local strategy

    Most businesses I speak with seem to have tunnel vision. They think it’s more important to rank in the local pack and, in some cases, even prioritize this over the real goal: more customers.

    Every day, I talk to new businesses and marketers that seem to have a single area of focus. While it’s not necessarily a bad thing to do one thing really well, the ones that are most successful are managing a variety of campaigns tied to their business goals.

    Instead of taking a single approach of focusing on just free local clicks, expand your horizon a bit and ask yourself this question: Where are my customers looking and how can I get in front of them?

    Sometimes taking a step back and looking at things from the 30,000-ft view is beneficial.


    You can start by asking yourself these questions by examining the SERPs:

    1. What websites, OTHER THAN MY OWN, have the most visibility for the topics and keywords I’m interested in?

    You can bet people are clicking on results other than your own website underneath the local results. Are they websites you can show up on? How do you increase that visibility?

    I think STAT has a great tracking tool for this. You simply set up the keywords you want to track and their Share of Voice feature shows who’s ranking where and what percentage of visibility they have in your specific market.

    In the example below, you can see the current leaders in a space I’m tracking. Notice how Findlaw & Yelp show up there. With a little further research I can find out if they have number 1–2 rankings (which they do) and determine whether I should put in place a strategy to rank there. This is called barnacle SEO.

    2. Are my customers using voice search?

    Maybe it’s just me, but I find it strange to talk to my computer. That being said, I have no reservations about talking to my phone — even when I’m in places I shouldn’t. Stone Temple recently published a great study on voice command search, which you can check out here.



    Some of the cool takeaways from that study were where people search from. It seems people are more likely to search from the privacy of their own home, but most mobile devices out there today have voice search integrated. I wonder how many people are doing this from their cars?
    This goes to show that local queries are not just about the 3-pack. While many people may ask their device “What’s the nearest pizza place,” other’s may ask a variety of questions like:



    Where is the highest-rated pizza place nearby?
    Who makes the best pizza in Denver?
    What’s the closest pizza place near me?

    Don’t ignore voice search when thinking about your localized organic strategy. Voice is mobile and voice can sure be local. What localized searches would someone be interested in when looking for my business? What questions might they be asking that would drive them to my local business?

    3. Is my website optimized for “near me” searches?

    “Near me” searches have been on the rise over the past five years and I don’t expect that to stop. Sometimes customers are just looking for something close by. Google Trends data shows how this has changed in the past five years:
    Are you optimizing for a “near me” strategy for your business? Recently the guys over at Local SEO Guide did a study of “near me” local SEO ranking factors. Optimizing for “near me” searches is important and it falls right in line with some of the tactical advice we have for increasing your Google My Business rankings as well. More on that later.

    4. Should my business stay away from ads?

    Let’s start by looking at a some facts. Google makes money off of their paid ads. According to an article from Adweek, “During the second quarter of 2016, Alphabet’s revenue hit $21.5 billion, a 21% year-over-year increase. Of that revenue, $19.1 billion came from Google’s advertising business, up from $16 billion a year ago.”


    This roughly translates to: “Ads aren’t going anywhere and Google is going to do whatever they can to put them in your face.” If you didn’t see the Home Service ad test with all ads that Mike Blumenthal pointed out, you can check it out below. Google is trying to find more creative ways to monetize local search.
    Incase you haven’t heard it before, having both organic and paid listings ranking highly increases your overall click-through rate.

    Although the last study I found was from Google in 2012, we’ve found that our clients have the most success when they rank strong organically, locally, and have paid placements. All of these things tie together. If potential customers are already searching for your business, you’ll see great results by being involved in all of these areas.

    While I’m not a fan of only taking a pay-to-play approach, you need to at least start considering it and testing it for your niche to see if it works for you. Combine it with your overall local and organic strategy.

    5. Are we ignoring the featured snippets?

    Searches with local intent can still trigger featured snippets. One example that I saw recently and really liked was the snowboard size chart example, which you can see below. In this example, someone who is interested in snowboards gets an answer box that showcases a company. If someone is doing this type of research, there’s a likelihood that they may wish to purchase a snowboard soon.
    Depending on your niche, there are plenty of opportunities to increase your local visibility by not ignoring featured snippets and creating content to rank there. Check out this Whiteboard Friday to learn more about how you can get featured snippets.

    Now that we’ve looked at some ways you can expand your strategies, let’s look at some tactical steps you can take to move the needle.


    Here’s how you can gain more visibility

    Now that you have an open mind, let’s take a look at the actionable things you can do to improve your overall visibility and rankings in locally centric campaigns. As much as I like to think local SEO is rocket science, it really isn’t. You really need to focus your attention on the things that are going to move the needle.

    I’m also going to assume you’ve already done the basics, like optimize your listing by filling out the profile 100%.

    Later last year, Local SEO Guide and Placescout did a great study that looked at 100+ variables from 30,000 businesses to determine what factors might have the most overall impact in local 3-pack rankings. If you have some spare time I recommend checking it out. It verified that the signals we put the most effort into seem to have the greatest overall effect.

    I’m only going to dive into a few of those factors, but here are the things I would do to focus on a results-first strategy:

    Start with a solid website/foundation

    What good are rankings without conversions? The answer is they aren’t any good. If you’re always keeping your business goals in mind, start with the basics. If your website isn’t loading fast, you’re losing conversions and you may experience a reduced crawl budget.

    My #1 recommendation that affects all aspects of SEO and conversions is to start with a solid website. Ignoring this usually creates bigger problems later down the road and can negatively impact your overall rankings.

    Your website should be SEO-friendly and load in the 90th percentile on Google’s Page Speed Insights. You can also see how fast your website loads for users using tools like GTMetrix. Google seems to reduce the visibility of slower websites, so if you’re ignoring the foundation you’re going to have issues. Here are 6 tips you can use for a faster Wordpress website.

    Crawl errors for bots can also wreak havoc on your website. You should always strive to maintain a healthy site. Check up on your website using Google’s Search Console and use Moz Pro to monitor your clients’ campaigns by actively tracking the sites’ health, crawl issues, and domain health over time. Having higher scores and less errors should be your focus.

    Continue with a strong review generation strategy

    I’m sure many of you took a deep breath when earlier this month Google changed the review threshold to only 1 review. That’s right. In case you didn’t hear, Google is now giving all businesses a review score based on any number of reviews you have, as you can see in the example below:
    I know a lot of my colleagues were a big fan of this, but I have mixed feelings since Google isn’t taking any serious measures to reduce review spam or penalize manipulative businesses at this point.


    Don’t ignore the other benefits of reviews, as well. Earlier I mentioned that users can sort by review stars; having more reviews will increase your overall CTR. Plus, after talking to many local businesses, we’ve gotten a lot of feedback that consumers are actively using these scores more than ever.

    So, how do you get more reviews?

    Luckily, Google’s current Review and Photo Policies do not prohibit the direct solicitation of reviews at this point (unlike Yelp).

    Start by soliciting past customers on your list
    If you’re not already collecting customer information on your website or in-store, you’re behind the times and you need to start doing so immediately.

    I work mainly with attorneys. Working in that space, there are regulations we have to follow, and typically the number of clients is substantially less than a pizza joint. In pickles like this, where the volume is low, we can take a manual approach where we identify the happiest clients and reach out to them using this process. This particular process also creates happy employees. :)

    1. List creation: We start by screening the happiest clients. We then sort these by who has a Gmail account for priority’s sake.
    2. Outreach by phone: I don’t know why digital marketers are afraid of the phone, but we’ve had a lot of success calling our prior clients. We have the main point-of-contact from the business who’s worked with them before call and ask how the service they received was. The caller informs them that they have a favor to ask and that their overall job performance is partially based off of client feedback. They indicate they’re going to send a follow-up email if it’s OK with the customer.
    3. Send a follow-up email: We then use a Google review link generator, which creates an exact URL that opens the review box for the person if they’re logged into their Gmail account.
    4. Follow-up email: Sometimes emails get lost. We follow up a few times to make sure the client leaves the review…
    5. You have a new review!

    The method above works great for low-volume businesses. If you’re a higher-volume business or have a lot of contacts, I recommend using a more automated service to prepare for future and ongoing reviews, as it’ll make the process a heck of a lot easier. Typically we use Get Five Stars or Infusionsoft integrations to complete this for our clients.

    If you run a good business that people like, you can see results like this. This is a local business which had 7 reviews in 2015. Look where they are now with a little automation asking happy customers to leave a review:

    Don’t ignore & don’t be afraid of links

    One thing Google succeeded at is scaring away people from getting manipulative links. In many areas, that went too far and resulted in people not going after links at all, diminishing their value as a ranking factor, and telling the world that links are dead.

    Well, I’m here to tell you that you need good links to your website. If you want to rank in competitive niches or in certain geographic areas, the anchor text can make a big difference. Multiple studies have shown the effectiveness of links to this very day, and their importance cannot be overlooked.

    This table outlines which link tactics work best for each strategy:


    Strategy Type Link Tactic
    Local SEO (3-Pack) Links to local GMB-connected landing page will help 3-pack rankings. City, state, and keyword-included anchor text is beneficial
    Featured Snippets Links to pages where you want to get a featured snippet will help boost the authority of that page.
    Paid Ads Links will not help your paid ads.
    “Near Me” Searches Links with city, state, or area anchor text will help you in near me searches.
    Voice Search Links to pages that are FAQ or consist of long-tail keyword content will help them rank better organically.
    Barnacle SEO Links to websites you don’t own can help them rank better. Focus on high-authority profiles or business listings.

    There are hundreds of ways to build links for your firm. You need to avoid paying for links and spammy tactics because they’re just going to hurt you. Focus on strong and sustainable strategies — if you want to do it right, there aren’t any shortcuts.

    Since there are so many great link building resources out there, I’ve linked to a few of my favorite where you can get tactical advice and start building links below.

    For specific tactical link building strategies, check out these resources:

    If you participate in outreach or broken link building, check out this new post from Directive Consulting — “How We Increased Our Email Response Rate from ~8% to 34%” — to increase the effectiveness of your outreach.

    Get relevant & high-authority citations

    While the importance of citations has taken a dive in recent years as a major ranking factor, they still carry quite a bit of importance.

    Do you remember the example from earlier in this post, where we saw Findlaw and Yelp having strong visibility in the market? These websites get traffic, and if a potential customer is looking for you somewhere where you’re not, that’s one touchpoint lost. You’ll still need to address quality over quantity. The days of looking for 1,000 citations are over and have been for many years. If you have 1,000 citations, you probably have a lot of spam links to your website. We don’t need those. But what we do need is highly relevant directories to either our city or niche.

    This post I wrote over 4 years ago is still pretty relevant on how you can find these citations and build them with consistency. Remember that high-authority citations can also be unstructured (not a typical business directory). They can also be very high-quality links if the site is authoritative and has fewer business listings. There are millions of listings on Yelp, but maybe less than one hundred on some other powerful, very niche-specific websites.

    Citation and link idea: What awards was your business eligible or nominated for?

    One way to get these is to consider awards where you can get an authoritative citation and link to your website. Take a look at the example below of a legal website. This site is a peanut compared to a directory like Yelp. Sure, it doesn’t carry near as much authority, but the link equity is more evenly distributed.


    Lastly, stay on point

    2017 is sure to be a volatile year for local search, but it’s important to stay on point. Spread your wings, open your mind, and diversify with strategies that are going to get your business more customers.

    Now it’s time to tell me what you think! Is something I didn’t mention working better for you? Where are you focusing your efforts in local search?


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5398147
    via IFTTT
  • Proximity to Searcher is the New #1 Local Search Ranking Factor

    Posted by Whitespark

    Have you noticed that a lot of local pack results don’t seem to make sense these days? Almost every time I search Google for a local search term, the pack results leave me wondering, “Why are these businesses ranking?”

    For example, take a look at the results I get for “plumbers”:


    (Searched in an incognito Chrome browser on PC in Edmonton)

    Here’s a quick summary of the basic local ranking factors for the businesses in this local pack:

    Notice that:

    • None of the businesses have claimed/verified their Google listing.
    • None of the businesses have any Google reviews.
    • Only one of the businesses even has a website!

    Surely, Google, there are more prominent businesses in Edmonton that deserve to rank for this term?

    Here’s the data table again with one additional point added: proximity to the searcher.

    These business are all so close to me that I could walk to them in about 8 to 15 minutes. Here’s a map of Edmonton with pins for my location and these 3 businesses. Just look at how close they are to my location:

    After analyzing dozens of queries that my colleagues and I searched for, I am going to make a bold statement:

    “Proximity to searcher is the new #1 ranking factor in local search results today.” - Darren Shaw

    For most local searches these days, proximity appears to be weighted more than links, website content, citations, and reviews in the local pack rankings. Google doesn’t seem to value the traditional local search ranking factors when determining which businesses to rank in the local pack. The main consideration seems to be: “Which businesses are closest to the searcher?” I have been noticing this trend for at least the last 8 months or so, and it seems to have intensified since the Possum update.

    Evidence of proximity-based local rankings

    Whitespark has team members that are scattered throughout Edmonton, so four of us ran a series of searches from our home offices to see how the results differ across the city.

    Here is a map showing where we are physically located in Edmonton:

    On desktop, Google doesn’t actually know exactly where we are. It guesstimates it based on IP, WiFi, and mobile data. You can figure out where Google thinks you’re located by doing the following:

    1. Open an incognito browser in Chrome.
    2. Go to maps.google.com.
    3. Search for a local business in your city.
    4. Click the “Directions” button.
    5. Enter “my location” into the top field.

    In order to give you directions, Maps will drop a circle on the spot that it thinks you’re located at.

    Here’s where Google thinks I am located:

    As a team, at approximately the same time of day, all four of us searched the same 9 local queries in incognito browser windows and saved screenshots of our results.

    The search terms:

    Non-geo-modified terms (keyword):
    plumbers
    lawyers
    coffee shops

    Geo-modified terms (keyword + city):
    plumbers edmonton
    edmonton plumbers
    edmonton lawyers
    lawyers edmonton
    coffee shops edmonton
    edmonton coffee shops

    Below are the mapped results for 9 local queries that we each searched in incognito browsers. Rather than dumping 24 maps on the page, here they are in a Slideshare that you can click through:

    Proximity is the New Top Local Search Ranking Factor from Darren Shaw

    As you click through, you’ll see that each of us get completely different results, and that these results are generally clustered around our location.

    You can also see that proximity impacted non-geo-modified terms (“plumbers”) more than the results for geo-modified terms (“edmonton plumbers”). The differences we’re seeing are likely due to relevancy for the geo-modified term. So for instance, the websites may have more anchor text targeting the term “Edmonton plumbers,” or the overall content on the site has more references to Edmonton plumbers.

    How does proximity impact local organic results?

    Localized organic results are the blue links that list businesses, directories, etc, under the local pack. We’re seeing some very minor differences in the results, but relatively consistent local organic rankings across the city.

    Generally, localized organic results are consistent no matter where you’re located in a city — which is a strong indication of traditional ranking signals (links, reviews, citations, content, etc) that outweigh proximity when it comes to local organic results.

    Here are screenshots of the local organic results:

    Proximity is the New Top Local Search Ranking Factor from Darren Shaw

    Some observations

    1. Non geo-modified searches (keyword only) can pull results from neighboring cities. In the new local packs, proximity to searcher is not affected by the city you are in, but by the radius of the searcher. This does not appear to be the same for a geo-modified term — when you add a city to the search. This tells us that the #1 local search ranking factor from the Local Search Ranking Factors survey, “Physical address in city of search,” may no longer be as important as it once was.
    2. Results sometimes cluster together. Even though there may be businesses closer to the searcher, it seems like Google prefers to show you a group of businesses that are clustered together.
    3. Google would rather show a smaller pack than a 3-pack when there is a business that’s too far away from the searcher. For example: I only get a 2-pack of nearby businesses here, but I know there are at least 5 other businesses that match this search term:
    4. Probably obvious, but if there aren’t many businesses in the category, then Google will return a wider set of results from all over the city:

    Why is Google doing this?

    Why is Google giving so much ranking strength to proximity and reducing the impact of traditional local search ranking factors?

    To sell more ads, of course.

    I can think of three ways that this will increase ad revenue for Google:

    1. If it’s harder to get into the organically driven local packs, then businesses will need to pay to get into their fancy new paid local packs.
    2. Back in the day, there was one local pack per city/keyword combo (example: “edmonton plumbers”). Now there are thousands of local packs across the city. When they create a new pack every mile, they drastically increase their available “inventory” to sell ads on.
    3. When the results in the 3-pack aren’t giving you what you want, then a click into “more places” will bring up the Local Finder, where Google is already displaying ads:
    4. (Bonus) And have you noticed that the new local ad packs focus on “nearby”? The local ads and the local pack results are increasingly focused on how close the businesses are to your physical location.

    Though I don’t think it’s only for the additional ad revenue. I think they truly believe that returning closer businesses is a better user experience, and they have been working on improving their technology around this for quite some time.

    Way back in 2012, Whitespark’s Director of Local Search, Nyagoslav Zhekov, noted in the 2012 Local Search Ranking Factors survey that proximity of business location to the point of the searcher was his top local ranking factor. He says:

    “What really matters, is where the searcher is physically located and how close the potentially relevant search results are. This ranking factor is getting further boost by the importance of local-mobile search, where it is undoubtedly #1. For desktop search the factor might not be as important (or not have any significance) if searcher’s location and the location for which the search is intended differ.”

    It is interesting to note that in today’s results, as we can see in the examples in this post, proximity is now a huge ranking factor on desktop as well. Google has been going “mobile-first” for years, and I’m starting to think that there is no difference in how they process mobile and desktop local results. You just see different results because Google can get a more precise location on mobile.

    Furthermore, Bill Slawski just published a post about a recently approved Google patent for determining the quality of locations based on travel time investment. The patent talks about using quality measures like reviews (both user and professional) AND travel time and distance from the searcher (time investment) to rank local businesses in search results.

    One excerpt from the patent:

    “The present disclosure is directed to methods and apparatus for determining the quality measure of a given location. In some implementations, the quality measure of a given location may be determined based on the time investment a user is willing to make to visit the given location. For example, the time investment for a given location may be based on comparison of one or more actual distance values to reach the given location to one or more anticipated distance values to reach the given location. The actual distance values are indicative of actual time of one or more users to reach the given location and the anticipated distance values are indicative of anticipated time to reach the given location.”

    The patent was filed in May 2013, so we can assume that Google may have been experimenting with this and incorporating it into local search for at least the past 3 to 4 years. In the past year, the dial seems to have been cranked up on this factor as Google gets more distance and travel data from Android users and from users of the Google Maps app on other mobile platforms.

    These results suck

    It seems to me that in most business categories, putting so much emphasis on proximity is a pretty poor way to rank results. I don’t care if a lawyer is close to me. I am looking to hire a lawyer that’s reputable, prominent in my city, and does good work. I’m perfectly happy to drive an extra 20 minutes to go to the office of a good lawyer. I’m also looking for the best pizza in town, not the cardboard they serve at the place down the street. The same applies for every business category I can think of, outside of maybe gas stations, emergency plumbers, or emergency locksmiths.

    In my opinion, this emphasis on proximity by Google seriously downgrades the quality of their local results. People are looking for the best businesses, not the closest businesses. If this is the new normal in Google’s local results, I expect that people will start turning to sites like Yelp, TripAdvisor, Avvo, Angie’s List, etc. when searching for businesses. I already have.

    So what about local rank tracking?

    Most local rank trackers set the location to the city, which is the equivalent of setting it to the centroid. It is very likely that the local pack and local finder results reported in your rank tracker will be different from what the business or client sees when they search. To get more accurate results, you should use a rank tracker that lets you set the location by zip/postal code (hint hint, Whitespark’s Local Rank Tracker).

    You should also realize that you’re never going to get local rank tracking reports that perfectly match with what the person sitting in the city sees. There are just too many variables to control for. The precise proximity to the searcher is one thing a rank tracker can’t exactly match, but you’ll also see differences based on device used, browser version, personalization, and even time of day as results can and do change by the hour.

    Use your rank tracking reports as a measure of general increases and decreases in local visibility, not as an exact match with what you would see if you were searching from within city.

    How does this affect local SEO strategies?

    Local SEO is not dead. Far from it. It’s just more competitive now. The reach your business can have in local results is smaller than it used to be, which means you need to step up your local organic and optimization efforts.

    • Local search practitioners, if you’re seeing traffic and rankings going down in your local SEO reporting and you need to answer to your clients on this, you’re now armed with more info on how to answer these questions. It’s not you, it’s Google. They have reduced the radius that your business will be shown in the search results, so you’re going to be driving less traffic and leads from local pack results.
    • If you want your business to rank in the pack or local finder, you will need to crank up the dial on your optimization efforts.
    • Get on those local organic opportunities (content and links). There is less pack real estate for you now, but the localized organic results are still great city-wide opportunities. The local organic results are currently localized to the city, not the searcher location. We can see this in all the terms.
    • Look for outliers. Study the businesses that are getting pulled into the local rankings from a far distance from the searcher. What are they doing in terms of content, links, reviews, and mentions that helps them appear in a wider radius than other businesses?
    • Diversify your local optimization efforts beyond Google. Make sure you’re on Yelp, BBB, TripAdvisor, Avvo, Angie’s List, etc, and that your profiles are claimed, optimized, and enhanced with as much information as possible. Then, make sure you’re driving reviews on THESE sites rather than just Google. If the local pack results are crap, a lot of people will click Yelp’s 10 Best XYZ list, for example. You want to be on that list. The more reviews you get on these sites, the better you will rank in their internal search results, and as people desert Google for local business recommendations because of their low-quality results, you’ll be ready and waiting for them on the other sites.

    The tighter radius might mean less local search pie for the more dominant businesses in the city, but don’t despair. This opens up opportunities for more businesses to attract local search business from their local neighborhood, and there is still plenty of business to drive through local search if you step up your game.

    Have you also noticed hyper-localized local pack results? I would love to hear about your examples and thoughts in the comments.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5387606
    via IFTTT
  • Branding Success: How to Use PPC to Amplify Your Brand

    Posted by purna_v

    Here’s a question for you:

    Do you think a brand can influence your behavior outside of purchase preference? Put another way, will seeing the North Face logo make you want to take up hiking in the snow?

    A few years ago, researchers at Duke University conducted an experiment with 341 students. Their goal? Studying what makes a brand powerful and how we’re influenced by brands. As part of this study, the students were asked to complete what they were told was a visual acuity test.

    During this test, either an Apple logo or IBM logo flashed on the screen for a second, so quickly that the students were unaware they had been exposed to the logo. The participants then completed a task designed to evaluate how creative they were, listing all the uses they could think of for a brick.

    Are you surprised that students exposed to the Apple logo came up with not just more uses, but more creative uses? The experiment was also done using the Disney Channel logo and the E! logo – and the students were tested on their degree of honesty and dishonesty. Which logo exposure led to more honesty? If you thought Disney, you’re right.

    This is evidence that subliminal brand exposure can cause people to act in specific ways. Branding matters.

    For those of us who work in paid search, this whole “branding” thing, with its unintuitive KPIs, can seem nebulous and not something for us to worry about. We PPC-ers have specific goals and KPIs, and it’s easy for us to be seen as only a bottom-funnel channel. But we’re far more powerful than that.

    Here’s the truth: Brand advertising via PPC does impact the bottom line.

    I’ll share three key ways to build a framework for branding:

    1. Make choosing you easier.
    2. Show your customers you care.
    3. Make it easy to be a loyal customer.

    Chances are you’re taking some of these steps already, which is fantastic. This framework can guide you to ensure you’re covering all the steps of the funnel. Let’s break down how PPC can support all three of these key points.

    1. Make choosing you easier

    Top brands understand their audiences really well. And what’s true of pretty much every audience right now is that we’re all looking for the fast fix. So if a brand can make it easy for us to find what we need, to get something done – that brand is going to win our hearts.

    Which is why getting your ad messaging right is critical.

    Something I notice repeatedly is that we’re so focused on that next advanced tactic or the newest feature that we neglect the simple basics. And that is how we get cracks in our foundation.

    Most accounts I look at perform brilliantly with the complex, but routinely make avoidable errors with the basics.

    Ad copy

    Ads are one of those places where the cracks aren’t just visible, they’re also costly. Let’s look at a few examples of ads with sitelink extensions.

    Example 1: What not to do

    What do you think of this ad?

    It’s a decent ad. It’s just not great. What’s hurting the ad is that the sitelinks are a broad – even random – mix of different paths and actions a person can take. We have a mix of product, social media, and spokesperson content. This is not likely to make anyone’s life easier.

    Even if I had been interested in the makeup, I might be distracted by the opportunity to meet Carrie Underwood, reducing the odds of a conversion. In trying to please too many different audiences, this ad doesn’t do a particularly strong job of pleasing anyone.

    Example 2: Sitelinks organized according to stage of interest

    Why not organize your sitelinks according to your customer’s stage of interest instead, like Clinique did here? This is brilliant.

    Clinique is acknowledging that some shoppers are here just to buy the makeup they always order – so “Shop Makeup” is the first sitelink offered. But other visitors have come to see what’s new, or to do research on the quality of Clinique skincare, and probably everyone is looking for that discount.

    Organizing sitelinks by your customer’s stage of interest also boosts brand by showing your customer that you care. We’ll talk more about that piece later.

    Example 3: Sitelinks organized according to customer’s need

    Here’s something smart: Organizing sitelinks according to what you already know your customers need.

    Harley Davidson knows that a potential customer coming to their website wants more than pretty pictures of the bike. They’re ready to schedule a test ride or even estimate payments, so these options are right at the top.

    They also understand that Harley Davidson is an aspirational product. I may want to estimate a payment or find information about my local dealer even before I know how to ride a bike. It’s part of the dream of joining the Harley lifestyle. They know this and make their customers’ lives easier by sharing links to learn-to-ride classes.

    Example 4: Give them multiple ways to choose you

    For brands targeting by geography and who have a local presence, including call extensions and location extensions is a must.

    As searches move from desktop to mobile, we know that local searches take the lead – and conversions on a local search happen within five hours of the search (source: Microsoft Internal research). Including call and location extensions helps shorten that conversion cycle.

    What I especially love about this ad is that they give you two different buying options. You can visit the store at the physical address, or if that is deemed out-of-the-way by the searcher, the ad entices them to shop Sephora with a discount code for an online purchase. This increases the odds that the shopper will choose Sephora as opposed to visiting a more conveniently located competitor.

    Indirect brand terms

    When people are looking for your service but not necessarily your brand, you can still make their lives easier by sharing answers to questions they may have.

    Of course, you’re already showing up for branded searches or searches directly asking for your product. But what about being helpful to your customers by answering their questions with helpful information? Bidding on these keywords is good for your brand.

    For example, Neutrogena is doing a great job at showing up for longer-tail keywords, and they’re also working to build the association between gentle makeup removers for sensitive skin and their brand.

    And here, Crest is doing a fantastic job in using their ad copy to make themselves stand out as experts. If anyone has questions about teeth whitening, they’re showing that they’re ready to answer them:

    This also helps you show up for long-tail queries, which are another increasingly critical aspect of voice search.

    2. Show your customers you care

    If you can anticipate issues and show up when your customers are venting, you win.

    Professor Andrew Ehrenberg of South Bank Business School says that people trust strong brands more. They forgive your mistakes more easily. They believe you will put things right.

    And what better way to show your customers you care than by anticipating their issues?

    Be there when they want to complain

    Where’s the first place you go when you want to look something up? Most likely a search engine. Showing up well in the SERPs can make a big difference.

    Let’s look at an example. I did a search for complaints related to Disney, a brand with a strong positive sentiment.

    Surprisingly, the SERPs were filled with complaint sites. What could have helped Disney here would be if they ran ads on these keywords, with the message that they were keen to make things right, and here’s the best number to call and chat.

    Wouldn’t that diffuse the situation? Best of all, keywords like this would be very low-cost to bid on.

    What about showing up when potential customers are complaining about the competition? You could consider running ads for keywords related to complaints about your competition.

    I’d advise you to be careful with this approach since you want to come across as being helpful, not gloating. This strategy also may not lead to very many conversions – since the searcher is looking to complain and not to find alternative businesses – but given the low cost, it may be worth testing.

    Cross-channel wins

    As PPCs, we’re more powerful than even we give ourselves credit for. Our work can greatly help the PR and SEO teams. Here’s how.

    PR:

    As noted earlier, the search engine is the first place we go when we want to look up something.

    This is so very impactful that, as reported in the New York Times, Microsoft scientists were able to analyze large samples of search engine queries that could in some cases identify Internet users who were suffering from pancreatic cancer, even before they have received a diagnosis of the disease.

    This all goes to show the power of search. We can also harness that power for reputation management.

    Broad-match bidding can help PR with brand protection. Looking through broad-match search term reports, a.k.a search query reports (SQRs), can help to spot trends like recalls or a rise in negative sentiment.

    PPCs can send the PR folks a branded SQR on a regular basis for them to scrub through to spot any concerning trends. This can help PPC stand out as a channel that protects and monitors brand sentiment.

    SEO:

    Content marketing is a key way for brands to build loyalty, and PPC is an excellent way to get the content to the audience. Serving ads on key terms that support the content you have allows you to give your audience the info they really want.

    For example, if your SEO teams built a mortgage calculator as value-add content, then you could serve ads for queries such as “How much house can I afford?”:

    Taking this concept a step further, you can use high-value content to show up with ads that match the research stage of the customer’s interest. As PPCs, we’re often keen to simply show an ad that gets people to convert. But what if they’re not ready? Why should we either ignore them or show up with something that doesn’t match their goal?

    Take a look at these ads that show up for a research-stage query:

    The first ad from Sears – while very compelling – seems mismatched to the search query.

    Now look at the third ad in the list, offering 50 kitchen idea photos. This is a much better match to the query. If it were me searching, this is the ad I would have chosen to click on.

    What happens to the conversion?

    Well, the landing page of the “50 ideas” ad could feature some type of offer, say like what the Sears ad has to offer, and here it would be much more welcome. In this way, we could use higher-funnel ads as lead gen, with KPIs such as content impressions, lead form fills, and micro-conversions.

    This is such a win-win-win strategy:

    • You’ve shown your customers you care for them and will be there for them
    • You’ve helped your colleagues get more exposure for their hard work
    • You’ve earned yourself cost-effective new leads and conversions.

    Boss move.

    Want more ideas? Wil Reynolds has some fantastic tips on how SEOs can use PPC to hit their goals.

    3. Make it easy to be a loyal customer

    Growing customer lifetime value is one of the most worthwhile things a brand can do. There are two clever ways to do this.

    Smarter remarketing

    You liked us enough to buy once – how would you like to buy again? Show your customers more of what they like over time and they’ll be more attuned to choosing your brand, provided you’ve served them well.

    What about remarketing based on how long it’s been since the purchase of a product?

    This tactic can be seen as helpful as opposed to overtly sales-y, building brand loyalty. Think of how Amazon does it with their emails suggesting other products or deals we may be interested in. As a result, we just keep going back to Amazon. Even if they don’t have the lowest price.

    For example, what if a sports nutrition company knew that most customers took three months to finish their box of protein shake powder? Then around the middle of month two, the company could run an ad like this to their list of buyers. It features an offer and shows up just at the right time.

    The customer will probably think they’ve lucked out to find a special offer just at the right time. We know that it’s not luck, it’s just smarter remarketing.

    Want more ideas? Check out Sam Noble’s Whiteboard Friday on how paid media can help drive loyalty and advocacy.

    Show up for the competition

    Remember when the iPhone 6s launched? Samsung ran very clever PPC ads during the launch of the iPhone 6s, and again when Apple was in the news about the phones bending.

    Samsung used humor – which, importantly, wasn’t mean-spirited – and got a lot of attention and goodwill, not to mention a ton of PR and social media attention. Great for their brand at the time!

    You can use the same tactic to run ads on competitors’ brand names with ads that showcase your USP. This works especially well for remarketing in paid search (or RLSA) campaigns.

    Here, Chevy capitalized on the Tesla Model 3 announcement-related search volume spike. They ran ads that reminded users that their cars were available in late 2016, with the unstated message that it’s much sooner than when the Tesla Model 3 cars are expected to arrive.

    Give back

    Engaging with the customer is the best way to make it easy for them to be loyal to your brand. Enhance that by showing them you care about what they care about for added impact.

    Here’s one way to give back to your customer, and this particular effort is also a huge branding opportunity.

    I love how L’Oréal is associating themselves with empowering women – and most of their customers will like this as well. They’re giving back to their customers by honoring the women they care about. To create loyal customers, the best brands give back in meaningful ways.

    Wrapping up

    One of my favorite Seth Godin quotes is, “Marketing is no longer about the stuff that you make, but about the stories that you tell.”

    PPC is a wonderful channel to shape and create stories that will engage and delight your customers.

    And now we come full circle, to that place where we started, wondering how in the world PPC can impact brand. Your paid search campaigns are a chapter in your brand’s story, and you have an unlimited number of ways to write that chapter, and to contribute to the brand.

    Branding isn’t just for the birds. Have you found a way to use PPC to help grow your brand? I’d love to hear your ideas in the comments below.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5373755
    via IFTTT
  • Strategic SEO Decisions to Make Before Website Design and Build

    Posted by Maryna_Samokhina

    The aim: This post highlights SEO areas that need to be addressed and decided on before the website brief is sent to designers and developers.

    Imagine a scenario: a client asks what they should do to improve their organic rankings. After a diligent tech audit, market analysis, and a conversion funnel review, you have to deliver some tough recommendations:

    “You have to redesign your site architecture,” or

    “You have to migrate your site altogether,” or even

    “You have to rethink your business model, because currently you are not providing any significant value.”

    This can happen when SEO is only seriously considered after the site and business are up and running. As a marketing grad, I can tell you that SEO has not been on my syllabus amongst other classic components of the marketing mix. It’s not hard to imagine even mentored and supported businesses overlooking this area.

    This post aims to highlight areas that need to be addressed along with your SWOT analysis and pricing models — the areas before you design and build your digital ‘place’:

    • Wider strategic areas
    • Technical areas to be discussed with developers.
    • Design areas to be discussed with designers.

    Note: This post is not meant to be a pre-launch checklist (hence areas like robots.txt, analytics, social, & title tags are completely omitted), but rather a list of SEO-affecting areas that will be hard to change after the website is built.

    Wider strategic questions that should be answered:

    1. How do we communicate our mission statement online?

    After you identify your classic marketing ‘value proposition,’ next comes working out how you communicate it online.

    Are terms describing the customer problem/your solution being searched for? Your value proposition might not have many searches; in this case, you need to create a brand association with the problem-solving for specific customer needs. (Other ways of getting traffic are discussed in: “How to Do SEO for Sites and Products with No Search Demand”).

    How competitive are these terms? You may find that space is too competitive and you will need to look into alternative or long-tail variations of your offering.

    2. Do we understand our customer segments?

    These are the questions that are a starting point in your research:

    • How large is our market? Is the potential audience growing or shrinking? (A tool to assist you: Google Trends.)
    • What are our key personas — their demographics, motivations, roles, and needs? (If you are short on time, Craig Bradford’s Persona Research in Under 5 Minutes shows how to draw insights using Twitter.)
    • How do they behave online and offline? What are their touch points beyond the site? (A detailed post on Content and the Marketing Funnel.)

    This understanding will allow you to build your site architecture around the stages your customers need to go through before completing their goal. Rand offers a useful framework for how to build killer content by mapping keywords. Ideally, this process should be performed in advance of the site build, to guide which pages you should have to target specific intents and keywords that signify them.

    3. Who are our digital competitors?

    Knowing who you are competing against in the digital space should inform decisions like site architecture, user experience, and outreach. First, you want to identify who fall under three main types of competitors:

    • You search competitors: those who rank for the product/service you offer. They will compete for the same keywords as those you are targeting, but may cater to a completely different intent.
    • Your business competitors: those that are currently solving the customer problem you aim to solve.
    • Cross-industry competitors: those that solve your customer problem indirectly.

    After you come up with the list of competitors, analyze where each stands and how much operational resource it will take to get where they are:

    • What are our competitors’ size and performance?
    • How do they differentiate themselves?
    • How strong is their brand?
    • What does their link profile look like?
    • Are they doing anything different/interesting with their site architecture?

    Tools to assist you: Open Site Explorer, Majestic SEO, and Ahrefs for competitor link analysis, and SEM rush for identifying who is ranking for your targeted keywords.

    Technical areas to consider in order to avoid future migration/rebuild

    1. HTTP or HTTPS

    Decide on whether you want to use HTTPS or HTTP. In most instances, the answer will be the former, considering that this is also one of the ranking factors by Google. The rule of thumb is that if you ever plan on accepting payments on your site, you need HTTPS on those pages at a minimum.

    2. Decide on a canonical version of your URLs

    Duplicate content issues may arise when Google can access the same piece of content via multiple URLs. Without one clear version, pages will compete with one another unnecessarily.

    In developer’s eyes, a page is unique if it has a unique ID in the website’s database, while for search engines the URL is a unique identifier. A developer should be reminded that each piece of content should be accessed via only one URL.

    3. Site speed

    Developers are under pressure to deliver code on time and might neglect areas affecting page speed. Communicate the importance of page speed from the start and put in some time in the brief to optimize the site’s performance (A three-part Site Speed for Dummies Guide explains why we should care about this area.)

    4. Languages and locations

    If you are planning on targeting users from different countries, you need to decide whether your site would be multi-lingual, multi-regional, or both. Localized keyword research, hreflang considerations, and duplicate content are all issues better addressed before the site build.

    Using separate country-level domains gives an advantage of being able to target a country or language more closely. This approach is, however, reliant upon you having the resources to build and maintain infrastructure, write unique content, and promote each domain.

    If you plan to go down the route of multiple language/country combinations on a single site, typically the best approach is subfolders (e.g. example.com/uk, example.com/de). Subfolders can run from one platform/CMS, which means that development setup/maintenance is significantly lower.

    5. Ease of editing and flexibility in a platform

    Google tends to update their recommendations and requirements all the time. Your platform needs to be flexible enough to make quick changes at scale on your site.

    Design areas to consider in order to avoid future redesign

    1. Architecture and internal linking

    An effective information architecture is critical if you want search engines to be able to find your content and serve it to users. If crawlers cannot access the content, they cannot rank it well. From a human point of view, information architecture is important so that users can easily find what they are looking for.

    Where possible, you should look to create a flat site structure that will keep pages no deeper than 4 clicks from the homepage. That allows search engines and users to find content in as few clicks as possible.

    Use keyword and competitor research to guide which pages you should have. However, the way pages should be grouped and connected should be user-focused. See how users map out relationships between your content using a card sorting technique — you don’t have to have website mockup or even products in order to do that. (This guide discusses in detail how to Improve Your Information Architecture With Card Sorting.)

    2. Content-first design

    Consider what types of content you will host. Will it be large guides/whitepapers, or a video library? Your content strategy needs to be mapped out at this point to understand what formats you will use and hence what kind of functionality this will require. Knowing what content type you will producing will help with designing page types and create a more consistent user interface.

    3. Machine readability (Flash, JS, iFrame) and structured data

    Your web pages might use a variety of technologies such as Javascript, Flash, and Ajax that can be hard for crawlers to understand. Although they may be necessary to provide a better user experience, you need to be aware of the issues these technologies can cause. In order to improve your site’s machine readability, mark up your pages with structured data as described in more detail in the post: “How to Audit a Site for Structured Data Opportunities”.

    4. Responsive design

    As we see more variation in devices and their requirements, along with shifting behavior patterns of mobile device use, ‘mobile’ is becoming less of a separate channel and instead is becoming an underlying technology for accessing the web. Therefore, the long-term goal should be to create a seamless and consistent user experience across all devices. In the interest of this goal, responsive design and dynamic serving methods can assist with creating device-specific experiences.

    Closing thoughts

    As a business owner/someone responsible for launching a site, you have a lot on your plate. It is probably not the best use of your time to go down the rabbit hole, reading about how to implement structured data and whether JSON-LD is better than Microdata. This post gives you important areas that you should keep in mind and address with those you are delegating them to — even if the scope of such delegation is doing research for you (“Give me pros and cons of HTTPS for my business” ) rather than complete implementation/handling.

    I invite my fellow marketers to add other areas/issues you feel should be addressed at the initial planning stages in the comments below!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



    from The Moz Blog http://tracking.feedpress.it/link/9375/5361106
    via IFTTT
  • Uploads
  • Posts