Loading...

Follow Moz | SEO Software, Tools & Resources for Smart.. on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Posted by DiTomaso

We base so much of our livelihood on good data, but managing that data properly is a task in and of itself. In this week's Whiteboard Friday, Dana DiTomaso shares why you need to keep your data clean and some of the top things to watch out for.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi. My name is Dana DiTomaso. I am President and partner at Kick Point. We're a digital marketing agency, based in the frozen north of Edmonton, Alberta. So today I'm going to be talking to you about data hygiene.

What I mean by that is the stuff that we see every single time we start working with a new client this stuff is always messed up. Sometimes it's one of these four things. Sometimes it's all four, or sometimes there are extra things. So I'm going to cover this stuff today in the hopes that perhaps the next time we get a profile from someone it is not quite as bad, or if you look at these things and see how bad it is, definitely start sitting down and cleaning this stuff up.

1. Filters

So what we're going to start with first are filters. By filters, I'm talking about analytics here, specifically Google Analytics. When go you into the admin of Google Analytics, there's a section called Filters. There's a section on the left, which is all the filters for everything in that account, and then there's a section for each view for filters. Filters help you exclude or include specific traffic based on a set of parameters.

Filter out office, home office, and agency traffic

So usually what we'll find is one Analytics property for your website, and it has one view, which is all website data which is the default that Analytics gives you, but then there are no filters, which means that you're not excluding things like office traffic, your internal people visiting the website, or home office. If you have a bunch of people who work from home, get their IP addresses, exclude them from this because you don't necessarily want your internal traffic mucking up things like conversions, especially if you're doing stuff like checking your own forms.

You haven't had a lead in a while and maybe you fill out the form to make sure it's working. You don't want that coming in as a conversion and then screwing up your data, especially if you're a low-volume website. If you have a million hits a day, then maybe this isn't a problem for you. But if you're like the rest of us and don't necessarily have that much traffic, something like this can be a big problem in terms of the volume of traffic you see. Then agency traffic as well.

So agencies, please make sure that you're filtering out your own traffic. Again things like your web developer, some contractor you worked with briefly, really make sure you're filtering out all that stuff because you don't want that polluting your main profile.

Create a test and staging view

The other thing that I recommend is creating what we call a test and staging view. Usually in our Analytics profiles, we'll have three different views. One we call master, and that's the view that has all these filters applied to it.

So you're only seeing the traffic that isn't you. It's the customers, people visiting your website, the real people, not your office people. Then the second view we call test and staging. So this is just your staging server, which is really nice. For example, if you have a different URL for your staging server, which you should, then you can just include that traffic. Then if you're making enhancements to the site or you upgraded your WordPress instance and you want to make sure that your goals are still firing correctly, you can do all that and see that it's working in the test and staging view without polluting your main view.

Test on a second property

That's really helpful. Then the third thing is make sure to test on a second property. This is easy to do with Google Tag Manager. What we'll have set up in most of our Google Tag Manager accounts is we'll have our usual analytics and most of the stuff goes to there. But then if we're testing something new, like say the content consumption metric we started putting out this summer, then we want to make sure we set up a second Analytics view and we put the test, the new stuff that we're trying over to the second Analytics property, not view.

So you have two different Analytics properties. One is your main property. This is where all the regular stuff goes. Then you have a second property, which is where you test things out, and this is really helpful to make sure that you're not going to screw something up accidentally when you're trying out some crazy new thing like content consumption, which can totally happen and has definitely happened as we were testing the product. You don't want to pollute your main data with something different that you're trying out.

So send something to a second property. You do this for websites. You always have a staging and a live. So why wouldn't you do this for your analytics, where you have a staging and a live? So definitely consider setting up a second property.

2. Time zones

The next thing that we have a lot of problems with are time zones. Here's what happens.

Let's say your website, basic install of WordPress and you didn't change the time zone in WordPress, so it's set to UTM. That's the default in WordPress unless you change it. So now you've got your data for your website saying it's UTM. Then let's say your marketing team is on the East Coast, so they've got all of their tools set to Eastern time. Then your sales team is on the West Coast, so all of their tools are set to Pacific time.

So you can end up with a situation where let's say, for example, you've got a website where you're using a form plugin for WordPress. Then when someone submits a form, it's recorded on your website, but then that data also gets pushed over to your sales CRM. So now your website is saying that this number of leads came in on this day, because it's in UTM mode. Well, the day ended, or it hasn't started yet, and now you've got Eastern, which is when your analytics tools are recording the number of leads.

But then the third wrinkle is then you have Salesforce or HubSpot or whatever your CRM is now recording Pacific time. So that means that you've got this huge gap of who knows when this stuff happened, and your data will never line up. This is incredibly frustrating, especially if you're trying to diagnose why, for example, I'm submitting a form, but I'm not seeing the lead, or if you've got other data hygiene issues, you can't match up the data and that's because you have different time zones.

So definitely check the time zones of every product you use --website, CRM, analytics, ads, all of it. If it has a time zone, pick one, stick with it. That's your canonical time zone. It will save you so many headaches down the road, trust me.

3. Attribution

The next thing is attribution. Attribution is a whole other lecture in and of itself, beyond what I'm talking about here today.

Different tools have different ways of showing attribution

But what I find frustrating about attribution is that every tool has its own little special way of doing it. Analytics is like the last non-direct click. That's great. Ads says, well, maybe we'll attribute it, maybe we won't. If you went to the site a week ago, maybe we'll call it a view-through conversion. Who knows what they're going to call it? Then Facebook has a completely different attribution window.

You can use a tool, such as Supermetrics, to change the attribution window. But if you don't understand what the default attribution window is in the first place, you're just going to make things harder for yourself. Then there's HubSpot, which says the very first touch is what matters, and so, of course, HubSpot will never agree with Analytics and so on. Every tool has its own little special sauce and how they do attribution. So pick a source of truth.

Pick your source of truth

This is the best thing to do is just say, "You know what? I trust this tool the most." Then that is your source of truth. Do not try to get this source of truth to match up with that source of truth. You will go insane. You do have to make sure that you are at least knowing that things like your time zones are clear so that's all set.

Be honest about limitations

But then after that, really it's just making sure that you're being honest about your limitations.

Know where things are necessarily going to fall down, and that's okay, but at least you've got this source of truth that you at least can trust. That's the most important thing with attribution. Make sure to spend the time and read how each tool handles attribution so when someone comes to you and says, "Well, I see that we got 300 visits from this ad campaign, but in Facebook it says we got 6,000.

Why is that? You have an answer. That might be a little bit of an extreme example, but I mean I've seen weirder things with Facebook attribution versus Analytics attribution. I've even talked about stuff like Mixpanel and Kissmetrics. Every tool has its own little special way of recording attributions. It's never the same as anyone else's. We don't have a standard in the industry of how this stuff works, so make sure you understand these pieces.

4. Interactions

Then the last thing are what I call interactions. The biggest thing that I find that people do wrong here is in Google Tag Manager it gives you a lot of rope, which you can hang yourself with if you're not careful.

GTM interactive hits

One of the biggest things is what we call an interactive hit versus a non-interactive hit. So let's say in Google Tag Manager you have a scroll depth.

You want to see how far down the page people scroll. At 25%, 50%, 75%, and 100%, it will send off an alert and say this is how far down they scrolled on the page. Well, the thing is that you can also make that interactive. So if somebody scrolls down the page 25%, you can say, well, that's an interactive hit, which means that person is no longer bounced, because it's counting an interaction, which for your setup might be great.

Gaming bounce rate

But what I've seen are unscrupulous agencies who come in and say if the person scrolls 2% of the way down the page, now that's an interactive hit. Suddenly the client's bounce rate goes down from say 80% to 3%, and they think, "Wow, this agency is amazing." They're not amazing. They're lying. This is where Google Tag Manager can really manipulate your bounce rate. So be careful when you're using interactive hits.

Absolutely, maybe it's totally fair that if someone is reading your content, they might just read that one page and then hit the back button and go back out. It's totally fair to use something like scroll depth or a certain piece of the content entering the user's view port, that that would be interactive. But that doesn't mean that everything should be interactive. So just dial it back on the interactions that you're using, or at least make smart decisions about the interactions that you choose to use. So you can game your bounce rate for that.

Goal setup

Then goal setup as well, that's a big problem. A lot of people by default maybe they have destination goals set up in Analytics because they don't know how to set up event-based goals. But what we find happens is by destination goal, I mean you filled out the form, you got to a thank you page, and you're recording views of that thank you page as goals, which yes, that's one way to do it.

But the problem is that a lot of people, who aren't super great at interneting, will bookmark that page or they'll keep coming back to it again and again because maybe you put some really useful information on your thank you page, which is what you should do, except that means that people keep visiting it again and again without actually filling out the form. So now your conversion rate is all messed up because you're basing it on destination, not on the actual action of the form being submitted.

So be careful on how you set up goals, because that can also really game the way you're looking at your data.

Ad blockers

Ad blockers could be anywhere from 2% to 10% of your audience depending upon how technically sophisticated your visitors are. So you'll end up in situations where you have a form fill, you have no corresponding visit to match with that form fill.

It just goes into an attribution black hole. But they did fill out the form, so at least you got their data, but you have no idea where they came from. Again, that's going to be okay. So definitely think about the percentage of your visitors, based on you and your audience, who probably have an ad blocker installed and make sure you're comfortable with that level of error in your data. That's just the internet, and ad blockers are getting more and more popular.

Stuff like Apple is changing the way that they do tracking. So definitely make sure that you understand these pieces and you're really thinking about that when you're looking at your data. Again, these numbers may never 100% match up. That's okay. You can't measure everything. Sorry.

Bonus: Audit!

Then the last thing I really want you to think about — this is the bonus tip — audit regularly.

So at least once a year, go through all the different stuff that I've covered in this video and make sure that nothing has changed or updated, you don't have some secret, exciting new tracking code that somebody added in and then forgot because you were trying out a trial of this product and you tossed it on, and it's been running for a year even though the trial expired nine months ago. So definitely make sure that you're running the stuff that you should be running and doing an audit at least on an yearly basis.

If you're busy and you have a lot of different visitors to your website, it's a pretty high-volume property, maybe monthly or quarterly would be a better interval, but at least once a year go through and make sure that everything that's there is supposed to be there, because that will save you headaches when you look at trying to compare year-over-year and realize that something horrible has been going on for the last nine months and all of your data is trash. We really don't want to have that happen.

So I hope these tips are helpful. Get to know your data a little bit better. It will like you for it. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by TheMozTeam

This post was originally published on the STAT blog.

Whether you’re tracking thousands or millions of keywords, if you expect to extract deep insights and trends just by looking at your keywords from a high-level, you’re not getting the full story.

Smart segmentation is key to making sense of your data. And you’re probably already applying this outside of STAT. So now, we’re going to show you how to do it in STAT to uncover boatloads of insights that will help you make super data-driven decisions.

To show you what we mean, let’s take a look at a few ways we can set up a search intent project to uncover the kinds of insights we shared in our whitepaper, Using search intent to connect with consumers.

Before we jump in, there are a few things you should have down pat:

1. Picking a search intent that works for you

Search intent is the motivating force behind search and it can be:

  • Informational: The searcher has identified a need and is looking for information on the best solution, ie. [blender], [food processor]
  • Commercial: The searcher has zeroed in on a solution and wants to compare options, ie. [blender reviews], [best blenders]
  • Transactional: The searcher has narrowed their hunt down to a few best options, and is on the precipice of purchase, ie. [affordable blenders], [blender cost]
    • Local (sub-category of transactional): The searcher plans to do or buy something locally, ie. [blenders in dallas]
    • Navigational (sub-category of transactional): The searcher wants to locate a specific website, ie. [Blendtec]

We left navigational intent out of our study because it’s brand specific and didn’t want to bias our data.

Our keyword set was a big list of retail products — from kitty pooper-scoopers to pricey speakers. We needed a straightforward way to imply search intent, so we added keyword modifiers to characterize each type of intent.

As always, different strokes for different folks: The modifiers you choose and the intent categories you look at may differ, but it’s important to map that all out before you get started.

2. Identifying the SERP features you really want

For our whitepaper research, we pretty much tracked every feature under the sun, but you certainly don’t have to.

You might already know which features you want to target, the ones you want to keep an eye on, or questions you want to answer. For example, are shopping boxes taking up enough space to warrant a PPC strategy?

In this blog post, we’re going to really focus-in on our most beloved SERP feature: featured snippets (called “answers” in STAT). And we’ll be using a sample project where we’re tracking 25,692 keywords against Amazon.com.

3. Using STAT’s segmentation tools

Setting up projects in STAT means making use of the segmentation tools. Here’s a quick rundown of what we used:

  • Standard tag: Best used to group your keywords into static themes — search intent, brand, product type, or modifier.
  • Dynamic tag: Like a smart playlist, automatically returns keywords that match certain criteria, like a given search volume, rank, or SERP feature appearance.
  • Data view: House any number of tags and show how those tags perform as a group.

Learn more about tags and data views in the STAT Knowledge Base.

Now, on to the main event…

1. Use top-level search intent to find SERP feature opportunities

To kick things off, we’ll identify the SERP features that appear at each level of search intent by creating tags.

Our first step is to filter our keywords and create standard tags for our search intent keywords (read more abou tfiltering keywords). Second, we create dynamic tags to track the appearance of specific SERP features within each search intent group. And our final step, to keep everything organized, is to place our tags in tidy little data views, according to search intent.

Here’s a peek at what that looks like in STAT:

What can we uncover?

Our standard tags (the blue tags) show how many keywords are in each search intent bucket: 2,940 commercial keywords. And our dynamic tags (the sunny yellow stars) show how many of those keywords return a SERP feature: 547 commercial keywords with a snippet.

This means we can quickly spot how much opportunity exists for each SERP feature by simply glancing at the tags. Boom!

By quickly crunching some numbers, we can see that snippets appear on 5 percent of our informational SERPs (27 out of 521), 19 percent of our commercial SERPs (547 out of 2,940), and 12 percent of our transactional SERPs (253 out of 2,058).

From this, we might conclude that optimizing our commercial intent keywords for featured snippets is the way to go since they appear to present the biggest opportunity. To confirm, let’s click on the commercial intent featured snippet tag to view the tag dashboard…

Voilà! There are loads of opportunities to gain a featured snippet.

Though, we should note that most of our keywords rank below where Google typically pulls the answer from. So, what we can see right away is that we need to make some serious ranking gains in order to stand a chance at grabbing those snippets.


2. Find SERP feature opportunities with intent modifiers

Now, let’s take a look at which SERP features appear most often for our different keyword modifiers.

To do this, we group our keywords by modifier and create a standard tag for each group. Then, we set up dynamic tags for our desired SERP features. Again, to keep track of all the things, we contained the tags in handy data views, grouped by search intent.

What can we uncover?

Because we saw that featured snippets appear most often for our commercial intent keywords, it’s time to drill on down and figure out precisely which modifiers within our commercial bucket are driving this trend.

Glancing quickly at the numbers in the tag titles in the image above, we can see that “best,” “reviews,” and “top” are responsible for the majority of the keywords that return a featured snippet:

  • 212 out of 294 of our “best” keywords (72%)
  • 109 out of 294 of our “reviews” keywords (37%)
  • 170 out of 294 of our “top” keywords (59%)

This shows us where our efforts are best spent optimizing.

By clicking on the “best — featured snippets” tag, we’re magically transported into the dashboard. Here, we see that our average ranking could use some TLC.


There is a lot of opportunity to snag a snippet here, but we (actually, Amazon, who we’re tracking these keywords against) don’t seem to be capitalizing on that potential as much as we could. Let’s drill down further to see which snippets we already own.

We know we’ve got content that has won snippets, so we can use that as a guideline for the other keywords that we want to target.


3. See which pages are ranking best by search intent

In our blog post How Google dishes out content by search intent, we looked at what type of pages — category pages, product pages, reviews — appear most frequently at each stage of a searcher’s intent.

What we found was that Google loves category pages, which are the engine’s top choice for retail keywords across all levels of search intent. Product pages weren’t far behind.

By creating dynamic tags for URL markers, or portions of your URL that identify product pages versus category pages, and segmenting those by intent, you too can get all this glorious data. That’s exactly what we did for our retail keywords

What can we uncover?

Looking at the tags in the transactional page types data view, we can see that product pages are appearing far more frequently (526) than category pages (151).

When we glanced at the dashboard, we found that slightly more than half of the product pages were ranking on the first page (sah-weet!). That said, more than thirty percent appeared on page three and beyond. So despite the initial visual of “doing well”, there’s a lot of opportunity that Amazon could be capitalizing on.

We can also see this in the Daily Snapshot. In the image above, we compare category pages (left) to product pages (right), and we see that while there are less category pages ranking, the rank is significantly better. Amazon could take some of the lessons they’ve applied to their category pages to help their product pages out.

Wrapping it up

So what did we learn today?

  1. Smart segmentation starts with a well-crafted list of keywords, grouped into tags, and housed in data views.
  2. The more you segment, the more insights you’re gonna uncover.
  3. Rely on the dashboards in STAT to flag opportunities and tell you what’s good, yo!

Want to see it all in action? Get a tailored walkthrough of STAT, here.

Or get your mitts on even more intent-based insights in our full whitepaper: Using search intent to connect with consumers.

Read on, readers!

More in our search intent series:


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by TheMozTeam

We teamed up with our friends at Duda, a website design scaling platform service, who asked their agency customers to divulge their most pressing SEO questions, quandaries, and concerns. Our in-house SEO experts, always down for a challenge, hunkered down to collaborate on providing them with answers. From Schema.org to voice search to local targeting, we're tackling real-world questions about organic search. Read on for digestible insights and further resources!

How do you optimize for international markets?

International sites can be multi-regional, multilingual, or both. The website setup will differ depending on that classification.

  • Multi-regional sites are those that target audiences from multiple countries. For example: a site that targets users in the U.S. and the U.K.
  • Multilingual sites are those that target speakers of multiple languages. For example, a site that targets both English and Spanish-speakers.

To geo-target sections of your site to different countries, you can use a country-specific domain (ccTLD) such as “.de” for Germany or subdomains/subdirectories on generic TLDs such as “example.com/de.”

For different language versions of your content, Google recommends using different URLs rather than using cookies to change the language of the content on the page. If you do this, make use of the hreflang tag to tell Google about alternate language versions of the page.

For more information on internationalization, visit Google’s “Managing multi-regional and multilingual sites” or Moz’s guide to international SEO.

How do we communicate to clients that SEO projects need ongoing maintenance work?

If your client is having difficulty understanding SEO as a continuous effort, rather than a one-and-done task, it can be helpful to highlight the changing nature of the web.

Say you created enough quality content and earned enough links to that content to earn yourself a spot at the top of page one. Because organic placement is earned and not paid for, you don’t have to keep paying to maintain that placement on page one. However, what happens when a competitor comes along with better content that has more links than your content? Because Google wants to surface the highest quality content, your page’s rankings will likely suffer in favor of this better page.

Maybe it’s not a competitor that depreciates your site’s rankings. Maybe new technology comes along and now your page is outdated or even broken in some areas.

Or how about pages that are ranking highly in search results, only to get crowded out by a featured snippet, a Knowledge Panel, Google Ads, or whatever the latest SERP feature is?

Set-it-and-forget-it is not an option. Your competitors are always on your heels, technology is always changing, and Google is constantly changing the search experience.

SEO specialists are here to ensure you stay at the forefront of all these changes because the cost of inaction is often the loss of previously earned organic visibility.

How do I see what subpages Google delivers on a search? (Such as when the main page shows an assortment of subpages below the result, via an indent.)

Sometimes, as part of a URL’s result snippet, Google will list additional subpages from that domain beneath the main title-url-description. These are called organic sitelinks. Site owners have no control over when and which URLs Google chooses to show here aside from deleting or NoIndexing the page from the site.

If you’re tracking keywords in a Moz Pro Campaign, you have the ability to see which SERP features (including sitelinks) your pages appear in.

The Moz Keyword Explorer research tool also allows you to view SERP features by keyword:

What are the best techniques for analyzing competitors?

One of the best ways to begin a competitor analysis is by identifying the URLs on your competitor’s site that you’re directly competing with. The idea of analyzing an entire website against your own can be overwhelming, so start with the areas of direct competition.

For example, if you’re targeting the keyword “best apple pie recipes,” identify the top ranking URL(s) for that particular query and evaluate them against your apple pie recipe page.

You should consider comparing qualities such as:

Moz also created the metrics Domain Authority (DA) and Page Authority (PA) to help website owners better understand their ranking ability compared to their competitors. For example, if your URL has a PA of 35 and your competitor’s URL has a PA of 40, it’s likely that their URL will rank more favorably in search results.

Competitor analysis is a great benchmarking tool and can give you great ideas for your own strategies, but remember, if your only strategy is emulation, the best you’ll ever be is the second-best version of your competitors!

As an SEO agency, can you put a backlink to your website on clients’ pages without getting a Google penalty? (Think the Google Penguin update.)

Many website design and digital marketing agencies add a link to their website in the footer of all their clients’ websites (usually via their logo or brand name). Google says in their quality guidelines that “creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links, can be considered a violation of our guidelines” and they use the example of “widely distributed links in the footers or templates of various sites.” This does not mean that all such footer links are a violation of Google’s guidelines. What it does mean is that these links have to be vouched for by the site’s owner. For example, an agency cannot require this type of link on their clients’ websites as part of their terms of service or contract. You must allow your client the choice of using nofollow or removing the link.

The fourth update of the Google Penguin algorithm was rolled into Google’s core algorithm in September of 2016. This new “gentler” algorithm, described in the Google Algorithm Change History, devalues unnatural links, rather than penalizing sites, but link schemes that violate Google’s quality guidelines should still be avoided.

We’re working on a new website. How do we communicate the value of SEO to our customers?

When someone searches a word or phrase related to a business, good SEO ensures that the business’s website shows up prominently in the organic (non-ad) search results, that their result is informative and enticing enough to prompt searchers to click, and that the visitor has a positive experience with the website. In other words, good SEO helps a website get found, get chosen, and convert new business.

That’s done through activities that fall into three main categories:

  • Content: Website content should be written to address your audience’s needs at all stages of their purchase journey: from top-of-funnel, informational content to bottom-of-funnel, I-want-to-buy content. Search engine optimized content is really just content that is written around the topics your audience wants and in the formats they want it, with the purpose of converting or assisting conversions.
  • Links: Earning links to your web content from high-quality, relevant websites not only helps Google find your content, it signals that your site is trustworthy.
  • Accessibility: Ensuring that your website and its content can be found and understood by both search engines and people. A strong technical foundation also increases the likelihood that visitors to the website have a positive experience on any device.

Why is SEO valuable? Simply put, it’s one more place to get in front of people who need the products or services you offer. With 40–60 billion Google searches in the US every month, and more than 41% / 62% (mobile / desktop) of clicks going to organic, it’s an investment you can’t afford to ignore.

How do you optimize for voice search? Where do you find phrases used via tools like Google Analytics?

Google doesn’t yet separate out voice query data from text query data, but many queries don’t change drastically with the medium (speaking vs. typing the question), so the current keyword data we have can still be a valuable way to target voice searchers. It’s important here to draw the distinction between voice search (“Hey Google, where is the Space Needle?”) and voice commands (ex: “Hey Google, tell me about my day”) — the latter are not queries, but rather spoken tasks that certain voice assistant devices will respond to. These voice commands differ from what we’d type, but they are not the same as a search query.

Voice assistant devices typically pull their answers to informational queries from their Knowledge Graph or from the top of organic search results, which is often a featured snippet. That’s why one of the best ways to go after voice queries is to capture featured snippets.

If you’re a local business, it’s also important to have your GMB data completely and accurately filled out, as this can influence the results Google surfaces for voice assistance like, “Hey Google, find me a pizza place near me that’s open now.”

Should my clients use a service such as Yext? Do they work? Is it worth it?

Automated listings management can be hugely helpful, but there are some genuine pain points with Yext, in particular. These include pricing (very expensive) and the fact that Yext charges customers to push their data to many directories that see little, if any, human use. Most importantly, local business owners need to understand that Yext is basically putting a paid layer of good data over the top of bad data — sweeping dirt under the carpet, you might say. Once you stop paying Yext, they pull up the carpet and there’s all your dirt again. By contrast, services like Moz Local (automated citation management) and Whitespark (manual citation management) correct your bad data at the source, rather than just putting a temporary paid Band-Aid over it. So, investigate all options and choose wisely.

How do I best target specific towns and cities my clients want to be found in outside of their physical location?

If you market a service area business (like a plumber), create a great website landing page with consumer-centric, helpful, unique content for each of your major service cities. Also very interesting for service area businesses is the fact that Google just changed its handling of setting the service radius in your Google My Business dashboard so that it reflects your true service area instead of your physical address. If you market a brick-and-mortar business that customers come to from other areas, it’s typically not useful to create content saying, “People drive to us from X!” Rather, build relationships with neighboring communities in the real world, reflect them on your social outreach, and, if they’re really of interest, reflect them on your website. Both service area businesses and bricks-and-mortar models may need to invest in PPC to increase visibility in all desired locations.

How often should I change page titles and meta descriptions to help local SEO?

While it’s good to experiment, don’t change your major tags just for the sake of busy work. Rather, if some societal trend changes the way people talk about something you offer, consider editing your titles and descriptions. For example, an auto dealership could realize that its consumers have started searching for “EVs” more than electric vehicles because society has become comfortable enough with these products to refer to them in shorthand. If keyword research and trend analysis indicate a shift like this, then it may be time to re-optimize elements of your website. Changing any part of your optimization is only going to help you rank better if it reflects how customers are searching.

Read more about title tags and metas:

Should you service clients within the same niche, since there can only be one #1?

If your keywords have no local intent, then taking on two clients competing for the same terms nationally could certainly be unethical. But this is a great question, because it presents the opportunity to absorb the fact that for any keyword for which Google perceives a local intent, there is no longer only one #1. For these search terms, both local and many organic results are personalized to the location of the searcher.

Your Mexican restaurant client in downtown isn’t really competing with your Mexican restaurant client uptown when a user searches for “best tacos.” Searchers’ results will change depending on where they are in the city when they search. So unless you’ve got two identical businesses within the same couple of blocks in a city, you can serve them both, working hard to find the USP of each client to help them shine bright in their particular setting for searchers in close proximity.

Is it better to have a one-page format or break it into 3–5 pages for a local service company that does not have lengthy content?

This question is looking for an easy way out of publishing when you’ve become a publisher. Every business with a website is a publisher, and there’s no good excuse for not having adequate content to create a landing page for each of your services, and a landing page for each of the cities you serve. I believe this question (and it’s a common one!) arises from businesses not being sure what to write about to differentiate their services in one location from their services in another. The services are the same, but what’s different is the location!

Publish text and video reviews from customers there, showcase your best projects there, offer tips specific to the geography and regulations there, interview service people, interview experts, sponsor teams and events in those service locations, etc. These things require an investment of time, but you’re in the publishing business now, so invest the time and get publishing! All a one-page website shows is a lack of commitment to customer service. For more on this, read Overcoming Your Fear of Local Landing Pages.

How much content do you need for SEO?

Intent, intent, intent! Google’s ranking signals are going to vary depending on the intent behind the query, and thank goodness for that! This is why you don’t need a 3,000-word article for your product page to rank, for example.

The answer to “how much content does my page need?” is “enough content for it to be complete and comprehensive,” which is a subjective factor that is going to differ from query to query.

Whether you write 300 words or 3,000 words isn’t the issue. It’s whether you completely and thoroughly addressed the page topic.

Check out these Whiteboard Fridays around content for SEO:


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by TheMozTeam

This week, we're taking a deep into search intent.

The STAT whitepaper looked at how SERP features respond to intent, and the bonus blog posts broke things down even further and examined how individual intent modifiers impact SERP features, the kind of content that Google serves at each stage of intent, and how you can set up your very own search intent projects.

Search intent is the new demographics, so it only made sense to get up close and personal with it. Of course, in order to bag all those juicy search intent tidbits, we needed a great intent-based keyword list. Here’s how you can get your hands on one of those.

Gather your core keywords

First, before you can even think about intent, you need to have a solid foundation of core keywords in place. These are the products, features, and/or services that you’ll build your search intent funnel around.

But goodness knows that keyword list-building is more of an art than a science, and even the greatest writers (hi, Homer) needed to invoke the muses (hey, Calliope) for inspiration, so if staring at your website isn’t getting the creative juices flowing, you can look to a few different places for help.

Snag some good suggestions from keyword research tools

Lots of folks like to use the Google Keyword Planner to help them get started. Ubersuggest and Yoast’s Google Suggest Expander will also help add keywords to your arsenal. And Answer The Public gives you all of that, and beautifully visualized to boot.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by Ben_Fisher

Google My Business (GMB) is one of the most powerful ways to improve a business’ local search engine optimization and online visibility. If you’re a local business, claiming your Google My Business profile is one of the first steps you should take to increase your company’s online presence.

As long as your local business meets Google’s guidelines, your Google My Business profile can help give your company FREE exposure on Google’s search engine. Not only can potential customers quickly see your business’ name, address and phone number, but they can also see photos of your business, read online reviews, find a description about your company, complete a transaction (like book an appointment) and see other information that grabs a searcher’s attention — all without them even visiting your website. That’s pretty powerful stuff!

Google My Business helps with local rankings

Not only is your GMB Profile easily visible to potential customers when they search on Google, but Google My Business is also a key Google local ranking factor. In fact, according to local ranking factor industry research, Google My Business “signals” is the most important ranking factor for local pack rankings. Google My Business signals had a significant increase in ranking importance between 2017 and 2018 — rising from 19% to 25%.

Claiming your Google My Business profile is your first step to local optimization — but many people mistakenly think that just claiming your Google My Business profile is enough. However, optimizing your Google My Business profile and frequently logging into your Google My Business dashboard to make sure that no unwanted updates have been made to your profile is vital to improving your rankings and ensuring the integrity of your business profile’s accuracy.

Google My Business features that make your profile ROCK!

Google offers a variety of ways to optimize and enhance your Google My Business profile. You can add photos, videos, business hours, a description of your company, frequently asked questions and answers, communicate with customers via messages, allow customers to book appointments, respond to online reviews and more.

One of the most powerful ways to grab a searcher’s attention is by creating Google My Business Posts. GMB Posts are almost like mini-ads for your company, products, or services.

Google offers a variety of posts you can create to promote your business:

  • What's New
  • Event
  • Offer
  • Product

Posts also allow you to include a call to action (CTA) so you can better control what the visitor does after they view your post — creating the ultimate marketing experience. Current CTAs are:

  • Book
  • Order Online
  • Buy
  • Learn More
  • Sign Up
  • Get Offer
  • Call Now

Posts use a combination of images, text and a CTA to creatively show your message to potential customers. A Post shows in your GMB profile when someone searches for your business’ name on Google or views your business’ Google My Business profile on Google Maps.

Once you create a Post, you can even share it on your social media channels to get extra exposure.

Despite the name, Google My Business Posts are not actual social media posts. Typically the first 100 characters of the post are what shows up on screen (the rest is cut off and must be clicked on to be seen), so make sure the most important words are at the beginning of your post. Don’t use hashtags — they’re meaningless. It’s best if you can create new posts every seven days or so.

Google My Business Posts are a great way to show off your business in a unique way at the exact time when a searcher is looking at your business online.

But there’s a long-standing question: Are businesses actually creating GMB Posts to get their message across to potential customers? Let’s find out...

The big question: Are businesses actively using Google My Business Posts?

There has been a lot of discussion in the SEO industry about Google My Business Posts and their value: Do they help with SEO rankings? How effective are they? Do posts garner engagement? Does where the Posts appear on your GMB profile matter? How often should you post? Should you even create Google My Business Posts at all? Lots of questions, right?

As industry experts look at all of these angles, what do average, everyday business owners actually do when it comes to GMB Posts? Are real businesses creating posts? I set out to find the answer to this question using real data. Here are the details.

Google My Business Post case study: Just the facts

When I set out to discover if businesses were actively using GMB Posts for their companies’ Google My Business profiles, I first wanted to make sure I looked at data in competitive industries and markets. So I looked at a total of 2,000 Google My Business profiles that comprised the top 20 results in the Local Finder. I searched for highly competitive keyword phrases in the top ten cities (based on population density, according to Wikipedia.)

For this case study, I also chose to look at service type businesses.

Here are the results.

Cities:

New York, Los Angeles, Chicago, Philadelphia, Dallas, San Jose, San Francisco, Washington DC, Houston, and Boston.

Keywords:

real estate agent, mortgage, travel agency, insurance or insurance agents, dentist, plastic surgeon, personal injury lawyer, plumber, veterinarian or vet, and locksmith

Surprise! Out of the industries researched, Personal Injury Lawyers and Locksmiths posted the most often.

For the case study, I looked at the following:

  • How many businesses had an active Google My Business Post (i.e. have posted in the last seven days)
  • How many had previously made at least one post
  • How many have never created a post
Do businesses create Google My Business Posts?

Based on the businesses, cities, and keywords researched, I discovered that more than half of the businesses are actively creating Posts or have created Google My Business Posts in the past.

  • 17.5% of businesses had an active post in the last 7 days
  • 42.1% of businesses had previously made at least one post
  • 40.4% have never created a post

Highlight: A total of 59.60% of businesses have posted a Google My Business Post on their Google My Business profile.

NOTE: If you want to look at the raw numbers, you can check out the research document that outlines all the raw data. (NOTE: Credit for the research spreadsheet template I used and inspiration to do this case study goes to SEO expert Phil Rozek.)

Do searchers engage with Google My Business Posts?

If a business takes the time to create Google My Business Posts, do searchers and potential customers actually take the time to look at your posts? And most importantly, do they take action and engage with your posts?

This chart represents nine random clients, their total post views over a 28-day period, and the corresponding total direct/branded impressions on their Google My Business profiles. When we look at the total number of direct/branded views alongside the number of views posts received, the number of views for posts appears to be higher. This means that a single user is more than likely viewing multiple posts.

This means that if you take the time to create a GMB Post and your marketing message is meaningful, you have a high chance of converting a potential searcher into a customer — or at least someone who is going to take the time to look at your marketing message. (How awesome is that?)

Do searchers click on Google My Business Posts?

So your GMB Posts show up in your Knowledge Panel when someone searches for your business on Google and Google Maps, but do searchers actually click on your post to read more?

When we evaluated the various industry post views to their total direct/branded search views, on average the post is clicked on almost 100% of the time!

Google My Business insights

When you log in to your Google My Business dashboard you can see firsthand how well your Posts are doing. Below is a side-by-side image of a business’ post views and their direct search impressions. By checking your GMB insights, you can find out how well your Google My Business posts are performing for your business!

GMB Posts are worth it

After looking at 2,000 GMB profiles, I discovered a lot of things. One thing is for sure. It's hard to tell on a week-by-week basis how many companies are using GMB Posts because posts “go dark” every seven business days (unless the Post is an event post with a start and end date.)

Also, Google recently moved Posts from the top of the Google My Business profile towards the bottom, so they don’t stand out as much as they did just a few months ago. This may mean that there’s less incentive for businesses to create posts.

However, what this case study does show us is that businesses that are in a competitive location and industry should use Google My Business optimizing strategies and features like posts if they want to get an edge on their competition.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by SamuelMangialavori

If you read the title of this blog and somehow, even only for a second, thought about the iconic movie “The Silence of the Lambs”, welcome to the club — you are not alone!

Despite the fact that the term “cannibalisation” does not sound very suitable for digital marketing, this core concept has been around for a long time. This term simply identifies the issue of having multiple pages competing for the same (or very similar) keywords/keyword clusters, hence the cannibalisation.

What do we mean by cannibalisation in SEO?

This unfortunate and often unnoticed problem harms the SEO potential of the pages involved. When more than one page has the same/similar keyword target, it creates “confusion” in the eyes of the search engine, resulting in a struggle to decide what page to rank for what term.

For instance, say my imaginary e-commerce website sells shoes online and I have created a dedicated category page that targets the term ‘ankle boots’: www.distilledshoes.com/boots/ankle-boots/

Knowing the importance of editorial content, over time I decide to create two blog posts that cover topics related to ankle boots off the back of a keyword research: one post on how to wear ankle boots and another about the top 10 ways to wear ankle boots in 2019:

One month later, I realise that some of my blog pages are actually ranking for a few key terms that my e-commerce category page was initially visible for.

Now the question is: is this good or bad for my website?

Drum roll, please...and the answer is — It depends on the situation, the exact keywords, and the intent of the user when searching for a particular term.

Keyword cannibalisation is not black or white — there are multiple grey areas and we will try and go though several scenarios in this blog post. I recommend you spend 5 minutes checking this awesome Whiteboard Friday which covers the topic of search intent extremely well.

How serious of a problem is keyword cannibalisation?

Much more than what you might think — almost every website that I have worked on in the past few years have some degree of cannibalisation that needs resolving. It is hard to estimate how much a single page might be held back by this issue, as it involves a group of pages whose potential is being limited. So, my suggestion is to treat this issue by analysing clusters of pages that have some degree of cannibalisation rather than single pages.

Where is most common to find cannibalisation problems in SEO?

Normally, you can come across two main placements for cannibalisation:

1) At meta data level:

When two or more pages have meta data (title tags and headings mainly) which target the same or very similar keywords, cannibalisation occurs. This requires a less labour-intensive type of fix, as only meta data needs adjusting.

For example: my e-commerce site has three boots-related pages, which have the following meta data:

Page URL Title tag Header 1
/boots/all /Women’s Boots - Ankle & Chelsea Boots | Distilled Shoes Women’s Ankle & Chelsea Boots
/boots/ankle-boots/ Women’s Ankle Boots | Distilled Shoes Ankle Boots
boots/chelsea-boots/ Women’s Chelsea Boots | Distilled Shoes Chelsea Boots

These types of keyword cannibalisation often occurs on e-commerce sites which have many category (or subcategory) pages with the intention to target specific keywords, such as the example above. Ideally, we would want to have a generic boots page to target generic boots related terms, while the other two pages should be focusing on the specific types of boots we are selling on those pages: ankle and chelsea.

Why not try the below instead?

Page URL New Title Tag New Header 1
/boots/all Women’s Boots - All Types of Winter Boots | Distilled Shoes Women’s Winter Boots
/boots/ankle-boots/ Women’s Ankle Boots | Distilled Shoes Ankle Boots
boots/chelsea-boots/ Women’s Chelsea Boots | Distilled Shoes Chelsea Boots


More often than not, we fail to differentiate our e-commerce site’s meta data to target the very specific subgroup of keywords that we should aim for — after all, this is the main point of having so many category pages, no? If interested in the topic, find here a blog post I wrote on the subject.

The fact that e-commerce pages tend to have very little text on them makes meta data very important, as it will be one of the main elements search engines look at to understand how a page differs from the other.

2) At page content level

When cannibalisation occurs at page content level (meaning two or more pages tend to cover very similar topics in their body content), it normally needs more work than the above example, since it requires the webmaster to first find all the competing pages and then decide on the best approach to tackle the issue.

For example: say my e-commerce has two blog pages which cover the following topics:

Page URL Objective of the article
/blog/how-to-clean-leather-boots/ Suggests how to take care of leather boots so they last longer
/blog/boots-cleaning-guide-2019/ Shows a 121 guide on how to clean different types of boots

These types of keyword cannibalisation typically occurs on editorial pages, or transactional pages provided with substantial amount of text.

It is fundamental to clarify something: SEO is often not the main driver when producing editorial content, as different teams are involved in producing content for social and engagement reasons, and fairly so. Especially in larger corporations, it is easy to underestimate how complex it is to find a balance between all departments and how easily things can be missed.

From a pure SEO standpoint, I can assure you that the two pages above are very likely to be subject to cannibalisation. Despite the fact they have different editorial angles, they will probably display some degree of duplicated content between them (more on this later).

In the eyes of a search engine, how different are these two blog posts, both of which aim to address a fairly similar intent? That is the main question you should ask yourself when going through this task. My suggestion is the following: Before investing time and resources into creating new pages, make the effort to review your existing content.

What are the types of cannibalisation in SEO?

Simply put, you could come across 2 main types:

1) Two or more landing pages on your website that are competing for the same keywords

For instance, it could be the case that, for the keyword "ankle boots", two of my pages are ranking at the same time:

Page URL Title tag Ranking for the keyword “ankle boots”
Page A: /boots/all Women’s Boots - Ankle & Chelsea Boots | Distilled Shoes Position 8
Pabe B: /boots/ankle-boots/ Women’s Ankle Boots | Distilled Shoes Position 5

Is this a real cannibalisation issue? The answer is both yes and no.

If multiple pages are ranking for the same term, it is because a search engine finds elements of both pages that they think respond to the query in some way — so technically speaking, they are potential ‘cannibals’.

Does it mean you need to panic and change everything on both pages? Surely not. It very much depends on the scenario and your objective.

Scenario 1

In the instances where both pages have really high rankings on the first page of the SERPS, this could work in your advantage: More space occupied means more traffic for your pages, so treat it as "good" cannibalisation.

If this is the case, I recommend you do the following:

  • Consider changing the meta descriptions to make them more enticing and unique from each other. You do not want both pages to show the same message and fail to impress the user.
  • In case you realise that amongst the two pages, the “secondary/non-intended page” is ranking higher (for example: Page A /boots/all ranks higher than Page B /boots/ankle-boots/ for the term ‘ankle boots’), you should check on Google Search Console (GSC) to see which page is getting the most amount of clicks for that single term. Then, decide if it is worth altering other elements of your SEO to better address that particular keyword.

For instance, what would happen if I removed the term ankle boots from my /boots/all (Page A) title tag and page copy? If Google reacts by favouring my /boots/ankle-boots/ page instead (Page B), which may gain higher positions, then great! If not, the worst case scenario is you can revert the changes back and keep enjoying the two results on page one of the SERP.

Page URL Title tag Ranking for the keyword “ankle boots”
Page A: /boots/all Women’s Boots - Chelsea Boots & many more types | Distilled Shoes Test and decide

Scenario 2

In the instances where page A has high rankings page one of the SERPS and page B is nowhere to be seen (beyond the top 15–20 results), it is up to you to decide if this minor cannibalisation is worth your time and resources, as this may not be an urgency.

If you decide that it is worth pursuing, I recommend you do the following:

  • Keep monitoring the keywords for which the two pages seem to show, in case Google might react differently in the future.
  • Come back to this minor cannibalisation point after you have addressed your most important issues.
Scenario 3

In the instances where both pages are ranking in page two or three of the SERP, then it might be the case that your cannibalisation is holding one or both of them back.

If this is the case, I recommend you do the following:

  • Check on GSC to see which of your pages is getting the most amount of clicks for that single keyword. You should also check on similar terms, since keywords on page two or three of the SERP will show very low clicks in GSC. Then, decide which page should be your primary focus — the one that is better suited from a content perspective — and be open to test changes for on-page SEO elements of both pages.
  • Review your title tags, headings, and page copies and try to find instances where both pages seem to overlap. If the degree of duplication between them is really high, it might be worth consolidating/canonicalising/redirecting one to the other (I'll touch on this below).
2) Two or more landing pages on your website that are flip-flopping for the same keyword

It could be the case that, for instance, the keyword “ankle boots” for two of my pages are ranking at different times, as Google seems to have a difficult time deciding which page to choose for the term.

Page URL Ranking for the keyword “ankle boots” on 1st of January Ranking for the keyword “ankle boots” on 5th of January
Page A: /boots/all Position 6 Not ranking
Pabe B: /boots/ankle-boots/ Not ranking Position 8

If this happens to you, try and find an answer to the following questions:This is a common issue that I am sure many of you have encountered, in which landing pages seem to be very volatile and rank for a group of keywords in a non-consistent manner.

When did this flip-flopping start?

Pinpointing the right moment in time where this all began might help you understand how the problem originated in the first place. Maybe a canonical tag occurred or went missing, maybe some changes to your on-page elements or an algorithm update mixed things up?

How many pages flip-flop between each other for the same keyword?

The fewer pages subject to volatility, the better and easier to address. Try to identify which pages are involved and inspect all elements that might have triggered this instability.

How often do these pages flip-flop?

Try and find out how often the ranking page for a keyword has changed: the fewer times, the better. Cross reference the time of the changes with your knowledge of the site in case it might have been caused by other adjustments.

If the flip-flop has occurred only once and seems to have stopped, there is probably nothing to worry about, as it's likely a one-off volatility in the SERP. At the end of the day, we need to remember that Google runs test and changes almost everyday.

How to identify which pages are victims of cannibalisation

I will explain what tools I normally use to detect major cannibalisation fluxes, but I am sure there are several ways to reach the same results — if you want to share your tips, please do comment below!

Tools to deploy for type 1 of cannibalisation: When two of more landing pages are competing for the same keyword

I know we all love tools that help you speed up long tasks, and one of my favourites is Ahrefs. I recommend using their fantastic method which will find your ‘cannibals’ in minutes.

Watch their five minute video here to see how to do it.

I am certain SEMrush, SEOMonitor, and other similar tools offer the same ability to retrieve that kind of data, maybe just not as fast as Ahrefs’ method listed above. If you do not have any tools at your disposal, Google Search Console and Google Sheets will be your friends, but it will be more of a manual process.

Tools to deploy for Type 2 of cannibalisation: When two or more landing pages are flip-flopping for the same keyword

Ideally, most rank tracking tools will be able to do this functionally discover when a keyword has changed ranking URL over time. Back in the day I used tracking tools like Linkdex and Pi Datametrics to do just this.

At Distilled, we use STAT, which displays this data under History, within the main Keyword tab — see screenshot below as example.

One caveat of these kinds of ranking tools is that this data is often accessible only by keyword and will require data analysis. This means it may take a bit of time to check all keywords involved in this cannibalisation, but the insights you'll glean are well worth the effort.

Google Data Studio Dashboard

If you're looking for a speedier approach, you can build a Google Data Studio dashboard that connects to your GSC to provide data in real time, so you don’t have to check on your reports when you think there is a cannibalisation issue (credit to my colleague Dom).

Our example of a dashboard comprises two tables (see screenshots below):


The table above captures the full list of keyword offenders for the period of time selected. For instance, keyword 'X' at the top of the table has generated 13 organic clicks (total_clicks) from GSC over the period considered and changed ranking URL approximately 24 times (num_of_pages).

The second table (shown above) indicates the individual pages that have ranked for each keyword for the period of time selected. In this particular example, for our keyword X (which, as we know, has changed URLs 24 times in the period of time selected) the column path would show the list of individual URLs that have been flip flopping.

What solutions should I implement to tackle cannibalisation?

It is important to distinguish the different types of cannibalisation you may encounter and try to be flexible with solutions — not every fix will be the same.

I started touching on possible solutions when I was talking about the different types of cannibalisation, but let’s take a more holistic approach and explain what solutions are available.

301 redirection

Ask yourself this question: do I really need all the pages that I found cannibalising each other?

In several instances the answer is no, and if that is the case, 301 redirects are your friends.

For instance, you might have created a new (or very similar) version of the same article your site posted years ago, so you may consider redirecting one of them — generally speaking, the older URL might have more equity in the eyes of search engines and potentially would have attracted some backlinks over time.

Page URL Date of blog post
Page A: blog/how-to-wear-ankle-boots May 2016
Page B: blog/how-to-wear-ankle-boots-in-2019 December 2018

Check if page A has backlinks and, if so, how many keywords it is ranking for (and how well it is ranking for those keywords)What to do:

  • If page A has enough equity and visibility, do a 301 redirect from page B to page A, change all internal links (coming from the site to page B) to page A, and update metadata of page A if necessary (including the reference of 2019 for instance)
  • If not, do the opposite: complete a 301 redirect from page A to page B and change all internal links (coming from the site to page A) to page B.
Canonicalisation

In case you do need all the pages that are cannibalising for whatever reason (maybe PPC, social, or testing purposes, or maybe it is just because they are still relevant) then canonical tags are your friends. The main difference with a 301 redirect is that both pages will still exist, while the equity from page A will be transferred to page B.

Let's say you created a new article that covers a similar topic to another existing one (but has a different angle) and you find out that both pages are cannibalising each other. After a quick analysis, you may decide you want Page B to be your "primary", so you can use a canonical tag from page A pointing to page B. You would want to use canonicalisation if the content of the two pages is diverse enough that users should see it but not so much that search engines should think it's different.

Page URL Date of blog post
Page A: blog/how-to-wear-ankle-boots-with-skinny-jeans December 2017
Page B: blog/how-to-wear-high-ankle-boots January 2019

What to do:

  • Use a canonical tag from page A to page B. As a reinforcement to Google, you could also use a self-referencing canonical tag on page B.
  • After having assessed accessibility and internal link equity of both pages, you may want to change all/some internal links (coming from the site to page A) to page B if you deem it useful.
Pages re-optimisation

As already touched on, it primarily involves a metadata type of cannibalisation, which is what I named as type 1 in this article. After identifying the pages whose meta data seem to overlap or somehow target the same/highly similar keywords, you will need to decide which is your primary page for that keyword/keyword group and re-optimise the competing pages.

See the example earlier in the blog post to get a better idea.

Content consolidation

This type of solution involves consolidating a part or the entire content of a page into another. Once that has happened, it is down to you to decide if it is worth keeping the page you have stripped content from or just 301 redirect it to the other.

You would use consolidation as an option if you think the cannibalisation is a result of similar or duplicated content between multiple pages, which is more likely to be the type 2 of cannibalisation, as stated earlier. It is essential to establish your primary page first so you are able to act on the competing internal pages. Content consolidation requires you to move the offending content to your primary page in order to stop this problem and improve your rankings.

For example, you might have created a new article that falls under a certain content theme (in this instance, boots cleaning). You then realise that a paragraph of your new page B touches on leather boots and how to take care of them, which is something you have covered in page A. In case both articles respond to similar intents (one targeting cleaning leather only, the other targeting cleaning boots in general), then it is worth consolidating the offending content from page B to page A, and add an internal link to page A instead of the paragraph that covers leather boots in page B.

Page URL Date of blog post
Page A: blog/how-to-clean-leather-boots December 2017
Page B: /blog/boots-cleaning-guide-2019/ January 2019

What to do:

  • Find the offending part of content on page B, review it and consolidate the most compelling bits to page A
  • Replace the stripped content on page B with a direct internal link pointing to page A
  • Often after having consolidated the content of a page to another, there is no scope for the page where content has been stripped from so it should just be redirected (301).
How can I avoid cannibalisation in the first place?

The best way to prevent cannibalisation from happening is a simple, yet underrated task, that involves keyword mapping. Implementing a correct mapping strategy for your site is a key part of your SEO, as important as your keyword research.

Carson Ward has written an awesome moz blog post about the topic, I recommend you have a look.

Don’t take 'intent' for granted

Another way to avoid cannibalisation, and the last tip I want to share with you, involves something most of you are familiar with: search intent.

Most of the time, we take things for granted, assuming Google will behave in a certain way and show certain type of results. What I mean by this is: When you work on your keyword mapping, don’t forget to check what kind of results search engines display before assuming a certain outcome. Often, even Google is not sure and will not always get intent right.

For instance, when searching for ‘shoes gift ideas’ and ‘gift ideas for shoe lovers’ I get two very different SERPs despite the fact that my intent is kind of the same: I am looking for ideas for a gift which involves shoes.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by KameronJenkins

When your target is constantly moving, how can you keep your clients informed and happy?

Raise your hand if you’ve ever struggled to keep up with all the changes in our industry.

Go ahead, don’t be shy!

Even the most vigilant SEOs have been caught off guard by an algorithm update, changes to the SERP layout, or improvements to the tools we rely on.

It can be tiring trying to keep up with a constantly moving target, but it doesn’t even stop there. SEOs must also explain those developments to their clients and stakeholders.

Work at an agency? Your clients will want to know that you’re helping them stay relevant. During my agency years, I can’t tell you how many times clients emailed in with a link to an article on the topic of a new development asking, “Do we need to be worried about this? How can we use this for our SEO?” Keeping apprised of these changes and informing your client how it applies to them is a critical component of not just campaign success, but customer loyalty.

Work in-house? The main difference here is that your client is your boss. Whereas at an agency you might lose a client over communication lapses, in-house SEOs could lose their jobs. That’s obviously the worst-case scenario, but if you’re in a budget-conscious, SEO-immature company, failing to stay relevant and communicate those changes effectively could mean your boss stops seeing the value in your position.

Anticipating changes and mitigating anxiety

There are some changes we know about ahead of time.

For example, when Google announced the mobile friendly update (remember #mobilegeddon?), they did so two months ahead of the actual rollout, and they had also been encouraging the use of mobile-friendly design long before that.

Google announced HTTPS as a ranking signal back in 2014 and had been advocating for a secure web long before that, but they didn’t start adding the “not secure” warning to all non-HTTPS pages in Chrome until July 2018.

Big changes usually warrant big announcements ahead of the rollout. You need time to prepare for changes like this and to use that time to prepare your clients and stakeholders as well. It’s why Moz put so much effort into educational materials around the rollout of the new DA.

But in order to mitigate the anxiety these changes can cause, we have to know about them. So where can we go to stay up-to-date?

If you’ve been in the SEO industry for any length of time, these sources likely won’t be new to you, but they’re some of the best ways to keep yourself informed:

If you know a change like this is coming, be proactive! Inform your clients of what the change is, how it affects them, and what you plan on doing about it.

For example:

Hey [client]! One of the metrics that we include in your reporting, Domain Authority (DA), will be changing next month, so we wanted to let you know what you can expect! Moz is changing how they calculate DA, and as a result, some DA scores may be higher or lower. Rest assured, we’ll be monitoring your DA score to see how it changes in relation to your competitors’ scores. Here are some helpful slides for more information on the update, or feel free to call us and we’ll be happy to walk you through it in more detail.

When you’re able to proactively communicate changes, clients and stakeholders have less cause to worry. They can see that you’re on top of things, and that their campaign is in good hands.

What about the changes you didn’t see coming?

Plenty of changes happen without warning. What are SEOs supposed to do then?

To answer that question, I think we need to back it all the way up to your client’s first day with your agency (or for in-housers, your first few days on the job).

Even with unexpected changes, preventative measures can help SEOs react to these changes in a way that doesn’t compromise the stability of their client or stakeholder relationship.

What are those preventative measures?

  • Give them a brief overview of how search works: Don’t venture too far into the weeds, but a basic overview of how crawling, indexing, and ranking work can help your clients understand the field they’re playing on.
  • Explain the volatile nature of search engines: Google makes changes to their algorithm daily! Not all of those are major, and you don’t want to scare your client into thinking that you’re flying totally blind, but they should at least know that change is a normal part of search.
  • Prepare them for unannounced changes: Let your client know that while there are some changes we can see coming, others roll out with no prior notice. This should prevent any upset caused by seeing changes they weren’t informed about.

By setting the stage with this information at the outset of your relationship, clients and stakeholders are more likely to trust that you’ve got a handle on things when changes do occur. Just make sure that you respond to unexpected changes the same way you would prepare your client for a planned change: tell them what the change was, how it affects them (if at all), and what you’re doing about it (if anything).

Your communication checklist

Whether you’re an SEO at an agency or in-house, you have a lot on your plate. Not only do you have to be a good SEO — you also serve as a sort of professional justifier. In other words, it’s not only about how well you did, but also how well you communicated what you did.

Like I said, it’s a lot. But hopefully I have something that can help.

I put together this list of tips you can use to guide your own client/stakeholder communication strategy. Every one of us is in a unique situation, so choose from the checklist accordingly, but my hope is that you can use this brain dump from my years in an agency and in-house to make the communication side of your job easier.

✓ Set the stage from the beginning

SEO can be a bumpy ride. Lay the foundation for your campaign by making sure your client understands the volatile nature of the industry and how you’ll respond to those changes. Doing so can foster trust and confidence, even amidst change.

✓ Never be defensive

Sometimes, clients will bring something to your attention before you’ve had a chance to see it, whether that be a traffic dip, a Google update, or otherwise. This can prompt a concerned “What’s going on?” or “Why didn’t I know about this?” Don’t try to spin this. Own up to the missed opportunity for communication and proceed to give the client the insight they need.

✓ Be proactive whenever possible

Aim to make missed communication opportunities the exception, not the rule. Being proactive means having your finger always on the pulse and intuitively knowing what needs to be shared with your client before they have to ask.

✓ Acknowledge unexpected changes quickly

If you encounter a change that you weren’t prepared for, let your client know right away — even if the news is negative. There’s always the temptation to avoid this in hopes your client never notices, but it’s much better to acknowledge it than look like you were hiding something or totally out of the loop. Acknowledge the change, explain why it happened, and let your client know what you’re doing about it.

✓ Always bring it back to the “so what?”

For the most part, your clients don’t have time to care about the finer points of SEO. When sharing these updates, don’t spend too long on the “what” before getting to the “how does this impact me?”

✓ Avoid jargon and simplify

SEO has a language all its own, but it’s best to keep that between SEOs and not let it bleed into our client communication. Simplify your language wherever possible. It can even be helpful to use illustrations from everyday life to drive your point home.

✓ Add reminders to reports

Communicate with your clients even when you’re not calling or emailing them! By adding explanations to your clients’ reports, you can assuage the fears that can often result from seeing fluctuations in the data.

✓ Keep updates actionable and relevant

Search changes constantly. That means there’s tons of news you could be sending to your client every day. Do you need to send it all? Not necessarily -- it’s best to keep updates relevant and actionable. Instead of “Hey there was an update [link to explainer post]” it’s much more relevant to say, “Hey, there was an update relevant to your industry and here’s what we’re planning on doing about it.”

✓ Put changes into perspective

As humans, it’s in our nature to make mountains out of molehills. As the SEO manager, you can prepare for these types of overreactions by always being ready to put a change into perspective (ex: “here’s how this does/doesn’t impact your leads and revenue”).

✓ Adapt your communication to your client’s preferences and the nature of the change

We all work with different types of clients and stakeholders. There are the “Can you call me?” clients, the “I have an idea” clients, the clients who never respond… you get the idea. The communication method that’s best for one client might not be well received by another. It’s also important to cater your communication method to the nature of the changes. Was there a big update? A phone call might be best. Small update? An email will probably suffice.

✓ Practice empathy

Above all else, let’s all strive to be more empathetic. Because we know SEO so well, it can be easier for us to take changes in stride, but think about your clients or your boss. SEO might as well be a black box to many business owners, so changes can be even scarier when you don’t know what’s going on and your business is at stake.

Putting it all into practice

If DA is one of your reporting metrics, or something your client/stakeholder pays attention to, then our March 5th update is the perfect opportunity to put all of this into practice.

We have a great DA 2.0 resource center for you so that you can prepare yourself, and those dependent on you, for the change.

Here’s what’s included:

  • An explainer video
  • A Q&A forum
  • A slide deck
  • A white paper

Russ Jones will also be hosting an entire webinar on this topic to help you understand these changes so you can speak intelligently about them to your clients and stakeholders. Join him on Thursday, February 21 at 10am PDT:

Save my spot!

Communicating with clients and stakeholders is a bit of an art form, but with empathy and preparedness, we can tackle any change that’s thrown our way.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by DiTomaso

Search results are sophisticated enough to show searchers not only the content they want, but in the format they want it. Being able to identify searcher intent and interest based off of ranking results can be a powerful driver of content strategy. In this week's Whiteboard Friday, we warmly welcome Dana DiTomaso as she describes her preferred tools and methods for developing a modern and effective content strategy.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to Whiteboard Friday. My name is Dana DiTomaso. I'm President and partner of Kick Point, which is a digital marketing agency based way up in Edmonton, Alberta. Come visit sometime.

What I'm going to be talking about today is using STAT for content strategy. STAT, if you're not familiar with STAT Search Analytics, which is in my opinion the best ranking tool on the market and Moz is not paying me to say that, although they did pay for STAT, so now STAT is part of the Moz family of products. I really like STAT. I've been using it for quite some time. They are also Canadian. That may or may not influence my decision.

But one of the things that STAT does really well is it doesn't just show you where you're ranking, but it breaks down what type of rankings and where you should be thinking about rankings. Typically I find, especially if you've been working in this field for a long time, you might think about rankings and you still have in your mind the 10 blue links that we used to have forever ago, and that's so long gone. One of the things that's useful about using STAT rankings is you can figure out stuff that you should be pursuing other than, say, the written word, and I think that that's something really important again for marketers because a lot of us really enjoy reading stuff.

Consider all the ways searchers like to consume content

Maybe you're watching this video. Maybe you're reading the transcript. You might refer to the transcript later. A lot of us are readers. Not a lot of us are necessarily visual people, so sometimes we can forget stuff like video is really popular, or people really do prefer those places packs or whatever it might be. Thinking outside of yourself and thinking about how Google has decided to set up the search results can help you drive better content to your clients' and your own websites.

The biggest thing that I find that comes of this is you're really thinking about your audience a lot more because you do have to trust that Google maybe knows what it's doing when it presents certain types of results to people. It knows the intent of the keyword, and therefore it's presenting results that make sense for that intent. We can argue all day about whether or not answer boxes are awesome or terrible.

But from a visitor's perspective and a searcher's perspective, they like them. I think we need to just make sure that we're understanding where they might be showing up, and if we're playing by Google rules, people also ask is not necessarily going anywhere.

All that being said, how can we use ranking results to figure out our content strategy? The first thing about STAT, if you haven't used STAT before, again check it out, it's awesome.

Grouping keywords with Data Views

But one of the things that's really nice is you can do this thing called data views. In data views, you can group together parts of keywords. So you can do something called smart tags and say, "I want to tag everything that has a specific location name together."

Opportunities — where are you not showing up?

Let's say, for example, that you're working with a moving company and they are across Canada. So what I want to see here for opportunities are things like where I'm not ranking, where are there places box showing up that I am not in, or where are the people also ask showing up that I am not involved in. This is a nice way to keep an eye on your competitors.

Locations

Then we'll also do locations. So we'll say everything in Vancouver, group this together. Everything in Winnipeg, group this together. Everything in Edmonton and Calgary and Toronto, group all that stuff together.

Attributes (best, good, top, free, etc.)

Then the third thing can be attributes. This is stuff like best, good, top, free, cheap, all those different things that people use to describe your product, because those are definitely intent keywords, and often they will drive very different types of results than things you might consider as your head phrases.

So, for example, looking at "movers in Calgary" will drive a very different result than "top movers in Calgary." In that case, you might get say a Yelp top 10 list. Or if you're looking for "cheapest mover in Calgary,"again a different type of search result. So by grouping your keywords together by attributes, that can really help you as well determine how those types of keywords can be influenced by the type of search results that Google is putting out there.

Products / services

Then the last thing is products/services. So we'll take each product and service and group it together. One of the nice things about STAT is you can do something called smart tags. So we can, say, figure out every keyword that has the word "best" in it and put it together. Then if we ever add more keywords later, that also have the word "best,"they automatically go into that keyword group. It's really useful, especially if you are adding lots of keywords over time. I recommend starting by setting up some views that make sense.

You can just import everything your client is ranking for, and you can just take a look at the view of all these different keywords. But the problem is that there's so much data, when you're looking at that big set of keywords, that a lot of the useful stuff can really get lost in the noise. By segmenting it down to a really small level, you can start to understand that search for that specific type of term and how you fit in versus your competition.

A deep dive into SERP features

So put that stuff into STAT, give it a little while, let it collect some data, and then you get into the good stuff, which is the SERP features. I'm covering just a tiny little bit of what STAT does. Again, they didn't pay me for this. But there's lots of other stuff that goes on in here. My personal favorite part is the SERP features.

Which features are increasing/decreasing both overall and for you?

So what I like here is that in SERP features it will tell you which features are increasing and decreasing overall and then what features are increasing and decreasing for you.

This is actually from a real set for one of our clients. For them, what they're seeing are big increases in places version 3, which is the three pack of places. Twitter box is increasing. I did not see that coming. Then AMP is increasing. So that says to me, okay, so I need to make sure that I'm thinking about places, and maybe this is a client who doesn't necessarily have a lot of local offices.

Maybe it's not someone you would think of as a local client. So why are there a lot more local properties popping up? Then you can dive in and say, "Okay, only show me the keywords that have places boxes." Then you can look at that and decide: Is it something where we haven't thought about local SEO before, but it's something where searchers are thinking about local SEO? So Google is giving them three pack local boxes, and maybe we should start thinking about can we rank in that box, or is that something we care about.

Again, not necessarily content strategy, but certainly your SEO strategy. The next thing is Twitter box, and this is something where you think Twitter is dead. No one is using Twitter. It's full of terrible people, and they tweet about politics all day. I never want to use it again, except maybe Google really wants to show more Twitter boxes. So again, looking at it and saying, "Is Twitter something where we need to start thinking about it from a content perspective? Do we need to start focusing our energies on Twitter?"

Maybe you abandoned it and now it's back. You have to start thinking, "Does this matter for the keywords?" Then AMP. So this is something where AMP is really tricky obviously. There have been studies where it said, "I implemented AMP, and I lost 70% of my traffic and everything was terrible." But if that's the case, why would we necessarily be seeing more AMP show up in search results if it isn't actually something that people find useful, particularly on mobile search?

Desktop vs mobile

One of the things actually that I didn't mention in the tagging is definitely look at desktop versus mobile, because you are going to see really different feature sets between desktop and mobile for these different types of keywords. Mobile may have a completely different intent for a type of search. If you're a restaurant, for example, people looking for reservations on a desktop might have different intent from I want a restaurant right now on mobile, for example, and you're standing next to it and maybe you're lost.

What kind of intent is behind the search results?

You really have to think about what that intent means for the type of search results that Google is going to present. So for AMP, then you have to look at it and say, "Well, is this newsworthy? Why is more AMP being shown?" Should we consider moving our news or blog or whatever you happen call it into AMP so that we can start to show up for these search results in mobile? Is that a thing that Google is presenting now?

We can get mad about AMP all day, but how about instead if we actually be there? I don't want the comment section to turn into a whole AMP discussion, but I know there are obviously problems with AMP. But if it's being shown in the search results that searchers who should be finding you are seeing and you're not there, that's definitely something you need to think about for your content strategy and thinking, "Is AMP something that we need to pursue? Do we have to have more newsy content versus evergreen content?"

Build your content strategy around what searchers are looking for

Maybe your content strategy is really focused on posts that could be relevant for years, when in reality your searchers are looking for stuff that's relevant for them right now. So for example, things with movers, there's some sort of mover scandal. There's always a mover who ended up taking someone's stuff and locking it up forever, and they never gave it back to them. There's always a story like that in the news.

Maybe that's why it's AMP. Definitely investigate before you start to say, "AMP everything." Maybe it was just like a really bad day for movers, for example. Then you can see the decreases. So the decrease here is organic, which is that traditional 10 blue links. So obviously this new stuff that's coming in, like AMP, like Twitter, like places is displacing a lot of the organic results that used to be there before.

So instead you think, well, I can do organic all day, but if the results just aren't there, then I could be limiting the amount of traffic I could be getting to my website. Videos, for example, now it was really interesting for this particular client that videos is a decreasing SERP for them, because videos is actually a big part of their content strategy. So if we see that videos are decreasing, then we can take a step back and say, "Is it decreasing in the keywords that we care about? Why is it decreasing? Do we think this is a test or a longer-term trend?"

Historical data

What's nice about STAT is you can say "I want to see results for the last 7 days, 30 days, or 60 days." Once you get a year of data in there, you can look at the whole year and look at that trend and see is it something where we have to maybe rethink our video strategy? Maybe people don't like video for these phrases. Again, you could say, "But people do like video for these phrases." But Google, again, has access to more data than you do.

If Google has decided that for these search phrases video is not a thing they want to show anymore, then maybe people don't care about video the way that you thought they did. Sorry. So that could be something where you're thinking, well, maybe we need to change the type of content we create. Then the last one is carousel that showed up for this particular client. Carousel, there are ones where they show lots of different results.

I'm glad that's dropping because that actually kind of sucks. It's really hard to show up well there. So I think that's something to think about in the carousel as well. Maybe we're pleased that that's going away and then we don't have to fight it as much anymore. Then what you can see in the bottom half are what we call share of voice.

Share of voice

Share of voice is calculated based on your ranking and all of your competitors' ranking and the number of clicks that you're expected to get based on your ranking position.

So the number 1 position obviously gets more ranks than the number 100 position. So the share of voice is a percentage calculated based on how many of these types of items, types of SERP features that you own versus your competitors as well as your position in these SERP features. So what I'm looking at here is share of voice and looking at organic, places, answers, and people also ask, for example.

So what STAT will show you is the percentage of organic, and it's still, for this client — and obviously this is not an accurate chart, but this is vaguely accurate to what I saw in STAT — organic is still a big, beefy part of this client's search results. So let's not panic that it's decreasing. This is really where this context can come in. But then you can think, all right, so we know that we are doing "eeh" on organic.

Is it something where we think that we can gain more? So the green shows you your percentage that you own of this, and then the black is everyone else. Thinking realistically, you obviously cannot own 100% of all the search results all the time because Google wouldn't allow that. So instead thinking, what's a realistic thing? Are we topping out at the point now where we're going to have diminishing returns if we keep pushing on this?

Identify whether your content efforts support what you're seeing in STAT

Are we happy with how we're doing here? Maybe we need to turn our attention to something else, like answers for example. This particular client does really well on places. They own a lot of it. So for places, it's maintain, watch, don't worry about it that much anymore. Then that can drop off when we're thinking about content. We don't necessarily need to keep writing blog post for things that are going to help us to rank in the places pack because it's not something that's going to influence that ranking any further.

We're already doing really well. But instead we can look at answers and people also ask, which for this particular client they're not doing that well. It is something that's there, and it is something that it may not be one of the top increases, but it's certainly an increase for this particular client. So what we're looking at is saying, "Well, you have all these great blog posts, but they're not really written with people also ask or answers in mind. So how about we go back and rewrite the stuff so that we can get more of these answer boxes?"

That can be the foundation of that content strategy. When you put your keywords into STAT and look at your specific keyword set, really look at the SERP features and determine what does this mean for me and the type of content I need to create, whether it's more images for example. Some clients, when you're looking at e-commerce sites, some of the results are really image heavy, or they can be product shopping or whatever it might be.

There are really specific different features, and I've only shown a tiny subset. STAT captures all of the different types of SERP features. So you can definitely look at anything if it's specific to your industry. If it's a feature, they've got it in here. So definitely take a look and see where are these opportunities. Remember, you can't have a 100% share of voice because other people are just going to show up there.

You just want to make sure that you're better than everybody else. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by TheMozTeam

Google let it be known earlier this year that snippets were a-changin’. And true to their word, we’ve seen them make two major updates to the feature — all in an attempt to answer more of your questions.

We first took you on a deep dive of double featured snippets, and now we’re taking you for a ride on the carousel snippet. We’ll explore how it behaves in the wild and which of its snippets you can win.

For your safety, please remain seated and keep your hands, arms, feet, and legs inside the vehicle at all times!

What a carousel snippet is an how it works

This particular snippet holds the answers to many different questions and, as the name suggests, employs carousel-like behaviour in order to surface them all.

When you click one of the “IQ-bubbles” that run along the bottom of the snippet, JavaScript takes over and replaces the initial “parent” snippet with one that answers a brand new query. This query is a combination of your original search term and the text of the IQ-bubble.

So, if you searched [savings account rates] and clicked the “capital one” IQ-bubble, you’d be looking at a snippet for “savings account rates capital one.” That said, 72.06 percent of the time, natural language processing will step in here and produce something more sensible, like “capital one savings account rates.”

On the new snippet, the IQ-bubbles sit at the top, making room for the “Search for” link at the bottom. The link is the bubble snippet’s query and, when clicked, becomes the search query of a whole new SERP — a bit of fun borrowed from the “People also ask” box.

You can blame the ludicrous “IQ-bubble” name on Google — it’s the class tag they gave on HTML SERP. We have heard them referred to as “refinement” bubbles or “related search” bubbles, but we don’t like either because we’ve seen them do both refine and relate. IQ-bubble it is.

There are now 6 times the number of snippets on a SERP

Back in April, we sifted through every SERP in STAT to see just how large the initial carousel rollout was. Turns out, it made a decent-sized first impression.

Appearing only in America, we discovered 40,977 desktop and mobile SERPs with carousel snippets, which makes up a hair over 9 percent of the US-en market. When we peeked again at the beginning of August, carousel snippets had grown by half but still had yet to reach non-US markets.

Since one IQ-bubble equals one snippet, we deemed it essential to count every single bubble we saw. All told, there were a dizzying 224,508 IQ-bubbles on our SERPs. This means that 41,000 keywords managed to produce over 220,000 extra featured snippets. We’ll give you a minute to pick your jaw up off the floor.

The lowest and most common number of bubbles we saw on a carousel snippet was three, and the highest was 10. The average number of bubbles per carousel snippet was 5.48 — an IQ of five if you round to the nearest whole bubble (they’re not that smart).

Depending on whether you’re a glass-half-full or a glass-half-empty kind of person, this either makes for a lot of opportunity or a lot of competition, right at the top of the SERP.

Most bubble-snippet URLs are nowhere else on the SERP

When we’ve looked at “normal” snippets in the past, we’ve always been able to find the organic results that they’ve been sourced from. This wasn’t the case with carousel snippets — we could only find 10.76 percent of IQ-bubble URLs on the 100-result SERP. This left 89.24 percent unaccounted for, which is a metric heck-tonne of new results to contend with.

Concerned about the potential competitor implications of this, we decided to take a gander at ownership at the domain level.

Turns out things weren’t so bad. 63.05 percent of bubble snippets had come from sites that were already competing on the SERP — Google was just serving more varied content from them. It does mean, though, that there was a brand new competitor jumping onto the SERP 36.95 percent of the time. Which isn’t great.

Just remember: these new pages or competitors aren’t there to answer the original search query. Sometimes you’ll be able to expand your content in order to tackle those new topics and snag a bubble snippet, and sometimes they’ll be beyond your reach.

So, when IQ-bubble snippets do bother to source from the same SERP, what ranks do they prefer? Here we saw another big departure from what we’re used to.

Normally, 97.88 percent of snippets source from the first page, and 29.90 percent typically pull from rank three alone. With bubble snippets, only 36.58 percent of their URLs came from the top 10 ranks. And while the most popular rank position that bubble snippets pulled from was on the first page (also rank three), just under five percent of them did this.

We could apply the always helpful “just rank higher” rule here, but there appears to be plenty of exceptions to it. A top 10 spot just isn’t as essential to landing a bubble snippet as it is for a regular snippet.

We think this is due to relevancy: Because bubble snippet queries only relate to the original search term — they’re not attempting to answer it directly — it makes sense that their organic URLs wouldn’t rank particularly high on the SERP.

Multi-answer ownership is possible

Next we asked ourselves, can you own more than one answer on a carousel snippet? And the answer was a resounding: you most definitely can.

First we discovered that you can own both the parent snippet and a bubble snippet. We saw this occur on 16.71 percent of our carousel snippets.

Then we found that owning multiple bubbles is also a thing that can happen. Just over half (57.37 percent) of our carousel snippets had two or more IQ-bubbles that sourced from the same domain. And as many as 2.62 percent had a domain that owned every bubble present — and most of those were 10-bubble snippets!

Folks, it’s even possible for a single URL to own more than one IQ-bubble snippet, and it’s less rare than we’d have thought — 4.74 percent of bubble snippets in a carousel share a URL with a neighboring bubble.

This begs the same obvious question that finding two snippets on the SERP did: Is your content ready to pull multi-snippet duty?

"Search for" links don't tend to surface the same snippet on the new SERP

Since bubble snippets are technically providing answers to questions different from the original search term, we looked into what shows up when the bubble query is the keyword being searched.

Specifically, we wanted to see if, when we click the “Search for” link in a bubble snippet, the subsequent SERP 1) had a featured snippet and 2) had a featured snippet that matched the bubble snippet from whence it came.

To do this, we re-tracked our 40,977 SERPs and then tracked their 224,508 bubble “Search for” terms to ensure everything was happening at the same time.

The answers to our two pressing questions were thus:

  1. Strange, but true, even though the bubble query was snippet-worthy on the first, related SERP, it wasn’t always snippet-worthy on its own SERP. 18.72 percent of “Search for” links didn’t produce a featured snippet on the new SERP.
  2. Stranger still, 78.11 percent of the time, the bubble snippet and its snippet on the subsequent SERP weren’t a match — Google surfaced two different answers for the same question. In fact, the bubble URL only showed up in the top 20 results on the new SERP 31.68 percent of the time.

If we’re being honest, we’re not exactly sure what to make of all this. If you own the bubble snippet but not the snippet on the subsequent SERP, you’re clearly on Google’s radar for that keyword — but does that mean you’re next in line for full snippet status?

And if the roles are reversed, you own the snippet for the keyword outright but not when it’s in a bubble, is your snippet in jeopardy? Let us know what you think!

Paragraph and list formatting reign supreme (still!)

Last, and somewhat least, we took a look at the shape all these snippets were turning up in.

When it comes to the parent snippet, Heavens to Betsy if we weren’t surprised. For the first time ever, we saw an almost even split between paragraph and list formatting. Bubble snippets, on the other hand, went on to match the trend we’re used to seeing in regular ol’ snippets:

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Posted by rjonesx.

Howdy Moz readers,

I'm Russ Jones, Principal Search Scientist at Moz, and I am excited to announce a fantastic upgrade coming next month to one of the most important metrics Moz offers: Domain Authority.

Domain Authority has become the industry standard for measuring the strength of a domain relative to ranking. We recognize that stability plays an important role in making Domain Authority valuable to our customers, so we wanted to make sure that the new Domain Authority brought meaningful changes to the table.

Learn more about the new DA

What’s changing?

What follows is an account of some of the technical changes behind the new Domain Authority and why they matter.

The training set:

Historically, we’ve relied on training Domain Authority against an unmanipulated, large set of search results. In fact, this has been the standard methodology across our industry. But we have found a way to improve upon it fundamentally, from the ground up, makes Domain Authority more reliable. In particular, the new Domain Authority is better at understanding sites which don't rank for any keywords at all than it has in the past.

The training algorithm:

Rather than relying on a complex linear model, we’ve made the switch to a neural network. This offers several benefits including a much more nuanced model which can detect link manipulation.

The model factors:

We have greatly improved upon the ranking factors behind Domain Authority. In addition to looking at link counts, we’ve now been able to integrate our proprietary Spam Score and complex distributions of links based on quality and traffic, along with a bevy of other factors.

The backbone:

At the heart of Domain Authority is the industry's leading link index, our new Moz Link Explorer. With over 35 trillion links, our exceptional data turns the brilliant statistical work by Neil Martinsen-Burrell, Chas Williams, and so many more amazing Mozzers into a true industry leading standard.

What does this mean?

These fundamental improvements to Domain Authority will deliver a better, more trustworthy metric than ever before. We can remove spam, improve correlations, and, most importantly, update Domain Authority relative to all the changes that Google makes.

It means that you will see some changes to Domain Authority when the launch occurs. We staked the model to our existing Domain Authority which minimizes changes, but with all the improvements there will no doubt be some fluctuation in Domain Authority scores across the board.

What should we do?Use DA as a relative metric, not an absolute one.

First, make sure that you use Domain Authority as a relative metric. Domain Authority is meaningless when it isn't compared to other sites. What matters isn't whether your site drops or increases — it's whether it drops or increases relative to your competitors. When we roll out the new Domain Authority, make sure you check your competitors' scores as well as your own, as they will likely fluctuate in a similar direction.

Know how to communicate changes with clients, colleagues, and stakeholders

Second, be prepared to communicate with your clients or webmasters about the changes and improvements to Domain Authority. While change is always disruptive, the new Domain Authority is better than ever and will allow them to make smarter decisions about search engine optimization strategies going forward.

Expect DA to keep pace with Google

Finally, expect that we will be continuing to improve Domain Authority. Just like Google makes hundreds of changes to their algorithm every year, we intend to make Domain Authority much more responsive to Google's changes. Even when Google makes fundamental algorithm updates like Penguin or Panda, you can feel confident that Moz's Domain Authority will be as relevant and useful as ever.

When is it happening?

We plan on rolling out the new Domain Authority on March 5th, 2019. We will have several more communications between now and then to help you and your clients best respond to the new Domain Authority, including a webinar on February 21st. We hope you’re as excited as we are and look forward to continuing to bring you the most reliable, cutting-edge metrics our industry has to offer.

Be sure to check out the resources we’ve prepared to help you acclimate to the change, including an educational whitepaper and a presentation you can download to share with your clients, team, and stakeholders:

Explore more resources here


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview