Loading...

Follow SEO Hacker Blog | Philippines SEO Blog on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Last updated on May 23, 2019 at 06:05 pm

Having the right tool is vital for your SEO strategies and efforts to dominate your niche. As much as you try to gain traffic and climb up the ranks in search results without the aid of a tool, this would just be a waste of time. Ahrefs has proven its name in the industry, evidenced by its visibility in most Webmasters’ vocabulary. The crowd-favorite is continuously evolving and you can see it through its recent update on the Internal Backlink Audit.

Internal linking is one of the tasks that most webmasters neglect to check, which can very well hurt your position in the SERPs. Google has mentioned time and again that links are one of the most important ingredients you should take advantage of in order for you to rank. This is why the right linking strategy includes the Internal Backlink Audit.

How to Audit Backlinks Using Ahrefs

The thing with having an efficient tool such as Ahrefs is that you are at your most convenient while using them. With the knowledge that Google crawls websites through links – both internal and external – this should be a reason enough for you to organize your internal linking. If you don’t know this by now, providing a road map for your content by internally linking to it provides a hierarchy to it.

Ultimately, you lead the users into your most important content, all while providing them with the best set of information. You will also see broken or bad links that return a 404 error. By using this tool, you can identify internal links that are not helping the parent pages gain velocity in ranking and traffic.

An overview of the external backlink profile has always been a part of the Ahrefs skeleton. Now, you will see that Internal Backlinks can also be checked using this tool. Under the Site Explorer tab in the Ahrefs toolbar, start by inserting your url and then checking the Internal Backlinks option found at the sidebar.

As you can see, your anchor link and all your internal backlinks are grouped accordingly. Once you see that you fail to include a link that is related to your anchor, then start internally linking through it.

Another important feature of Ahrefs Internal Backlinking Option is its ability to let you categorize your link data according to the link type. This will further enhance your efforts to do a linking audit since you can decide where to focus. Additionally, you would not find difficulty in doing on-site optimization regarding links since you can check if you are properly redirecting to a page or if you are using the dofollow/nofollow tags correctly.

Aside from URL Ranking, Referring Domains, External Links, Total Search Traffic, and the Keywords that page is ranking for, you can also see how your content is grouped according to the message you want the user to receive. The new Ahrefs feature lets you do just that by using the text snippets surrounding your content, you will see how it points back to your anchor link. This will help you understand how you should group your content accordingly.

Importance of Internal Backlink Audit for SEO

Internal Linking should follow a structure. Using the Ahrefs tool will help you do this in order for your site to bring smoother user experience and a solid SEO profile. Organizing your web pages in accordance with the keywords or its purpose for the site is one of the best things you can do to dominate the SEO industry. Let’s face it, you can come up with thousands of strategies but if you fail to do the simplest act of Internal Link Audit, all your efforts will be for nothing.

Additionally, you will not reap the benefits that Backlink Audit brings to your SEO. These advantages include but are not limited to:

  1. Utilizing anchor links to better aid user intent.
  2. User navigation becomes smoother because users would not run the risk of encountering a dead link or page.
  3. Content is displayed in a series, giving the signal that your site has a well-optimized structure.
  4. Highlights new links that point to relevant content.
  5. Link juice is passed between your web pages.

Internal linking will pass on your site’s true value. In addition to running a site crawl to collect the links that are not beneficial to pages anymore, it is important that you see them relating to one another. By doing this, content is better delivered and utilized according to its purpose.

Using Internal Links for Your Content Marketing Strategy

Knowing all this, how are you going to use the Ahrefs tool to your advantage? Linking deep into your site can help will be the best way to build your link architecture, so why not make a content marketing strategy out of it? You may have heard about topic clusters and this is where Internal Backlink Audit comes into play.

Whether you plan on re-purposing content or making new ones, you can use this Ahref tool to do so. As you may well know, using tons of links in your internal pages will hurt your SEO score. To avoid this, just segment those content on your blog page or preferred content navigation page.

This common model found below will give you a clear idea of what to do by using internal linking as a content strategy. It may seem obvious to do this but many tend to neglect this part of their sites.

Think about this: If your content is woven together to form a large chunk of information from your site, users will have something to look forward to. More traffic and a larger opportunity to rank in the SERPs for you.

Once you see that internal linking makes sense, you will have more cause to debunk the common misconception in SEO that internal links carry little weight as opposed to external backlinks. Using Internal Linking will help you improve your link flow to individual pages which means that you will help them rank better. And since “Content is King”, target this part of your site and see the improved results.

Key Takeaway

Every website has internal links but not all webmasters know how to use it. Internal links are silent players when it comes to boosting organic traffic, but once you start being mindful enough to audit them, the results will be beneficial. Keep in mind that the link audit is an important part of your regular on-site optimization efforts and step up your game by using it to build a great content structure. What are your best techniques in link auditing? Comment down below!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on May 21, 2019 at 05:43 pm

Maintaining a website and doing SEO means putting out content regularly. Whether it’s an e-commerce website that has thousands of products or a services website that publishes blog posts regularly, a website will inevitably expand the number of pages inside as time goes by.

Whether a website has 100 pages or 10,000 pages, internal linking is a crucial on-page SEO strategy. Linking one page to another helps visitors to navigate through your website. Aside from that, it helps search engine bots crawl your website. The more a page is internally linked, the easier it is for a bot reach the page and crawl it more frequently.

That means important pages for your websites such as Landing Pages, Product Categories, Services, Blog Posts, etc. should be frequently linked to each other.

However, there are some cases where some pages are left out in the ecosystem. These are called Orphaned Pages.

What are Orphaned Pages?

Orphaned pages are pages of a website that is not internally linked or has zero links from other pages of your website. This makes it difficult for search engine bots to crawl and index these pages.

Orphaned pages may occur for different reasons. It could be old blog posts, old products that are not being sold anymore, old services pages that are not being offered anymore. While there are some pages that are purposely left out such as testing pages and tags pages, it is critical that you check if there are orphaned pages that are still relevant for the users.

Does it Affect My SEO?

The answer is both yes and no. The effect of orphaned pages in a website’s rankings depends on how you look at it. If a page that is orphaned was created to be shown to users and has content that is important to users, it hurts your SEO because crawlers can’t see this page thus it won’t appear in the search results. Users won’t be able to see them either.

However, if a page that is orphaned was created for other purposes not related to users such as testing functionalities or testing a new website design, then you can leave these pages as it is.

How to Find Orphaned Pages using Screaming Frog

To find orphaned pages using Screaming Frog, you have to first make sure that your Google Analytics and Google Search Console accounts are connected.

To do that, under Configuration, scroll down to API access and connect Google Analytics and Google Search Console.

Once you got them connected, make sure that under the General tab of the API window, you select Crawl New URLs Discovered in Google Analytics.

After connecting your GA and GSC accounts, under Configuration, go to Spider, and check Crawl Linked XML Sitemaps. Then check the option Crawl These Sitemaps: and input the URL of your website’s sitemap.

After setting everything up, you could now start crawling your website. Once it’s finished crawling, under Crawl Analysis, click on configure and check the box beside Sitemaps. It will start analyzing the crawl log of your website and will allow you to see the orphaned pages.

After the analysis, in the Overview under Sitemaps, you can now see all orphaned pages that were crawled by Screaming Frog.

How to Find Orphaned Pages using SEMRush

You could also find orphaned pages by setting up Site Audit in SEMRush. If you don’t have a website set up, create a new project first and let SEMRush crawl your website.

Once the set up of the project is complete, go to the Site Audit of your website then go to Issues. Under the Notices tab, scroll down to check if orphaned pages report is enabled.

If it hasn’t been enabled yet, connect your Google Analytics account in the Site Audit Settings. The process is similar to Screaming Frog. It will prompt you to log in with your Google Account, select the Profile, Property, and View of your selected Website and click Save.

Once you complete the setup, SEMRush will automatically collect data from Google Analytics. Unlike Screaming Frog, you don’t have to connect Google Search Console to get orphaned pages data in SEMRush.

After a few minutes, refresh your browser and check the Issues tab again. Click the dropdown menu Select an Issue and you will find Orphaned Pages (Google Analytics) under Notices.

Optimize or Scrap?

Once you collected all orphaned pages, it is now up to you what to do with these. You could place them inside a Google Sheet.

  • If a page is still relevant, label them as ‘optimize’ and find possible pages to link to this page.
  • If a page was relevant but now irrelevant such as old products or old services, you could delete them and leave them as 404. No need to redirect these as they don’t carry any link value at all.
  • If a page is purposely left out, you could leave them as it is.

Here’s a sample template that you could use:

Key Takeaway

While orphaned pages can be harmless to your website’s overall rankings and SEO value, it could be a critical issue when important pages are left out. Include monitoring of orphaned pages in your regular website maintenance audit. Make sure that your website has a healthy site structure and good flow of link juice by internally linking pages to each other.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on May 16, 2019 at 05:09 pm

Website optimization is made up of different facets that need to be optimized individually in order for your site to slowly reach the top spot in the first page of the search results. Onsite optimization – consisting of different factors that are inside your website need to be checked and optimized, offsite optimization – consists of different factors that deal with links that connect other websites to yours, and technical optimization – everything technical (codes and other factors that need IT expertise). All of these are important for your rankings and should not be disregarded. But the challenge is to find the pain points out of all these facets and fix them.

A website is a delicate object that needs constant maintenance and care from webmasters and SEOs. Our job is to create the most optimized site that contains useful, authoritative, and high-quality content that is able to assist users in their search queries and help them find what they’re looking for. So, how do we do that? We audit the site to find the broken facets and fix them accordingly. Here’s how:

Check your Website Traffic

Traffic is a consequential effect of your SEO efforts. If you were able to improve your search visibility for a high volume keyword, you can almost be sure that your site’s traffic count will also increase. However, a drop in an otherwise steady number of traffic does not always mean that your search visibility took a drop as well. Take a look at this example:

We do a weekly checkup of our traffic count and once we saw the sudden drop, we knew something was wrong. The problem was, we didn’t do anything. I just published a new post and it suddenly became that way. I won’t go into how we investigated and fixed the cause of the drop, but this just goes to show how important it is to do a regular check of your traffic in Google Analytics. If we didn’t do the regular checks, then our traffic count might have just stayed that way until it becomes a crisis.

Make it a habit to regularly check your traffic count for the sole reason of you being on top of everything. I recommend doing this twice a week, if you can check it 4 times a week, then that would be best. This is an important foundation of your site audit and checking your traffic can never be disregarded. After checking your traffic, the next step is to:

Check your Google Search Console Coverage Report

Google Search Console is probably the most crucial tool for SEOs and learning how to use it at its full extent is a must. As an SEO professional, it is important that the pages that you want Google to index are being shown in the search results and those that you don’t want to show should not appear in the SERPs.

Google Search Console’s Coverage Report is the best way to know how Google sees your pages and monitor the pages on your website that is being indexed. You would also be able to see crawling errors so you could fix them immediately.

Check your Submitted Sitemaps

Since your sitemap contains all the URLs you want Google to crawl, you should make sure that all your sitemaps are submitted and crawled by Google. You could submit multiple sitemaps in Search Console but this decision should be based on the size of your website.

If you have less than a thousand pages, it would be better if you only have one sitemap. If your website is a travel booking website and has 50,000 pages or more, you could divide your sitemap to multiple sitemaps to better organize your URLs.

Take note that having more than one sitemap does not mean crawling priorities will change. It is only a great way of structuring your website and telling crawl bots what are the parts of your website that are important.

Submitted and Indexed

Under this report, you should see all the URLs that are in your sitemap. If you see URLs here that should not be indexed such as testing pages or images, you should place a noindex tag on them to tell crawl bots that this should not be shown in the search results.

Indexed, Not Submitted in Sitemap

Most of the time, this report shows pages that you don’t want users to see purposely in the search results. Even though you submitted a sitemap, Google could still crawl links that are not in your sitemap.

As much as possible, the number of pages indexed in this report should be kept at a minimum. If there are pages indexed that you don’t want users to purposely see, place a noindex tag on them. If you see important pages here, it would be much better if you could add them to your sitemap.

Check for Crawl Anomalies

In Google Search Console under the Coverage Report, you could also see pages that were crawled by Google but were excluded from the search results. This could be because of the noindex tag, robots.txt file, or other errors that might cause crawl anomalies and make a page non-indexable by Google.

You should check this report as you might have important pages that are under these. For pages that are under ‘Blocked by Robots.txt’ or ‘Excluded by Noindex tag’, fixes should be as easy as removing them from the robots.txt file or removing the noindex meta tag.

For important pages under ‘Crawl Anomaly’, you should check it using the URL Inspection tool of Search Console to see more details why Google is having problems crawling and indexing it.

If you don’t see any important pages here, there is no need for any further actions and let search console keep unimportant pages here.

Check the SERPs

This is one of the most important things a lot of SEOs usually forget. A lot of people are busy doing link building and strategizing with their on-page SEO that they don’t check what their pages look in the search results.

To do this, go to Google search and use the advanced search command “site:” to show all the results Google has for your website. This is a great way of knowing how users see you in the search results.

Check for page titles that are too long or too short, have misspelled words or wrong grammar. Though meta descriptions are not used as a ranking factor by Google anymore, it is still a strong Click Through Rate factor and should still be optimized so make sure that meta descriptions of your pages are enticing for users.

Robots.txt Validation

The robots.txt file, also called the robots exclusion protocol or standard, is what the search engine look for to determine which pages on your site to crawl. Robots.txt is vital to your SEO Audit since a slight misconfiguration or problem can cause a world of problems for you. What more if you totally neglect to check it for the audit?

Putting your robots.txt to the test by making sure that the search engine can properly access is one of the best SEO practices in the industry. If your robots.txt file is missing, chances are all of your publicly available pages would be crawled and added to their index.

To start, just add /robots.txt next to your URL.

If you haven’t created a robots.txt file already, you can use a plain text editor like Notepad for Windows or TextEdit for Mac. It is important that you use these basic programs because using Office tools could insert additional code into the text.

However, you can keep in mind that you can use any program that is in the .txt file type. Robots.txt can help you block URLs that are irrelevant for crawling. Blocking pages such as author pages, administrator credentials, or plugins among others, will help the search engine prioritize the more important pages on your site.

The search engine bot will see your robots.txt file first as it enters your site for crawling.

use programs like Microsoft Word, the program could insert additional code into the text. Of course, you shouldn’t miss out on the robots crawling directives such as user-agent, disallow, allow, and the sitemap.

Onsite Diagnosis and Fix

After diagnosing your site through the different facets of the search engine (Google), it’s time for you to check your website as an entity. The tool we’ve always used to check on our site’s onsite status is Screaming Frog. We’ve always used it as the websites we handle grow larger as the months pass by. You set the parameters and it’s even capable of crawling/compiling outbound links to let you know if you have broken links. Here’s what the overview looks like:

Screaming Frog compiles all the different onsite factors and lists down errors that you might want/need to fix. Onsite factors that Screaming Frog shows are:

  • Protocol – If your pages are HTTP or HTTPS
  • Response codes – From Pages blocked by Robots.txt to server errors, these are all compiled and displayed by Screaming Frog
  • URL – If your URLs contain underscores, uppercases, duplicates, etc.
  • Page Titles – Everything that you need to know about your pages’ title tags
  • Meta Description – If your pages are missing their meta descriptions, duplicate meta descs, the length of your meta descriptions, etc.
  • Header Tags – Although Screaming Frog only compiles and displays H1s and H2s, these are already the most valuable aspect of your page structure.
  • Images – Missing alt text, image size, etc.
  • Canonicals
  • Pagination
  • And many more facets that are important for your SEO efforts.

Knowing the current state of your website’s pages is important since we are not perfect beings and we make the mistake of overlooking or forgetting to optimize one or two aspects of a page. So, use crawling tools like Screaming Frog to check the state of your pages.

Pagination Audit

Performing a pagination audit can affect your SEO efforts in your site because it deals heavily with how organized your pages are. Meaning, that task of pagination audit is done with the end goal of organizing sequential pages and making sure that these are all contextually connected. Not only is this helpful for site visitors, but it also projects a message to search engines that your pages have continuity.

Pagination is implemented for instances when you need to break content into multiple pages. This is especially useful for product descriptions used in eCommerce websites or a blog post series. Tying your content together will signal the search engine to think that your site is optimized enough to allow them to assign indexing properties to these set of pages.

How do you go about a Pagination? You have to simply place the attributes: rel=”prev” and rel=”next” in the head of each page in the series. Perform an audit by using an SEO Spider tool. While doing this, make sure that the attributes serve its purpose and that is to establish a relationship between the interconnected URLs that directs the user to the most relevant content that they need.

Pagination Audit should not be amiss since this maximizes the content in your site, allowing users to have a great experience in digesting these chunks of information. It is also very useful in increasing efforts for the navigation throughout the page.

XML Sitemap Check

XML sitemaps are especially useful because it lists your site’s most important pages, allowing the search engine to crawl them all and increase understanding on your website’s structure. Webmasters use the XML Sitemap to highlight the pages on their sites that are available for crawling. This XML file lists URLs together with additional meta-data about each of these links.

A proper SEO audit guide should always include the XML Sitemap Check because doing so will guarantee that User Experience always lands on a positive note. For you to make sure that the search engine finds your XML sitemap, you need to add it to your Google Search Console account. Click the ‘Sitemaps’ section and see if your XML sitemap is already listed there. If not, immediately add it on your console.

To check your sitemap for errors, use Screamingfrog to configure it. Open the tool and select List mode. Insert the URL of your sitemap.xml to the tool by uploading it, then selecting the option for “Download sitemap”. Screamingfrog will then confirm the URLs that are found within the sitemap file. Start crawling and once done, export the data to CSV or sort it by Status Code. This will highlight errors or other potential problems that you should head on out and fix immediately.  

Geography/Location

Google places importance on delivering useful, relevant, and informative results to their users, so location is an important factor for the results that they display. If you search for “pest control Philippines”, Google will give you pest control companies in the Philippines – not pest control companies in Australia or any other part of the world.

ccTLD plays a role in stating which specific search market/location your site wants to rank in. Some examples of ccTLD would be websites ending in .ph, .au, etc. instead of the more neutral .com. If your website is example.ph, then you can expect that you’ll rank for Google.com.ph and you’ll have a hard time ranking for international search engines like Google.com.au. If you have TLDs that are neutral (.com, .org, or .net), then Google will determine the country where you can be displayed based on the content you publish in your site and the locations of your inbound links.

If you already have a target country in mind but you have a neutral TLD like .com, you can set it your website’s target country manually in Google Search Console. Here’s how

Go to the old version of Google Search Console → Click on Search Traffic → Then click on International Targeting → Manually set your target country

This is what it should look like:

Note that if you have a ccTLD, you won’t be able to set your target country and this option is only available for websites that have a neutral TLD.

Structured Data Audit

We all know that Google is constantly improving their algorithms to better understand content, semantics, and the purpose of websites. This is where structured data shines since it indicates or marks up specific entities in your pages to help Google and other search engines better understand it.

The most common format of structured data used by webmasters around the world comes from schema.org. After indicating the necessary entities in your structured data, you can use Google’s Structured Data Testing Tool to check if your structured data has any errors. Here’s what it looks like:

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on May 14, 2019 at 05:20 pm

Aside from podcasts showing up in the SERPs, Google I/O has another good news for SEOs around the world. They’re finally adding a Page Speed Report in Google Search Console. This is an awesome addition to the constantly improving features of Google Search Console and can definitely improve our lives as SEOs.

We all know that page speed is an important factor and we use a variety of tools to check and optimize our websites to make it faster and consequently give our visitors a better experience. However, details regarding specific pages don’t come up, or at least, we need to manually input them in the tool one by one. So, how can Search Console’s upcoming page speed report make our lives easier?

Page Speed Report

Photo from Ilya Grigorik’s tweet

A lot of SEOs and digital marketers attended this year’s Google I/O and they shared some photos. Here’s what it looks like:

Tweet from Kenichi Suzuki

How the Page Speed Report Help SEO

Much like other reports available in Google Search Console, users can use the Speed Report to check the performance of their site’s speed. The report is divided into 3 groups – namely, Slow, Average, and Fast. If they made this like the Index Coverage report, then it’s safe to believe that we’ll see the list of pages in each group while also enabling us to see and analyze issues with the listed pages.

This helps us SEOs know which pages need our time and attention and how we can better optimize them. The speed reports eliminate the need to use speed checking and diagnosing tools and it also helps us dive deeper on which pages Google considers “fast”, “slow”, and just “average”.

If you like to experiment, you can also use this valuable feature to learn if the changes you made to specific pages on your website is harmful or beneficial to its speed. You can monitor its performance in the report and you’ll already know if the experiment was successful after a few hours or days from the date of the implementation.

We can’t help but be excited about this announcement and Google has given us the chance to try it out before they roll it out. You can sign up here.

Key Takeaway

Google Search Console is one of the most important and vital tools we use as SEOs and constant improvements to this awesome tool are always welcome. It’s already safe to assume that this isn’t the last update we’ll experience and I’m excited what other features Google plans to integrate into Search Console.

What do you think about the upcoming Speed Report? Share your thoughts below!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on May 9, 2019 at 05:07 pm

Podcasts have become one of the most popular forms of media content across the internet, with millions of podcast episodes available on a variety of platforms. With a constantly increasing number of listeners and shows, podcasts have become a viable platform for discussing a multitude of topics ranging from current news and events, music, politics, comedy, and so much more.

Despite the growing popularity of podcasts, one challenge that a lot of these shows encounter is the difficulty of promoting their podcasts outside of their platform. While there is a multitude of listicles, videos, and fellow podcasts that help get the word out, it is still challenging to make podcasts more searchable on search engines like Google. Unlike videos, where you can instantly click on a video snippet to view it instantly, podcast episodes rarely appear, with shows linked to Google Podcasts mostly being prominent in that area.

This makes podcast SEO an area with so many challenges from the get-go. However, things are about to change, as Google has announced in I/O 2019 that podcasts will soon become indexed in SERPs. This announcement is not only good for the podcast community, but this also opens up more opportunities in promoting a different kind of content to the users. Here are some things that you should know about Google’s announcement, along with our thoughts and some tips that can help you optimize your podcasts for SEO.

How will it work?

Similar to how videos are being shown on Google’s desktop and mobile search results, podcasts will have their own media snippet, which also allows you to play or download the episode without having to switch to a new page. Using Google’s ability to search the audio in a podcast episode, you can even skip to the part of the episode which mentions the topic that you have searched for.

This works in the same way that voice search results would yield video snippets that have been timestamped to the relevant section. A lot of podcast users listen to their podcasts on the go and having the feature to download and listen to specific sections of an episode helps optimize the experience.

Our Thoughts

With podcasts becoming present in Google search results, this makes promoting podcasts much easier, while making them more accessible to a wider audience thanks to the increased searchability. The increased convenience helps make the listening experience more efficient while making more users discover new podcasts without having to browse and search through hosting platforms.

To an extent, podcast episodes have become as significant as standard and live video content, with some shows having loyal and dedicated listeners. Having podcasts appear on search results helps expand a show’s online presence, which in turn helps the listener base grow. This update also helps Google Podcast become one of the biggest podcast hosting platforms, helping it compete against the likes of Apple podcasts and Spotify. The prospect of increased searchability and users being able to instantly listen already makes Google’s platform attractive enough for users looking to start their own podcast.

The platform also allows users to publish their episodes that come from other hosting sites, which makes it even more convenient for established podcasts yet to use Google. With this amount of functionality and increased potential to generate more traffic and plays, podcast SEO will something that would be more significant in the near future.

Podcast SEO Tips

While podcasts appearing on SERPs is still an update coming within the next few months, you can still optimize your podcasts and make them more searchable with a few handy tips.

Make sure every episode is transcribed

One of the ways that your podcast can be understood by search engines much better is through the use of transcriptions. This not only helps users be able to have a text version of your podcast episode, but also allows Google to understand the content in the same way that video transcriptions on YouTube help do the same, thus improving its searchability. Based on the demonstration at Google I/O, it looks like Google’s way of picking parts of the episode related to a keyword is through reading audio transcriptions.

While podcast transcriptions can cost a significant amount, it is an investment worth taking, as you are helping your podcast get more listeners to discover your content.

Migrate your episodes to Google Podcasts

Google Podcasts looks to become a bigger presence in the podcast market as soon as the update rolls out, and fortunately, the process to migrate your episodes to the platform is a simple process that only takes a couple of minutes to finish. All you have to do is to copy the URL of your podcast’s RSS feed to apply your podcast. After verifying that the account is yours, all you have to do is fill up information about your podcasts and then you’re set.

While the option to view podcasts in SERPs and snippets still hasn’t fully rolled out yet, you can still view podcast episodes on Google Search, provided that they have migrated their episodes on Google Podcasts. It may not be as dynamic as what the update would present, but it’s a good first step towards optimizing your podcast for SEO.

Key Takeaway

Podcasts have become another form of popular media users access on a daily basis, as it is a platform that provides content that is just as compelling and diverse as what you can watch on video or read in blogs. The upcoming update is a huge step towards developing a proper Podcast SEO strategy, helping to establish their online presence along with growing their audience.

If you have questions and inquiries about podcast SEO or SEO in general, leave a comment below and let’s talk.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on May 7, 2019 at 03:43 pm

Through the years, SEO has experienced various changes – from the death of link schemes to mobile prioritization. However, one of the biggest innovations in the industry is the use of the Topic Cluster Model as the newest SEO strategy.

This strategy began when Google launched RankBrain in 2015 – an algorithm that connects the users past searches with related topics and phrases that will result in finding the best results for users.

You should know by now that ranking well in any Search Engine Result Page (SERP) means that you need to show how each of your focused keywords are related to one another.

This is where topic clusters come into the picture.

What are Topic Clusters?

Topic Cluster is a group of interlinked web pages that are constructed around a pillar content which targets a broad topic. This is based on the idea that search visibility about a particular topic is way, way better than ranking for a specific keyword.

This strategy ultimately helps you develop an area of influence wherein the overall sum of searches for topically relevant long-tail keywords outweighs the sum of searches for a major keyword. This will definitely aid you in organizing the structure of your website and content. In addition, when a blog post in the cluster does well in rankings, the entire cluster also ranks well.

There are three components of the Topic Cluster Model, which are the following:

  • Pillar Content
  • Cluster Content
  • Hyperlinks

Now, let’s discuss these components more thoroughly.

  • Pillar Content

The pillar content is the cluster core since this is based on the broader topic. It is usually 3,000 to 5,000 words long that usually covers all the aspects of a specific topic but still leaves an ample room for different posts to answer. The pillar content is great for people who are not familiar with a certain topic but want to see a comprehensive overview of it.

  • Cluster Content

This component deals with the various cluster content that is directly connected to the pillar content. Unlike the pillar content that tackles a broad topic, the cluster content focuses on a specific keyword that is related to your broad keyword – discussing it in a more thorough and comprehensive manner. Lastly, these cluster content contain a link that brings your readers back to the pillar content.

  • Hyperlinks

Out of all the three components, this is the most important of them all. This is mainly because the hyperlinks are the ones that bind the pillar content to the cluster content.

To have a clearer view as to how the Topic Cluster Model works, here’s a photo of the three components put together:

(Image Source: HubSpot)

Simply put, the topic cluster model is a group of interlinked content under one specific topic for easier identification of various search engines. It produces signals that aim to prove your website’s authority and expertise on the given topic. This will increase your website’s visibility which may lead to more traffic and conversions.

Topic Cluster Model: Its Importance and Advantages

Undoubtedly, keywords have been and still are the foundation of content creation. However, with constant technological innovation and improvement, gradual behavior shifts happened with how the users interact or use a given set of keywords.

Ever since digital assistants – such as Siri and Alexa – were introduced, they have become one of the most common platforms in accessing search engine results pages (SERPs) at a much faster and more efficient rate than manually typing the query.

Due to this change in user behavior, Google and other search engines have been modifying their systems to cater to topic-based content searches. Existing SEO strategies that could not adapt to the behavior change were ultimately made obsolete to make way for new and more effective strategies, such as topic clusters model.

Though keywords are still important, targeting an entire topic is the way to go these days – mainly because of these following reasons:

  • Search engines are better at understanding related ideas.

Searching for an exact keyword is still relevant. However, search algorithms these days are better at understanding multiple terms on the same topic.

  • Authoritative and trustworthy results are what Google and other search engine results want to provide to their users.

In order to show authority to people and bots, consistently creating valuable and precise contents about a certain topic is a must. This is far better than making a number of disorganized content that targets keywords that are unrelated to one another.

With this being said, thinking of topics instead of a specific keyword is ultimately one of the factors that you should be focusing on because of these following benefits:

  • This will keep visitors on your website

Having innumerable content that is related to the interests of your visitors will make them stick to your website longer.

  • When an article does well, all the interlinked articles also do well

As you create content around a certain topic, this gradually improves the search rankings of your other similarly related content that are already on your website. As a result, this can ultimately lead you to own multiple SERP positions.

  • It helps bring you more traffic

Given that your cluster does well in terms of rankings, this will attract more visitors to your website and will more likely stay on it – which will make your traffic and conversions soar.

Creating a Topic Cluster

Now that you are already knowledgeable about the topic cluster model and how important and beneficial it is, we move on to the next step – creating your own topic cluster:

  • Select Your Topic

The first thing that you should do when curating your own topic cluster is to figure out what topic you want to be an authority on – it can be on sports, games, or SEO – if you have a blog. However, if you’re dealing with a business website, try to stick to the business’ niche as much as possible. Some things to note when you’re choosing a topic as your pillar content:

  • Your topic should be the keystone of your business
  • It should be something that you want to rank for
  • It should cover all the various aspects of a pillar page but must be broad enough for you to write numerous articles about it

When you finally choose your pillar content topic, that’s the time that you build content around it. By doing this, you are helping prospects to steadily trust you and your brand.

  • Inspect Your Existing Content

After figuring out and choosing the topic that you want to talk about, the next step is to do a content audit to see if you have existing and supporting articles that you can link to your chosen pillar page. This will ultimately let you maximize your existing content, and at the same time, will help you maximize the focus on your content creation efforts.

  • Optimize Your Keywords

Although topic clusters are the latest strategy in the SEO world today, this does not mean that you should neglect and forget about your target keywords. Optimizing your content and web pages around your keywords is still a must.

As soon as your topic clusters are made and you have your target keywords finalized, you can now update and optimize your pages and content in your cluster.

  • Link All Your Content Together

Once you have completed all your content, it is now time to link them with each other, as well as to the pillar page. When you are in this part, always remember to make your links two-way – which means that your pillar page must have links going to each of your articles in the cluster.

This is considered the most important step when crafting a topic cluster since this clearly demonstrates to search engines, such as Google, that these contents are connected and related to each other.

Once you have applied all these changes to your website, the only thing for you to do is to wait for the results in your site analytics. However, do note that it takes a month or two before you see and feel the strategy’s impact.

After the given time and you have seen the results to your topic clusters, you’ll be able to see which pages in the cluster performed best and the pages that you still need to optimize to make it rank better.

Key Takeaway

Shifting to a new strategy is indeed intimidating, most especially when your business already has an extensive archive. However, if you are able to plan numerous contents on a topic while making it well turned-out and then stitch them all together – then you are apt to be successful in doing this strategy.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on May 2, 2019 at 04:01 pm

Google’s search tools are not only some of the most reliable SEO tools around, but they are also free to use, allowing users to be able to monitor traffic, conduct keyword research, and manage their business’s online presence without having to pay for a fee to access premium features.

While this has been the case for a majority of Google’s tools for a long time, there has been a discussion about one of their tools, Google My Business, going premium. Being perhaps the most important local SEO tool available, this comes as big news to everyone using the platform, and that includes our team. We’ll be taking a look at Google’s plans for Google My Business and give my thoughts on the impact of the service going premium.

Going Premium?

Google’s proposal for making Google My Business a paid tool comes from a survey that was released earlier this week. The survey asked questions such as how much you would be willing to spend on Google My Business if it goes premium, along with which features they would like to see included.

Along with these questions, they also showed some possible new features that might enhance the experience of using Google My Business. Some of these include:

  • A “Book” button that enables customers to set up appointments.
  • Automated responses to messages and reviews.
  • Google Customer Support
  • A background check which is presented to the customer as a way of building their trust.
  • The ability to display badges and verified licenses on your profile.
  • A promoted map pin on Google Maps.
  • Having your business shown near the top of Google Search Results.

These new features have also been grouped in separate option packages, with Google asking which option they found the most viable for their needs. These options also came with their corresponding prices, with the most expensive option costing $60. With a host of new features coming in the way, this survey helps Google determine which features business owners would want to have in a premium service.

My thoughts

One of the best things about Google My Business is that it is able to provide businesses of various sizes a platform that helps them expand their online presence, allowing them to compete with fellow local businesses and established companies alike. Being a free tool, this allows any business to be able to promote their products and services on Google, allowing more users to discover what they offer.

One of the biggest challenges of Google My Business would be providing a substantial price point that would justify the number of new features it would include and satisfy business owners. This also means that SEOs would have to increase their budget for tools, as Google’s proposed monthly plan won’t come in cheap for the functionality that it aims to offer. A lot of SMBs rely on Google My Business to improve their online visibility and local search presence because it is a free tool that does not need to be provided with a budget, which is why paying a price for it might put off some businesses looking to gain more organic traffic.

Another point of concern is that paying premium also means being near the top of search rankings, which might affect numerous businesses that have raised their traffic organically for a long period of time. This might also mean that organic traffic might not take you on top of the search rankings without having to go premium. If there is one area that Google needs to take another look at with their proposed set of updates, it would have to be this one.

While the cost of the premium service is yet to be determined, I believe that one of the benefits of making Google My Business a paid tool is being able to have a more secure tool, as some of the proposed features include verified reviews, which allows Google to check reviews before posting. Harmful reviews have caused numerous inconveniences for Google My Business users and having the option to manage this makes this issue less of a problem than before. Being able to generate revenue also allows Google to be able to support the service in more ways, as they would now have the budget to optimize its service, which leads to an improved product.

Key Takeaway

Since being launched in 2014, Google My Business has constantly improved to become one of the best tools to promote your business and optimize your local SEO. With the plan of going premium, Google not only aims to gain revenue but also introduce new features that make it worth the investment for business owners and SEOs across the internet.

If you have questions and inquiries about Google My Business or SEO in general, leave a comment below and let’s talk.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on April 30, 2019 at 04:29 pm

The Robots.txt file and Noindex meta tag are important for doing on-page SEO. This gives you the power to tell Google which pages they should crawl and which pages they should index – display in the search results.

Knowing how to use these two and when to use them is important for all SEOs since this involves a direct relationship between the websites we’re handling and the search engine crawlers. Being able to direct the search engine crawlers on where they should go and which pages they should include in the database is a massive advantage for us, and we can use that to make sure that only our website’s important pages are the ones that Google and other search engines crawl and index. But before we delve into the details of how and when to use these two, we must first know what they are and their specific functions.

What is a Robots.txt file?

The Robots Exclusion Protocol, or more commonly known as Robots.txt is a file that directs web crawlers and robots such as Googlebot and Bingbot to which pages of your websites should not be crawled.

What is the use of a Robots.txt file?

The robots.txt file is only a crawling directive and it cannot control how fast a bot should crawl your website and other bot behaviors. This is just a set of instructions for bots on what parts of your website should not be accessed.

You should also take note that while some bots respect robots.txt file, some can ignore it. Some robots can exploit files on your website or even harvest information so to completely block malware robots, you should increase your site security or protect private pages by putting a password. If you have other questions about the robots.txt, check out some frequently asked questions on robots here.

How to Create a Robots.txt File?

By default, a robots.txt file would look like this:

You could create your own robots.txt file in any program that is in .txt file type. You could block different URLs such as your website’s blog/categories or /author pages. Blocking pages like this would help bots prioritize important pages on your website more. The robots.txt file is a great way of managing your crawl budget.

Robots crawling directives
User-agent Specifies the crawl bot you want to block from crawling a URL eg. Googlebot, Bingbot, Ask, Yahoo. Here’s a link to a directory of known web crawlers
Disallow Specifies that a URL and all other URLs under it should be blocked
Allow This is only followed by Googlebot. It tells it that a page can be crawled even if the parent page is disallowed
Sitemap Specifies the location of your website’s sitemap
Proper usage of wildcards

In the robots.txt, a wildcard, represented as the (*) symbol, can be used as a symbol for any sequence of characters.

A directive for all types of crawl bots:

User-agent:*

The wildcard could also be used to disallow all URLs under the parent page except for the parent page.

User-agent:*

Disallow: /authors/*

Disallow: /categories/*

This means all page URLs under the main author page and categories page are blocked except for them.

A good example of a robots.txt file would look like this

User-agent:*

Disallow: /testing-page/

Disallow: /account/

Disallow: /checkout/

Disallow: /cart/

Disallow: /products/page/*

Disallow: /wp/wp-admin/

Allow: /wp/wp-admin/admin-ajax.php

Sitemap: yourdomainhere.com/sitemap.xml

After editing your robots.txt file, you should upload in the top-level directory of your website’s code so when a bot enters your website for crawling, it would see the robots.txt file first.

What is Noindex?

Noindex is a meta robots tag that tells search engines not to include a page in the search results.

How to Implement Noindex Meta Tag?

There are three ways to put a noindex tag on pages you don’t want search engines to index:

Meta Robots Tag

In the <head> section of the page, place the following code:

<meta name=”robots” content=”noindex”>

The code may vary depending on your decision. The code mentioned tells all types of crawl bots from indexing a page. Alternatively, if you only want to noindex a page from a specific crawl bot, you could place the name of that bot in the meta name.

To prevent Googlebot from indexing a page:

<meta name=”googlebot” content=”noindex”>

To prevent Bingbot from indexing a page:

<meta name=”bingbot” content=”noindex”>

You can also instruct bots to follow or don’t follow links that are found on the page you noindexed.

To follow links in the page:

<meta name=”robots” content=”noindex,follow”>

To tell bots to not crawl the links in the page:

<meta name=”robots” content=”noindex,nofollow”>

X-Robots-Tag

The x-robots-tag allows you to control the indexing of a page in the HTTP response header of the page. The x-robots-tag is similar to the meta robots tag but it also allows you to tell search engines not to show specific file types in the search results such as images and other media files.

To do this, you need to have access to your website’s .php, .htaccess, or server access file. Directives in the meta robots tag are also applicable to the x-robots-tag. Here’s a great article about the X-Robots-Tag in HTTP headers.

Through YoastSEO

If you’re using YoastSEO in WordPress, there is no need for you to manually place these codes. Just go to the page or post you want to noindex, scroll down to the YoastSEO interface, go to the settings of the post by clicking the gear icon and then select “No” under “Allow Search Engines to Show this Post in Search Results?”

You could also put a noindex tag sitewide for pages such as categories, tags, and author pages so you don’t have to go to every individual page on your website. To put a noindex tag, go to the Yoast plugin itself and then go to Search Appearance. Selecting ‘no’ under ‘Show Categories in Search Results’ would place a noindex tag on all category pages.

Best Practices

Many people are still confused by these two. It is critical as an SEO to know what the difference is in between. This is crucial in making sure that the pages that you want the users to see in the search results are the only pages that appear and the pages you want bots to crawl are the only pages that get crawled.

  • If you want a page that already has been indexed, for example, by Google, be removed in the search results, make sure that page is not disallowed in the robots.txt file before you add the noindex tag because the Google bot won’t be able to see the tag in the page. Blocking a page without the noindex tag first would still make a page appear in the search results but it would look like this:

  • Adding a sitemap directive to the robots.txt file is technically not required, but it is generally good practice.
  • After updating your robots.txt file, it is a good idea to check if your important pages are blocked from crawling using the Robots.txt Tester in the Google Search Console.
  • Use the URL inspection tool in Google Search Console to see the indexing status of the page.
  • You can also check for unimportant pages being indexed by Google using the coverage report in Google Search Console. Another alternative would be using the ‘site:’ search command in Google to show you all pages that are being shown in the search results.

Adding Noindex in Robots.txt

There has been a lot of confusion in the SEO community recently about using noindex in the robots.txt but it has been said over and over by Google that they don’t support this but still a lot of people insist that it is still working.

In a Twitter thread, Gary Illyes said:

“Technically, robots.txt is for crawling. The meta tags are for indexing. During indexing, they’d be applied at the same stage so there’s no good reason to have both of them.”

It is best to avoid doing this. While it could be agreed that it is efficient since you don’t have to put a ‘noindex’ tag in individual pages rather just type them in the robots.txt file, it’s better that you treat these two things separately.

Blocked Paged can Still be Indexed if Linked to

In an article by Search Engine Journal, they quoted John Mueller in a Google Hangouts Session. Here’s his statement:

“One thing to maybe keep in mind here is that if these pages are blocked by robots.txt, then it could theoretically happen that someone randomly links to one of these pages and if they do that, it could happen that we index this URL without any content because it’s blocked by robots.txt. So we wouldn’t know that you want to have these pages actually indexed.”

This statement is huge since it gives us a better understanding of how crawl bots and the robots.txt work. This means that pages you blocked through robots.txt is not safe from indexing as long as someone linked to it.

To make sure a page without useful content won’t appear in the search results accidentally, John Mueller suggests that its still better to have a noindex meta tag in those page even after you blocked them from crawl bots with robots.txt

For John Mueller’s full thoughts on this, check out this Google Webmaster Central office-hours hangout back in 2018

Key Takeaway

There are many SEO hacks out there but you have to pick the ones that will give you optimal benefits in the long run. Using your robots.txt file to your advantage will do more than just increase your SEO visibility, it will also improve user experience as well. Robots.txt will stay significant so you have to be on guard for updates which will affect it.

Robots.txt should never be neglected, especially if you want to appear at your best in the SERPs. Brush up on these best practices whether you are a beginner in SEO or you have already optimized many sites. Once you do, you’re going to see how it will help you be cut off from the rest.

With that, comment down below how you use the meta robot tags. How is it working for you so far?

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on April 25, 2019 at 05:58 pm

For the past few years, I’ve always seen this question being asked frequently across blogs, social media posts, and online forums, and the question is:

“Is link building dead?”

I’ve seen this being asked in various sites and channels, and the answers that you would get can vary between “Link building is a dead strategy that would not be effective in the next few years” to “Link building is still a viable strategy”. The number of answers that you get from these articles and posts can make you continue, lessen, or even stop your link building efforts altogether. If you would ask me if link building is dead, here’s our answer:

“Link building is not dead.”

Despite what a number of people would say, link building is alive and is still one of the most effective SEO strategies around. Like every other SEO strategy, link building continues to evolve as digital marketing becomes increasingly diverse. Building links continue to be effective, as it helps to generate traffic, improve the authority and reputation of your content, and establish new networks between various websites.

Measuring the success and effectiveness of your link building campaign can be assessed with the right tools, and one of the best tools you can use to track your links is the Google Search Console Link Report. With the new Search Console introducing an improved set of features, here’s how the Link Report feature works in helping you track your link building efforts.

How to access the Google Search Console Link Report

To access the GSC Link Report, all you have to do is to access the URL you want to take a look and go to Links, which is under the Security & Manual Actions section. Upon accessing the Links, you can now view options such as internal and external links, top linking sites, and top linking text. Additionally, you also have the option to export link data as well, which is handy for performance reports.

External Links

This section allows you to view the pages that contain external links. You can see which pages have the most incoming links and linking sites, which then you can sort by number. You also have the option to filter the pages, allowing you to look for specific names and tags for target pages, look for pages that have a high or low number of links, and by the number of linking sites.

These filters allow you to track specific pages, which is important when measuring your link building statistics and want to know whether or not your website is being linked by external websites. This is a good indicator of your online presence and authority, as Google would be able to recognize if you are being linked by numerous quality websites. This is also a feature that you can use to check for harmful or malicious links, especially if you encounter unusual numbers that might indicate possible attacks.

Internal Links

Internal linking is essential, as it connects your webpages to one another, allowing users to discover content, and help generate more traffic. Internal links also contain data filters to track specific pages, along with the option to export data.

Top Linking Sites

If you want to view which websites have the most links to your website, you can view it using Top Linking Sites. This section contains the same filters as the previous sections, which helps you know which pages have the most links from your website. This is an important metric to track for link building, as this would help you know if your link building efforts on these external websites are successful in generating links and traffic to your website. This also helps you know if there are new websites linking to you, which helps you see whether or not you are being linked by authoritative websites.

Top Linking Text

Top Linking Text is a list of the most frequently used link text on the links to access your website. This data is case-sensitive, which means that capitalization and pluralization are taken into account and treated as different link text. Identifying link text is important, as you would know which words are being used to discover your links, and whether or not the text you’re using is attracting clicks.

Analyzing your Link Building Efforts using External Link Report

While there are a lot of backlink checker tools that are available out there like Ahrefs and Moz, let’s not look further as to where you should measure the effectiveness of your link building campaign. It’s alright to consider that a link that you acquired is good if it appears when you crawl your website through these tools, but nothing confirms it more when it appears in Google Search Console.

After you launched your link building campaign, you have to monitor from time to time if it appears in the External Link Report. There is no specific time on how long before a link you built will appear on the report because it will still depend on when Google will crawl the website you acquired the link from.

One thing I noticed is the appearance of nofollow links. A lot of people say that Blog commenting is a dead link building strategy because 99.9% of the time, the link you’ll be getting is a nofollow. I still leave comments from blogs that I follow from time to time and I noticed that these domains still appear on the External Link Report.

What does this mean for Link Building?

I think that this means that as an SEO, you should not focus on whether the link that you acquired is nofollow or dofollow. You should also not worry if majority of your backlinks are nofollow links. Yes, nofollow links do not pass on link juice but it seems like Google still consider these links. Remember, when a website links to your website, it’s like they are casting a vote to Google that your website is trustworthy and is a good source.

This also means that Link Building is definitely not dead. While it has been reiterated over and over that links should be acquired organically, link building strategies would not hurt as long as you’re not spamming websites.

Keep on sending out those outreach emails. Leave meaningful blog comments once in a while. Try to connect with other webmasters for guest posting. Your hard work and all the time that you spend in these strategies will be fruitful when you see those websites in your link report.

Key Takeaway

The Google Search Console Link Reports is a great tool to use to analyze the effectiveness of your link building campaign, as you would be able to view the link metrics of different webpages, allowing you to see the comprehensiveness of your link building campaign, and be able to see areas where you can improve your efforts.

If you have questions and inquiries about link building and SEO in general, leave a comment below and let’s talk.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last updated on April 23, 2019 at 05:43 pm

SEO has become an increasingly competitive industry where you can be on top in one day, and quickly fall in the next if you’re not careful. Having the right strategy and winning formula counts if you want to overcome your competitors. Analyzing the competition has become a standard approach when it comes to formulating the right strategy that will help you go on top of the search rankings.

This can be even more evident when it comes to content, as every brand aiming to improve their rankings and online presence are aiming to craft quality content that would satisfy the standards of Google while being a helpful piece of content for the users. With a plethora of topics to choose from, finding topics to create your content can be a challenging task. This is why one of the best places to look for ideas is from the competition itself, in the form of dead pages. Here’s how you can use these dead pages to your advantage in generating traffic that would help create link building opportunities on your end.

What are the objectives?

This strategy aims to accomplish two things: Create quality content that generates organic traffic and use this content to replace the dead links present in the target article.  We will be using the skyscraper method to create quality content and reach out to article authors and webmasters for building links. This is not only a form of content marketing but is, more importantly, an effective link building technique that would

How do you look for dead pages and links?

The first step in making this strategy work is to look for dead links and pages on the websites of your competitors. There are different ways to approach this, and one of the tools that we’ll be using for this strategy is the Ahrefs Site Explorer, which allows us to be able to check for 404 pages in a website, along with being able to look for their referring domains. Another tool that we’ll use is Dead Link Checker, which allows us to be able to look for the links in those referring domains and verify that they’re not working.

To start, enter the URL of the competitor website in Site Explorer to bring up the Overview. The next step is to click on Best by links, which is under the Pages category. Using this, you will be able to see pages with the most referring domains and clicks.

The next step is to change the HTTP Code in one of the filters and change it to 404 not found, which will help you narrow down the selection to 404 pages. Look for a page that has a topic that contains content that can be relevant for your website.

The next step is to track the referring domains of the dead page and see the websites that it has been linked to. Select one of the sites that you choose to target and use Dead Link Checker to check if the link is really not working.

After checking the dead page, the next step is to use the Skyscraper Technique and create content that is more engaging and well-designed than that of your competitor. You can make your content more comprehensive, and even contain infographics and videos that provide even more information that readers would find useful.

For this example, we saw that our competitor’s dead page was about digital marketing and SEO for lawyers, which is a topic that has gained a significant amount of interest, which makes it a good topic to generate traffic on. Handily, we have created a page that is about SEO Strategies for Lawyers and Law Firms, which is a comprehensive guide that shows the benefits of SEO for lawyers and law firms, along with the kinds of services that they can avail for.

Creating an outreach campaign

The next step is to endorse your content to the authors and webmasters of pages with dead links. For your email pitch, point out the dead page that you have discovered upon reading the article, and mention that you have content that would be able to swap out that dead link with a fresh one. Here’s a sample template that might help you  promote your article:

Greetings,

I read your article about (article topic) and found it really helpful. While I was reading, I have discovered a link to (article title) that is no longer working. Fortunately, I have written (your article title) that is not only similar to that link but is also high-quality, which is something that your readers would surely find useful. If you’re interested, you can replace the link of my article in place of the dead link, which will surely help our website, while providing your readers with even more informative content.

Best Regards,

 SEO Hacker

Depending on how many you reach out to, your content would not only be able to generate organic traffic, but also be a great source of referring domains. You can use Dead Link Checker again to take a look at pages with more dead links. You can treat the referring domains as a contact list of authors and webmasters that you have to reach out to. To narrow down the high-quality sources, it is best to apply more filters and focus on dofollow links. Once you have gathered their email addresses, you are now ready to deliver your pitch to all interested parties.

Once this is done, you have accomplished the process of using your competitor’s dead pages against them and continue to generate more traffic and better rankings for your website.

Key Takeaway

Having a dead link in any form of content means that an opportunity to generate traffic has been lost. When your competitors are the ones responsible for these dead links, it is best to capitalize on these opportunities and use this against them to gain a massive advantage.

If you have questions and inquiries about competitor analysis and SEO in general, leave a comment below and let’s talk.

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview