Loading...

Follow AudienceBloom | Link Building & Content Marketi.. on Feedspot

Continue with Google
Continue with Facebook
or

Valid
Table of Contents

+ Introduction to Site Speed for SEO
+ How Site Speed Can Affect Your Rankings
+ How to Measure Website Speed
+ What is Considered a Slow Website Loading Time?
+ How to Increase Website Speed
+ Conclusion

Introduction

When we talk about website optimization, a lot of people tend to focus on a site’s content, meta data, and other traditional on-page SEO factors. But for some time now there’s been something else that’s just as important: site speed. That’s how fast your site loads and responds to the user.

How Site Speed Can Impact SEO Rankings

Google announced in 2010 that site speed would be a factor they take into consideration for ranking purposes:

We’re including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.

So if you want to stay competitive online, site speed is something you have to pay attention to. If you look into the nitty-gritty about the things that affect your site’s download and response time, you’ll realize that it’s basically all about creating a “clean” site.

You want to have a site that’s not riddled with messy code and images that aren’t optimized. What can you do to ensure your site is performing fast enough? Here is a quick outline of some of the easier things you can do.

If your site takes more than a few seconds to load, then your rankings, your bounce rate, and more may be adversely affected. But should that be your main focus right now? While site speed is something you should pay attention to, it may not be the top priority at the moment.

If you have poor content (or barely any content at all), if your site isn’t optimized in the first place, if it doesn’t have a well-organized navigation structure, if you have poor (or no) backlinks, or if you have an abundance of pages with duplicate content … then these issues probably take precedence. Bounce rate and performance will not be telling you the true story unless you start with a well-designed site that’s optimized.

How to Measure Site Speed

Site speed can actually be considered in a number of different ways, and all of them culminate in your overall speed and loading times.

Document Complete-Based Page Loading Times

When you access a webpage, the information streams in gradually. You see words and images appear on the page at different times, and this is especially apparent on slow-loading websites. A webpage is considered loaded as “document complete” when it has loaded enough to allow a user to start clicking buttons or entering written text. It’s possible that not all of the content is fully loaded, but a user can begin to take action.

Full Render-Based Page Loading Times

On the other hand, it’s also possible to measure page loading times based on when the entire page is fully loaded. This loading speed is always longer than a “document complete” loading speed, but the difference between the two values may be different for two different sites.

Time to First Byte

Finally, it’s also possible to measure your overall site speed by looking at the “time to first byte” (TTFB) metric, which is the amount of time it takes for a browser to download the first byte of information from an online source. Essentially, it measures whether or not there is any significant delay between the request for information and your web server’s response. Where page loading times generally depend on your site settings and the type and amount of content you have on your page, TTFB measurements are usually indicative of your server settings.

Below are some more sources to help you get your site speed optimized:

Google PageSpeed Browser Plugin

Google Webmaster Tools

SEOmoz Browser Toolbar

Web Page Test

Yslow (Yahoo’s Tool)

What is Considered a Slow Website Loading Speed?

Now that we know how site speed can be measured in different ways, we can come up with a ballpark for what are considered “good” or “bad” metrics. Like I mentioned earlier, Google doesn’t publish what types of site speeds it takes into consideration, or if there are any specific numbers it looks for, but we can make reasonable assumptions for target loading times based on other sites we’ve seen, and based on a recent analysis by Google.

According to this analysis, the average “full render” page loading time is roughly 7 seconds on desktop devices, with a median page loading time of roughly 3 seconds. On mobile devices, the average page loading time is more than 10 seconds, with an average of nearly 5. It’s difficult to compare individual sites against such broad metrics, especially with such a sharp rift between the median and mean values, but if your site loads slower than the average page, you can generally consider your site to be too slow.

According to Moz, the median TTFB figure for high ranking websites is roughly 0.4 seconds, with that same figure being closer to 0.6 seconds for lower-ranking websites. If your site’s TTFB is greater than 0.6 seconds, you have some room for improvement.

If you’re looking for a way to measure your own site speed to compare it against these metrics, try out WebPageTest. It’s a free tool that will allow you to perform multiple types of tests to measure your site’s performance.

How to Increase Website Loading Speed

WordPress, one of the most popular platforms for website creation, offers a number of different themes, templates, plugins, and widgets to give you a completely controllable user experience. For most businesses getting started on the web, this is a bit overwhelming, but it’s important to make the right decisions when setting up and maintaining your WordPress site. In order to give your users the best experience and improve your chances of ranking, use these strategies to improve your site’s load times:

1. Get an Efficient Host.

Hosting may not seem like a big deal, especially if this is your first site, but the type of hosting you have makes a big difference in your load times. For example, if you opt for shared hosting in order to save a bit of money, you could be setting yourself up for drastically slower load times, especially under peak conditions.

2. Reduce Your Images.

As you might imagine, high-definition images can be a major drag on your site’s load times. Each individual user must download these images when accessing a page that contains them, so if you can replace those images with much smaller, faster images, you’ll instantly improve your load times. The WordPress plugin WP Smush.it is one tool that can help you automatically and efficiently compress the size and load times of all the images on your site.

3. Choose an Efficient Theme.

The themes and frameworks available on WordPress are part of what has made the platform so popular, but not all themes are efficient. In many cases, it’s better to choose a simple, unadorned theme with lightning-fast load times than a bulkier theme you prefer from an aesthetic perspective. Don’t worry; there are tons of themes with light frameworks to choose from (including some of the defaults).

4. Clean Up Your Plugins.

Some plugins can be valuable to your load times, like the WP Smush.it tool we mentioned above. However, many plugins simply take up space and make your site bulkier to process. Check out P3, the Plugin Performance Profiler—it can quickly tell you how each of your plugins affect your overall load times and give you direction for which ones to keep and which ones to disable.

5. Zip Your Website Files.

Compressing your website as a ZIP file allows for much easier transmission to your users’ browsers. Essentially, you’re reducing the amount of data that is transmitted without changing the final product displayed. There are several zipping applications available, but any of them will suffice so long as they don’t otherwise interfere with your loading times.

6. Use a Caching Plugin—but Set It Up Properly.

Caching plugins tend to be free to download and install, and they’re relatively easy to use. By directing browsers to download files stored in a visitor’s cache instead of trying to download them from the server, you can cut significant loading time. It only works for repeat visitors, but it’s still valuable. Just avoid playing around with the advanced settings too much or you could interfere with its proper functioning.

7. Reorganize Your Homepage.

Maximize your layout for speed. Show segments of posts rather than full content, make the length of your homepage shorter, and remove any widgets and plugins you don’t need on the homepage (including social sharing widgets, which belong on individual posts rather than on the homepage directly).

8. Make Your Database More Efficient.

If you know what you’re doing, you can clean up your database manually, but the better way is to use the WP-Optimize plugin. With this plugin, you can quickly and easily establish settings that prevent the buildup of unnecessary information on your database. Since you’re storing information more efficiently, your page will end up loading faster on your users’ machines.

9. Control Image Loading.

You can selectively control which images load immediately for the user in order to reduce the total amount of information necessary for a user to download from the server. With the right setup, only images above the fold will load immediately, and the remaining images will load only when the user scrolls down accordingly. It’s possible to do this manually, but it’s easier to do it with a plugin.

10. Get Rid of Pingbacks and Trackbacks.

Pingbacks and trackbacks are notifications from external blogs that inform your blog that it’s been mentioned. Pingbacks and trackbacks automatically update the data contained in your post, thereby increasing the amount of data needed to load and increasing load times. Getting rid of pingbacks and trackbacks will preserve your backlinks but prevent the extra data from being stored on your site.

11. Eliminate Unused Post Drafts.

Drafts of old blog posts can weigh down your site more than you think. For example, if you revise a draft four times, you’ll have five total versions of your blog post sitting in your site’s database. You’ll never need to reference those earlier drafts, so update your database settings (perhaps using the WP-Optimize plugin we mentioned in point 8) to delete them and prevent unnecessary storage in the future.

12. Use Static HTML Instead of PHP When You Can.

This isn’t something everyone should do, but for those of you looking to cut load times dramatically, it’s an additional option. PHP is a useful way to improve the efficiency of your site, but it also occupies server processes while it’s running. If you can replace it with a static HTML equivalent, it’s worth trying.

13. Take Advantage of a Content Delivery Network.

Content delivery networks (CDNs) provide the same data you would ordinarily need to transmit to your users—such as CSS files and images—but on closer servers to maximize user download speeds. There are many CDNs available, but most will require a subscription fee to use.

14. Aggregate Your CSS and Javascript Files.

If you use several plugins, your site probably links to multiple CSS and javascript files on every page, which can interfere with loading times. Instead, use a plugin like Minify to combine all that information to a much more condensed, faster-downloading form.

15. Disable hotlinking.

Hotlinking is the process of linking to another person’s images, thereby increasing server loads without necessarily increasing your traffic. Disable hotlinking with a handful of steps and prevent that extra burden.

Obviously, load times aren’t the only factor Google uses in populating its search results. You still need to have a regular, high quality content marketing strategy, a social media presence, well-structured meta data, and a long-term backlinking strategy. However, if you put these tips to good use and decrease your WordPress site’s load time, you’ll be able to simultaneously improve your domain authority (and thus, your rankings) and give your users a better overall experience.

Conclusion

There are two factors you should bear in mind when analyzing your site speed and making preparations for the future:

  • Every site is unique. What’s considered “fast” for one type of site may not be considered “fast” for another type. For example, open Google’s homepage, then open CNN’s homepage. You’ll notice a huge difference, but both sites have a very high user experience rating. You shouldn’t make your site fast compared to the rest of the world—make it fast for the type of site it is.
  • The big picture is what’s important. Ultimately, lowering your site loading times by half a second is a positive change, but it’s not nearly as effective as improving user experience with bigger changes—like institutinga more intuitive navigation.

Still, if you’re concerned about your site speed and you want to make it better, you can only stand to benefit. Don’t obsess over site speed, but do whatever it takes to give your users a great experience.

The post Site Speed: How & Why to Increase Your Website Speed for SEO appeared first on SEO.co.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Table of Contents

+ Introduction
+ Benefits of Guest Blogging
+ How to Find Guest Blogging Opportunities
+ How to Establish Your First Guest Post
+ How to Tell if a Site Accepts Guest Posts
+ Guest Blogging Best Practices
+ Ways to Succeed at Guest Blogging
+ Why Quality Guest Blogging Will Always Be Effective
+ Can Sponsored Guest Blog Posts Hurt Your Rankings?
+ Guest Blogging Sites to Avoid
+ Conclusion

Introduction

Many business owners are passionate about their product or service, but assume they lack the expertise in the online marketing sphere to gain visibility and website traffic.  This is a common challenge faced by many of today’s businesses and entrepreneurs.

But in reality, you don’t need to be an SEO professional or online marketing expert to succeed with your online marketing initiative. In fact, if you have the skills to start a business, you already likely have all the skills necessary to exponentially increase your website traffic while building your brand equity for the future.

The secret to success in the online marketing world, and the solution for business owners struggling to generate website traffic, is to establish your brand (and yourself) as a credible authority within your niche, and the best tactic for doing so is called guest blogging.

Guest blogging has become the darling of the SEO industry, but that doesn’t mean that only SEO professionals can do it. In fact, in the competitive world of online marketing, it’s necessary for business owners to do, or risk being burned by the competitors who are.

This article is meant for business owners who want an easy-to-follow, understandable guide to building their business online through guest blogging.

Why Guest Blogging, and What are the Alternatives?

It’s important to note that guest blogging is not the only tactic business owners should pursue. Online marketing is a diverse and rapidly-changing field that currently consists of tactics including (but not limited to):

  • PPC (paid search)
  • Paid ads
  • Social media marketing
  • Search engine optimization (SEO)
  • Content marketing
  • Email marketing

With the exception of paid search and paid ads, all the other tactics intertwine and affect each other. For instance, a strong social media marketing campaign will positively affect your organic search rankings, improving your SEO. And a strong content marketing campaign will provide fuel for social media marketing and SEO campaigns.

While paid search and ad campaigns can yield great ROI in the right situations, they usually amount to short-term gains with little or no long-term impact. A good SEO campaign, on the other hand, is like building equity in your business that lasts for the long haul. It’s similar to the difference between buying a house and building equity vs. just paying rent.

So, why do I so strongly advocate guest blogging? Because a properly-executed guest blogging campaign yields the strongest and safest ROI while simultaneously supporting your SEO, social media, and content marketing efforts. It builds the most valuable, long-term equity in your business, and, most importantly, requires nothing more than a computer and an Internet connection to execute. This means there are no excuses; if you’re reading this, you already have everything necessary to start a guest blogging campaign and grow your business online.

Don’t have time? Hire staff and assign them some of your duties to take things off your plate. Trust me, this is an initiative you should be making time for.

Guest blogging is relatively simple to get into, though the process may seem intimidating to those unfamiliar with the strategy. All you need to do is identify a publisher or blog that might be a good fit for your industry, reach out with a guest post or post idea, and hope to get published. The wider your network of guest posts and the more authoritative your sources are, the greater effects you’ll see. The benefits are diverse and many.

Referral Traffic

One of the most immediate benefits of guest blogging is the referral traffic you’ll receive. Assuming you include at least one link pointing back to your root domain on each individual post, you’ll see an increase in referral traffic from those external sources. For example, if your guest post gets 1,000 views and 10 percent of those readers end up clicking on your link, you’ll end up with 100 free visitors to your site. Since those links (and posts) are permanent, your referral traffic will continue to increase and compound over time.

Brand Recognition and Reputation Building

One of the less measurable effects of guest blogging is the increased brand recognition and reputation you’ll receive. As people start seeing your name and your brand popping up on more publication outlets, and as you are seen more consistently, you’ll start to be seen as a greater authority. This, in turn, will attract more people to your site and increase the likelihood that your new site visitors will eventually convert. You can even call attention to the fact that you’ve been published on these external sites on your homepage to strengthen your perceived reputation and credibility.

Link Building

One of the most popular reasons for guest posting has been the opportunity to build external links. Since its inception, Google’s search algorithm has used the number and quality of backlinks pointing back to domains as a go-to resource for determining that domain’s total authority. In essence, the stronger the backlink profile, the more authoritative the site will be, and the more authoritative a site is, the higher it will rank. Guest posting gives you the perfect opportunity to build high-quality links on external sites, giving you higher search rankings—so the theory goes.

Social Audience Building

Social media marketing is becoming increasingly important as more and more consumers rely on social media platforms for their communication needs. A larger social audience means greater influence, greater search ranks, and greater brand visibility, and including your social links on all your guest posts is a surefire way to increase your following. In a self-perpetuating relationship, a greater number of followers means more traffic for your posts, and greater traffic to your posts means a greater number of followers.

What’s Changed?

The benefits of guest blogging listed above have served as the justifiers for a guest blogging strategy since its rise in popularity several years ago. However, a number of changes in the market—including Google’s algorithm updates and a shift in consumer preferences—are influencing guest blogs’ effects.

Overcrowding

Guest blogging’s popularity has been a burden for those practicing it as an ongoing strategy. Because the demand for content is finite and consistent and the amount of content available is constantly growing, there’s been a slow but measurable oversaturation of guest content getting published. As a result, each guest post published today is slightly less valuable than an equivalent post published three years ago. As time goes on, this effect may become more severe, but for now, as long as you’re posting the best possible material you can, oversaturation can be overcome through sheer quality.

The Slow Death of Link Building

Thanks to Google’s Penguin algorithm and repeated assertions by Google that link building is not an effective strategy, many search marketers are shying away from link building altogether. Fortunately, link building is only a small part of what makes guest posting effective. Instead of using links, rely on brand mentions to pass authority—they’ll build your domain authority just as much, and they don’t carry a risk of penalty. Plus, you can still use links to increase your referral traffic—just use a nofollow tag if you want to mitigate your penalty risk.

Guest Blogging Benefits

There are many benefits of guest blogging:

  • Builds and improves Author Rank (editor’s note 4/29/16: Google Authorship is no longer a thing, but it’s not known if Author Rank still is)
  • Creates links to your website
  • Strengthens awareness of your brand
  • Aligns your brand with industry leaders
  • Builds your personal brand
  • Generates leads and traffic
  • Creates social signals

Here’s the breakdown on each of the benefits.

Builds and improves Author Rank: Author Rank is how Google calculates the credibility of the author of a particular page, affecting how well that page ranks in search. Credible, authoritative, trusted authors receive “bonus points” in the rankings for articles they write related to their niches of expertise. I believe Author Rank will grow significantly in importance in the ranking algorithm over the next few years.

Creates links to your website: Inbound links have the heaviest weight of all the ranking factors in Google and Bing. Inbound links are considered much like “votes” by one website for another. Links from more credible, trusted websites will be treated as more important votes, so it’s best to spend your efforts focusing on getting inbound links from authoritative publishers.

Aligns your brand with industry leaders: Aligning your business name and website with brands that Google already ranks at the top in search engines is the best way to become a part of Google’s inner trust circle. This results in higher rankings for your website, driving more traffic, leads, and sales.

Builds your personal brand: After a while, if you publish enough great content that your readers love, you’ll start to become an authority in your niche. Once you become a niche authority, this opens the doors for many more opportunities, such as:

  • Speaking opportunities at events (for which you can get paid and further build brand recognition)
  • Easier access to guest posting on more, higher-quality publishers in your industry
  • More leads from your target market
  • Higher quality website traffic

Generates leads and traffic: Give advice or solutions to problems, and you’ll come to be recognized as a trustworthy source for further help, resulting in leads and sales.

Creates social signals: Social signals include Tweets, Facebook Likes, LinkedIn shares, Google +1’s, and more. Together, social signals represent a quality signal to search engines, because pages that are shared and discussed more often in social media channels are usually higher-quality. They are growing fast in importance as one of Google’s many ranking factors, so it’s important to get lots of social activity associated with your brand in order to stand above the rest in search engine rankings.

How to Find Guest Blogging Opportunities

Guest blogging, as an SEO tactic, has long been considered an expensive, time-consuming endeavor. It’s also been considered one of the safest, most “white-hat” methods of link building in the SEO’s arsenal, but over the last several years, has largely been put on the backburner as most SEOs pursued more powerful (albeit, more risky) tactics.

But with the rollout of Google Penguin, everything changed. Guest blogging services are cropping up everywhere (including here, at SEO.co) as the industry begins to realize that guest blogging, as a link building tactic, is one of the few safe havens left after Penguin demolished many of the lower-cost, higher quantity tactics that SEOs came to rely upon over the course of the past several years.

As the new darling of the SEO industry, the popularity of guest blogging is growing exponentially. But while many SEOs are just now learning about the benefits of guest blogging, many are still in the dark about how, exactly, to do it.

There are lots of great guides available on the Web that offer nuggets of information about guest blogging, but I haven’t been able to find any that really dig deep into the most difficult part of guest blogging: Actually finding the blogs to guest post on. This guide is meant to provide a thorough, step-by-step walk-through of exactly how to find guest blogging opportunities. And I’m going to show you how to do it by using one of my favorite internet marketing tools: Scrapebox.

Saddled with an unfortunate reputation for being a tool useful only for propagating blog comment spam, Scrapebox is actually one of the few internet marketing tools I use on a daily basis—and for only ethical, white-hat purposes.

What You’ll Need:

  • Scrapebox (download it here for a one-time fee of $57. TOTALLY worth it.)
  • Private proxies (Get them from Proxybonanza for a small monthly fee. I recommend going for the “Bonanza” package from the “Exclusive Proxies” section.) Note: That Proxybonanza link is an affiliate link. I’d really appreciate if you’d buy through my link!
How are We Going to Use Scrapebox to Find Guest Blogging Opportunities?

Scrapebox will execute multiple search queries simultaneously in Google and Bing, automatically harvest all the results, and allow us to manipulate, augment, and export the data.

For example, let’s say you want to find good guest blogging opportunities for your website about canine epilepsy. To find other websites that rank well for the term (and similar terms) which might be good targets for a guest blog post, you’d want to examine the top 100 search results for the following search queries:

  • Dog seizures
  • Canine epilepsy
  • Canine seizures
  • Seizures in dogs

Without Scrapebox, you’d have to perform each of those searches manually (via Google.com), manually click through each of the top 10 pages, and copy/paste each URL into a spreadsheet for future follow-up. This process would easily take you at least an hour.

With Scrapebox, you supply the search queries, and it will perform the searches, collect the URLs of the top 100 results, and supply them to you in an Excel spreadsheet. Additionally, you can use Scrapebox to automatically find the PageRank of the domain of each search result, allowing you to filter out low-PR domains without having to manually visit them. Scrapebox also offers myriad other filtering options, such as the ability to ignore results from domains that would never accept a guest blog post, such as facebook.com, amazon.com, etc. All of the above processes can easily be completed in under 60 seconds.

Ready to take your link prospecting capabilities to a whole new level? Let’s get started.

Step 1: Load your proxies into Scrapebox

After obtaining your proxies, load them into a .txt file on your desktop in the following format:

IP:port:username:password
IP:port:username:password
IP:port:username:password

Here’s an example:

123.456.789.012:01234:jayson:awesomepassword
123.478.759.032:01234:jayson:awesomepassword
123.446.899.012:05274:jayson:awesomepassword
129.486.749.012:01234:jayson:awesomepassword
176.495.989.016:01637:jayson:awesomepassword

In Scrapebox, click “Load” under the “Select Engines & Proxies” area. Select the text file containing your proxies. Scrapebox should load them immediately, and look something like this:

Click “Manage” and then “Test Proxies” to test your proxies and ensure Scrapebox can successfully activate and use them.

Be sure that “Google” and “Use Proxies” are both checked.

Step 2: Choose a keyword that best represents your niche or vertical

For example, let’s say I’m trying to find guest blogging opportunities for my website about canine epilepsy. I would select “dogs” as my keyword. I could go for a more targeted approach and try “canine epilepsy” or “dog seizures” as my keyword, but I’m likely to find much less (albeit more targeted) prospects.

Step 3: Define your search queries.

Copy and paste the following search queries into a .txt document on your desktop, and replace each instance of [keyword] with your chosen keyword from Step 2.

Note: The following is my personal list of search queries that I use to identify guest blogging opportunities. Google limits queries to 32 words, which is why these are broken down into many chunks rather than one long query. Enjoy!

“submit blog post” OR “add blog post” OR “submit an article” OR “suggest a guest post” OR “send a guest post” “[keyword]”

“guest bloggers wanted” OR “contribute to our site” OR “become a contributor” OR “become * guest writer” “[keyword]”

“guest blogger” OR “blog for us” OR “write for us” OR “submit guest post” OR “submit a guest post” “[keyword]”

“become a guest blogger” OR “become a guest writer” OR “become guest writer” OR “become a contributor” “[keyword]”

“submit a guest post” OR “submit post” OR “write for us” OR “become an author” OR “guest column” OR “guest post” “[keyword]”

inurl:”submit” OR inurl:”write” OR inurl:”guest” OR inurl:”blog” OR inurl:”suggest” OR inurl:”contribute” “[keyword]”

inurl:”contributor” OR inurl:”writer” OR inurl:”become” OR inurl:”author” OR inurl:”post” “[keyword]”

site:twitter.com [keyword] “guest post” OR “guest blog” OR “guest author”

Step 4: Load Search Queries into Scrapebox.

In the “Harvester” section in Scrapebox, click “Import,” then “Import from file.” Select the file containing the search queries that you just created in Step 3. Scrapebox should then populate with the search queries, looking something like this:

Step 5: Update your blacklist.

Scrapebox has a “blacklist” which allows you to automatically filter out undesired search results. For example, I know that Facebook.com and Amazon.com will never accept a guest blog post, so I don’t want results from those domains appearing in my list.

To edit your blacklist, click “Black List” from the top navigation, then click “Edit local black list.”

After you start using Scrapebox and receiving output lists, you’ll begin to notice undesirable domains that often appear in search results. As you notice these, add them to your local blacklist so they never appear again. Here are a few good sites to add to begin with:

Amazon.com
Facebook.com
Tumblr.com
Linkedin.com
Yahoo.com
Squidoo.com
Hubpages.com

Step 6: Set Search Depth in Scrapebox

Next, define how many search results Scrapebox should harvest for each query. You can do this in the “Select Engines & Proxies” area, in the text field next to “Results.” I generally set it to 200 or 300.

Step 7: Start Harvesting

We’re now ready to start harvesting search results for our queries. Click “Start Harvesting” in the “URL’s Harvested” section.

Harvester in action

Finished harvesting

Step 8: Filter results by PageRank

You should now have a list of websites that Scrapebox harvested, which looks something like this:

The next step is to filter these results by PageRank, since we don’t want to waste our time reaching out to websites with a low PR. Scrapebox makes this super easy. Click “Check PageRank” then select “Get Domain PageRank.”

Next, click “Import/Export URL’s & PR.” Click “Export as Excel” and export the file to your desktop. Open the file on your desktop and re-save it if need be (sometimes the file is corrupt, but by re-saving it and deleting the older version, you can easily solve this).

Column A should contain a list of all the harvested URLs. Column B will contain the PageRank of each domain. Add column headers to column A (URL) and column B (PR).

Next, sort column B by PR, in order of largest to smallest. To do this, highlight column B by clicking on the column header, then click “Sort & Filter” in the “Home” tab in Excel. Then, click “Sort A to Z.”

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Table of Contents

+ Introduction
+ Optimizing Design for SEO
+ SEO Website Redesign Mistakes to Avoid
+ User Experience (UX) Considerations in Website Design
+ Launching a New Website Without Killing Your Rankings
+ Conclusion

Introduction

User experience is one of the most important aspects of any online business or online marketing strategy. Strictly defined, user experience is the sum total of a user’s impressions, feelings, and thoughts as he/she navigates your platform (for the purposes of this article, we’ll be referring to a website as the target for user experience design, though any number of interactive platforms could be a worthy substitute). Making a user happy, giving him/her easy, convenient directions, and eliciting feelings of comfort and familiarity are quintessential to a positive user experience.

So what value is a great user experience? First and foremost, it gets a user to stay on your site for longer, and increases the likelihood that he/she will come back. Second, it increases the likelihood that he/she will tell others about the experience, peripherally increasing your traffic. Last but not least, it can affect your standing in other areas of the Internet—for example, great user experiences are correlated with higher search engine rankings, and if your user experience ratings are high, you’ll be more likely to establish relationships with other major players in the industry.

Updating the design and structure of your site from time to time is a requirement of the age of online marketing. Designs get stale, technology evolves, and your customers are always looking for the next big thing, so eventually, whether it’s two years or six years down the line, you’ll have to rebuild your website from the ground up.

It’s an exciting opportunity for the entrepreneur enthusiastic about the future of the brand. It’s a new challenge and another project for the design and development team. But for the search marketer trying to maintain and build on their site’s current level of success, the whole process can be a nightmare. Pulling one version of your website down and putting another one up is like swiftly pulling a tablecloth out from under a set table without breaking or moving anything; it’s next to impossible unless you know what you’re doing.

Optimizing Website Design for SEO

During the process of your redesign, you’ll want to pay careful attention to the design factors that can impact your search rankings.
if you are an online marketer, it pays to know and use crucial SEO and web design best practices when you create websites and blogs.

It’s not that you need to know about the technical details of html if you’re not a programmer. But you need to understand how SEO and design work to effect maximum search engine exposure.

Most SEO is based on keyword-optimized content. However, factors such as file naming, image tagging, use of coding, and ensuring that your website loads quickly also contribute to search rankings.

Keyword-optimized file names

The search engines use several factors and sources to assess what a website’s content is all about. These include file and directory names.

Your site will get a much better ranking when all the elements, including file name, are keyword optimized. Assigning a keyword-based name to a file increases your chances of getting indexed properly without being penalized for re-using keywords.

Making the Most of Images

Keyword-optimizing images and videos on webpages is one of the most neglected tricks of skilled SEO.

Search spiders or robots are actually unable to determine what images or videos are all about, unless they are accompanied by a descriptive text and coding elements. This is where Alt tags come in very handy.

Alt tags tell the search engines what’s in the images. When you assign Alt tags to graphics and videos, be sure to use keywords, so they show properly on search results. This elevates a site’s keyword association, and pushes its relevancy to targeted keywords upward in search engine rankings.

Internal linking structure

Including a sitemap makes it easier for search engine spiders to find their way around your site when they’re looking for something new to index.

It’s particularly helpful for search spiders to determine what your site is being optimized for if you use keywords in navigation links.

Coding to highlight keywords

Headers and titles can also place emphasis on specific keywords. When you create titles or headers around content, try to keep them short.

Avoid using very long titles and headers with repeated keywords, or your site could be flagged for keyword stuffing. The use of headers also helps search engines identify important sections of a website.

Navigation

The navigation is a critical component of your website because it directs your users where to go; if your users aren’t sure what to do when they get to your site, they’re going to bounce. As a result, many designers choose to make a visually impressive and intuitive navigation in the header of the site. This is great, but you’ll also need to ensure that your navigation is SEO-friendly. Otherwise, Google won’t be able to tell that your site has an intuitive navigation, and any benefit for your users’ intuitions could be compromised by restricted web traffic from a lower search rank.

In order to make your navigation clear, you’ll need scannable text and clickable links corresponding to each page and section of your website. Even if your navigation depends on images and visual elements for a better user experience, it still needs to be grounded in a crawlable format.

You’ll also want to make sure to include a clear sitemap for search bots to crawl and understand your site. There are several types of sitemaps you can include for your site, including an HTML sitemap found directly on your site and an XML sitemap submitted directly for Google through Webmaster Tools (though this is not an explicit design change). Redundancy is not an issue; just make sure your sitemaps are visible and accurate.

Page Offerings

Some modern designs emphasize minimalism, which is valuable for a user experience—rather than getting bogged down by countless pages and potential destinations, users are simply presented with what they need. However, from an SEO perspective, you’ll want to have at least a solid foundation of page offerings related to your core products and services. Otherwise, you’ll have little hope of ranking for keywords specific to those offerings.

For example, if you offer three types of consulting services but you’ve consolidated your site to one streamlined homepage, you’ll miss out on the opportunity to highlight each of those consulting services in a separate, crawlable format. You don’t need to go crazy, but make sure you have a solid representation for all the major facets of your business.

Speed

Site speed is another important ranking factor that can be easily controlled during the design process. The faster a site is, the better experience a user will have, and the more Google will reward you, so do everything you can to keep your site running efficiently and quickly.

There are many ways you can do this, and any combination of them will help reduce your page loading times. You can remove any pieces of unwanted, unused, or no longer necessary code on the back end of the site. You can reduce the amount of JavaScript in the code, which can be painfully slow depending on how you’re using it. Most importantly, you can reduce the number of videos and images throughout the site and reduce the file size of those that remain.

Responsiveness

Your site needs to be optimized for mobile. At this stage of SEO and mobile development, that is a necessity. There are a handful of different methods you can use to make it mobile responsive, which you can address in your redesign if you haven’t already. These options include simply building out a separate mobile site, or converting your desktop site to a mobile version and setting up an automated means of toggling between the two based on the device accessing it.

However, the easiest and most efficient way to optimize your site for mobile is to make it responsive. With a responsive web design, your website layout will automatically flex and stack to accommodate any size screen that attempts to access it. It’s a one-time fix that makes your site perfectly compatible with any device or browser.

Social Elements

Social integration isn’t going to directly increase your rank on SERPs, but it will go a long way to tie your online presence together. Include social icons on your homepage and contact page, and if you can, include sharing functionality that allows users to share your content on their social profiles in as few clicks as possible. This will increase the likelihood of people following you, increase your social visibility through more social shares, and help keep your social and website presences in sync. A stronger social presence will lead to higher domain authority, so don’t miss the opportunity to grow it.

Written Content

As part of the growing trend of minimalism in web design, some designers avoid including written content in favor of more images and more white space. Both these visual elements are important to draw users’ eyes and keep their attention, but without written content, Google will see your site as virtually empty. Even if you want to keep your content as concise as possible, be sure to include enough crawlable content to let Google know what your company is and what you can offer people.

Google Webfonts

Talented designers are very picky about the fonts they include on a website—and some will argue in favor of the most beautiful fonts available. Unfortunately, Google has an easier time reading some fonts over others, and the prettiest fonts around may not be as crawlable as the more basic choices. If you’re concerned about your font’s compatibility with Google search bots, or if you’re looking for one to start with, Google has a helpful list of available fonts.

A Note on 301 Redirects

If you’re redesigning your site, odds are you’re going to have a new site structure, complete with new pages and new URLs. When you go to make the switch, Google will notice new pages coming up and old pages coming down, and that might have a major impact on your search rankings. While it’s not necessarily an element of web design, it’s critically important to set up 301 redirects for your old URLs if you want to avoid potential ranking drops. If you aren’t familiar with 301 redirects, Google has a helpful how-to here.

SEO Mistakes to Avoid in Website Design

When it comes to re-designing your website, you’re going to see a bit of volatility no matter what, but you can mitigate the effects by watching out for these three common vulnerabilities:

1. A Changing URL Structure.

The biggest problem you’re likely going to face as you update your website is a disconnection between your old URL structure and your new URL structure. In a perfect world, you would maintain an identical URL structure, thereby preventing the possibility of a discrepancy, but then you probably wouldn’t need to be updating your site in the first place.

There’s one critical danger here, which can have a rippling effect that permanently damages your domain authority and crashes your ranks. Your URLs have history with Google, and Google likes links with history. Its search engine algorithm has come to expect your site to be in a very specific structure and a very specific order, and when it goes to crawl your new site, if it doesn’t see what it expects to see, it triggers a red flag. Historical links, with lots of credibility, that suddenly disappear in favor of entirely new links can wreak havoc on your domain authority, putting you in the same position as a site for a brand that just launched.

The problem is compounded by external links. Naturally, you’ve built a number of links on external sites pointing to various internal pages of your domain in an effort to improve your authority. If any of those links become no longer relevant, the page rank those links pass will become useless, and you’ll have a profile full of dead links pointing to nowhere, further damaging your domain authority and possibly interfering with your inbound traffic.

Fortunately, there is a simple—but admittedly painstaking—strategy you can use to ensure this outcome doesn’t occur. First, you’ll need to set up a Webmaster Tools account and crawl your site or use an alternative tool to generate a list of all the URLs found on your current website structure—and don’t forget about all your subdomains! Then, if you can, do everything you can to keep that link structure as similar as possible.

For any old links that do not have an immediate counterpart in the new site, or for links whose names have changed, you’ll want to set up 301 redirects. Fortunately, setting up 301 redirects is easy, and once they’re in place, any traffic that would encounter your old URL will be automatically pointed to the new one. This should prevent any damaging crawl errors from Google, and will definitely keep all your inbound external links accurate and functional.

2. Design and CMS Pains.

Unless you’re working for a very small business, your new website is going to be in the hands of many individuals from many teams and many different departments. Everybody is going to have their own perspectives on what would be best with the site. Multiple opinions, collaborating together can ultimately culminate in the greatest final product, but you can’t forget about Google’s perspective.

Graphic designers want to make the most visually appealing site possible, but there are some design principles that need to be balanced in order to fit with Google’s priorities and prevent a nosedive in your ranks. For example, most designers would prefer a site designed with minimalism, with only a handful of links in the navigation and as little onsite content as possible. However, Google likes to see lots of high-quality pages, and without ample onsite content, the search engine may find it difficult to understand your purpose. There is always room for a compromise, so work with your designers to find a good balance that works for both of your goals.

Similarly, other members of your team may have strong preferences when it comes to selecting a CMS, either due to price or personal opinions. There are hundreds of CMS options out there, with varying compatibilities and functionality with SEO. Be sure to do your research and vet your options when considering a transition.

3. The Same Old Mistakes.

Finally, and perhaps most importantly, transitioning to a new site is an opportunity to fix all the mistakes that were holding you back with the old site. Passing over this opportunity, or failing to give it its due attention, is a critical mistake and a vulnerability you cannot afford to neglect.

Throughout the planning and design process, run an audit of your current efforts on your current site. Where are you ranking? How much traffic are you getting? Where is that traffic going and how is it behaving? What problems are inherent in your navigation, and what gaps are there in your meta data?

Your first priority when designing a new site, from an SEO perspective, is your navigation. It needs to be simple and intuitive, but fleshed out enough so that any new visitor will know exactly where to go. It also needs to have strong anchor pages with keywords related to your business, and clear sitemaps for Google to read and understand your site. Second, you’ll need to examine which of your pages tend to attract or retain the most traffic, and look for ways to replicate its success in your other pages (in terms of design, content, and purpose). While not directly linked to the process of getting a new site up and running, this is also a good idea to review your ongoing tactics and find ways to improve them.

Even though the process is ripe with SEO vulnerabilities that could shake up your rankings or traffic flow, your website rebuild is an opportunity, first and foremost. Treat it as such, and you’ll be able to reap the benefits.

UX Considerations in Website Design & SEO

Without a solid UX design in place, your design and web strategy will immediately fall apart. That being said, it’s important to understand some core truths about UX design before plunging in:

1. It’s more than just aesthetic design choices.

First is a core misconception about UX design, and it has to do with that pesky word “design.” When people think about design, especially when it comes to web design, they think of aesthetic choices like coloration, layout, structure, and so on. While these are all important to user experience, UX design and web design are not intrinsically the same. Web design can have all kinds of motivations—for example, you could make the most beautiful site possible, or make a site that only cares about funneling people to conversion. A successfully designed site from a UX perspective might be beautiful and have elements of conversion optimization, but its primary focus is always the user’s interaction.

Plus, most UX design includes more than just the “aesthetic” part of design. There’s also sitemapping, branding, navigation, and similar subjects to consider.

2. Successful design choices aren’t always rational ones.

This is a hard concept to accept, because we’d like to imagine that the world always behaves rationally. Because user experience depends on the instincts, intuitions, first impressions, and emotions of the individual user (none of which are standardly predictable), some of the best UX design choices are inherently irrational.

Let me illustrate with a simple example. Imagine a navigation bar with a horizontal layout. It might make sense that your user’s eye is drawn to the leftmost item first, as most American users read from left to right. But you might discover in testing that the middle item is usually the first seen and first clicked. Depending on your priorities, this could demand a redesign. In UX, you can’t trust your instincts—you can only trust the tests.

3. It’s not easy.

On the surface, UX seems like it would be a fun, relatively easy experience. It might seem like designing an amusement park, adding new features that look fun and experimenting with different combinations until you find the perfect layout. But the reality is, UX is a hard, tedious, and arduous process. As we saw in my previous point, your instincts and beliefs are constantly called into question by real data, and in many cases, you’ll end up with a product you don’t subjectively “like” because it happens to work best.

The unpredictability of users makes the process even more difficult. You might find that a portion of your audience loves your site, but another portion hates it—what do you do then? The unfortunate answer is usually, start from scratch.

4. Users are everything.

In other ways, UX design is really quite simple. In theory, it can be reduced to a single process: find out what your users want and give it to them. Your users are the only thing that matters—it doesn’t matter what your company wants, what you want, or what the design award organizations want—if your users are happy, you’ve been successful. If they aren’t, you’ve failed. Conducting surveys and tests can help you uncover what people are actually thinking and feeling, but that human factor is still the simplest and yet most complicated part of the whole process.

5. It demands ongoing attention.

UX isn’t something you do once. It isn’t a phase of the web design and development process that you go through, settle on, and then ignore for the remainder of your online marketing campaigns. It is a..

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Table of Contents

+ Introduction
+ The Big Picture of Building Your Online Brand
+ How to Optimize Brand Name for Search Engines
+ Brand Qualities That Resonate With Customers
+ Online Brand Identity Considerations
+ Brand Awareness KPIs to Track
+ How to Build Better Content for Better Branding
+ Questions to Ask to Find the Perfect Brand Voice
+ Using Brand Associations for SEO
+ How to Improve Brand Engagement Through Storytelling
+ How to Outrank Big Brands in SEO
+ Technical Components of Building Your Online Brand
+ Qualitative Brand Building
+ Modern SEO = Proper Branding
+ How to Build an Online Brand from Scratch
+ Conclusion

Introduction

You know what a brand is, so I’m not going to bore you with a standard definition. You might already have a brand, but are unhappy with it, you might be starting a company without a brand, or you might have a brand but simply know nothing about it.

In any of these scenarios, your brand requires attention. It’s one of the most important elements of your business since it permeates not only your corporate identity, but also every sales and marketing campaign you ever launch. If you’ve got a brand already, you can work on it by trying to understand its function (and maybe upgrade it to a more modern aesthetic), but otherwise, you have one, admittedly daunting option: building a brand from scratch.

This guide will walk you through this complex, yet stimulating process, helping you to find the perfect set of brand characteristics for your organization—and challenging you along the way.

Why is a brand important?

Before I dig into the details, let’s establish why it is a brand is important in the first place.

Take a look at these options.

(Image Source: The Benjamin)

Which one do you think tastes best? Second best? Unless you’re deliberately manipulating your answer, the stronger brands with the higher prices look as though they taste better. Yet, according to blind taste tests, there’s no inherent advantage one brand has over the other (for the record, Pepsi won consistently during the Pepsi challenge—but biases in the type of test used have been called into question).

The point is, a noteworthy brand will immediately seem like a better product, service, or business than one that is unknown, or objectively weaker. Strong, consistent brands have immediately better appeal, tend to encourage more customer loyalty, and end up performing better than their counterparts. If you can develop your brand enough, it will come to speak for itself in terms of quality—the way powerhouse brands like Coca-Cola, Apple, and Amazon have today.

So how can you build a “strong” brand on your own? That’s the purpose of this guide.

Simple tips to establish the right mentality

First, you need to set yourself in the right frame of mind. Building a brand isn’t a simple, easy, one-step process like choosing a gas station to refuel at. It requires an investment of time, effort, and in many cases, money. If you start with the right mentality, you’ll be prepared for all the challenges to come your way:

  • Don’t skimp. This is one of the biggest investments you’ll make for your company. You wouldn’t buy a house that was falling apart just because it was cheap, nor would you spend $100 for a car that probably wouldn’t get you anywhere. Branding is not the place for frugality, either financially or in terms of effort. Be prepared to give it your all.
  • Think it through. If you jump and run with the first idea that pops into your mind, you’ve done yourself a disservice. First drafts are always terrible, so take your time, sort through multiple ideas, and only walk away with what sticks.
  • Be ready for your brand to be everywhere. Brands aren’t just something you slap on the front door and push into the corner of your website; by necessity, they are present everywhere. They’re in your ads, in your social profiles, and even in your company’s office. Your brand will define you.
  • Get everyone on the same page. Because your brand is present everywhere, it’s important that every member of your team understands and accepts the rules of your brand. Any break in consistency could compromise its overall effectiveness.
  • Don’t separate yourself too much. As a founder or company owner, try not to make your brand too much of a separate entity—throw your own thoughts, values, opinions, and personality into the mix. It will make your brand seem more personal, which as you’ll see, is always a good thing.
  • Don’t be afraid to get help. Branding is a serious, intensive endeavor, and not all entrepreneurs or marketers are capable of doing it alone. If you find your own experience and capabilities are limited, don’t be afraid to reach out for help.

Now that you’re mentally prepared for the challenge, it’s time to start building a brand.

The Big Picture

First, I want to cover the “big picture” of SEO, because the “technical,” intimidating stuff is only a fraction of what’s actually involved in your search rankings. The goal of SEO is to increase your search visibility, which in turn will increase your site traffic.

Google ranks sites based on a combination of two broad categories: relevance and authority. Relevance is how closely the content of a page is going to meet a user’s needs and expectations; for example, if the user asks a question, Google wants to find a webpage that answers it. Authority is a measure of how trustworthy or authoritative the source of the content is.

Your tactics will usually involve building your authority, increasing your relevance for targeted queries, or both, across three main areas of optimization:

  • On-site optimization. On-site optimization is the process of making your site more visible, more authoritative, and easier for Google’s web crawlers to parse and understand. Many of these tweaks and strategies involve technical changes to your site, including adjustments to your backend code and other structural site changes.
  • Ongoing content marketing. Content marketing is the best way to build your authority and relevance on-site over time; you’ll have the chance to choose topics and optimize for keyword phrases your target audience will use, and simultaneously create content that proves your authoritativeness on the subject.
  • Off-site optimization (link building). Off-site optimization is a collection of tactics designed to promote your on-site content and improve your authority by building links to your site. The quantity and quality of links pointing to your site has a direct influence on how much authority your site is perceived to have.
How to Optimize Branding for Search Engines

Search engine optimization (SEO), to the outsider, is a frustrating, complicated mess. Google doesn’t publish how its algorithm works (though it does give us helpful hints), and there are hundreds of independent, technical variables that can determine how your site ranks.

If you don’t have experience with programming or website building, technical factors like meta titles, site structure, and XML sitemaps can seem intimidating and difficult to approach. And while it’s true that experience pays off—a novice won’t get the same results as someone with years of experience—the reality is that SEO is more learnable than you probably give it credit for.

I’ve put together this guide to help the technically challenged folks out there—the ones new to SEO, or those unfamiliar with coding and website structure—to illustrate the basics of SEO, and simplify some of the more complicated techniques and considerations you’ll need to get results.

Optimizing a brand name for search engines takes time and a lot of upfront work if you’re coming up with a new name or renaming an older product. The majority of the advice in this article will focus on a “brand name” as the name of your company or organization, but keep in mind that the same strategies can be applied to the branded name of a particular product or service to achieve the same ends.

That being said, take a look at the ways you can create a search-friendly brand name and populate that brand name in authority-rich ways around the web.

Creating a Unique Brand Name

First, your goal is to create a brand name that is both memorable and unique. The “unique” factor of the equation is important because it differentiates you from the competition. If you have a slightly modified version of a competitor’s brand name, your potential traffic could become confused if they see both in the SERPs, or even worse—mistake your competitor for you in a more general sense. The “memorable” factor is important to encourage more searches in general—for example, if someone hears your name from a friend and makes a note to search for you later, you’ll want to be sure your name is memorable enough to stick around.

For the sake of illustration, imagine a company with the name “Qwoxillyyon.” It’s definitely a unique name, but it’s also not memorable because it’s not catchy. On the other end of the spectrum, a name like “VitaSupps” is more memorable, but it’s not unique—it’s pieced together from names of existing companies in the supplement industry. The key here is to find a balance between those two qualities.

Don’t rush into your brand decision; this name is likely what you’re going to be stuck with for a long time, so spend some time really perfecting it.

Associating the Name With Your Industry

In addition to crafting a brand name that’s both memorable and unique, you’ll want to include some keywords, phrases, or even chains of letters that are related to your industry. Barring that, you’ll want to come up with a tagline or slogan for your brand that clearly defines what you do. There are two major search-related motivations for doing this. First, including industry-based language will make your brand more likely to appear in industry-related searches. Second, incoming searchers who see your brand name and/or tagline in search results will be more likely to click on your link and understand exactly what it is you do.

If you’re stuck on trying to figure out exactly what type of keywords to include, run an exercise that can help you determine the strongest possible identifying words in your industry. Forget about your brand for a second, and just work with your team to come up with a list of seven to ten words that most succinctly describe or are most associated with your business or line of work. See if you can work at least two of those words into your brand name, or the tagline associated with it. Doing so will increase your brand’s relevance to the industry and attract more total search traffic to your site.

Onsite Optimization

Once you have your brand name and tagline finalized, you’ll have to find ways to work it into your website in a way that maximizes your chances of getting shown. In your title tags, the first few words should be the most important and most descriptive—so here, you’ll want to include the title of your business or a description of your space. Do include your brand name, but try to include it closer to the end, perhaps segmented off with a vertical bar (|). Throughout the body copy of your site, make references to your brand in text and in the context of descriptions of who you are and what you do. Google will semantically learn to associate your brand with whatever type of terms and subjects you include it with.

Ongoing Management

As an ongoing process, include references to your brand on offsite sources. Google sees brand mentions in a way highly similar to the way it views offsite links—but with a much lower chance of getting penalized if you appear spammy. Post mentions of your brand in the context of relevant, appropriate responses on industry-related blogs and forums, as well as major publishing outlets, news sources, and .edu/.gov sources whenever you get a chance. Just be sure to stay consistent in your efforts and diversify your sources.

As with any search optimization strategy, the upfront work is important, but the true value only comes through an ongoing process of dedication, refinement, and improvement. The more time you invest into making your brand name strong and visible on the web, the more results you’re going to see. In short order, you’ll be dominating any searches related to your brand name directly, and in time, your brand name will help you rank higher for even searchesperipherally related to your industry.

Brand Qualities That Resonate With Customers

Every business is unique, and every brand needs to stand apart as something original, especially in a competitive landscape. However, there are seven essential brand qualities that serve as prerequisites to capture and keep consumers’ attentions:

1. Trustworthy.

Imagine two brands. One you consider trustworthy, and one you do not consider trustworthy. If forced to make a purchasing decision between these two brands, which one would you choose? Trustworthiness should be an obvious quality to go after for your brand, but many companies neglect to prioritize it. You can improve your perceived trustworthiness by ensuring the accuracy and validity of each and every one of your posts. One misstated fact, false claim, or misleading piece of information can wreck your trustworthy reputation, so double check everything. Aside from that, just make sure you remain honest, and your reputation will naturally follow.

2. Authoritative.

Asserting yourself as an authority in a given space takes some time, but it also forces you to exaggerate your expertise in a given area. That doesn’t mean inflating your capabilities or lying about your status, but it does mean choosing your words carefully when describing your business. For example, including references to your certifications or your history can make you seem like more of an authority, as can calling out the fact that your content has been featured in major publications. It also helps if your company is mentioned or gains the approval of other influencers in the industry—so start networking!

3. Emotional.

Many companies choose the logical, conservative approach when it comes to communicating with their audiences. It’s less risky that way, but it also has a way of alienating your followers. People don’t want to deal with a faceless, bland corporation—not in any context, and not in any industry. If you want to seem more appealing and truly resonate with your potential customers, you need to inject your messages with a little more emotion. Show it off when you’re happy. If your company announces some bad news, show that it’s personally affecting you. Otherwise, you’ll come off as robotic.

4. Personable.

This goes along with the emotional element, as customers are more naturally drawn to brands that seem like people. What you really need to do is inject a bit of your own personality into the brand personality you intend to demonstrate. Add a bit of characteristic flair with some colloquial language, informal expressions, and a bit of direct humor. Doing so will make your brand seem more human and more approachable, and it’s going to lead to more people seeking you out for their needs. It also helps to show off the names and faces of your team—especially on social media.

5.Open.

Openness goes along with trustworthiness, but it is a distinct characteristic. People want to engage with brands that aren’t afraid to hide anything from their customers. For example, when facing controversy, many large modern brands choose the route of ambiguity—hiding or speaking in generalities about whatever subject is being hotly debated by their fans. This leads to a sense of distrust, or a sense that the brand doesn’t have the people’s best interests in mind. Instead, be open about anything and everything you can be. Develop a reputation that you’re willing to share information with your followers.

6.Helpful.

Obviously, helpful brands are going to get more attention than apathetic ones, but showcasing this trait is harder than it might seem. All you can really do is pay attention and look for opportunities where your brand can step in and do something valuable. Watch for people complaining about your products, and step in to try and resolve the situation. Find individuals with problems in forums and offer your own advice. Include tutorial or FAQ sections on your website, and go out of your way to ensure your customer service processes are unrivaled.

7. Passionate.

Finally, you’ll have to show off how passionate you are about your business. Corporations that are in it only to make profit come off as evil, intimidating, or otherwise alienating. Companies that appear to truly enjoy what they do and live and breathe that culture have a far better reputation, and tend to fare better in attracting new customers in their marketing programs. Shine a spotlight on individuals of your team, show off your latest and greatest accomplishments, and do whatever it takes to show you really care about the industry.

Identity Considerations for Your Online Brand

As you develop (or revise) your brand, you’ll need to consider and map out these seven elements:

1. Vision and Values.

These represent what is most important to your company. Your vision is the culmination of your goals and your central mission, while your values are the characteristics of your brand that will allow those goals to be met most efficiently.

For example, the vision of a nonprofit could be “ending hunger,” and the values could be a focus on education, community empowerment, and personal motivation. The vision can be expressed and reiterated subtly, while the values should become evident through your use of language and the presentation of your ideas. Let’s say this nonprofit decides to publish a newspaper. Obviously, they’ll want to make reference to the fact that their main goal is reducing hunger in the community, but each entry in the newsletter should be in line with the brand’s values of education, community empowerment, and personal motivation. A spotlight on an individual’s attempts to unite the community with an awareness program would fit in perfectly with the brand.

2. Formality and Informality.

You’ll have to decide where your brand voice falls on the spectrum of formality and informality. Formality usually requires strict adherence to grammatical rules, full and detailed sentences, and a straightforward, logical structure. Informality has no such structure, allowing more colloquial phrases, swear words, and unconventional structures to convey messages. Formality is often considered in higher regard, earning more respect from readers, but it can also be seen as rigid or impersonal. Conversely, informality is much more conversational and approachable, but can be seen as immature or inexperienced as a result.

Consider your main demographics. As an example, if you run a chauffeur service, your clientele will be wealthier, better educated, and demanding of a formal experience—so incorporating a layer of formality into your brand will improve your reputation. Alternatively, if you’ve created a new phone messaging app that you hope teenagers will use, you can afford to be more informal with your communicating.

3. Emotionality and Rationality.

This is another important spectrum to consider for your brand, and it might change slightly depending on your purpose and medium. Emotion-based communications try to persuade readers and followers by making an appeal to emotions. For example, a dog food company could create messaging that dramatizes your relationship with your pet and focuses on that bond to sell dog food. Logic-based communications, on the other hand, use logical and rational appeals. Using the same example of a dog food company, the company could emphasize the objective nutritional superiority of their dog food versus a competitor’s.

Every company will need to use both emotional and rational messaging to convey ideas, but some brands will gain value by using one more than the other.

4. Humor and Sarcasm.

Some branding experts use humor and informality interchangeably, since most jokes and humorous language can be classified as informal. However,..

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

It’s never that much of a surprise when Google comes out with some new feature or update for its search algorithm. In fact, the company has made a name for itself in always striving for a better, more amazing product. When Google announced recently that it was introducing a new feature to its search algorithm called RankBrain, not many people outside the SEO community seemed to care. Even after Google pointed out that the new RankBrain system had been watching, tweaking, and changing search results on its own for more than a month, few people noticed anything different in their search results. Accordingly, there hasn’t been much in the way of public discussion.

Despite this lack of public awareness, RankBrain is one of the most significant advancements to Google’s algorithm since the Panda update, and if you want to preserve and improve your SEO strategy for the coming years, you’ll need to understand its full potential.

The High-Level Breakdown of RankBrain

Before I get too deep into the tangible effects of the RankBrain system, let me explain the basics. RankBrain is a machine learning algorithm that works in conjunction with the Hummingbird Update to give better search results for user queries. The Hummingbird update included the then-new feature of “semantic search,” so rather than focusing on individual keywords in a user query, Google would be able to look at the entire phrase and user intention behind it. RankBrain takes this a step further by analyzing ambiguous, unclear, or otherwise indecipherable semantic user queries, learning from the experience, and applying that experience to future, similar queries.

For example, if you search for something like “what is the executive leader of the United States called?” you might get results about the position of “President.” On the other hand, if you search for something more ambiguous like “who is it that leads America in politics?” Google might struggle. With RankBrain, Google would be able to learn that the latter search query is actually just a less clear way of rephrasing the former search query, and would gradually shift search results to match. This is impressive because the majority of Google’s search algorithm updates, including Hummingbird, have been deliberately and painstakingly pre-programmed by human beings. RankBrain is going to learn, posit, and execute updates to itself over time, free of any human intervention.

Misconceptions of RankBrain

Though RankBrain is only a few weeks old at this point, there are a handful of key misconceptions about what it is and how it fits into our current understanding of search.

First, understand that RankBrain isn’t a formal algorithm update. Unlike Panda, Penguin, or other landmark algorithm changes, RankBrain isn’t shaking up the ranking factors that Google considers when sorting out which sites to list first in the SERPS—instead, RankBrain is something of a ranking signal of its own. It’s working in conjunction with the Hummingbird update (which is an algorithm update) to produce a better understanding of queries—not a different selection of results. Think of it as a query translator.

Second, know that RankBrain is similar to, but distinct from, the Knowledge Graph. Because RankBrain and the Knowledge Graph are both forms of query service that analyze user queries and improve via machine learning over time, it’s easy to mix them up or assume that they work in the same way. However, RankBrain is focused on understanding queries, while the Knowledge Graph is focused on providing the best direct answers to certain queries. Think of it this way; RankBrain could serve to better understand your query, then pass it off to the Knowledge Graph for a proper and complete answer.

Users will continue using Google the way they always have, without any visual or experiential clues to suggest that anything different is happening behind the scenes. The end result is going to be better results for a greater number of queries, which is important, but shouldn’t provide any disruption to usual processes.

Will Search Change That Much?

The nature of RankBrain means that the average user isn’t going to notice much of a difference. The changes it makes will be tiny, gradual, and will only exist for long-form user queries. Still, you might notice yourself getting better results in certain areas—apparently, RankBrain has already affected millions of queries (which isn’t that impressive for a search engine that processes more than 40,000 queries every second).

What Comes AFTER Google RankBrain & What it Means for SEO?

Google’s latest update isn’t making waves because of how many ranking factors it changed, or how many queries it’s impacted, or even how detailed or effective it is. Though Google didn’t announce the update until the end of October, it had been running in the background for months. According to Google, it’s already helped to handle millions of queries, but if it’s so effective, why haven’t search marketers noticed it until now, and why is it so significant?

It all has to do with the type of update RankBrain is—a slowly building machine learning algorithm. In the SEO world, we’re used to large, manual pushes to the core Google ranking algorithm (and very rare algorithm replacements). Panda was the first example of this in 2011, followed by Penguin in 2012. After these twin heavy-hitters, search marketers buckled down, constantly on the lookout for the next major update to shake up rankings.

Two things have happened since then that have challenged our expectations: the first is that Google has broken up its “big” updates into much smaller, more manageable chunks. Part of this is to reduce the total impact of each update, and part of this is because they don’t have a lot to add. The second is the introduction of this machine learning algorithm, which changes the way Google’s algorithm will update in the future.

What Is Machine Learning?

Machine learning is exactly what it sounds like; RankBrain is closely associated with the Hummingbird update Google released back in 2013, which brought a semantic understanding to Google’s analysis of user queries. RankBrain works by analyzing complex or ambiguous user queries, and finding ways to simplify those queries. However, it wasn’t pre-programmed for any specific courses of action; instead, it was programmed to experiment, learn, and essentially update itself.

So What Now?

If Google has a component of its algorithm that can update itself automatically, they can consider their job complete. Theoretically, if they could apply machine learning algorithms to every aspect of its search algorithm, Google’s search engine would be able to gradually update itself over time, always improving, without ever requiring human intervention. Combined with the knowledge that Google hasn’t pushed a big update to its algorithm since 2013 (ignoring the smaller, more gradual refreshes of Panda and Penguin), there’s a great deal of ambiguity in Google’s next move. Obviously, they’ll want to continue improving and refining their core algorithm, but how are they going to do it?

The Case for Big Manual Updates

Even though big manual updates have tapered off, there’s still a possibility that there are more to come. New technologies (on the order of mobile devices) could disrupt the current search format, and advances in semantic understanding or user patterns could force a manual push to become necessary. Plus, even though machine learning is great in theory, it’s untested and entirely unpredictable, necessitating a form of manual backup to serve as a complement.

If manual pushes remain, search marketers will need to remain vigilant, always watching for the next big change. Historically, these pushes have come without warning or instruction, and have shaken up what were considered “best practices” in SEO before their release.

The Case for Gradual Manual Updates

Though it’s possible that more big manual pushes wait on the horizon, it’s more likely that Google will stick with the slow, gradual manual updates that it’s been using for the past few years. These pushes happen manually, so they protect against the total reliance on machine learning for algorithm advancement, but they’re less intensive and more flexible than their larger, more significant counterparts. This retains some control for Google, and since its algorithm is in excellent shape currently, these gradual pushes spare it some effort.

If gradual pushes remain the mainstay, there’s almost nothing you need to change. Most current best practices will remain in their current form, demanding that you continue producing good content, building offsite relationships, and so on. The tweaks will come so slowly and imperceptibly that your bottom line will be barely affected.

The Case for More Machine Learning Updates

Gradual manual updates don’t take much time, but they do still take time. In Google’s ideal world, everything will be fully automated—just look at their efforts into self-driving cars. The likeliest scenario is that Google will strive for more updates like RankBrain, eventually turning its full algorithm into a giant, self-regulating, self-updating behemoth.

Even if Google decides to opt for an all-machine-learning version of its algorithm, it will be some time before it can handle such a task. In the meantime, we’re likely to see a gradually shifting hybrid of machine learning and gradual manual updates. This will give you time to prepare for unknowns, and gradually introduce you to the machine learning algorithm of tomorrow.

Parting Thoughts

If you’re worried about what RankBrain will mean for SEO, don’t be. Because RankBrain is more focused on analyzing and mapping out user queries than it is sorting out sites for potential rank, you won’t have to change much about your current strategy. Best practices are still best practices, and there aren’t any strange new ranking factors to learn and implement. Still, don’t be surprised if you see some ranking shakeups coming out gradually in the next few months as RankBrain scales upward. Because ambiguous long-tail phrases are the main targets here, if your strategy revolves around long-tail keywords, you might see a small hit in overall ranks (or a small boost, depending on the queries). But it’s nothing to be concerned about, and it doesn’t demand any significant changes to your existing approach.

It’s likely that more machine learning algorithms will arise in Google after RankBrain, eventually shifting the algorithm from a monthly, gradually updating one to one that completely updates by itself. It’s likely that big manual updates are pretty much done for, and that gradual manual updates will serve as a complement in the interim. This means that your strategy won’t require much adjusting, especially in the short-term. Changes will happen so gradually, you’ll barely notice them, and by the time machine learning fully takes over Google, you’ll be well versed in its abilities (as long as you keep yourself in the loop).

Ultimately, RankBrain is an impressive and significant addition to Google’s algorithm, but it isn’t going to revolutionize the world of SEO the way algorithm updates like Panda and Penguin have. However, this is a major step for Google in the introduction of machine learning to its central processes. Keep watch for other, similar machine learning segments of its algorithm to be introduced, and stay on your toes to react to those changes. You never know what Google will come up with n

The post Google RankBrain: What is it? How does it affect SEO? appeared first on SEO.co.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Table of Contents

+ Introduction
+ Most Important Off-Site SEO Factors
+ Off-Site Issues that Impact SEO Rankings
+ Critical Off-Site Content & Link Building Considerations for Search Rankings
+ Off-Site Strategies to Improve Conversion Rates
+ Conclusion

Introduction

Here’s the thing: no one really knows the exact formula Google uses for ranking websites, and no one knows exactly how sites are ranked for various keywords. Google has set out some guidelines to live by, and by trial-and-error, top SEOs have uncovered some a few factors for optimizing a site for search.

Most Important Off-Site SEO Factors

Let’s analyze six of the most important off-site SEO factors you should always pay attention to, based on the rules by which Google — and other search engines — want us to play the SEO game.

Creating backlinks
Backlinks are one of the foundations of SEO. Backlinks are outside links that point back to your site. In other words, they are links from other people’s websites. People choose to link back to your content because they have found it to be relevant and useful.

However, you can also create backlinks yourself by posting content that links back to your site via social media profiles and directories.

These days, one of the best and most powerful ways to create backlinks is to do guest posting on other authoritative and high-PR blogs.

Also, if you want to attract tons of high-quality backlinks to your site, it inevitably comes back to this: You have to create interesting, timely, and relevant content that people active in your niche would want to link to, and you have to do it on a consistent and continual basis.

That is the ultimate guarantee that links around your site will be created naturally.

Quality, quality, and quality
Linking back from sites with higher pagerank than yours will help jack up your own pagerank.

However, you need to be careful when picking sites from which to link back. Before Google did a major overhaul on their algorithm, quantity seemed to be the dominating factor behind successful off-site SEO strategy. Today, quality trumps quantity.

Quality backlinking calls for links from very relevant and high authority sites. You want to create links from sites with good reputation, and ideally they will relate only to your niche.

Relevance
The more relevant the site you are linking from, the better. There’s little advantage in creating backlinks from home improvement sites if you are working around the gadgets and electronics industry.

Diversify
Don’t just link from one source or one type of site. Link back from as many different kinds as possible. Create links from blogs, industry directories, article directories, forums, and social media properties.

The more you diversify, the more you’ll attract traffic from a variety of sources. This will increase your chances of winning a favorable ranking from Google.

Pace naturally
Creating several hundred links to a new site within days is a recipe for disaster. It raises red flags and runs the risk of being deemed unnatural. Keep the process natural by building several links at a regular pace; say, two to five per business day.

Anchor texts
Here’s where Google Penguin has lowered the boom on many thousands of sites. The norm used to be for SEOs to create tons of exact-match anchor texts.

These days, the Penguin wants to see variations or you will pay a huge price and see a significant drop in rankings and traffic.

You can still use exact-match anchor texts, but keep them to a minimum. Use related terms for variations.

Off-site Issues that Can Effect Rankings

Almost every strategy under the SEO umbrella can be categorized as “onsite” or “offsite.” Onsite refers to all the site structuring, basic setup, and ongoing work you do on your domain, while offsite refers to anything that happens away from that domain. Strategies like guest posting, link building, and social media marketing all fall into the offsite category, and are critical if you want to rank for any cluster of keywords.

Depending on the size of your site and on how many people have access to it, odds are your onsite structure and content aren’t going to change frequently. Occasionally, you should run an onsite audit to ensure no new pages have gone untitled or no duplicate pages have been indexed, but unless there’s a serious performance issue with your site, it’s unlikely that an onsite hiccup can cause your rankings to fall. If you see unexpected volatility and your onsite SEO is in order, the only reasonable possibility is that something has gone wrong offsite.

There are five common offsite SEO hiccups that can interfere with your rankings, but fortunately, all of them have relatively easy fixes:

1. Low-quality source links.

If you’re experienced in SEO, you know the deal; offsite links are necessary for building authority and building on low-quality sources is easy, but can actively damage your reputation depending on the source. A rogue link pointing to your domain on a scam site or a virtually unknown publisher could drive your domain authority down and prevent you free gaining any positive momentum.

There are a few ways links like these could pop up. They could be remnants from an older strategy, or links you forgot you built. They could be links built by someone else on your team without your knowledge. They might have even been built without your company’s consent. In any case, you can find them using a link search tool like Moz’s Open Site Explorer, and usually get them taken down with a simple request to the webmaster in question.

2. Heavy-handed or spammy links.

Just because your link is on a medium- to high-authority source doesn’t mean you’re out of the woods. Google’s Penguin update detects the natural or unnatural presence of links, and can penalize those that appear to have been built for the sole purpose of manipulating rank. For example, if your link appears randomly in an online forum thread about a topic completely unrelated to your industry, it could register as spam. If your link is embedded in keyword-dense anchor text with no legitimate purpose, it could register as spam.

These links are a little harder to detect, so you’ll have to be dutiful in your scan. Again, a search tool is useful here, but you’ll have to dig a little deeper and use your best judgment if you want to estimate the perceived “naturalness” of the link in question.

3. Excessive link exchanges with one source.

You might have posted heavily on your first guest posting opportunity, rather than seeking out newer sources. You might have two separate sites and link between them to boost each other’s rankings. Whatever the case, if you have too many links pointing to your site from one source, it can make Google think you’re trying to game the system. You can use a search tool to evaluate this, but chances are, you won’t need one. If you’re engaging in a link exchange like this, even an innocent one, you’ll need to supplement it with other outside sources and more nofollow links.

4. Inconsistent NAP entries in local citations.

This is a specific problem for local SEO, but no matter what your goals are, it’s worth fixing. Your NAP information refers to your company’s name, address, and phone number—the information Google thinks is most important to searchers. If this information exists on third party directories and review sites, but is inconsistent with the NAP info you have on your site, you could miss out on achieving a local rank. Work with these directories to ensure that all your information is accurate, complete, and up-to-date, and reach out to new directories to prevent the problem from recurring.

5. Incomplete or nonexistent social media integration.

This one comes with a clarification—few actions on social media directly influence your domain authority or rank. However, Google does index social media content, and your presence on these external profiles could give it more accurate information to index on your company (especially for queries with rich answers). Similarly, neglecting to include social media icons on your site stifles users’ abilities to share your content, restricting the social signals your company can earn and reducing your potential reach. To prevent this, claim all your major social media profiles, fill out your information completely, and make it easy for users to share your onsite content.

When you see your rankings shift, don’t panic. It’s something that happens even to the best and most experienced search marketers of the world. The key is to track down the source of the problem quickly and resolve the issue with surgical precision. Even if you don’t see volatility in your rankings, it’s a good idea to occasionally check your link profile, local citations, and social media profile statuses, as it’s easy to overlook common issues or slip up and inadvertently damage your own reputation. Still, if you work quickly and remain vigilant, there’s no reason these small offsite hiccups should be anything more than a minor, temporary inconvenience.

Creating Content for Building Off-Site Backlinks

There’s one critical fact you must consider above all others: Google still relies on offsite links to evaluate domain authority. That means even though links are more rigidly evaluated, they’re still an important factor for your SEO campaign. Building links isn’t the problem; instead, it all comes down to how you build them.

If you build links with the sole intention of artificially increasing your rank, you’re going to get penalized. If, however, you focus on building links with quality offsite content, you’ll be able to reap the benefits of a large-scale offsite link building strategy without facing the risks. The question then becomes how do you write high-quality offsite content for link building?

Why Offsite Content Is Different Than Onsite Content

There are two demands for onsite content. First, you must write in a way that’s pleasing to your users—your goal is to show off your brand voice, entertain or inform your users, and make them want to come back for more. Second, you must write in a way that informs search engines about the nature of your business, using keywords and topic choices that convey accurate ideas to its hyper-intuitive crawlers.

Offsite content has a different set of goals. You’ll want to make sure your content is valuable, but making a striking impression isn’t quite as important. In fact, you may want to write in a style other than your brand voice, depending on your goals. For example, if you’re looking to guest post and build your reputation offsite, you should focus on maintaining a consistent brand and quality. However, if you’re merely looking for a vessel to build links, you can spend less time and focus on a standard production.

Assuming you’re trying to write content solely as a vessel for link building, there are several qualities you’ll need to consider.

Length

The length of your link building content needs to be substantive, but not over-the-top. Anything less than 300 words isn’t worth writing because it barely registers as a full article. Anything longer than 1,000 words is too much effort. As for the ideal range between those two extremes, that’s up to you. What type of content are you writing? How detailed do you need to be? The answers to these questions should point you in the right direction.

Topic

Because your offsite articles aren’t going to be directly posted on your main site, you have much more flexibility with the range of topics you offer. You won’t have to adhere to a certain theme or follow any particular protocols. However, you will need to select topics that are at least peripherally related to your industry. The goal here is to ensure that Google reads and categorizes your content appropriately; otherwise, it could get mixed signals about the nature of your business and your keyword rankings could become unpredictable.

Structure

Like with any piece of content online, your offsite link building content should be structured in a way that’s inviting to a reader. Include subsections, headings, bullet points, and stylistic differences that make it easy to navigate the greater article. This will make your article seem more valuable, and stray readers might eventually wander to your site, giving you some bonus referral traffic in addition to your domain authority building strategy.

Link Presence

The number and type of links you include in the body of your article both affect how Google crawls and interprets your material. If you include too many links, it could register as spam. If you include too few, you could waste your effort. If you include too many of the same link across multiple articles, your domain authority could suffer.

Unfortunately, there’s no single rule that dictates the best link types to include. Your best bet is to diversify your strategy, using as many different links as you can and varying your link frequency from few links to many links. On the whole, one link per 300 words is a good rule of thumb, but you should still diversify regardless.

Quality

Your content needs to be well-written, no matter what. Google’s search bots can detect the unnatural use of language, so it’s going to tell if you’ve simply outsourced your article writing to developing countries. Double check your content for spelling, syntax, or grammatical errors, and make sure all your facts are both accurate and cited. Just because your content is offsite doesn’t mean that Google won’t dock you for the quality of your work.

Frequency

Generally, it’s unwise to post too many new links at one time. Spread your link building article publications out over the course of weeks or months, regularly and consistently posting to ensure an even build. The number of articles and links you can get away with building depends in part on the size of your organization; too many external links too soon for a new business might seem out of the ordinary, while that same number for a long-established major corporation might not trigger any red flags.

While a content strategy is usually seen as the onsite portion of your SEO campaign, it’s also critically important for the success of your offsite strategy. Once you’ve mastered the process of writing, publishing, and syndicating linked offsite articles, you’ll be able to easily and steadily increase your domain authority without interfering with your other efforts.

Off-Site Strategies to Improve Conversion Rates

There’s one limiting factor holding back conversion rates from being infinitely inflatable, however: there’s a finite number of onsite changes you can make before you start running out of ideas or grasping at straws. Making your call-to-action (CTA) more prominent and making the conversion process easier will greatly increase your conversion rate, but once you’ve burned up all the standard best practices, all you’re left with are experimental changes like button colors and wording tweaks, which can only increase your conversion rate by small degrees.

My solution to this is to look outward, rather than inward. Instead of hunting down every last onsite strategy useful for increasing conversion rates, start looking offsite. In particular, there are three offsite strategies I’ve known to be useful in maximizing your onsite conversion rate:

1. Work on Your Reputation.

The first strategy might be the most obvious, but it’s one that often goes neglected by busy business owners focusing on bottom-line revenues. Your brand’s online reputation is hard to accurately or objectively measure; you might use your number of social followers or your search ranking to get an indirect idea of how well you’re faring against the competition, but brand recognition and brand trust are more subjective factors, not tied to any one metric.

The more users recognize and trust your brand, the more likely they’ll be to convert. Whether you’re selling a funny T-shirt or enticing email subscribers with a free eBook, if a user encounters your CTA and thinks “oh, I know these guys!,” he/she will be far more likely to pull the trigger. The way to build this trust and name recognition is through ongoing relationship management offsite.

There are a few ways to do this. First, work on making your social media profiles more prominent and more active, and don’t be afraid to reach out to new people (especially influencers). The more active you are on social media, the more people will learn to recognize you. Second, get your material published in higher circles. If you’re already published in local news outlets or niche industry forums, strive for something more national and visible to the average consumer. The more publication outlets you have under your belt, the more your name will come up (and the more trustworthy your brand will seem when it comes time to make a purchasing decision).

2. Pre-Qualify Your Leads.

Next, work on pre-qualifying your leads. This is the process of filtering out any inbound traffic that doesn’t have any chance of buying from you in favor of traffic that does. As you might suspect, there are a number of ways to do this, and most of them start offsite. One caveat to this: as you start filtering out irrelevant traffic, you’ll notice your traffic figures start to decline. As long as your conversion rates correspondingly increase, it shouldn’t concern you.

One of the easiest ways to pre-qualify leads is through highly targeted content. Whether you’re distributing your material through multiple external publishers or just syndicating your stuff on social media, put an extra emphasis on content that can only appeal to people late in the buying cycle, or those actually willing to buy from you. For example, if you sell bike tires, articles like “What’s the best bike tire for mountain bikes?” is much more targeted to interested buyers than articles like “How to prepare a mountain bike for spring.”

You can also pre-qualify leads by segmenting your audiences on social media. On LinkedIn, this could mean getting involved with specific Groups more than others. On Twitter, this could mean creating custom lists based on your follower demographics. On Facebook, this could mean utilizing geo-targeting. How you pre-qualify your leads is up to you; what’s important is the increased relevance of your inbound traffic.

3. Optimize Your Calls for Traffic.

This is a strategy related to point two, since it involves increasing the relevance of your inbound audience. But rather than filtering out uninterested segments of your target audience, this strategy is all about increasing the trust and interest level of your existing followers.

Post a diversity of different calls for traffic, including discount offers, different types of content, sales, and links to internal pages, then analyze the behavior patterns of traffic coming to your site from each type of post. Think of this as an AB test that occurs before your users are ever exposed to a CTA, with your CTA being consistent in both rounds. Eventually, you should notice a pattern of more users converting after coming to your site from specific types of posted content. Increase the prevalence of this type of content on your social circuits, and you should see a correspondingly higher conversion rate.

Conclusion

These offsite strategies, when working in conjunction with proper onsite conversion optimization, can take your conversion rate to new heights. As with any conversion optimization strategy, data is your best friend here, so try out these strategies independently against a control group before you make a final determination of what’s effective and what’s not. Eventually, you’ll find the right combination of tactics for your business to maximize its lead and sales pipeline.

The post Off-Site SEO: Improve Your SEO Rankings with These Off-Site SEO Tactics appeared first on SEO.co.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Table of Contents

+ Introduction to Mobile SEO
+ Why Mobile Friendly Isn’t Enough for a Great Mobile Experience
+ Differences Between Mobile & Desktop SEO
+ You’re Losing Money if You’re Not Mobile Optimized
+ Why Flash Mobile Sites are Bad for SEO
+ Biggest Myths in Mobile SEO
+ WordPress Plugins for a Mobile Friendly Website
+ How to Optimize a Site for Mobile Devices
+ How to Increase Mobile Site Speed
+ Mobile App Optimization
+ Conclusion

Intro to Mobile SEO

There have been a number of misconceptions and half-truths circulating about mobile optimization, mostly as extremist responses to major announcements by tech companies like Google, and a panic that’s set in thanks to the rising trends of mobile use in most demographic segments. Fortunately, once you understand it, mobile optimization is relatively simple, and your site might already be in the clear. Still, there’s an ongoing component to mobile optimization—striving for a perfection that can never be reached—so there’s always more to learn about the process.

What is mobile optimization?

Here’s the simplest definition of mobile optimization you’re liable to find on the Internet: mobile optimization is changing your site to be as usable and convenient as possible for users on mobile devices. Ten years ago, mobile devices didn’t exist (or at least, weren’t popular), so most sites were designed specifically for desktop screens. Mobile screens, like those on smartphones, offer a handful of unique elements that desktop-designed sites can’t address:

  • Smaller screen sizes make it harder to view full-size pages, especially when it comes to viewing images and reading text.
  • Finger-based interactions make small, precision buttons on desktop sites hard to manage.
  • The diversity of devices available makes it hard to present an all-in-one solution.
  • Mobile browser compatibility is not universal, and not all types of code show up for all browsers.

Mobile optimization strives to fix all these problems.

Why optimize for mobile?

You may be asking yourself what the benefits of mobile optimization are. After all, a good chunk of your user base is still accessing your site through desktop devices, and even those who aren’t can get most of the same experiences even on the un-optimized version of your site, right?

Consider these benefits of mobile optimization before neglecting the strategy altogether:

  • SEO. Google (and other search engines) are staunch supporters of “ideal” mobile experiences. They want every site online to be “mobile friendly,” and they’re taking action to make it happen by penalizing sites that aren’t optimized for mobile and rewarding sites that are. Just by optimizing your site for mobile, you’ll earn higher positions in Google search results, resulting in more traffic to your site. In addition, you’ll earn a little badge next to your site’s name, telling users that your site is, indeed, mobile-friendly:

(Image Source: Google)

  • User experience. Some users are going to access your site through desktop, but the impressions mobile users get from a site are substantial. If a first-time visitor on a mobile device sees your content not loading properly or has a poor experience, he/she may not come back. Even loyal customers who don’t have a great mobile experience could leave you in favor of a competitor who can offer such an experience. Both your customer satisfaction and your brand reputation are on the line here.
  • Rising importance. These benefits are fantastic today, but what you really have to consider is their future value. Mobile devices and mobile web browsing are poised to surge dramatically over the course of the next several years. The longer you wait, the more benefits you’ll miss out on, and the worse position you’ll be in for the coming years.

Let’s take a look at the factors shaping mobile user experiences, and how they relate to mobile optimization overall.

The Mobile Landscape

We’re in the middle of an era that revolves around mobile experiences, and it’s not going away anytime soon.

Rising trends in mobile use

It was May of 2015 when Google announced that mobile searches had overtaken desktop searches for the first time ever. Now, we’re on an ever-accelerating upward trajectory, with mobile use still growing and desktop use starting to look more and more obsolete.

(Image Source: SmartInsights/ComScore)

Why the steep growth? Mobile Internet access used to be nothing more than a novelty, to be used in rare circumstances by a fraction of the population. Coverage was limited, speeds were egregiously slow, devices were clumsy, and smartphones were only in the hands of the super tech-savvy. But slowly, tech giants have favored mobile use with innovative features like better touchscreens, voice-activated search, faster Internet, and better geographic positioning. Collectively, these improvements have led more users to rely on mobile devices, which in turn has prompted more tech companies to invest in mobile technology. It’s a self-perpetuating and exponential cycle with no end in sight.

Google’s response

Google is one of these forerunners of mobile technology, and they’re one of the biggest influencers of this steep rising trend in mobile use. The company unveiled its Voice Search product back in 2002, and local search started developing even before that, but they’ve been two major areas of development in the past decade. Voice search has become more intuitive, local search has been integrated with mobile, and most importantly, Google started giving ranking advantages to sites that ranked well on mobile devices. For a while, this was somewhat informal and unspoken, but back in April of last year, it took a massive leap forward.

Mobilegeddon v. 1.0

Announcing the update nearly two months in advance, Google proactively warned webmasters that on April 21, it would be launching a massive update to reward sites that had been properly optimized for mobile and penalize those that had not. This was a rare move for the company, as most of its search algorithm updates came as undocumented, unannounced surprises that the rest of us optimizers had to scramble to try and crack. Now, Google heads were telling us exactly what to expect—more or less.

The search community went on a rampage, donning the coming update as “mobilegeddon,” and using it as an opportunity to wrangle up business from webmasters who hadn’t yet updated their sites for mobile devices, or how exactly to go about it. Some insisted that this reaction was overblown, and to a degree it was, but the impact of “mobilegeddon” was still significant.

(Image Source: SearchEngineLand)

It’s not impossible for non-mobile-friendly sites to rank today, and desktop searches weren’t hit as hard as mobile searches, but it’s still a significant difference to note. Without a mobile-friendly site, your SEO potential is seriously compromised—and that update is here to stay.

Why Mobile Friendly Isn’t Enough for a Great Mobile Experience

When Google released the so-called “Mobilegeddon” update, it penalized any site that didn’t appear properly on mobile devices. Though the update was made out to be more significant than it actually was, millions of business owners breathed a sigh of relief when they realized their sites weren’t hit.

But unfortunately, that sigh of relief has led to a degree of complacency. Because they survived the Mobilegeddon update, they believe their site to offer a great mobile user experience. However, this is a product of a misconception: that being “mobile friendly” in Google’s eyes means you’ve essentially won the battle for mobile supremacy. Unfortunately, this correlation doesn’t exist.

How Google Defines Mobile Friendly

People feared Mobilegeddon as some massive, unpredictable, and subjective algorithm with the potential to crush their sites. But Google was open and honest about its intentions and its standards since it announced the update nearly two months before its rollout.

There are only a handful of standards that Google holds for mobile sites:

  • Loading and display. The page should be accessible to mobile devices, and without any technologies that aren’t compatible with mobile devices, like Flash. Horizontal scrolling is not acceptable.
  • Content availability. Words should be readable without zooming and images should load properly.
  • Touchable links. Links should be spaced so that users can tap them easily.

And if you’re ever in doubt, Google offers a convenient testing tool you can use to determine whether any page on your site is or isn’t mobile friendly.

All this seems relatively simple and straightforward, but it isn’t all that matters to give your users a great user experience.

Site Speed

Mobile Internet speeds are generally slower than connected ones, and users are as impatient as ever. If your site takes more than a few seconds to load for a mobile device, a user is far more likely to bounce and never come back. It’s possible to have an aggravating site loading speed but still pass Google’s mobile friendly test, so if you want to offer the best possible mobile experience, you need to reduce those loading times by optimizing your site and eliminating unnecessary material.

Aesthetic Appeal

“Aesthetic appeal” is a subjective feature, but it’s also an important one. Good design features of a desktop site don’t always apply to a mobile site. For example, mobile devices tend to sport a “stacked” vertical look far better than a desktop site. Responsive sites add convenience for developers, but can also result in awkward-looking pages on a mobile display. Your main navigation can also appear strange if not designed and implemented properly. There are very few objectively “right” or “wrong” ways to design your site, so put it to the test with user experience trials to see what people actually think.

Multimedia Content

Removing all the images and videos on your site is not the best solution to reducing site loading times. People still love seeing visual content, and it’s going to help take your site to the next level. Being judicious about the format and placement of these pieces of content is vital to ensuring the best possible experience.

Full Functionality

Mobile users should have the “full” experience that your desktop users have. For example, if your desktop site has an e-commerce platform and a checkout process, your mobile users should be able to access it. Adding those extra functions can be tricky if you’re hosting separate versions of your site, but it’s necessary if you want to create a seamless experience.

Humans or Search Engines?

Making your site mobile friendly will stop you from getting penalized or ranked harshly in Google’s search, but as I’ve demonstrated, that isn’t enough to stay competitive in the mobile arena. But is optimizing your mobile user experience going to help your rankings or is it all about making your visitors happy?

The short answer is, improving the user experience can increase your ranks peripherally. You’ll have lower bounce rates, people will recognize your brand more, they’ll want to link to you more, and you’ll see a host of other benefits, all of which can incidentally increase your rank. As far as we know, there are no direct “bonus points” you can win from Google by having a “mobile friendlier” site as opposed to an average “mobile friendly” one.

That being said, your users are why you exist. Making them happy should always be your primary objective. Forget for a moment that mobile usability is a ranking factor in Google at all—if you could make your users happier by making a simple tweak to your site, wouldn’t you still?

It’s in your best interest to offer your mobile users the best possible experience you can. As new technologies like the Apple Watch and Google Glass start to herald an era of wearable technology, an adaptive mobile experience will become even more important. Figure out what your users need to be happy, and do your best to give it to them.

Differences Between Mobile & Desktop SEO

Mobile SEO has been in the spotlight for a few years now, as mobile traffic has risen dramatically to overtake desktop traffic in total volume. Google has added fuel to the fire by making mobile SEO an absolute must, launching its “Mobilegeddon” update earlier this year to penalize any webmasters who haven’t taken the time to optimize for mobile. Shortly after, Google’s John Mueller released a statement that to rank successfully in Google, a desktop site isn’t really necessary as long as you have a good mobile site—so what’s the deal? Are mobile and desktop SEO really that different, and if so, do you need both to succeed in SEO?

Why the Terms Are Differentiated

Why is there a “mobile” and “desktop” SEO in the first place? Why isn’t there just standard “SEO?”

In SEO, the terms “mobile” and “desktop” can actually apply in two different contexts. The one that people usually consider is a traditional desktop site versus a traditional mobile site—meaning that “desktop” and “mobile” refer to two kinds of sites, one of which is intended to function on desktop machines, and one of which is intended for mobile devices. This gets complicated because when these terms started appearing, most mobile sites were separately hosted versions of desktop sites (or else were found on subdomains via redirects). Now that responsive sites, which function seamlessly on both desktop and mobile devices, have emerged this terminology gets fuzzy; a responsive site is typically considered a mobile site, merely because it performs well on mobile.

In this context, “mobile” is important because it refers to a site’s ability to be easily loaded and viewed on a mobile device. “Mobile-optimized” and “mobile compatible” are often used here as well. Because desktop sites were the norm for so long, it is assumed that all new sites are automatically “optimized for desktop,” and because mobile devices are smaller and more finicky than desktop devices, even new “mobile sites” (responsive or otherwise) don’t bear much risk of sporting a desktop loading error.

The second context for “mobile” and “desktop” is more specific to SEO itself—Google actually produces separate results based on whether a user is performing a search on desktop or a mobile device. A few years ago, this meant your mobile searches were far more likely to fetch results that were mobile-friendly, and you might see different layouts for your destination SERP. Today, thanks to Mobilegeddon and gradual aesthetic tweaks from Google, mobile and desktop results are pretty similar. Being optimized for mobile can actually help your rank even in desktop sites.

With that explanation out of the way, we can start digging into what could qualify as “mobile SEO” versus “desktop SEO.”

General Best Practices

When it comes to general best practices for SEO, mobile and desktop SEO are practically identical. Your rankings in both types of SERPs depend on your domain authority, onsite content, availability and functionality for mobile sites, security, site speed, inbound links, social integration and shares, and an appropriate technical structure. With these in place, along with an ongoing content and audience development strategy, ranking in mobile and desktop results should be more or less the same.

A Single Site

As I mentioned above, the term “mobile SEO” only came about because desktop sites were so dominant, and because mobile sites used to be hosted or developed separately. Now that responsive sites have offered an all-in-one solution, there’s no reason why a modern webmaster would be concerned with separating the terms. All sites should be accessible on both types of devices no matter what kind of SEO strategy you want to follow. In this context, there isn’t a major differentiator between “desktop” and “mobile” SEO—even if desktop rankings are all you’re after, you still need a mobile-optimized site.

Unifying SERPs

Mobile and desktop SERPs are different, with different layouts and ranking structures, but they’re gradually growing to become more similar. Take, for instance, the new local 3-pack on desktop results, which emulates the traditionally mobile local 3-pack. Results are also becoming more similar as the months go on, reducing the need to differentiate the terms.

Mobile-Specific Strategies

Despite all the similarities and shared space that desktop and mobile SEO offer, there are still some mobile-specific optimization strategies that only help your mobile visibility:

  • Voice search. Mobile users are beginning to rely on voice search more than typed search, so long-tail and conversational keyword phrases are more important for mobile visibility than desktop.
  • Map optimization. Most people use Google and Apple maps exclusively on mobile devices. Ensuring that your information is available and accurate is crucial to gain more mobile visibility.
  • Local listings. Local results exist in desktop searches too, but they’re far more prominent and important in the mobile world.

Even so, adopting these strategies won’t offer any miraculous turning points to your strategy by themselves. They’re only effective if you already have a responsive, functional responsive website (which works for both desktop and mobile optimization), and you’re adhering to all other SEO best practices across the board.

You’re Losing Money if You’re Not Mobile Optimized

Roughly ¾ of mobile searches will result in conversion or other actions

Google and the respected Nielsen group recently released findings on a massive mobile search study: If someone is searching for your service or product, he or she is most likely to purchase, call, visit your store, etc., within ONE HOUR.

About 17% of mobile searches ended in the user visiting the store or making a purchase. Also, 25% of users visited the store’s site to get more info. Take a look at these actions, according to study:

Pretty amazing, right? But guess what? If they search for what you have to offer and visit you, let’s just hope they land on a mobile-friendly version of your site, or they’re probably just going to click out of there and go to a competitor’s site that IS mobile-friendly.

When they find your site, the rest is all on you. It’s crucial that:
  • Your site is mobile-friendly
  • They can easily find whatever they’re looking for
  • It is also easy for visitors to take action: tap to call, make a purchase, etc.

I know it’s like beating you over the head with this, but again: You have to have a good, clear, mobile-friendly site that gives people what they were looking for.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Backlinks are a vital part of any SEO strategy because they serve as a roadmap for search engines. The more quality, authoritative links you have pointing back to your site, the more “authority” your site is deemed to have, and the more likely it is that you’ll rank for a given keyword or keyword phrase.

Over the years, this has become less of a mathematical problem and more of a relationship-based one; Google once calculated a site’s authority based on the sheer number of links, but due to spam and aggressive rank manipulation, it now only favors natural, relevant links.

What does that mean for you, the link builder?

It means you need to be careful where and how you post links. If you’re suspected of spamming unnatural links for the purposes of increasing your page rank, you could earn yourself a Google penalty. Since it’s still important to build links if you’re going to succeed in your search engine marketing campaign, you have to recognize which sources of links are valuable and which ones are dangerous.

For starters, make sure you avoid these 11 risky types of backlink sources and link building companies:

1.Article Directories.

Article directories used to be a big deal in the world of SEO. They were cheap, easy, useful ways to syndicate one article and generate hundreds of links pointing back to your site from other directory participants. Too many people took advantage of this, syndicating poorly written, keyword-stuffed fluff, and Google took notice.

Since then, Google’s algorithm has penalized backlinks that have been generated from low-quality article directories, and has scouted for any duplicated content on the web that could be evidence of such a practice. There’s nothing inherently wrong with guest posting, but avoid duplicating your content through article directories if you don’t want to earn yourself a penalty.

2. Link Farms.

Link farms, as their name suggests, are low-quality resources that exist solely to host links for other sites. They serve no purpose on their own other than to increase traffic and page rank for other sites. This may sound almost benevolent, but since they exist only to manipulate page ranks, they’ve earned a smackdown from Google. You’ll likely never see a link farm on the first page of Google again and for good reason—they do nothing to improve the online user experience or provide any meaningful information. If you happen to find one, don’t even think about using it to post links to your own site.

3. Exchangers.

First, let me say that relationship building is great in an online context. Finding relevant partners with whom you can share content is a great way to mutually build authority and share an audience. However, if you excessively exchange links with a consistent other source, you could get some serious negative attention from Google. Diversity counts, so if too much of your backlink profile is dependent on one or a handful of other sources, it could look bad for all of you. Feel free to post on each other’s blog, but don’t exchange a series of links with another source unless it’s only a small part of your overall strategy.

4.Link Wheels.

Link wheels are essentially the pyramid schemes of link building, and just like pyramid schemes, they are a waste of time and money. Link wheels are artificial means of passing link juice through a series of links that point to each other. To put it simply, link wheels attempt to fool search engines by building a pathway that passes authority from site to site. Like most linking schemes, on paper it seems like a good idea and it used to work marvelously. But Google is sharp, and they’re wise to all these tricks. Link juicing in itself isn’t a terrible strategy, but trying to build a link wheel will work against you.

5.Conversation Plugs.

Trying to slip your link into an irrelevant conversation, or posting your link at the end of your comment regardless of the circumstances is an example of a bad backlink strategy. While forum and blog comments are ordinarily great resources for quality link building, if your message is intended solely to build your authority with a backlink, you’re in for trouble. Instead, focus on finding ways to incorporate your links naturally into the conversation. Seek out threads that give you an opportunity to present yourself as an authority. Then, make sure your link points back to a relevant page—and not the same one over and over.

6. The “Too-Good-To-Be-True” Builders.

If it sounds too good to be true, it probably is, and that rule applies to link building companies as much as anything else. Some link building companies will go out of their way to promise the world to you, guaranteeing an insane amount of links in a ridiculous timeframe, or making a promise of a specific rank by a certain date. Nothing is guaranteed in the SEO world, so if a link building company is making hefty promises, it’s best to avoid them entirely.

7. The Dirt-Cheap Builders.

Going along with the “too good to be true” theme, if your link building company is offering you insanely low rates for their service, consider it a red flag. As a consumer, you should do your research and shop around for the best deal, but cheap link building usually means bad link building. Your link building strategy is an investment. If you buy a used car for $100, you can expect that car to break down on you in a relatively short timeframe. Frugality doesn’t always pay off.

8.Irrelevant Directories.

There are some directories that are beneficial for link building. Highly focused, niche-based directories try to organize and build relationships between companies in the same industry, and relevant links you post there will count positively toward your own authority. However, posting links in a directory that has nothing to do with your industry is a negative practice that should be avoided at all costs. Google knows what type of company you are; if you’re a hardware store posting in a restaurant directory, you might as well ask for a penalty directly.

9. The “Click Here” Types.

If you see a link building company advertising with a flashing, poorly designed “CLICK HERE NOW!!!” style banner ad on a website, that probably isn’t a good company. If a company is willing to resort to such low-quality cheap tactics for their advertising, they’re probably willing to pull a similar stunt in their link building process. Instead, look for a link building company with a solid reputation and attention to detail.

10. The Link Building Exclusives.

If a company “specializes” in link building, it might be a bad sign. Odds are, their “specialists” are busy posting hundreds of links to all kinds of sites, with little regard for the quality or relevance of the links. Link building is just one piece of the SEO puzzle, and if your link building company is exclusively working in link building, there’s a high chance they don’t fully understand the scope of modern SEO and they won’t give you the results your business deserves.

11.Non-Newsworthy Press Releases

Ordinarily, press releases are a great source of quality brand links. They’re examples of well-written content that showcase your brand, point to your site, and can be accepted by some of the highest-ranking news authorities in the country. However, if your press releases are not newsworthy, your strategy can do more harm than good. Posting too many press releases for the sake of posting press releases is considered a type of spam; you’ll have duplicate content all over the web, and a series of boring, fluff-filled updates pointing back to your site. It’s a bad way to build backlinks, and can damage the reputation of your company.

Unfortunately, posting in the right places isn’t enough to guarantee that you’ll avoid a penalty. The context and frequency of your links is also important. Even when posting to a reputable site of high authority, you need to ensure that your link is valuable to readers and relevant to the conversation. Also, be careful not to post too many links to the same place in the same location, or you could get negative attention for spamming. Use Nofollow links to mitigate these risks, and encourage brand mentions whenever possible.

The biggest takeaway here is that there is no guaranteed safe haven for posting backlinks. It’s good to avoid the notorious low-quality culprits in the list above, but you still need to make sure all your links are spaced out, diversified, and relevant in order to see the best results. Quality link building takes time and patience, and unfortunately, there are no shortcuts.

The post Link Building Companies & Strategies to Avoid appeared first on SEO.co.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Table of Contents

+ Introduction – What Are Spam Backlinks
+ How Spam Backlinks Can Harm SEO Rankings
+ How to Use Moz Spam Analysis to Test Your Links
+ Other Spam Backlink Checkers
+ How to Manually Remove Spam Backlinks
+ Additional Spam Backlink Fixes
+ Conclusion

Introduction to Spammy Backlinks

Anyone who’s engaged in link building for SEO in the past few years can tell you the biggest—and most important—concern of the strategy: getting penalized for posting spammy links. The era of quantity-based link evaluation has gone away completely thanks to revisions of Google’s Penguin update. The search engine giant can now tell easily whether your link is built “naturally,” with the intention to increase value to web users, or “unnaturally,” with the sole intention of increasing your rank.

What are spam backlinks?

SPAM is as old as the Internet itself, so most people will know what it is, at least in a general way, just from hearing the word in context. Still, not everyone is so Internet savvy, which is why it’s important to define exactly what a SPAM backlink is. To be succinct, SPAM backlinks are like fleas on the hide of your digital puppy (or the analogically challenged, the puppy is your, or indeed any, website). They take without giving anything in return, and are a major nuisance to anyone plagued by them.

SPAM backlinks usually manifest in the form of comments to blog posts, replies to forum threads, and so forth. The offending SPAM – which, by the way, is just another way of saying “junk” – will contain a backlink. Backlinks are like breadcrumbs or a signpost pointing from one site to someplace else (usually the spammer’s own site or its affiliate(s). The goal of such things is usually monetary, but it can vary. For instance, SPAM backlinks may also be created on behalf of one’s own website without the knowledge or consent of the owners or administrators. Wherever they point to in the end, these SPAM links will hog site resources, damage your site’s credibility (no one likes to read machine-generated gibberish!), and generally cause a lot of unwelcome maintenance / clean-up headaches for those affected.

Worst of all, if your website gets a reputation with Google for allowing, promoting, generating, or otherwise engaging in SPAM-based activities, they may just decide to de-index your website, which is a fancy way of saying “Now you see it, now you don’t”. De-indexed websites can obliterate years of hard work in a millisecond.

Not all links are created equal, however. Some backlinks pass on positive ranking juice, others pass on negative ranking juice and still others are ignored by Google altogether. Toxic backlinks are backlinks that harm a website’s search engine optimization (SEO), or the ability to rank well in a Google search. Paid links, links received from link schemes, link wheels and blog networks, and links from porn, gaming or payday loans sites are all considered toxic.

How Spam Backlinks Can Harm SEO Rankings

The fact that old spam backlinks harm rankings is old news. In fact, there are a number of different backlinks that are current today that harm the ranking on your website. Old spam backlinks simply add to the amount of people who wander away from your website due to the amount of spam coming from you. If you have these backlinks on your site, find them and get rid of them right away. They are most definitely hurting your rankings and pushing users away from your website and ultimately, your business. While you are in the site cleaning out these old backlinks, make sure you complete some further backlink maintenance. Google is paying attention closer than ever because their reputation depends on their ability to give their users the highest quality possible; you want your website to be part of their high ranking algorithm.

Google and Inbound Links from Bad Sites

Some people claim that Google will not move your site down in the rankings due to links from bad sites. In fact, there have been articles written on the “myth” and how it’s not true. Recent publications from top SEO experts confirm that it is not a myth – inbound links from bad sites will hurt your ranking just as much as the old spam backlinks. You will need to know about these links, how to find them and how to get rid of them if you want your site to continue to rank high among Google.

What actually happens is that a black hat SEO master will link to your site and Google will “award” you with a bad point against your website. It happens, much to everyone’s chagrin. Google could penalize sites right away with bad links no matter how popular the site has gotten on the search engines. Unfortunately, no one knows with confidence how or why it happens, or how long the penalty will be attached to your site. While the situation is quite uncommon, it still happens. It’s wise to get rid of these backlinks as soon as you find them to avoid being penalized by Google and dealing with all the unknowns.

Updated Bad Links

Old spammy backlinks are bad enough, but building new backlinks in the wrong way is a double nail in the coffin. The links chock full of keywords with no readability or the sites linking to irrelevant websites are coming on strong, especially with all the rising competition. This is exactly why Google had to change the algorithms and make things harder for these culprits. New backlinks will be the new bane to your site after you have eliminated all those annoying old spammy backlinks, which are holding your site back in the rankings.

These links are doing damage to your website consistently. As an entrepreneur found out recently, these links can hide and pull your ranking down quickly before you realize what is going on. Sometimes, this activity isn’t found until your website completes a full audit of the full site. The problem is that incoming links are essential to Google’s new algorithm, so how does Google now view links?

At its most basic, Google looks at how many links are pointing to a website and uses that to determine how well the site will rank on its search engine. If the world were a fair place, a site with a greater deal of incoming links would rate higher on the search engines and life would proceed as usual. However, this isn’t a fair world, thanks to those black hat SEO pirates.

Thankfully, however, quality is rising above quantity and web searchers don’t need to weed through bad, possibly even harmful, links just because there happen to be a greater deal of links on one certain site. Those off-shore SEO companies which linked everything to everyone are no longer valid in the eyes of Google.

These Google changes happened in 2011, when Google began to realize what was really happening. Bad links were hurting websites continuously and people were beginning to wander away from Google to keep their computer safe from spam. Gone were the days when links were a dime a dozen; quality needed to matter to get yourself high on Google’s rankings.

Unfortunately, some webmasters, and companies, didn’t keep up with the trend fast enough. When the switch happened, it happened fast. Some companies didn’t have time to update the site, didn’t know the links were on the site or simply ignored the inevitable, that Google was going to penalize them for these links. The results were dramatic.

The fallout was terrible. Many websites fell in rankings quickly. A number of businesses got caught in this crossfire and were left dramatically low on the search engines, where they had once been top-notch in their business categories. The scramble to fix the problem was on for many website and SEO companies. Obviously, bad backlinks, old spam backlinks and black hat SEO can upset your website and can make your rankings completely tank.

Why is it Important to Remove Toxic Backlinks?

Toxic backlinks reduce the Page Rank of the sites they link to so website owners want as few of them as possible. If Google notices that a site has a fair number of toxic backlinks, it likely will reduce the site’s Page Rank. If Google notices that a site has a very large number of toxic backlinks, it likely will exclude the site from its database altogether.

How Do I Know if I Have Toxic Backlinks?

There are several ways you can find out if your site has toxic backlinks. Here are four common ways.

  •  You Know Because You Created Them

If you created the toxic backlinks yourself through suspicious web activity such as paying for links or content spamming, then you already know the links exist.

  •  You Receive a Warning Notification from Google

It is possible that you didn’t create the toxic backlinks yourself. They may have been created by a scammer or a questionable SEO company you hired in the past. If you are genuinely unaware of the toxic backlinks, you may not find about them until you receive a warning message from Google.

  • You Notice a Sudden Drop in Traffic

If your organic traffic levels suddenly dropped, particularly around May 22, 2013 when Penguin 2.0 took effect, toxic backlinks are likely the cause.

  • You Pull Up a List of Your Backlinks to Check

If you are nervous about the latest Penguin 2.0 update like many other website owners are, you may want to pull up a list of your backlinks just to check for any suspicious links. You may or may not find any, but it never hurts to make sure.

Where Can I Find a List of My Website’s Backlinks?

Google recommends that the first and only place you should look to find a list of your website’s backlinks is in your site’s Google Webmaster Tools. To do this, log into your Google Webmaster Tools account and select traffic, then links to your site, then more. Here you will find a sampling of the backlinks to your blog. However, if you wanted a complete list, you will need to pay a backlink checking service such as Ahrefs, SEOmoz or Majestic SEO.

How Do I Determine Which Backlinks are Toxic?

Take a look at each of your links to see where they are coming from. You will want to keep links that come from high-quality sites and get rid of links that come from low-quality or spammy sites. If your links come from any of the following types of sites, they are most likely toxic links and need to be removed.

  • Obviously spammy sites, porn sites, payday loan sites or gambling sites
  • Sites that are not indexed by Google
  • Sites with a virus or malware warning
  • Sites that have no Domain Authority
  • Sites with very new domain names
  • Sites with very little traffic
  • Link networks
  • Sites with an unusually large number of external links
  • Irrelevant sites

You will also need to remove all paid links and some site-wide links, such as those that show up in website footers and blogrolls. Not all links that come from sites such as these are toxic, however. It is important that you go through each of your backlinks manually so that you can be sure to delete all the negative ones while maintaining all of the positive ones.

How to Use Moz Spam Analysis to Test Links

Up until this point, determining whether your link is “natural” or “unnatural” has been grounded in solid evidence, but it’s mostly come down to a guessing game. If you choose a reputable source and post a link you genuinely think is helpful to the conversation, then in theory, it should be considered a high-quality link. Still, it’s easy to doubt yourself and worry about whether or not Google is picking up on your link building attempts and considering them to be unnatural.

Fortunately, Moz just released a new tool that might help put an end to those speculative worries. Operating under the Open Site Explorer tool you’ve probably used to map out your backlink profile in the past, the new “Spam Score” is designed to objectively measure how natural or unnatural your link appears.

How the System Works

After a few thorough rounds of research, Moz data scientist Dr. Matt Peters eventually boiled down the deterministic qualities of an unnatural link to 17 factors, which he called “spam flags.” The more of these spam flags a link has, the more likely it is to be penalized and the less authority it’s going to pass.

Spam Score, the name for Moz’s objective measurement, is a calculation of how many spam flags a subdomain shows. At this time, it does not function at a page level, nor does it function at an overall root domain level, but this shouldn’t stop you from gaining some key insights into whether or not your link has been posted on a high-quality site. You can find the Spam Analysis tab under Open Site Explorer—right now, it’s only available for subscribers, but you can sign up for a free trail to access the feature or wait until Moz inevitably rolls out the feature for free to all users.

Once you’ve selected a specific subdomain, the system will evaluate it based on those 17 spam flags, and tell you how many of those spam flags it is demonstrating. Between zero and four flags means the site is low risk, between five and seven flags means it is a medium risk, and eight or more flags means it is a high risk. The 17 flags are as follows:

  • Low MozTrust or MozRank Score—this is a calculation of overall domain authority.
  • Low site link diversity—this means the types of links pointing out isn’t diversified and seems unnatural.
  • Abnormal ratio of followed to nofollowed domains—high or low ratios make Google suspicious.
  • Low-quality content—if the content is thin or low-quality, it signifies a low-quality site.
  • Excessive external links—too many links pointing out mean it could qualify as a link directory.
  • High ratios of anchor text—improper anchor text use triggers a red flag.
  • Lack of contact information—without a phone number, address, or email address, the site could register as spammy.
  • Top level domain is associated with spam—if a subdomain is linked to a low-quality TLD, the subdomain becomes low quality by extension.

  • Numeral-containing subdomain—numerals are a bad idea for inclusion in a URL.
  • Few inbound links—if the site is large but contains few inbound links, its authority is weakened.
  • Abnormal ratio of followed to nofollowedsubdomains. The rule about domains applies to subdomains as well.
  • Few branded links. A lack of branded anchor text in inbound links triggers a red flag.
  • Minimal site markup. If there is too much text in comparison to HTML and JavaScript, it looks bad.
  • Few internal links. Without internal links, the quality of a site comes into question.
  • External links in main navigation. Hosting external links in a main navigation or sidebar makes a site appear less authoritative.
  • Few pages. The number of pages on a subdomain plays into how authoritative it is.
  • Excessive length. If a subdomain’s length is higher than average, it appears as a red flag.

Even if you don’t use Moz’s automated tool, you can use these 17 spam factors to evaluate whether you should post links on a particular domain.

Best Spam Backlink Checkers

In order to maximize your link profile, you need a top flight link checker. Below is a comparison of three of the best.

SEO Spyglass

SEO Spyglass Is a unique type of back link checker in that it is part of a larger suite of applications. It is, however, quite an effective tool in and of itself. If you are looking for a back link checker that is easy to use with a simple layout and intuitive structure, then this might be the application for you.

When it comes to getting rid of spam back links, the Spyglass application will download your back links from many sources around the Internet into a proprietary database. You will be able to access these links from the software directly and receive real-time data on the links that have been created.

Within each of the listings, you will be able to determine the page ranks, the anchor text and many other features of links that have your website as the underwriting entity. You will be able to easily export these reports into PDF format for the benefit of clients or for internal use in your company.

You will be able to see if your website links back to these internal links and the total amount of links that each link service such as Alexa credits your website with. With these tools, you can actually limit your exposure to different aggregation sites depending on the type of search optimization that you are looking to create for yourself.

The Spyglass application is available for a one-time fee, which can be reduced significantly if you purchase the entire suite of applications that it is a part of. It is also one of the few top-flight back link checkers that has a free trial that is available online to the general public.

Ahrefs.com

The overall functionality of the Ahrefs.com back link checker is second to none. Even on the most basic of functions, the data includes the number of links that your site has, the number of IP’s that the links are coming from, which pages are being linked to and the anchor text that is being used as well. This data can be viewed in a number of different formats depending on the characteristics that you feel are most important. The program also has its own proprietary ranking structure that has proven to be quite useful when used in conjunction with the ranking structures of the major search engines.

Data is presented with many different, completely customizable characteristics including do follow links, no follow links, links that are sitewide versus links that are not sitewide, redirect links, links that include an image, links from.gov and.edu sites and any other determining characteristic that you can think of or create. The reports also include errors, warnings and notices so that you can easily locate and destroy any links that are old or which lead to a misdirect.

The data is also presented in a highly attractive visual format that is easy to interpret, even if you have not had a great deal of experience with back links for search engine optimization. You can view links using line charts, line charts or numeric charts depending on your preference.

The pricing on this particular tool is expensive; however, there is a free option that allows you to pull a limited amount of reports and results. The data is so intensive that even the less expensive packages will provide plenty of results for small businesses. This is definitely an enterprise-level program that is scalable for growing companies. The pricing is also per month rather than a one-time fee.

Be sure that you have the money set aside in your budget every month, because once you begin using this tool, it can quickly become an integral part of your online marketing strategy. Losing it after relying on it for even a short period of time can be devastating to online strategy.

Raven Tools

Raven Tools is the reason that the much more well-known link tracking tool SEOMoz is not on this top three list. The two programs have a great deal in common; however, Raven Tools builds on many of the weaknesses that SEOMoz has failed to patch during its long tenure as one of the most well known programs on the market.

If you are familiar with the SEOMoz layout, then you will take warmly to the Raven Tools landscape. Aside from being a simple back link profiler, Raven Tools is actually a complete suite of online marketing tools that can manage your online campaigns from beginning to end.

The functionality of the reports is much wider in scope than either of the other two entries within this article; however, it may simply be too much for a company that does not have a need for a full search engine optimization suite. As a matter of fact, Raven Tools is actually much better suited for a company that has its own IT in-house, as many of the data that you receive are only useful if is able to be acted upon in real time.

The reports are easily customizable according to the data that you feel is most important within your online campaign. You can receive reports on the number of links that you have online, the ranking that the linking sites have as well as the effect that these links are creating on your own Page Rank. The tools will allow you to easily disassociate yourself from any website that is pulling down your search engine rankings.

One of the best features of Raven Tools is that it has a full month trial that is completely free and fully functional. You will be able to determine if this solution is the best back link profiler for you without any hangups in the functionality.

The blog that you can access from the website is also one of the most informative social media tools that is available within any search engine optimization company. You will always be kept up-to-date on the latest happenings in technology and how you can apply them to your current Raven Tools suite in order to give yourself a leg up on your competition. Raven Tools is also one of the best companies when it comes to creating updates based on the needs of its client base, updates that will be profiled in their blog.

How to Manually Remove Spam..
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Duplicate content is bad. Using the same content, either in total or partial form, on your website leads to a poor user experience, and triggers a red flag in Google’s search algorithm. In the old days of SEO, duplicate content was often used as a cheap trick to get more keywords and more content on your website, so Google evolved a system to weed out the spammers who violated best practices by doing this. Today, if you’re caught using duplicate content, your domain authority could suffer and your keyword rankings could drop.

Fortunately, Google is pretty fair about the issue. The company understands that the majority of duplicate content issues don’t come about as a malicious attempt to cheaply increase rank. In actuality, most instances of duplicate content are accidents or are overlooked by webmasters. Still, having too much repeated content on your site can be damaging, and it’s in your best interest to run a test to see if there is any duplication on your site.

Table of Contents

+ Introduction – Duplicate Content Defined
+ How Google Penalizes Duplicate Content
+ Syndication: Duplicate Content Across Domains
+ Duplicate Content on the Same Domain
+ How to Find & Fix Duplicate Content
+ Conclusion

Introduction

Ever since I started getting my feet wet in SEO, this question has swirled around forums and blogs. Somewhere, someone out there perpetuated the idea that having the same content on page A of your Website as page B of your Website would cause your site to be penalized in search engine rankings. This idea began to percolate in the internet marketing community because a bunch of spammers realized that when they had a piece of content (ie, an article) that was getting a lot of search traffic, they could fill up every page of their Website with the same content in order to pull even more traffic from the search engines. Obviously, the same article blatantly duplicated across hundreds of pages within a single domain is a malicious attempt to gain search engine traffic without actually adding any value. Google caught on pretty quickly to this method and fixed its algorithms to detect duplicate content and display only one version of it in the search rankings. Websites that engaged in this blatant activity were de-indexed and cried up a river across forums and blogs throughout the internet marketing community. Thus was born the fear of the “duplicate content penalty.”

However, in the vast majority of cases, duplicate content is non-malicious and simply a product of whichever CMS (content management system) the Website happens to be running on. For example, WordPress (the industry-standard CMS) automatically creates “Category” and “tag” pages which list all blog posts within certain categories or tags. This creates multiple URLs within the domain that contain the same content. For example, this particular post will be on the root domain (www.jaysondemers.com, while it remains on the first page), the “single post” version (which you can find by clicking the title of the blog), and in the “Categories” and “Tags” pages. So that means this particular post will be duplicated 4 times on this domain. But am I doing that intentionally in order to get more search engine traffic? No! It’s simply a product of the automatic, behind-the-scenes work that my CMS (WordPress) is doing.

Google knows this, and they are not going to penalize me for it. Millions of Websites are running on WordPress and have the exact same thing happening. But what if I were to take this particular post and re-post it 100 times in a row on my blog? That would definitely send red flags when Google’s crawler sees it, and one of two things will happen at that point.

1) Google may decide to let me off with a “warning” and simply choose not to index 99 of my 100 duplicate posts, but keep one of them indexed. NOTE: This doesn’t mean my Website’s search rankings would be affected in any way.

2) Google may decide it’s such a blatant attempt at gaming the system that it completely de-indexes my entire Website from all search results. This means that, even if you searched directly for “jaysondemers.com” Google would find no results.

So, one of those two scenarios is guaranteed to happen. Which one it is depends on how egregious Google determines your blunder to be. In Google’s own words:

Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don’t follow the advice listed above, we do a good job of choosing a version of the content to show in our search results.

This type of non-malicious duplication is fairly common, especially since many CMSs don’t handle this well by default. So when people say that having this type of duplicate content can affect your site, it’s not because you’re likely to be penalized; it’s simply due to the way that web sites and search engines work.

Most search engines strive for a certain level of variety; they want to show you ten different results on a search results page, not ten different URLs that all have the same content. To this end, Google tries to filter out duplicate documents so that users experience less redundancy.

So, what happens when a search engine crawler detects duplicate content? (from https://searchengineland.com/search-illustrated-how-a-search-engine-determines-duplicate-content-13980)

How Google Penalizes Duplicate Content

Google is fairly open about its duplicate content policies. According to their reports, if Google encounters two different versions of the same web page, or content that is appreciably similar to onsite content elsewhere, it will randomly select a “canonical” version to index. The example they give is this: imagine you have a standard web page and a printer-friendly version of that same web page, complete with identical content. Google would pick one of these pages at random to index, and completely ignore the other version. This doesn’t imply anything about suffering a penalty, but it’s in your best interest to make sure Google is properly indexing and organizing your site.

The real trouble comes in when Google suspects your content of being maliciously or manipulatively duplicated. Basically, if Google thinks your duplicated content was an effort to fool their ranking algorithm, you’ll face punitive action. It’s in your best interest to clear up any errors well in advance to prevent such a fate for your site.

Syndication: Duplicate Content Across Domains

Sometimes, the same piece of content can appear word-for-word across different URLs. Some examples of this include:

  • News articles (think Associated Press)
  • The same article from an article directory being picked up by different Webmasters
  • Webmasters submitting the same content to different article directories
  • Press releases being distributed across the Web
  • Product information from a manufacturer appearing across different e-commerce Websites

All these examples result from content syndication. The Web is full of syndicated content. One press release can create duplicate content across thousands of unique domains. But search engines strive to deliver a good user experience to searchers, and delivering a results page consisting of the same pieces of content would not make very many people happy. So what is a search engine supposed to do? Somehow, it has to decide which location of the content is the most relevant to show the searcher. So how does it do that? Straight from the big G:

When encountering such duplicate content on different sites, we look at various signals to determine which site is the original one, which usually works very well. This also means that you shouldn’t be very concerned about seeing negative effects on your site’s presence on Google if you notice someone scraping your content.

Well, Google, I beg to differ. Unfortunately, I don’t think you’re very good at deciding which site is the originator of the content. Neither does Michael Gray, who laments in his blog post “When Google Gets Duplicate Content Wrong” that Google often attributes his original content to other sites to which he syndicates his content. According to Michael:

However the problem is with Google, their ranking algo IMHO places too much of a bias on domain trust and authority.

And I agree with Michael. For much of my internet marketing career I have syndicated full articles to various article directories in order to expand the reach of my content while also using it as “SEO fuel” to get backlinks to my Websites. According to Google, as long as your syndicated versions contain a backlink to your original, this will help your case when Google decides which piece is the original. Here’s proof:

First, a video featuring Matt Cutts, a well-known blogger and search engine algorithm engineer for Google:

The discussion on syndication starts at about 2:25. At 2:54 he says you can tell people that you’re the “master of the content” by including a link from the syndicated piece back to your original piece.

More evidence:

In cases when you are syndicating your content but also want to make sure your site is identified as the original source, it’s useful to ask your syndication partners to include a link back to your original content.

And finally:

Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.

Now, what I think is interesting from this last quote from Google is that they actually admit that the piece of content they choose may not be the right one. In my experience, it’s very likely not to pick the right one if the site that originated the content is relatively young or has a low PageRank. So this raises the next big issue:

How do I get ranked as the original source for the content I syndicate?

In a past life, I syndicated tons of my articles to EzineArticles only to see Google credit them with higher search results for my content, even when I made fully sure that Google had indexed my content at its original location prior to submitting it to Ezine. Vanessa Fox, who previously worked at Google and built Webmaster Central, attempts to tackle this question in her blog post, “Ranking as the Original Source for the Content you Syndicate.”

Unfortunately, she concludes that, basically, there’s nothing you can do to ensure that you do. She suggests:

Create a different version of the content to syndicate than what you write for your own site. This method works best for things like product affiliate feeds. I don’t think it works as well for things like blog posts or other types of articles. Instead, you could do something like write a high level summary article for syndication and a blog post with details about that topic for your own site.

Rewriting a piece of content is not my definition of syndication. That’s just rewriting an article in different words and distributing it. Almost all information circulating on the Web has already been posted elsewhere anyway; even this blog post is composed of a ton of information that I found elsewhere on the internet. So to me, writing a new article that says the same thing in different words and distributing that to syndication partners isn’t really syndication of the original article. It’s syndication of a different article. So we’re still left with the question of the results of syndicating the exact same content that already appears on your Website: what are the effects of doing so? Can it harm my rankings in any way?

To me, this is the most important question surrounding duplicate content. Before I jump into that analysis, let’s consider an important foundational question.

Why would I want to syndicate the exact same content from my Website elsewhere?

The internet really operates on a simple economy of give-and-take. The two commodities that are exchanged are unique content and backlinks. Unique Content is defined as content which Google does not identify as duplicate. There are various theories about where exactly Google draws the line of deciding whether content should be considered duplicate, but one figure I’ve heard tossed around a lot is 30%. Basically, according to the 30% theory, if Google identifies that more than 30% of a particular piece of content appears elsewhere across the internet, it’ll be categorized as duplicate. Now, I can’t attest to the accuracy of this figure, so take it for what it’s worth. There’s also various duplicate content-detection software such as CopyScape which is designed to help Webmasters check to see if their content has been stolen and duplicated across other domains. This is also a good tool to use to determine whether your content is likely to be considered duplicate by Google. And that’s what really matters.

But I’ve gotten a bit off track, let’s get back to the discussion of why you’d want to syndicate content. I mentioned the internet economy of backlinks and unique content. Unique content is desirable because it will be indexed by Google, giving that particular Website another instance of its “name in the hat” so to speak. Basically, the more content a Website has indexed, the more chances it has of being returned in Google’s search results for relevant queries.

But what about backlinks? Backlinks are simply links from any other Website to your own. Search engines consider it a “vote” when one Website links to another. This vote is used to determine authority & relevance in Google’s search results. In fact, it’s thought that backlinks are the single most-important factor in determining how your Website should rank for a given query. There are a ton of factors that play into backlinks and how much their “vote” counts for, but I’ll get into that in a future blog post. For now, what you need to know is that backlinks are valuable because they improve your rankings in the search engines, and that means more traffic to your Website.

OK, so now we’ve covered the basic commodities of the micro-economy of the Web. This is important because when you syndicate your content, assuming you have included a backlink in it linking back to your original source, you get a backlink from each and every Website to which your content was syndicated. Awesome, right?

Maybe not. The first question is how highly Google values a backlink from a piece of content that is known to be duplicate content. Frankly, I don’t know. On the one hand, it’s easy to syndicate content to a bunch of auto-accept blogs if your sole goal is to get backlinks, and this says nothing about the quality of your content or how much the originator of the content should be rewarded. On the other hand, syndication can also be a great indicator of the quality of a particular piece of content. After all, why would it be syndicated so much if it weren’t really great?

In the end, Google probably has signals for how it answers these two questions, but the real answers are probably only known by the software engineers that coded the algorithm. Many folks try to boost the value of their syndicated content by engaging in content “spinning” which is perfectly legitimate as long as it’s not the garbage that’s often spouted out by automated software. I’ll go into more depth about content spinning in a later post. For now, we’re still trying to answer the question of whether syndicating content exactly as it appears on your own Website is a good idea or a bad idea. After careful testing I’ve come to the following conclusion:

.

…….

*drumroll*

……

*more drumroll*

…..

Maybe.

I know, I know. That’s not the answer you wanted. Allow me to explain.

I own over 50 domains, and I like to do a lot of testing across them. I spent a couple hours last night performing searches for my content that I had syndicated to various other blogs and directories. And what I found was both disappointing and encouraging.

The disappointing part was that, in many cases, my syndicated content outranked my own original content. Even if a site ranked higher than mine for my own content had a backlink to my site, the originator of the content, it was like Google completely ignored that backlink and still gave more credit to the other sites. In some cases, my own site’s version of the content was nowhere to be found, obviously falling into Google’s duplicate URL cluster and being filtered out of the search results. This means that by syndicating my content, I actually, in effect, got my own content de-indexed.

This is pretty much the worst possible scenario, but it happened. Sometimes, at least. And that’s the weird part; sometimes, my content was recognized as the original content and received the highest ranking. With other sites and pieces of content, it ranked second behind a high-authority site, usually EzineArticles. So I have to conclude the following:

When you syndicate your content, it might:

  • Cause your own, original content source (ie, your Website) to be, in effect, de-indexed for that piece of content
  • Cause your site to rank highly for queries relevant to your content, but not highest
  • Cause your site to rank highest for your content

Well, that pretty much covers all the bases, doesn’t it? These are all the results I observed when looking at my own sites and the results of syndicating articles that originated on those sites. Basically, I can conclude that Google just doesn’t always get it right. And, Google doesn’t like to do anything with any sort of consistency. The last thing they want is for us SEOs to completely figure out their algorithm, because once that happens, the integrity of their search results will be destroyed as folks manipulate them all to hell.

The encouraging part was when I discovered that the backlinks from the syndicated content definitely helped my sites’ rankings for my target keywords. So there is definitely at least somevalue of backlinks originating from content which Google has labeled as “duplicate.”

So, the final question remains: Should I syndicate my content?

Let’s look at the benefits of doing so:

Benefits of syndicating your content
  • Get backlinks from lots of sites
  • Expand your reach and brand awareness to highly-trafficked sites
  • Get direct traffic via referrals from backlinks in your syndicated content
  • Much cheaper way of getting backlinks than writing brand-new content (or re-writing existing content) for distribution/syndication
Drawbacks of syndicating your content
  • The sites to which you syndicate might actually outrank you for your own content if they have higher authority than your own site, even if you follow Google’s advice and include a backlink to the original source of the content
  • Google might group the URL on which your content resides with the rest of the duplicates, hiding it from search engine results pages (effectively de-indexing it)

So, in the end, syndicating your content is risky. You can definitely get the best of both worlds if Google decides your site is the originator of the content, thereby rewarding your content with the top position in the search results and also getting all the juicy backlinks that play into your overall rankings for specific keywords. But if Google gets it wrong (and it does, quite often, contrary to what they might think), you risk having your content never rank for relevant search engine queries.

And this really worries me, because I’ve always held the opinion that there’s nothing else someone else can do to harm the rankings of a particular Website. After analyzing these results, I fear I’ve found a loophole in my own argument; If someone else visits my Website, copies all my content and syndicates it around the Web, it’s possible that the sites to which my content was syndicated will actually rank higher for it than my own site. Google tries to address this problem here as well as in the Matt Cutts video:

In most cases a webmaster has no influence on third parties that scrape and redistribute content without the webmaster’s consent. We realize that this is not the fault of the affected webmaster, which in turn means that identical content showing up on several sites in itself is not inherently regarded as a violation of our webmaster guidelines. This simply leads to further processes with the intent of determining the original source of the content—something Google is quite good at, as in most cases the original content can be correctly identified, resulting in no negative effects for the site that originated the content.

Again, unfortunately I have to point out that in my own experience, repeatedly, I’ve seen my own content rank worse than the sites to which it was syndicated. So even though Google thinks it’s good at identifying the original source of the content, my data suggest otherwise. In time, we can only hope that Google improves this..

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview