Loading...

Follow SEO Blog | SEO Professor on Feedspot

Continue with Google
Continue with Facebook
or

Valid

I once attended a meeting with a client who (to cut a long story short) was questioning the point of doing SEO for his local businesses when all he could see in local search results was the local pack and adverts. He had a point.

For anyone who doesn’t know what a local pack is, here is an example for my local area below:

The local pack results are usually displayed when Google or Bing detects that the most relevant search result will be a local one. The problem for website owners who don’t feature in it is that most people probably won’t to scroll down to click on anything below these results.

This can make doing SEO seem like a futile exercise when the local pack and Google ads combined can occupy nearly a quarter of the first page of search results.

As everyone in the search industry knows, the number of clicks declines dramatically the further down the page your website appears in those search results.

The chart below from advancedwebranking.com shows that if a website occupies position 1 on the first page they can expect to enjoy a 36.5% click through rate on average compared to just 5% of clicks for position 5.

So if you are cracking open the champagne celebrating a page one position, don’t expect a torrent of visitors when what you’ll probably get is a trickle.

If your website is towards the bottom of page one in the organic listings, you will need to sit there being very patient. Curiously there is very little difference in click through rate between the bottom of page one and the click through rates for page 2. The click through rate is even worse for mobile search results.

The good news is the client I mentioned earlier failed to see the bigger picture about SEO. SEO is as much about finding opportunities to gain those coveted top spots as it is for optimising purely for the organic search results.

One thing is certain though. If your local business isn’t appearing near the top of the search results pages you might have a problem.

So how do we solve this problem of getting your business into the local pack? Let’s unravel the mysteries with these 7 tips.

1. Display an address on your website and be consistent

Local optimisation starts with your website. Make sure you have your up-to-date business address displayed on your website. It is also useful to include a map of where your business is located and importantly the location and address must be consistent across all the other websites your business name appears on.

2. Create and optimise a Google/Bing Business Page

It’s easy and free to set up Google and Bing business pages. Simply go through the instruction on how to set up a page and pinpoint your business address on your local map. The listing will need to be verified so any address provided should be accurate. Ideally this address should be within the boundaries of the town or city you are targeting. This isn’t always possible unless you buy or rent an office or use your personal address but don’t be tempted to use a virtual address. Registering an address in the same building occupied by several other businesses with the same address can make it less likely you’ll appear in the local pack.

3. Ensure your web pages are optimised for local search

Next up, you will need to optimise the pages of your website for local search results. If your website ranks prominently in organic search results, it stands a better chance of also appearing in the local pack. On page optimisation includes meta titles and descriptions as well as the copy presented on your web pages.

4. Set up social media pages for your local business and be active

While there is no proof that social signals will get you into the local pack, they should not be discounted. If they feature your business address then this will provide another signal that your business is a legitimate one with its base in a particular locality. In my experience, however, even popular social media pages on Facebook don’t guarantee a prominent ranking in the search results or local pack when they are not combined with all the other signals mentioned. The strength of social signals is somewhat overrated.

5. Seek links from local business directories

Directory sites such as Yell provide a good way to gain localised links and provide a further local signal. One thing to note is directories should be good quality and be relevant to your business. Anything else will be seen as spam. Local chamber of commerce websites are also worth looking into but you will need to become a member to have your website featured and this will involve a significant financial investment over time if you are just in it for a link.

6. Work on improving click through rates

You’re unlikely to be too concerned with click through rates if your website doesn’t already feature on page one but there are suggestions in the SEO community that high click through rates correlate with higher rankings. As I mentioned earlier higher rankings in local search can help your website get into the local pack.

7. Encourage people to review your business

If your business is receiving lots of reviews this will show that customers are engaging with it. Also once you are in the local pack then you probably want your business to stay there. One way to bolster your position in the local pack is to gain lots of reviews via your Google Business listing. People can and do abuse this by hiring people to leave reviews for them which is unethical and very much in the news at the moment but for those of us who are honest then why not encourage real customers to leave some positive feedback from time to time?

The post 7 Things You Need To Do To Optimise For The Local Pack appeared first on SEO Professor.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Google appears to be rolling out another algorithm update just over a month after the last core algorithm update. This means it is worth checking if you have lost or gained some keyword rankings this week. If it is the former then you may need to check the quality of your web content.

Google hasn’t announced anything officially but there is plenty of discussion taking place in the SEO community and volatility in SERPS is being reported across all the major SEO tracking tools including SEM rush and Rank Ranger highlighted below:

So What’s Occuring With This Algorithm Update?

The update hasn’t created as much volatility as the one in March but it is still a significant one and may have a wide ranging impact.

This continues the pattern of Google making regular tweaks to its algorithm as it attempts to rid the search results of websites displaying poor quality content. This latest update is the third one of the year so far with the first occurring around January 20th and the second on March 8th.

It is believed that this latest update is, as with others to have taken place this year, aimed at maintaining the quality of results. So websites with dodgy thin content or content that simply duplicates what others are writing could feel the impact of this latest update.

The advise for website owners who wish to avoid penalties is to continuing to focus on creating not only good content but the kind of content that attracts and keeps an audience on the page. It also means organising content so that it is easily accessible and fits into a well- structured website that loads fast on mobile as well as on desktop.

*It has since been confirmed that an update did take place in April. Rather than content quality as was speculated on many forums the update targeted content relevancy. So the advise if you have suffered any rankings drop from this latest update is to continue to work on the relevancy of your content to search queries. More information on Google’s latest core algorithm update can be found here.

The post Warning Signs Of New Google Algorithm Update appeared first on SEO Professor.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Job posts on websites are fairly common and can help drive traffic to a website even if those jobs sometimes only exist as bait to get people on a website or make a business look successful enough to be recruiting.

Well Google is now going after companies who don’t remove their expired job listings from the index according to this update on Search Engine Land.

No reason is given as to why Google is targeting job posts but the assumption is that expired job posts are not going to offer a great user experience for job hunters.

Google’s advice on job postings is as follows:

“Jobs that are no longer open for applications must be expired in one of the following ways. Failure to take timely action on expired jobs may result in a manual action.”

While this may not result in a sitewide penalty it may result in your job schema markup not being visible in search results in the future.

Schema markup is a set of tags often referred to as microdata which is used on a website to improve the way search results are displayed in SERPS. Here is an example of how sites that have schema markup have jobs displayed in search results taken from the Google Developers page.

In an effort to warn and advise site owners, Google has updated advice in its Webmaster Guidelines on how to deal with expired job posts including; the removal of schema JobPosting markup, adding noindex meta tags, or removing the page so that it returns a 404.

The post Google May Penalise Expired Job Posts appeared first on SEO Professor.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Because nobody knows much about what goes on behind the scenes at Google, anything that sounds bad like a 404 error is automatically viewed as damaging.

Fortunately, for anyone tasked with trying to remove 404 errors, this is not necessarily the case and can be a waste of time and effort if those urls are no longer serving any kind of purpose.

When dealing with 404s, a lot depends on the history of the urls returning those errors. So next time you log into Google’s Search Console and see that alarming chart showing 404 errors, take a step back and assess what the causes of the errors are before acting on them.

404 errors are not in themselves a bad thing in all cases. 404s are simply created when a page url can no longer be found. This might happen, for example, on an e-commerce store where products are discontinued. The page is removed and a 404 is returned instead. Over time that page will eventually drop out of the search results altogether which can either be good or bad depending on what was featured on the page.

When 404s are bad

It would be wrong to say 404s don’t cause any problems. If any of the following cases apply then you will need to do something about the error(s).

The page had lots of inbound links pointing to it

One of the most important ranking factors for a website are inbound links. Particular web pages may accumulate lots of inbound links over time if they serve useful information to their audience. Blog posts, homepages and useful information pages are the ones most likely to attract links. If any of these pages are returning a 404 then all the rankings for that page may be lost as well as a substantial amount of traffic to the site when this takes place.

A new page has been created covering the same topic

If you have created a new page on the website covering the same topic when a previous one needed updating, then the old page should be permanently 301 redirected to the new one to preserve any benefits the old page might have had.

Following a site migration

Site migrations and redesigns often open up the possibility of an increase in 404 errors. For example a page may cover the same topic with a slightly different theme meaning the old one is not redirected. This is why it is important to take a record of all the old urls of a website so that checks can be made following a site migration to ensure all urls are 301 redirected to their new homes.

When it’s Ok to Ignore 404s

As I mentioned earlier there will be plenty of cases where 404s are returned and this is ok if those urls have no inbound links and no regular traffic coming in. For example, you may have had a page dedicated to a one-off event but that event has now expired. It would be a bit daft to redirect that url somewhere else irrelevant just to avoid having a 404 error.

Conclusion

When it comes to 404s a degree of common sense is required when dealing with them.

404 errors will eventually fall away over time anyway when a page has expired, although they may still exist for a considerable amount of time. As long as you use Google search console and any of the other paid website SEO tools to identify all urls returning a 404, you should be able to handle any important errors that slip through the net.

Avoiding harmful 404s is all about preparation. Ensure all important urls are recorded prior to a site migration and analyse traffic and links to a page carefully before removing it from your website.

As John Mueller says in this video covering the subject, having 404s will not negatively impact the entire site.

Google Search Console: What should I do with old 404 errors? - YouTube

The post Will 404 Errors Harm My Rankings? appeared first on SEO Professor.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The one thing bugging me about SEO at the moment is search snippets.

While there can be no debate that search snippets are an important part of SEO what is up for debate is the length of them now that Google has moved the goalposts in this area.

So why is it that many website owners appear to have been slow to catch on to the new character limit introduced towards the end of 2017?

Everyone in SEO knows that having search snippets set right can provide an easy way to gain an advantage over your competitors in the search results.

While meta descriptions, no matter what length they are, won’t have an impact on rankings they are still crucial to getting people to click on the result.

Looking at the example below for SEO companies in Chester, most have either lengthened their meta descriptions or Google may be dynamically producing them (we’ll come to that later) with just the one exception in these results:

I then looked up SEO companies in Manchester and I got these results:

All the results on this page have the old shortened version of their meta descriptions. It isn’t clear if the website owners intend to display these shorter descriptions or if they are simply not aware of the recent changes.

Either way there is some confusion over whether or not we should manually lengthen meta descriptions so that we can occupy more real estate in the search results or if we should just leave things as they are.

The current views on search snippet lengths are conflicting

On the one hand we have the basic rule of advertising which is the more space you get in front of people the better it is for visibility and that would include search results. On the other we have Google’s advice.

What Does Google Have To Say?

Danny Sullivan’s advice is that website owners should not attempt to lengthen meta descriptions.

Yes. It’s not your imagination. Our snippets on Google have gotten slightly longer. And agree with @rustybrick — don’t go expanding your meta description tags. It’s more a dynamic process. https://t.co/O1UTyFeNfA
— Danny Sullivan (@dannysullivan) December 1, 2017

More on this here: https://twitter.com/dannysullivan/status/936780855019581440?ref_src=twsrc%5Etfw

But if it is a dynamic process and Google can set the length of search snippets dynamically, surely this would then eliminate the need to edit meta page titles and descriptions altogether. There wouldn’t be any point in trying – would there?

The evidence from our small sample of search results suggests otherwise and leads to the conclusion that if you don’t lengthen the meta descriptions yourself, your snippet ends up occupying less of the page than your competitors who may well be enjoying more click throughs than your result.

My personal view is that meta descriptions should be lengthened and not be left as they are. The reason for this conclusion is that having worked on several of the more popular platforms including WordPress, the meta description limits have been extended to suit the new 320-character limit from the previous 165 characters.

If there was to be no manual control over meta descriptions, then it would negate the purpose of WordPress plugins such as Yoast and so far this hasn’t been the case.

Does a longer meta description result in more click throughs?

Another concern if you have shorter snippets in the search results is the risk of losing clicks to your competitors. For this we need to find something that supports the theory that longer snippets result in higher click through rates.

Fortunately there has been some research that proves this is the case on a small scale in this interesting article on SEM Rush. The experiment concludes that longer snippets result in a 36% improvement in click through rates.

Can google set your meta descriptions automatically?

Another interesting article on the subject of meta descriptions on the Moz highlights another experiment to find out if it is worth setting meta description tags at all if Google has the ability to change them. The results are that 55% of queries from a large sample use the original meta description without any major changes.

The conclusion

The main takeaway from weighing up the current evidence is that it is probably wise to at least start to experiment with making meta descriptions 300 characters long and see if it is indeed worth lengthening them or to at least include more intro copy to your pages just in case.

Leaving short meta description tags can and does run the risk of snippets being shorter than rival sites in some cases, although how much of your snippet gets shown may depend on the level of competition in each location.

The post Should You Expand Search Snippets To Fit Google’s New Character Limit? appeared first on SEO Professor.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview