Loading...

Follow The Edit Blog on Feedspot

Continue with Google
Continue with Facebook
or

Valid

With Part 1 out of the way, it’s time to get down to the good stuff that we live and breathe – SEO & UX.

How To Use Chrome Puppeteer to Fake Googlebot & Monitor Your Site, Tom Pool

Really great to listen to Tom’s enthusiasm for this topic and it has given me dozens of ideas of how to use this and have already begun working on it with our tech guys.

Taking a headless Chrome browser and using it to “fake” Googlebot is a really clever idea that allows you to not only monitor your own site but any site you want.

An obvious starting point is to automate checks of your clients and their competitors sites so to be able to give regular and consistent guidance on changes your competitors have made and overlay that with organic performance for actionable suggestions.

The more Google makes it difficult for programs to scrape it’s SERPs the more this sort of stuff will come to the fore and be common place across SEO Teams.

View the slides here

How To Trim JS, CSS & External Stuff To Slim Down & Speed Up Your Site, Chris Simmance

I’ve worked alongside Chris very closely in the past at Optus Digital, and he’s someone I consider a good friend. I also witnessed the inception of Under2 and got to see a lot of the brilliant work that Chris, Shane and the rest of the team can produce first hand. I was very excited to hear this talk!

Key Points:

  1. The internet is very slow and it is getting slower because of the increase in users and current infrastructure. The strain will only increase because of the IoT and we need to build better for the sake of the internet and its users (us!).
  2. Cue a hilarious video about “How the internet works”. You can watch this on his slides here.
  3. CNN has a really slow site compared to the BBC (featured a few times in other talks too, sort it out CNN!) because of their background resources, ads and trackers.
  4. Off-site resources can be the root of most speed problems.
  5. Bloated assets like CSS (unused CSS) are also not considered enough. Test your site to see what visual elements are being loaded but not being used by the page.
  6. Images can be the biggest site speed pain. Use Base64 encoding, next gen formats and compression to get the best out of them.
  7. Stop using so many trackers if possible. Don’t use plugins if you can help it.
  8. Use caching, look into varnish on your server and make sure you are running the latest PHP version (a lot of people are running the old, end of life version 5.6)
  9. Apache and Nginx servers are generally better for speed optimisation than an IIS server.
  10. Use a good front end framework.
How to use Programmatic to drive search results internationally, Gemma Houghton

I headed to the International stage to hear from Webcertain’s Marketing Director, Gemma Houghton. Gemma’s talk provided an insightful overview of Programmatic Advertising, looking at how you can optimise your campaigns to drive search results.

Gemma started off by explaining that Programmatic Advertising is an automated marketing method, based on real time bidding on advertising inventory, targeting specific customers in a specific context. According to Gemma it’s the preferred method of display advertising, targeting people instead of placements, devices and platforms. One of the many benefits is that you can run all campaigns across all platforms from one place and it’s great for reaching audiences at scale.

Here’s what I learnt:

  1. Programmatic helps us do our jobs better! It’s great for driving efficacy as you can automate activity which takes manual work away so you can focus on other things.
  2. We need to be thinking about audiences vs keywords – Gemma stresses that keywords do still matter but with Programmatic you aren’t focusing on specific things people are searching for, you’re looking at the bigger picture focusing on things that will capture people’s attention.
  3. Are the robots coming? Gemma highlights that this is a genuine concern for marketers in the industry but she reassured us that the robots in fact aren’t coming and we very much still need people. People still need to put it all together in order to tell the machines what to do and we need to be monitoring the campaign and keeping an eye on things to ensure campaigns are effective as possible.
  4. Creative, creative, creative! Gemma says that the creative of your campaign is SO important but often gets overlooked. You always need to be thinking about the banners you use in your campaigns as they are the way you represent your brand and they are what will draw people into your campaign, they need to look good and appeal to the user.
  5. When looking at international campaigns it’s important to consider culture, the imagery needs to represent the country your campaign is targeting. Make sure you’re using images that mean something to the people your ads are aimed at, so your ad doesn’t get ignored.
  6. Frequency capping – Is your ad being seen by the same people, on multiple sites? Gemma ended her talk by highlighting that you don’t want your ad to be everywhere as this can alienate people, targeting the same people with the same messaging.
Why UX is SEO’s Best Friend, Michelle Wilding-Baker and Luke Hay

The talk was split in two, with Michelle kicking us off from an SEO POV and Luke backing her up from a UX standpoint.

Michelle started her talk by outlining that, with Search becoming ever more user focused, it’s important to outline what it is users want – which can be summarised as follows:

  1. Fast Site & Page loads
  2. Relevant Content – meet User’s intent
  3. Optimised for all devices

We should focus on User Optimisation and not Search Engine Optimisation, as providing the best content for our User’s will enable us to rank better organically.  Reaching rank #1 isn’t the hard bit anymore, maintaining it is, which can be combated by not only providing the best content but also by providing the smoothest user journey.

Luke supported this with 3 main takeaways:

  1. Conversion Persuaders & Blockers
  2. User Research & User Testing
  3. Embed a User Centred Culture

Luke advised all SEO Specialists to work closer with UX, Data & Dev Specialists and to focus on the  key blockers of  slow page loads, negative price perceptions, poor usability, and too much or wrong information as the reverse are key persuaders for conversion.

Both speakers highlighted the importance of conversions, with Michelle even claiming that “traffic in isolation is just vanity”, what we need to understand is the ‘why’ determine what users care about. This is done by setting up clear conversion goals and user testing properly on real users to understand what they want to get out of their search.

The post What we learnt at BrightonSEO: Part 2 appeared first on Edit..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

BrightonSEO is our (second) favourite search event of the year because of course, we have a bias. It’s packed full of practical, in-depth talks from real experts giving their advice on what they know best – search.

If you weren’t lucky enough to get one of the ballot tickets this April, then we have you covered and decided to a rundown of the key takeaways from our favourite talks.

In this rundown we’ll run through some of the great presentations we went to on some often overlooked aspects of SEO that should still be part of your strategy.

Why I adore Sitemaps, an ex-Google engineer’s love story, Fili Wiese

During this insightful session, Fili looked at one of the often overlooked basics of technical SEO, sitemaps.

As an agency, we do a lot of work on sitemaps and there are some clients that are still not sure how much value they offer.  Additionally, there are a few different types to get your head around which can cause some confusion.

We started by learning that sitemaps aren’t there to manage crawl budget, they are they to manage crawl priority.  Fili then looked at HTML and XML sitemaps and the differences between the two. The key point I took away is that XML sitemaps are the way to go; they allow you to manage additional resources such as images, videos and hreflang’s.

With different types of content needing to be increasingly managed across regions, XML sitemaps are a valuable tool for site owners.

Why we should stop ignoring Bing, Julia Logan

Julia uses Bing as her primary search engine at work and at home. And she thinks we should too. Why? Well, healthy competition in the face of the Google monopoly is no bad thing. People change their search engines very rarely, which only strengthens Google’s position. If we were all inclined to try something new occasionally, then Google’s monopoly could be compromised.

Bing’s UX is also better than you might expect. It’s interface scores better than Google in blind “taste tests”. Bing is also faster than Google and Bing let you submit up to 10K ULRs today (far more than Google allow).

Bing’s grip on the market share is tightening. The rise in the use of voice search is occurring across multiple devices and Amazon, Xbox and IE voice searches are all powered by Bing. Could we see it overtaking Google one day? Would you want to help them on their way by switching your search engine today?

Black Friday SEO: where and when to start, quick wins and top tips, Alexandra Coutts

Alexandra drove home the message that even though it’s only April, brands should already be planning for this year’s Black Friday.

Her research has found that Google puts publishers before retailers in search results, meaning that retailers need to find other ways to appear in SERPs, such as PR with publishers that promote Black Friday deals. It is also worth noting that Black Friday doesn’t necessarily mean discounts, and that brands have real scope to decide what this retail holiday means for them.

Alexandra’s key advice was to ensure your sales page is built, suggesting that one sales page that can be adapted would work well, and remember that you can still capture customer data now to help drive traffic when November arrives. She also recommended making a list of products that are likely to be subject to Black Friday deals, and ensuring that everything is in place to help them sell well, such as good reviews and product images that effectively demonstrate the product.

Lastly, Alexandra stressed the importance of good SEO housekeeping, such as making your site fast and mobile first, as well as ensuring that your servers can take the increased load around Black Friday. Many customers will look at your site in the lead up to Black Friday, but won’t buy due to being in the research phase, so plan for conversion rates to drop off immediately before, but hold your nerve and reap the rewards this year.

The post What we learnt at BrightonSEO : Part 1 appeared first on Edit..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
The Edit Blog by Chris Coultas - 3M ago

Google advises website owners to “Focus on the user and all else will follow”. While there are many technical fixes that are important to improve SEO performance, Google continues to place increasing emphasis on the user’s experience.

As a search engine, Google aims to meet users’ needs quickly and easily. This encourages users to keep coming back to Google’s services, which is why ranking websites that also provide a positive user experience (UX) is vital.

So UX becomes a vital part of SEO, leaving website owners to find the best ways of creating both a streamlined experience for users while also increasing rankings in search engine results pages. This may sound like an intimidating prospect but making some simple improvements will benefit search rankings and customer satisfaction alike.

Here are 5 tips for creating an accessible user experience that’ll also boost your search rankings:

1. Optimise website navigation

A website’s navigation provides links that connect the pages of a website. Its ultimate purpose is to help users quickly find what they need from the website in question. A badly constructed navigation can create a frustrating experience for users, who will likely lose patience quickly and look for an alternative.

Search engine crawlers also use website navigations to find and index pages. The links in the navigation help search engines to frame information about the pages that will be visited, such as their purpose, expected content and how the pages are related. So, a poor navigation can make the process of indexing a website’s pages more difficult. If crawlers don’t find and index valuable pages then your rankings will be lower.

Your first priority should be simplifying your website’s navigation as much as possible. Offering too many choices and a sequence of cascading menus can be a confusing sight for users and make it more difficult for search engines and users to identify important pages. Instead, make careful choices about which pages should be added to the main navigation and whether these pages can be consolidated under fewer headings.

For SEO purposes, avoid using unnecessary JavaScript in a navigation if possible. Search engines struggle to parse out the mass of confusing code, which means it could take longer for pages in your navigation to be crawled and indexed. This means any changes made to benefit the website’s rankings could take longer for search engines to recognise, so a better option is to code with plain HTML instead.

Also, ensure that links in the navigation prioritise the pages with the highest traffic. This will increase the chances that they retain this strong traffic, and will also allow new users to easily find the pages that have proved to be most useful. Submenus should be arranged alphabetically so it can be easily scanned by a reader in the way most would be familiar with.

2. Include product reviews

Reviews for a product can reinforce a potential customer’s choice, or even remove any doubts they might have about this product. Some customers may only choose to purchase a product or service if they see a vote of confidence from other customers, particularly if the product is expensive or unfamiliar to them. Clearly, this could improve conversions on any website, but it also saves customers the trouble of looking for reviews elsewhere or abandoning a transaction prematurely.

The more reviews available for a product or service, the stronger this confirmation effect can be, as illustrated in a study by Bazaarvoice (see below).

There is also a distinct SEO benefit to including product/service reviews; it’s a type of user-generated content that search engines factor into rankings, strengthening the authority a website holds and increasing its chances of ranking above others for queries. What’s more, a steady stream of customer reviews can show a search engine that your site is still active, providing a steady stream of new content that acts as proof of consistent user engagement.

Also, when optimising for mobile, ensure the navigation only takes up 1/5 of the screen’s space at most. This makes sure there’s plenty of room for other content above the fold, which lets users see as much information as possible before scrolling and confirms they have come to the right place.

In a similar vein, making sure menus aren’t too big for the screen they’re viewed on is vital. It not only looks more professional and less cluttered, but also avoids valuable links going unseen by many users because they haven’t scrolled down.

3. Properly utilise search function

A website search function allows visitors to your website to quickly find information or products by entering queries into a search bar, the website then returns the closest matching pages to that query.

E-commerce websites are one of the best candidates for a search function, because they have a large number of pages, however a search function is also useful for information delivery websites such as news or blog sites as well. While there are many ways to optimise a search function, one that is often underutilised is an auto-suggest function.
Auto-suggest is an effective way to steer the user journey by helping potential customers control their search query and intuitively find what they need, particularly if they are looking for something with a complicated name.

Returning customers may also find the listing of previous and recommended search terms gives them more efficient experiences, which allows fewer opportunities for user journeys to be interrupted. This rewards the user for visiting the website more often and builds a bespoke experience for those customers each time they use the function.

A spelling correction function is also extremely useful for ensuring a clean path for the user to their desired product or service, keeping them from missing out on products because of a spelling error. Similarly, if searches fail to return any results, potential customers could quickly lose interest. However, if a recommended product based on that search could be presented, this may provide a good alternative for the customer. In short, if searches always return something for a user, there is always a chance they can convert, which is good for the user and positive for website metrics and profit.

If users consistently convert, this sends positive signals to search engines about the way a customer behaves on that website, resulting in higher rankings.

4. Use CTAs effectively

A call to action (CTA) is a valuable way to focus users on their journey to becoming a customer, encouraging them towards a transaction. Using a CTA above the fold on a website’s home and product pages can add a sense of urgency to that customer journey, giving users an immediate direction.

However, the difference between a good CTA and a great CTA could be a minor tweak. One way to make a CTA stand out is to make them original; some styles of CTAs are used across multiple industries, so finding one that’s less utilised while still being direct is far more noteworthy. To strengthen this even further, explicit benefits can be added to a CTA to make them more enticing, such as ‘purchase now to save…’, ‘sign up today and get the best daily tips’ or ‘buy now and get 15% off’.

CTAs are also one of the most versatile UX strategies that can be applied to websites serving almost any industry, from e-commerce to information. While they serve to direct a user forward in their customer journey they’ll also decrease bounce rate, which is something that search engines can be wary of when deciding a website’s ranking.

5. Think carefully before using a carousel

Carousels serve the purpose of saving space on a web page, allowing more than one piece of content to occupy the same area of the home page. They also allow a quick comparison of products and can be initially visually appealing.

However, they can also be a source of frustration among users, as many are unlikely to wait for the images to turn to one that meets their needs, causing them to click away and look for more interactive ways to engage with the website or to simply leave altogether.

An nd.edu study reinforces this point, indicating that less than 1% of users click the feature, and, of those that did, 89% clicked the first item.

Unfortunately, carousels can also have a negative effect on SEO outside of user engagement. They could slow page load speed, they often don’t translate well to mobile users, and, if implemented via flash, the contents can’t be crawled by Google bots.

If you use a carousel, make it easy to interact with by ensuring its movement is user-initiated rather than automatic. This avoids situations where users cannot read the content fast enough before it moves on, or links are clicked incorrectly as the carousel is in motion. Text can also be kept short and clear, so that users don’t feel burdened with a lot to read on each slide, and the carousel stays visually appealing.

Overall, carousels do offer some advantages but they need to be well-optimised and positioned to make sure that their inherent downsides are mitigated for UX and SEO purposes.

UX and SEO should work hand in hand, with an improvement to one ultimately benefitting the other. As the boundaries between both disciplines become less defined it pays to focus on the user – a good experience for them will likely improve your website’s SEO too.

The post How UX can improve your SEO appeared first on Edit..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Google has increasingly been talking about the importance of creating a good user experience.

Unfortunately for many webmasters, understanding how Google evaluates what a good user experience looks like as well as knowing if it will have the desired effect, can put them off making such changes. In this blog post, I look back at some work we did with a client to improve their UX (and specifically, their site architecture) that had a big impact on their rankings.

At the beginning of 2017, I started working with a bullion ecommerce site. They were Europe-based but had struggled to gain traction in the more profitable UK market, which is where we came in. It quickly became apparent to us that the site was difficult to navigate around and didn’t always highlight important pages to users, so we performed a review of their site architecture and made the following recommendations:

  1. Reduce the number of category levels – A typical user journey might require users to land on the following pages to perform a purchase: Home – Main Category – 1st Level Subcategory – 2nd Level Subcategory – Product – Basket. In reality, the main categories were unnecessary and didn’t really rank for much meaning this level of the site could be removed entirely as it was pointless and just created more effort for the user.
  2. Make important, 2nd level subcategory pages more prominent – This was done by including links to these pages in the menu which allowed users to bypass another layer of the site if they knew what product they were looking for.
  3. Make 2nd level subcategory pages accessible on mobile – Previously, 2nd level subcategories were only accessible on a side-navigation that was only visible on the desktop version of the site.
  4. Stop filters from creating thousands of unnecessary subcategory pages – Rather than using parameters when applying filters to a category page, the site had individual category pages for each variation of the filters. As an example, the category page for Sovereign coins had 30 different versions of almost the same page depending on what filters were being applied. All of these pages were indexable creating various content issues and a lot of category pages to sift through.
  5. Base naming conventions of pages on terms users are searching for – As an example, most users search for bullion bars by weight rather than brand, but the client was categorising their bars by brand.
  6. Remove unnecessary subcategories – This included pages users were rarely landing on or that had very few products and helped reduce the bloat of content on the site and removed many low-quality pages combining them into broader and better quality category pages.
  7. Remove unimportant links from the main navigation – This placed greater focus on more important pages making them easier to find.
  8. Make informative content accessible from the main menu
  9. Remove unnecessary filters and filter options
The results

It was a pretty hefty project and took several months to complete these tasks, but the results were substantial.
In the UK, visibility increased by 184% YoY while Germany saw an increase of 495% over the same period. Spain and Italy also began overtaking the performance of the UK having started 2017 with practically no visibility at all.

UK Germany Italy Spain

Organic traffic increases over this period were also substantial with an increase of 119% YoY which led to a 207% growth in organic revenue.

User metrics also changed in a positive way over the course of this year. We found that Pages/Session were down 24% from 4.87 down to 3.71 while average session duration was also down 21% from 3:44 to 2:57. On face value, this might be viewed negatively but when we think about what this project was trying to achieve (e.g. helping users find the product they want as quickly as possible and in as few steps as possible), we realise that this data demonstrates that this is exactly what was accomplished.

How do I optimise my site architecture for SEO?

Based on our learnings from this project (and others), as well as Google’s own guidelines, we’d recommend making the following considerations as you review your own site architecture:

  1. Ensure the user journey doesn’t require landing on unnecessary pages.
  2. Remove unnecessary levels in the site architecture.
  3. Ensure that category pages relate to terms users are searching for.
  4. Ensure that category pages have enough products to make them worthwhile landing on as Google views having little choice as a negative for the user so reduces rankings for such pages.
  5. Ensure that only filters and filter options are included that are helpful to avoid cluttering.
  6. Avoid content bloat.
  7. Ensure that pages accessible on desktop are also accessible on mobile.
  8. Remove links from the main nav that are unnecessary to make it easier for users to find more important content.
  9. Don’t make SEO your only consideration – For example, just because a category page doesn’t target keywords with much search volume, doesn’t mean it isn’t important. Sale pages are a good example of this, lots of users click on these pages once they land on the site but don’t search for such pages.
  10. Categories in the menu should be ordered by the amount of traffic they receive, and subcategories should be ordered alphabetically to make content discovery easier.

We regularly come across clients who need to change their site architecture and find that once the right changes have been made, that performance across all channels is improved.

So, if you haven’t already, why not give your site architecture a good spring clean and transform your site from average to awesome?!

The post The importance of Site Architecture in SEO appeared first on Edit..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Along with images, CSS and JavaScript are two of the most common reasons for code bloat and slow loading pages. Sometimes (although not often) this is unavoidable, and the functionality and design of a page requires a lot of code however there is nearly always room for improvement. Users don’t like waiting for a page to load and search engines know it. You should try and improve page speed however you can.

It is particularly frustrating to see pages where large portions of the scripts being called by a page are redundant and not actually being used for anything. This can happen for a number of reasons including those listed below:

  • Scripts being called in the head section of a template that are required for some pages but are being called for every page on the site.
  • Scripts and styles that are only used when a certain part of the page is interacted with such as tabs, accordions, faceted navigation.
  • Cruft that has built over time where additional functions or styles have been added over time without reviewing existing code to see whether it can be modified to fit newer requirements (Dawn Anderson recently talked about the concept of generational cruft)
  • Installation of Plugins / addons for a CMS where only a small subset of the functionality offered is actually required.
  • Poor coding.

There are many other reasons for why unused code can build up, but these are some of the common ones. You can read more on this topic here

How do we Identify What is and Isn’t Being Used?

There are various methods for identifying unused code including browser plugins, custom built testing scripts, third party libraries etc but there is also a really simple and handy way to do this right within Chrome Developer Tools which is the option we will be looking at today.

The first thing to do is open the page you want to examine in Chrome and then press F12 to bring up developer tools. Once you have the Chrome Developer Tools window open, hold CTRL + SHIFT+ P to open up the filter bar shown below

It shouldn’t matter which tab you are in (Elements, Console, Sources etc) the filter should still open. In the filter box you should type coverage which should bring up the options shown below

You should select the first option which is Draw Show Coverage which should bring you up a couple of panels as shown below

In either of the panels click the reload button which should refresh the page and start populating the bottom panel with data as shown in the example below.

In addition to the URLs of all scripts loaded on the page (not shown here but would normally appear on the far left) we also get the type of script, the Total size of the file and the Unused Bytes. The bar on the right hand side shows the split between used and unused code with the red portion showing a visual representation of the percentages.

At the bottom left of the screen you will also see the total percentage of code loaded on the page that isn’t being used.

As you can see from the example above, of 2.3 MB of CSS and JS 1.5 MB or 64% is not being used on this particular page on initial load. By clicking on an individual row in the bottom pane you can also see the specific blocks of code that aren’t being used in the top page.

It is also possible to get this information using Headless Chrome and Puppeteer. You can find out more about this here This is a more complex approach but much more scalable if you need to get data in bulk. We will talk more about this approach in a future post.

What do I do With This Information?

Whilst it’s exciting to think that huge savings can be so quickly and easily achieved it’s important to remember some of the points mentioned earlier and not just dash off and tell the dev team to remove all of the highlighted code (hopefully they would say no if you did this anyway!)

To get an idea of the impact of scripts not being available you can test on a per page basis by blocking resources and then interacting with the page to identify UI issues. You can find more information on this here

Below we will discuss some of the ways that you can use this information to improve coding efficiencies on your site.

Defer Non-Critical Scripts and Styles

Aside from removing completely redundant scripts, one of the other common reasons for identifying code that is not used on initial page load is to minimise render blocking code that doesn’t actually do anything until a user interacts with the page. Examples previously mentioned include things like accordions, tabs, search functionality etc.

Tools such as Lighthouse highlight both render blocking scripts and unused CSS but return the file names rather than the specific parts of the script that unused  however it may well be that parts of a script are required for the initial render and some are only used upon interaction.

By looking at a more granular level it may be possible to split code up into that which is parsed at the start of the page load process and unused scripts and styles that can be deferred until after the above the fold content has loaded.

Use Different Templates for Different Types of Pages

Many sites include references to all external scripts on every page of a site which is often the reason for unused code. As an example, an e-commerce site might be loading all of the scripts and styling for product listings, search results, shopping baskets etc on the home page and other unrelated pages.

In such cases, we recommend having different templates for each part of the site to ensure that only the required functionality and styles are loaded at any time. Another alternative may be to use Google Tag Manager to conditionally inject script references based on a selector on the page however this is often not the best approach and a template based approach is normally preferable.

Use Custom Builds of Script Libraries

Use of third party libraries such as JQuery and BootStrap is another frequently causes code bloat. Sites frequently load the full library when they are only using a small subset of the functions available.

There are a number of options available for building custom versions of these libraries to minimise the amount of unused code that is included. You can find a BootStrap customizer here and a jQuery builder here and a jQuery UI builder here

You should also consider whether you really need to load an external library at all. The greatest savings can sometimes come from using simple custom solutions.

The post Identifying Unused CSS and JavaScript with Google Chrome Dev Tools appeared first on Edit..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I was asked recently what I see as the future of eCommerce and search. In short, it’s data.

The biggest mistake people make in search is thinking Google is one step ahead. In actual fact, Google updates its algorithms based on how people interact with it. So, in theory, by using data correctly you can out Google Google.

The customer profile will be more important than ever

If you know what your customers looks like, how they like to interact, what devices the use (and at what time), Google doesn’t stand a chance!

Data can also help with your forecasting – you can use it to estimate traffic to your site, predict what products will sell well, and factor in for special occasions such as the holidays and seasonal events. This something Amazon does really well using Amazon Web Services.

Although you need to make sure you segment your data and create user profiles, when you’re collecting data make sure you’re looking at the quality, not just the quantity, and focus on one campaign at a time.

This is the simplest way to get more out of your marketing spend in SEO and PPC. But don’t rely too heavily on automation.

Stephen Kenwright described Google’s relationship with automation like Isaac Asimov’s Laws of Robotics:

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
  • A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.

Translated, this is what Google’s Laws of Automation look like:

  • Google must do what’s right for Google
  • Google must do what’s right for a user except where this would conflict with the First Law.
  • Google must do what’s right for those who pay Google, as long as this does not conflict with the First or Second Law.

Know your customer, and Google is yours for the taking.

There will be a shift from authority to relevancy

The more relevant the link, the more traffic flow will flow through it. If you get 50+ links from random bloggers, does that weigh the same as a link from the BBC? No!

The BBC link wins every time. But, in the same line of thought, does it make sense to weigh a link from a travel site to a finance site more than a link from a blog that talks about money? It shouldn’t, right?

Well, depending on your DA, it does.

Currently, Google ranks on high authority links pretty much indiscriminately. But when you think about the traffic the travel site will be passing to the finance site, the bounce rate and CRO must be ridiculous. It won’t be long before Google won’t appreciate a diverse link profile as much as it will relevant links that pass higher amounts of convertible traffic. This will also help end the war between journalists and PRs.

So many journalists now are picking up on digital PR, fighting back against asking for a link. To me, this makes no sense – they’re getting free content, so why bite the hand that feeds?

If you make it so that a journalist’s piece won’t make sense without linking back to you, they’ll have no choice but to do so if they want to cover the story.

There will be a voice crack

People will realise that voice isn’t that big of a deal. Stats suggest that by 2020, roughly 50% of searches will be voice searches. What we don’t look at is what those searches actually look like.

At the moment, what is being counted as voice searches are simple action tasks like creating a to-do list, managing calendars, checking out sports scores, using a calculator, and asking for information about local places.

Nearly half of interactions with an assistant included both voice and touch input. Only 8.1% of voice assistant users actually bought products with their assistants and 90% of those people used Alexa. So, to look at improving your sales in voice, you need to look at selling on Amazon first.

Amazon’s hold over eCommerce will only grow

Nearly 62% of people start shopping for a product on Amazon before they look on Google. This has massively increased from 2015, when that number was only 45%. 90% of your customers will look at your site, and then check Amazon to see if they can find it cheaper!

Why is Amazon outstripping Google in the eCommerce sphere?

Amazon cares only about buyers – it doesn’t care if you spend thousands in advertising, it doesn’t care if you’re a massive brand, it doesn’t even care if you’re an Amazon own brand.

If users are buying your products over others, you’ll be rewarded with a higher ranking. And that’s why Amazon still and will keep beating Google (see laws stated above).

The start of the digital high-street

In the long run the high street will eventually have to merge with digital. If it doesn’t, the high street will die out. Google is already using AR and reviews in Google Maps – how long do you think it’ll be before it brings that to shopping?

But what’s the best way to bring digital into the shops? Bring all the benefits of eCommerce and all the benefits of the high street together.

When a customer walks into a store, CCTV could be used to pick up their biometrics and start creating a profile. As the customer shops, this profile could be enhanced with smaller cameras embedded in shelves, which could pick up the customer’s likes and dislikes around the shop, reading emotions through facial reactions to different products.

This same concept could feed into an interactive mirror in a changing room. The mirror might suggest certain things – for example, because the customer likes X they should buy Y. The customer should also be able to access the same level of information about the product as if they were buying it online, such as if the product is available in a different size.

Finally, customers should be able to buy by selecting which items they want and ‘check out’ there and then. It might sound like something out of Minority Report, but it’s not. AmazeRealise has already created Match, which does exactly that!

Welcome to in store search optimisation.

So, when you ask what we should expect for search and eCommerce:

  • Digital will meet the high street. If it doesn’t, the high street will die out, but if the high street does merge with the digital sphere, it will start the rise of in-store search optimisation.
  • Amazon’s hold over eCommerce will grow stronger and when the eCommerce world understands that, SEOs will start moving our focus from Google and Bing to Amazon.
  • Because of the rise of Amazon, the world will realise that, at the moment and for the foreseeable future, voice search isn’t that big of a deal, and certainly not a big money maker… Unless your customers own an Alexa and you sell on Amazon.
  • Back in more traditional realms of link building, the more relevant the link the more traffic flow will flow through it and hopefully, it will end this war between journalists and PRs

Finally, and most importantly, with all the right data about our customers, we can out Google, Google!
Think you’re prepared for this vision of the future? Contact me at daniel.saunders@edit.co.uk to share your thoughts.

The post What to expect when you’re expecting: The future of search and eCommerce appeared first on Edit..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Site speed has always been important in SEO. Not always directly for rankings but, those interested in achieving high conversion rates have always placed importance on good user experience and how quickly a page loads is a key component.

With recent changes to Googles algorithms making speed more of a ranking factor, performance is only going to become more important, particularly given the dominance of mobile devices.

There are many different options when it comes to performance testing. Free tools include WebPageTest.org, GTMetrix, Pingdom and many others but one of the most popular performance analysis tools available at the moment is Lighthouse from Google.

Lighthouse is popular for several reasons including the fact that it is developed by Google themselves, so you can get a good idea of what they consider to be important. The reports generated by Lighthouse are also very comprehensive and provide clear feedback on what the issues are on a page are and how they should be addressed.

A downside with this tool (as with many other page speed testing tools) is that it can take a while to run, and the manual method of checking individual pages is laborious and doesn’t lend itself well to bulk testing.

Fortunately, it is possible to run Lighthouse via a command line interface which means that you can have it running in the background, checking URLs without requiring you to keep going back to process new URLs manually. We’ll be explaining this in detail throughout this article*.

How Much Time Can You Save by Running Lighthouse on the Command Line?

In considering options for efficiency improvements, it’s always important to weigh up the potential time savings versus the level of effort required for implementation.

When running a Lighthouse report manually there are two-time aspects to consider; How long it takes to set a report running (going to the page, opening Chrome developer tools, selecting audit options etc) and the amount of time it takes for the report to actually run.

For the 25 reports from the Edit website that I created when writing this post, I went through the process both manually and using the command line to compare the two.

Time taken to run the reports manually

For each of the 25 URLs it took around 20 seconds to go the URL, select to run only the performance report and start it running. This equates to around 8 minutes in total.

For each URL, I ran the report five times and took an average of how long it took to complete each run. The overall average was 36.37 seconds for each to report to run, so the completion time for all 25 works out to be around 15 minutes.

Combining these two times, the total amount of time either passive or active was 23 minutes. Let’s compare that to the automated method.

Time taken to run the reports via the command line

Rather than entering each URL separately, we simply populate a text file with a list of the URLs that were tested previously (this process will be explained later).

As with the manual process, I ran each of the 25 URLs through 5 runs and took an average which came out at 12.55 seconds, with the time taken to process all 25 reports being just over 5 minutes.

Overall the command line approach was around 78% faster with a time saving of 18 minutes. Whilst this is a good time saving, it doesn’t sound earth shattering but as we start to scale up the number of URLs being tested, the time savings become somewhat more appealing.
This is shown in the example table below based on the quoted run times for each report:

It’s often not necessary to run such large numbers of Lighthouse audits and we often just take a smaller sample of pages, covering different types to get an idea of common issues. But, if you do have to analyse a large number of URLs, then the prospect of being able to save a couple of days is going to be too good to turn down!

Another benefit of running reports from the command line is that by default, the report is saved as a HTML file, whereas with the Dev tools method of running reports, if you want something client facing then you also need to export the data into a JSON file and send that to the client along with a link to the Lighthouse audit viewer

There are a couple of problems with this. First of all, the JSON files are huge and can’t typically be emailed in large numbers, so an upload solution is required. There is also the extra hurdle of having to drag the files into the viewer to be able to look at them.

An alternative here is to use the Chrome Lighthouse plugin which offers the same export features as the command line generated report, however it has similar processing times as the browser based method.

One of the issues that many SEO agencies face is getting buy-in from internal dev teams to implement changes, and anything we can do to make their lives easier only improves the likelihood that our recommendations will be implemented – and this is one small step in achieving this.

Getting Started – What you need to run Lighthouse from the command line

There are only 3 things that we are going to need to have installed in order to be able to run Lighthouse from the command line:

  • Google Chrome
  • Node JS
  • Lighthouse
Google Chrome

The first thing to do is make sure you have Chrome installed. Even though we are going to be batch processing reports in the background the Chrome browser is still required to create headless instances.

Node.js

After we make sure that Chrome is installed, the next thing we need to do is download Node.js. Windows users should download the LTS version which at the time of writing is 8.11.3.

Mac OSX users should download the macOS installer here

Node.js is a JavaScript runtime environment that allows us to take advantage of the power of JavaScript outside of a browser window. You can read more about it here.

Node.js also has NPM bundled with it and we will be using this to install Lighthouse. NPM is a package manager for JavaScript applications and allows developers to share and distribute code providing access to a huge number of libraries and making installation quick and easy.

Lighthouse

Now that we have Chrome and Node.js installed the final step is to install Lighthouse so we can call it from the command line. This is a very quick and easy process.

Windows Users

Firstly, you need to open a Windows command prompt. You can typically do this by opening the run dialog box and typing cmd as shown below:

This should open up a command prompt window like the one shown in the screenshot below. There are various different methods for opening the command prompt depending on which version of windows you have.

This article shows different methods for windows 7, 8.1 and 10. I am using Windows 10, but the methods are pretty much the same across all recent versions of Windows.

* Note: If you find the font size in the command prompt on Windows too small you can right click on the top bar and adjust the size of the text to make it more comfortable to read.

Now we have a command prompt open. We can install the Lighthouse module using the command below and pressing enter:

npm install -g lighthouse

In the command above, we are using npm install to install the Lighthouse module from the NPM repository discussed previously.

The -g flag means that the module is installed globally and is not dependent on the directory that you are currently in. You can find out more about this here.

Finally, the lighthouse part is telling npm which particular module you want to install. Once you have entered the above command hit return and you should get the following appear showing a progress bar.

As you can see from the final line, the whole process took around 25 seconds to complete so it is not a long installation. If at any time you want to uninstall lighthouse you can simply run basically the same command but in reverse so npm uninstall -g lighthouse

Mac Users

Open terminal (easy way if you don’t have a shortcut on your taskbar, is Command > Spacebar then type ‘Terminal’)

On the command line enter the following:

sudo npm install -g lighthouse (you will be required to enter your password)

Note – If you have any problems with getting the npm command to run you may need open a new command prompt. (hat tip to Mike G for that).

Your First Lighthouse Audit Report from the command line

We now have everything we need installed to be able to generate reports from the command line so let’s go ahead and do that!
At first, we will run a basic report and then we will look at some of the many options available for customising what is generated.

To run a basic report, simply enter the following command and hit enter:

lighthouse yoururl

* yoururl should be the URL that you want to test so in my case I would enter:

lighthouse https://edit.co.uk

After you hit enter you should initially see a screen that looks like this:

In this instance you will also get a chrome window pop up which will start modifying the display of the page in question (we will talk about how to avoid this in the next section)

After a short delay, the command prompt will start listing what lighthouse is currently testing and when it is finished it will show you a link to where your report has been generated.

For a basic run of the tool, the report will be generated in whichever folder you currently happen to be in. In my case I was in the root of my H:\ so that is where the report will be saved. By default, the filename should be the URL of the page tested and the date and time when it was generated.

And that’s it, you have (hopefully) run your first lighthouse report from the command line! So, let’s take a look at what you actually get…

The contents of your Lighthouse Report

Those reading this article are most likely familiar with Lighthouse reports, and what you get as standard when generating reports via the command line is pretty much the same. Depending on what categories you have specified when running the report, you will get up to 5 areas of analysis:

There are differences in some areas for example, the metrics used in the performance section. Here are the ones that are the same for both methods of generating the reports:

The browser based report also has a “First Interactive” metric and the Command Line version has “First CPU Idle” and “First Contentful Paint”

One of the most significant differences between the two methods of generating the report the ability to pass the information on to a third party. With the browser based approach you can export the data to a JSON file but unfortunately the file generated is huge and needs to be interpreted with the Audit viewer mentioned earlier.

With the Command line version, the report is natively generated in HTML and can immediately be forwarded on. You are also provided with a number of other options as shown below:

This is one of the key benefits to generating reports in this way as you are able to immediately share the results without having to forward on both a large JSON file and instructions for how to view it.

Customizing your reports: Going beyond the basic Lighthouse CLI Report

As mentioned earlier, we have only looked at the basic default settings so far and this is just the tip of the iceberg when it comes to what you can do with this tool. In this section, we will look at some of the ways that you can customise how your reports are generated. We won’t cover all of the available options in this post but will look in more detail in future posts at what you can do.

To see a list of all of the configuration options you can run the following from the command prompt:

lighthouse –help

This will give you a list of all the available options as shown in the screenshot below:

Below we will run through some of these options and what they do. It’s useful to note that each flag should be prefixed with a double hyphen e.g — and that generally the flags don’t need to be in a specific order to work.

Quiet and headless

By default, when you request a report, a Chrome browser will open, and the command line will log each part of the process. If you want the report to run silently you can use the following:

lighthouse https://edit.co.uk –quiet –chrome-flags=”–headless”

In the example above, the –quiet flag means that nothing is shown in the command prompt until the process has ended and the chrome-flag value will run the process in a headless instance of Chrome so that a browser window isn’t opened during the running of the report. The latter is particularly useful if you are running a batch of reports in the background which we will cover later.

Once the process has finished, a report will be saved in the folder that you are in when you initiate the command with no notification.

View

If you want to know when your report has finished you can add the –view flag, so that your report opens in a browser window once it has finished running. An example in conjunction with the previous flags would be:

lighthouse https://edit.co.uk –quiet –chrome-flags=”–headless” –view

Output

Running Lighthouse from the command line allows you to generate reports in three different formats; HTML, JSON and CSV. The default is HTML however you can use any of the three or even generate a report in all of them at once. An example of generating a CSV version of the report that we looked at above would be:

lighthouse https://edit.co.uk –output csv

To generate a report in both csv and html formats you could use:

lighthouse https://www.edit.co.uk –output csv –output html

Categories

As with running reports through the browser you can choose which categories you want to report on. If you wanted to only get performance data, you could use the following:

lighthouse https://www.edit.co.uk –only-categories performance

If you wanted to get both performance and SEO you could do:

lighthouse https://edit.co.uk –only-categories performance –only-categories seo

Running Batches of Lighthouse Reports on the Command Line

At this point you might be thinking this is all well and good, but it’s not really going to save me any time. So, we will now look at running batches of reports which is where you will hopefully start to see the benefit of this approach to running Lighthouse audits from the command line.

For this next part we are going to create two files, one to store our list of URLs and another to hold the script that will generate our reports. The first thing we need to do is choose where we are going to store our files and create a new folder.

There are a few differences here between the Windows and Mac processes which will be covered in the two sections below:

Windows Users

I am going to create a folder called demo in this location

C:\Users\mike.osolinski\Documents\Local\Lighthouse\demo but you can create a folder anywhere you want and name it whatever you want.

Important: Make sure that the path does not contain any spaces.

Once you have created a folder to hold your files create a text file called urls.txt. This is the file that we will store our list of pages to test in.

Open your text file and enter a list of URLs as shown below:

Save and close this file and create another in the same folder called test.bat (the file name is arbitrary) and open it for editing in your favourite text editor. I use notepad++ but anything will do really. You can just basic notepad if you prefer.

In test.bat you should add the following lines of code replacing C:\Users\mike.osolinski\Documents\Local\Lighthouse\demo\urls.txt with the path to your urls.txt file. We will run through what each line of code does below.


@echo off
for /f “delims=” %%a in (C:\Users\mike.osolinski\Documents\Local\Lighthouse\demo\urls.txt) DO (
ECHO Line is: %%a
lighthouse –quiet –chrome-flags=”–headless” %%a
)

In the script above the first command we see is @echo off which relates to what output is in the command window. If we run the script as is for a single URL, we will see something like the below:

If we remove the @echo off line and re-run the script, then the output will be as follows:

As we can see when the @echo off line is removed the whole of the test.bat file is also echoed to the console. This isn’t a particularly big deal for us as the script is so small, but if you’re running very large scripts then you may want to avoid this.

The next line (see below) is extremely important and key to our script running properly. It’s a bit more involved than the first line, so we’ll break it down into separate parts.

for /f “delims=” %%a in (C:\Users\mike.osolinski\Documents\Local\Lighthouse\demo\urls.txt) DO (

First, the opening for we are creating a for loop to read through each of the URLs in our text file. The /f indicates that we are going to run the loop against a file and “delims=” indicates how our data is delimited which in this case is via a space / line break. You can read more setting options for the type of source data and delimiting data here.

Next, we need to declare a container variable to store each value which is where %%a comes..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Most of us are aware that changes to your site pose a risk to your rankings – and a site migration is the single biggest thing you can do to risk it all in one go.

The term ‘migration’ is thrown around quite a lot, but I tend to define it as: “any significant change that would require a reassessment of a site by search engines.”

This can include everything from content and design changes through to alterations of URL structure, switching to HTTPS, or moving domain completely.

I advise against changing any more than two of the above at any one time, otherwise the risk increases significantly. You will be required to have a range of fall-backs in place, which will increase the required preparation time and the number of developers you need on hand in the weeks following the migration.

Over the last 18 months, I have been involved in dozens of site migrations at various different points, and the most challenging ones to fix usually have one (or more) of these issues:

1. Adding a 301 redirect to your historic robots.txt file

You’re migrating your site and your SEO says you need to redirect everything, right? Wrong!

Search engines that are typically used to encountering a robots.txt file on your site will avoid crawling your site if they find that it now redirects. This is bad because it means all the redirects you have put so much thought into will not be followed by search engines because they aren’t visiting those pages, out of fear of crawling something they shouldn’t be.

Over time, the redirects will eventually be followed. But I have seen instances where this takes three to four months and the visibility of the site never reaches the height it would have if the redirect on the robots.txt was not there in the first place.

2. Lack of understanding of the existing value within the site

Knowing what you stand to lose and what you can afford to lose is hugely valuable. Not all pages are created equal, and knowing what’s important means you won’t be leaving behind any equity that was helping your current site to perform.

Make sure you understand:

  • What drives the revenue for your site
  • Where search engines see value
  • What pages have links
  • Which content engages users the most

This way, you will know what sections and pages are worth your time when it comes to redirecting or recreating.

It can also help generate buy-in when you ask for more development time or request the deployment to be delayed by a week so you can get everything sorted because you (the SEO) wasn’t brought into discussions earlier enough (more on that later).

3. No awareness of historic domains redirecting into your existing one

I have seen several variations of this over the years, with businesses that owned multiple domains and unknowingly pointed one with a manual action at their main site (yes, that really happened).

The resulting fallout was significant, and removal of the manual action on the redirecting domain was only half the issue, as the main site was caught by Penguin as a result.

As recently as March 2018, I have seen businesses fail to check what they are redirecting into their site and visibility has gone backwards rather than achieving the hoped-for improvements.

This point and point 2 on this list are always caused by the next point.

4. Bringing an SEO into discussions too late

Businesses have gotten in touch with us after the damage is done, meaning we need to take action to reverse the issues. This isn’t always the case – I’ve also been involved in discussions from day one.

The latter is (obviously) preferable. While it’s highly likely that the SEO won’t have the most to say in those early meetings, over the course of several meetings you’ll likely notice that your SEO can steer other teams around potential issues (such as points 2 and 3), which otherwise would have been missed had they not been in the room.

It also helps to have someone to rein in those who want to change every detail, from URL structure to on page content and meta data, without considering the organic impact.

5. Forgetting about Search Console access consolidating Disavow files

This isn’t necessarily something that is going to make your migration a success or a failure, but rather a word of caution. Just because your old site no longer exists, it doesn’t mean your old Search Console profile is redundant.

Not only will it have historic data that may come in handy for comparisons in the new site’s first year, but I have seen instances of Google communicating with webmasters through the original GSC profile and not the new one. This is more common when the site has moved from HTTP to HTTPS.

I have also seen sites with manual actions for spammy structured mark up or security issues where messages will persist within Search Console’s “manual actions” section in the HTTPS version of the site, despite having a separate message confirming removal of the manual action.

Only when the reconsideration request or request for a review was submitted through the HTTP profile were the messages completely removed. Pedantic? Yes, but your client or boss will be thankful you removed any messages in the site’s Search Console that are bringing bad news from Google.

Finally, something that should be par for the course – move the disavow file over from your old domain to your new one, because that can prevent the depressing visibility graphs you will see if you make the mistakes in point 3.

The post 5 Reasons your site migration didn’t go as planned appeared first on Edit..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

SEO for eCommerce is completely different from traditional SEO. It requires skills across multiple disciplines. As an eCommerce SEO, you need an in-depth understanding of human psychology for a mass of customer profiles, which is essential for conversion rate optimisation, analytics, social media, and selling.

To top it off, you also need to understand the economy, usability and user experience, technical SEO, as well as PPC. The best eCommerce SEO specialists also have deep insight and understanding of the digital and retail world and know how to bring them together.

If it sounds like a lot, that’s because it is. But don’t despair – we’ve broken it all down for you here, covering everything you need to know to improve your eCommerce SEO.

Get your keywords right and that’s half the job done!

Your keywords are the first thing you need to get right. Our top tip: don’t be too broad or too competitive.

Too broad, and you’ll end up with a ridiculous bounce rate and a poor conversion rate because people who click on your site won’t be able to find what they’re looking for. Too competitive and you’ll be up against sites with more authority and more trust, meaning it can take a really long time to achieve high rankings.

Include site search when conducting your keyword research to see what your customers are looking for. They might be looking for products you don’t have, which can give you ideas for expanding your retail offering. It’ll also allow you to look at what people are searching for compared to what’s selling, to help you spot potential issues and areas of improvement.

Tip about how to find the right keywords

First of all, you need to find your niche, what separates you from your competitors. You then you need to use the right tool to find those keywords (I use an active Google Ads account and SearchMetrics). Once you’ve found your niche, you need to refine your keywords and find out how competitive they are and prioritise them.

Make sure site is optimised for those keywords. Use them in:

  • URLs
  • Titles
  • Headers and subheaders
  • Image file names
  • alt tags
  • Meta title and description

When creating content for your eCommerce site:

  • Avoid keyword cannibalisation: This is what happens when you try to rank for loads of keywords across several pages. Google picked up on this in 2008 and its Panda algorithm is programmed to punish it in real time.
  • Don’t keyword stuff: Use your blog posts for keywords that you can’t get into any other content. This helps develop awesome and unique content that your customers will read and share.
  • Write for people, not search engines: Google’s algorithm now rewards sites that create great content and penalises sites that keyword stuff or use other tactics that can be seen as manipulative.
Technical SEO: Clean your house before people come over

Do you clean up your house before inviting people over? Yes! So why don’t you do the same for your site?

If you don’t get your technical SEO right, none of your hard work will index and you’ve just flushed away all that budget spent on outreach. Make some nice clean URLs, add detailed breadcrumbs, make the journey from viewing to purchasing as simple and as straightforward as possible, speed up your page loading times, check for 404 and 302 redirects and, for my sake and your own, make sure you’re on HTTPS!

Part of this involves conducting periodic SEO analysis. SEO is not static, nor is your eCommerce website. Your site and code will change. A developer may, with the best intentions at heart, fix one problem but create another.

Design for mobile and desktop. Right now, there’s no question that Amazon rules eCommerce. It’s only recently that more people have started shopping on Amazon on their mobile rather than their desktop and the balance is still pretty tight, make sure you are not missing out.

Schema.org is a nice easy converter – I’m sure you all know about it, but it wouldn’t be an eCommerce blog if I didn’t mention it. Use it wisely and you’ll get stars next to your name, which can improve your conversion rate.

Make sure you have quality content!

When Google first released the Panda algorithm, quality content became a top priority for any site. This is easy to fix – for example, product pages need to be more than just pages with a photo and blurbs about a product. Vue does this extremely well.

Amazon features customer product reviews, photos, and videos. To increase your SEO rankings, you also need to have custom product descriptions.

Things to always remember when implementing onsite content:

Duplicate content is the enemy

The downside is, many eCommerce stores have a large amount of duplicate content as a result of product descriptions and lists, which can get penalised.

Assess your site and look for ways to reduce the amount of redundant and duplicate content. Careful usage of the canonical tag can help avoid these problems as well.

Be wary of filters

I say be wary because, in terms of UX, they are great as they to help your users find the products most relevant to them, pushing up your conversions.

The issue in terms of SEO is that most filtering systems generate unique URLs for every type of filter search, which means that one site could have thousands of indexed pages, all with duplicate content issues. As a result, it can make your site look like a content farm in the eyes of Google.

Make sure you blog

Creating blog content can assist in ranking your eCommerce business for additional keywords that might not have a place on your main website. Plus, you can capitalize on long-tail keywords.

Sell to humans, not search engines!

I read a really interesting blog by our very own Jen Derrick about this very subject, which will help you create product copy that speaks to both Google and your customers.

Another thing to remember – add a “psychology” layer to your content. Manufacturer’s descriptions are often dull and technical because they have to be. People, however, buy based on emotion! Use language that people will respond to, while still being clear about what you’re selling.

Images are really important as well – make sure to use one that are high quality, and make sure product images aren’t too large, or they will slow your page speed. Don’t forget the importance of image search: add appropriate alternative text to all images and videos.

You need GA analysis to help prioritize areas for improvement

As Stephen Kenwright once said, “be like Amazon and beat Google at search.”

How do you do this? By using the right tools, you can see the keywords your customers are searching for on your site and use them to calculate the revenue they generate.

Make sure you track the clicks you want and don’t want and build dashboards containing actionable information. Better yet, speak to Emma Barnes because this is hard and you need someone who lives and breathes it to get it right.

Conduct relevant CRO

The best way to describe how to do well in CRO is to put the focus on the user. Test absolutely everything, collect information, and use that information to make sure you continue evolving.

Think like a customer and map out their journey! The downfall of ecommerce that your customer can’t touch your products before they make a decision. Currently there’s no solution for this, unless you’re able to send out free samples or trial products.

Making the journey from viewing your product to purchasing as simple and as clean as possible can help to improve the customer experience, along with having a clear returns policy and shipping information.

Always test! Before, during, and after you launch any ecommerce business, you should invest in testing and analytics. Think like the customer and figure out what’s working, what’s not, and the why behind those answers.

Don’t rely solely on PPC (but also don’t forget it)

Most companies fall back on a pay per click (PPC) strategy to create visibility for their store. The truth is that PPC costs continue to go up and once you stop paying for placement, your online presence can disappear.

Some customers have an inherent distrust of sponsored links, banners, and other online ads, meaning we can’t solely rely on a PPC strategy – we need organic search too!

That said, PPC is a heavyweight in the ecommerce market, and it’s important to find a place that suits your brand and your customers. When implementing PPC:

  • Test ads that compel people to purchase
  • Sell in a unique way
  • Purchase intent keywords
Intertwine your email and social

Social media works well. Endorsements and reviews from happy users or customers will not only enhance your SEO campaign but also boost sales conversions.

Have social integrated into your site to build a community of happy buyers. The easiest way to do this? Fill in Google My Business, Facebook, and Twitter business forms.

Final thoughts on the future of eCommerce

What can we expect for SEO eCommerce in the not too distant future? Be prepared for the rise of voice search. The stats indicate that it’s not going to be long before more than half of all search is done without a screen.

So, when your customer asks Alexa, Siri, Google, or Cortana to add toothpaste to your shopping list, how do you make sure that it’s your toothpaste? What happens when all eCommerce is Amazon? Why is Dan just answering our questions with more questions? Because I’m hoping your naturally inquisitive nature will entice you to reach out and ask me these questions in person! Get in touch with me at daniel.saunders@edit.co.uk

The post A complete introduction to eCommerce SEO appeared first on Edit..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview