Loading...

Follow Blueclaw's SEO & Digital Marketing Blog - E.. on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Google has recently introduced three new metrics for its shopping campaigns. These metrics aim to show how your pricing compares to other advertisers for comparable products. We’ve found these metrics are incredibly useful and will really help you fine-tune your shopping and wider direct response marketing campaigns! This article explains what they are, how to access them and their potential uses.

What are benchmark pricing metrics and how do I view them?

Price benchmarks show, on average, how other merchants are pricing the same products that you sell. There are three metrics:

  • Average product price: This is the average price of a product when your ad showed or when your ad was competitive in an auction.
  • Benchmark product price: This is the average click-weighted price for a product across all merchants who advertise that product using Shopping ads.
  • Benchmark Product Price Difference: This is the percentage difference between your product’s average product price and the associated benchmark product price. For product groups, the price difference is weighted based on your product’s traffic potential. More popular products will be weighted more than less popular products.

(Source: Google, 2019)

This data can be viewed once you’ve clicked on a Google Shopping campaign. Then you just need to navigate to the Products section on the left-hand navigation. Finally, add the columns by selecting the columns from the custom columns menu on the righthand side of the interface.

Where To Find Benchmark Metrics

How do I use this data?

Google outlines there are four primary uses of the data. These are:

  1. Bidding: Identify products that are price competitive and adjust your bids accordingly. Or, bid down on less price-competitive products. Combine strategies by pricing attractively and raising bids for popular products.
  1. Diagnostics: Understand changes in performance caused by pricing alteration or changes in the competitive environment.
  1. Assortment: Inform assortment decisions by understanding the demand for different price points.
  1. Pricing: Understand how your price compares to other merchants to drive pricing decisions.

(Source: Google, 2019).

We think the scope of use for this data could be much broader. For example, why stop at bidding for your Shopping campaigns? Surely understanding which products you’re competitively priced for should be considered as part of your wider paid search bidding strategy?

Then there are wider channel uses; Could this data influence which creative you use for Facebook Ad campaigns, for example? Is your price competitiveness impacting your wider channel performance (e.g. paid social, organic search, direct traffic etc.)? These are just a small number of broader potential uses for this data.

Bear in mind this will only show data for merchants in the same Google Shopping auctions as you so it should not be a substitute for keeping an eye on competitor pricing!

Concluding Thoughts

As useful as this data is, price is only one of the four P’s of marketing (or seven P’s depending on which framework you use). Whilst price is undoubtedly important, other factors such as brand recognition, delivery options, checkout experience and seller reviews – to name but a few variables – will have a huge impact on your campaign performance. This means it’s not as simple as “we have the lowest price so why aren’t our shopper ads converting?!”

But there is no doubt this data has several uses as highlighted earlier. These metrics are also in beta so we can expect improvements as Google develops its benchmark reporting.

Hopefully this article has been of use to you and thanks for reading it! If you’d like to discuss anything to do with Google Ads and how to get the most out of it, then please feel free to get in touch.

Josh Colbeck

Head of Biddable

Email: josh.colbeck@blueclaw.co.uk

The post Google Shopping Benchmark Price Metrics appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The quality of your on-site content is undoubtedly one of Google’s core ranking factors; Google’s Panda algorithm update heavily focused on rewarding sites with high-quality content and demoting sites with low-quality, thin content.

Google prefers information-rich pages written for the user and will, therefore, penalise pages it thinks don’t provide enough value; therefore valuable user-centric content is essential to enable the opportunity of ranking well for relevant keywords, as well as aiding on-site user engagement and improving conversion rates.

Back in 2011, the same year that the Panda update was released, Google provided some guidance on which aspects need addressing for a website’s on-site content to be deemed “high-quality”.

To this day, especially since the E-A-T based algorithm update, answering these questions provided by Google with your on-site content is more important than ever.

In this blog post, I’m going to focus on the question that I think is arguably the most actionable and how you can address this to give your pages the best chance possible to rank highly in Google.

Does the page provide substantial value when compared to other pages in search results? (Google, 2011)

Evaluating websites which rank well for your target keyword(s) is the best place to start when putting together a content strategy to create a new page or update an existing one.

You should be looking at aspects such as:

  • The comprehensiveness and depth of information they provide around the topic
  • Types of content – Do they include a range of images and an informative video that provide extra context for instance?
  • The order of content – How have they structured their content? Do they meet the most important needs of the user first?
  • The level of portrayed expertise of the topic – Is it written by, or does it quote, an expert within the particular industry for example? 

For this example, I’m going to put myself in the shoes of Vodafone. Their iPhone X page currently ranks on the 2nd page for “iPhone X Contracts”, which is pretty poor considering how big, and relevant, the brand is. It isn’t for a lack of backlinks either; the page has a total of 63 referring domains.

In comparison, Mobiles.co.uk, a lesser known brand, ranks comfortably on the first page for “iPhone X Contracts”, with less referring domains than Vodafone (42).

However, in terms of content, Mobile.co.uk’s page includes an array of useful, informative and relevant content features such as:

  • An expert review

The expert review here is great for users and for Google; it adds a level of expertise to the content while also being a very useful and informative feature for users in terms of helping them decide if the specific phone is best suited to them.

  • User-generated reviews

User generated reviews are great for keeping your content fresh and ranking for longer tail keywords, as well as being super valuable for users in terms of finding out the thoughts of “real” owners of the phone.

  • Unique copy describing key details of the phone

Mobiles.co.uk include a range of these feature sections (examples above). Extremely relevant; highlighting the key features for users as well as boosting the topical relevance of the page, something that Google looks for.

  • Related blog posts

Supplementary content in the form of related blog posts are a great way to keep users engaged on your site while also providing a good opportunity in terms of internal linking to relevant guides and blog posts.

Vodafone’s iPhone X page offers nothing of the sort, in terms of unique content at least; the main content features are copy and pasted content from Apple’s iPhone X page (which is an issue in itself), with a very basic description of the specifications of the phone as detailed below.

Competitor content analysis is just one aspect to think about in terms of trying to make your content as high quality as possible. At the least, it provides a great foundation to work from. The next step would be thinking about what can you do to make your content better than your competitors. For any help with your onsite content get in touch with Blueclaw’s SEO team.

The post Competitor analysis for an informed on-site content strategy appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In the US, Mother’s Day is the time to celebrate someone that looks after and provides for you. And yet, most greeting card companies take such holidays as an opportunity to perpetuate sexist gender stereotypes. Mothers, in the world of these companies, are self-sacrificing, unthreatening and nurturing. Their main hobbies in life are ironing, tidying up after others and slaving away over the stove, while their husbands jovially play golf before enjoying a nice cold beer with their manly friends.

While mothers may provide, nurture and care for their families, they’re so much more than that. They are real people with real interests and achievements that extend beyond the realm of the 1950s housewife. And while they may only be greeting cards, gifting them to your mother only normalises the archaic attitudes that we’ve tried hard to stamp out.  

It’s not just Mother’s Day that’s fallen victim to these ridiculous stereotypes. In early April, a pair of amorous greeting cards went viral for their shameless perpetuation of the belief that women love to receive flowers in return for making hearty sandwiches for their manly man. And to add insult to injury, the envelopes were even colour-coordinated blue and pink to appeal to their respective sexes.   

The majority of Mother’s Day cards available today are plastered with garish designs of flowers and sparkly wreaths, alongside messages encouraging the poor women to relax and leave that inevitable pile of ironing until tomorrow – when it will still be there for her to delve into, of course.

There’s nothing wrong with children saying thank you to their mother for all the support and comfort they provide, but why, in 2019, are we still dividing these traits and tasks by gender? Times have changed, attitudes are evolving and it’s about time that we have some greeting cards that reflect what mothers really do.

Our talented design team have created six new realistic greeting card designs for multidimensional mothers in 2019, each highlighting the roles mothers really play.

Working Mum

While there’s nothing wrong with being a stay-at-home mum, greeting cards rarely acknowledge working mothers. If they do, job roles are limited to receptionists, cleaners and nurses. This greeting card gives a more accurate representation of the wide range of jobs working mothers do – i.e. any job a man can.

Single Mum

Single mothers have long had their strength and capability undervalued in favour of the single dad, who appears far more frequently in the greeting card world. This card salutes the mums who are doing the jobs of both parents and are absolutely crushing it.

Boss Mum

While the majority of Father’s Day cards praise dads for their business prowess, mums are rarely given any praise for the work they do outside of the home. This card is for the mothers who work hard to provide for their family.

DIY Mum

Father’s Day cards may make out that DIY is solely a man’s job, but few mothers will never have assembled flat-pack furniture or painted a wall. This card is for the mothers who help maintain their home, and not just by hoovering or washing up.

A Real Stay-At-Home Mum

A stay-at-home mum does far more than just cook and clean. Whether it’s checking for monsters under the bed, spending hours on end keeping guests entertained or working hard to resolve family disputes, this card gives thanks to one of the most difficult jobs.

Active Mum

Golf has become the unofficial symbol of Father’s Day, along with pints, slippers, pipes and tools. But what about women who are equally passionate about the sport? With more focus being placed on women in sport than ever before, this card is for the mothers who have active interests.

If you’d like to show that you’re forward-thinking with a PPC or SEO strategy that promotes your values, get in touch with us today.

The post Mother’s Day: what’s really on the cards? appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

An exciting and informal kick off to this year’s Leeds Digital Festival, we’re very pleased to announce Blueclaw’s latest event – The SEO Mixer.

Taking place at The Adelphi on Tuesday, 23rd April, this free event is aimed at bringing the search professionals of Leeds together for a night of top-class SEO talks, great drinks, and networking in a fun and friendly environment.

The agenda thus far includes speakers from Digitaloft, Delete, Pendragon, and of course, Blueclaw, covering core topics such as onsite usability, hands-on SEO growth hacks, and the fabulous world of link-building.

Following each presentation, we’ve planned loads of time to drink and chat about the areas discussed, so come along to enjoy some food for thought, SEO based schmoozing, and a beer on us!

For more information, and to sign up for your free ticket, please visit the event page here.

The post The SEO Mixer: Thinking, Linking and Drinking appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Before moving to Blueclaw, I had spent five enjoyable (and often very stressful and eventful) years working in sports journalism.

For somebody working on the front line as a reporter, having to forge working relationships with PR firms to get the best interviews and the best access was a crucial part of the job if I wanted to stay one step ahead of the competition.

I was fortunate to spend time both in the field and on the editorial desk of the national agency I worked for. It gave me a great grounding in how to deal with the huge variety in campaigns that would undoubtedly flood my inbox on a daily basis.

So often I would field calls and give the standard advice to ‘send a release over email’, without ever really having the intention of following it up, but every now and then one would slip through the net and catch the eye.

But what is the best way to attract the attention of a sports journalist, or even an editor on a regular basis?

Access

The number one attraction for a sports journalist is access. Whether that is the opportunity to speak to a current sportsman/woman or someone relevant to a particularly current topic, the chance to get a fresh perspective and fresh quotes is what drives a story forward.

That, ultimately, is the holy grail for editors around the world.

For example while working in Scotland I worked closely with an agency representing a betting company, who gave me access to a football manager who was out of work. By the end of my one-on-one interview, I had managed to get him to confirm his interest in a vacant managerial position in the Scottish Premiership. By the end of the week he was in formal talks to take over and was eventually hired.

There are many ways to get an exclusive, but as a journalist, fostering positive relationships with PR firms can put you in a position where you are first in line to get that access that can make a splash and move the story on. In turn for the access, you return the favour by providing publicity to the original company and everyone is a winner.

Keep it short and snappy

There is nothing that turned me off more than a lengthy, well-rehearsed pitch over the phone or a press release so long-winded it could be released in hardback.

The world of journalism is a fast-moving business. The chances are that if you fail to grab the attention of your reader in your opening gambit your hard work is going straight in the bin.

So what is the best way to get someone hooked and eager to read more?

Just like writing an article, you have to hit them hard early on and keep them interested. Too many times I have felt my eyes glaze over and my brain switch off as I am just quoted number after number, supposedly proving some study that has been conducted. In other words, be straight to the point, show off your knowledge, and be personable.

The importance of data

In a world that is increasingly dominated by clickbait opinions and ‘fake news’, a solid piece of well sourced, in-depth research is worth its weight in gold.

At Blueclaw, producing compelling data-driven content is a huge part of what we do on a day-to-day basis.

Alongside our Data Insights team, we can spend time that journalists simply do not have gathering and analysing massive amounts of data that can back up or contradict a current trend, or offer findings for a journalist to expand on and take further.

One such campaign was the Champions League Price Index. Through swift and methodical research we were able to put together a definitive guide for English football fans on the cost of travelling to matches in Europe, including the price of a local beer, food and accommodation for the night.

I would never say there is one perfect way to crack the process. Sometimes it is just about being in the right place at the right time, sometimes it takes an exceptional bit of content to finally draw the eye.

The key is to stay up-to-date with the ever-changing world of sport (or, indeed, any industry), and try to stay ahead of the story.

If you’d like to learn more about our award-winning campaigns in the field of sport and many other sectors, please get in touch.

The post Good sports: How to infiltrate the sporting media appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I have now been a PPC Executive at Blueclaw since October 2018. With it being my first role after Uni, the world of Paid Media was completely new to me when I joined.

Here’s five things I’ve learned so far: 1) Be Succinct

On competitive keywords, a square inch of the SERP often has a market price to rival that of a 5-bedroom townhouse in Kensington (ok maybe not quite) so it’s critical to squeeze the most out of the limited ad character allowances.

I have worked on accounts where CPC bids on terms can easily sneak up to the £50s, £60s, £70s and beyond, particularly in high-competition and high-value markets like betting and property. Because of this, it’s important to be as concise as possible and include the most crucial information and USPs to ensure clients see a return on their investment.

Having studied English Literature at Uni, this concept was initially quite foreign to me. Going from writing 9,000-word essays to 90 character description lines was somewhat of a challenge, but also satisfying.

2) Always Ask ‘Is It Worth It?’

In a PPC role, it is virtually impossible to run out ways to optimise the performance of an account. This could be by adding new keywords using Google’s keyword planner, running a Search Query Report to negative match irrelevant search terms, or simply adjusting keyword bids. But working for an agency, time and effort come at a price for clients. It’s important to consider their best interests and ask ‘Will the client get a return on investment if I spend time doing this?”. If the answer is “probably not”, then your time is better spent elsewhere.

   

3) There’s Not Necessarily a Right Way of Doing Things

This follows on nicely from my last point. Paid search is a quickly evolving area of marketing, and best practice is sometimes a grey area. This leaves lots of opportunities to test different tactics, be it by switching up the bidding strategy or testing the effectiveness of Responsive Search Ads against traditional Expanded Text Ads.

Luckily, there is a ‘Campaign Draft & Experiment’ tab in Google Ads and ‘Experiments’ tab in Bing Ads that make experimentation easy to do. In these sections of the interface, you can create duplicate campaigns to A/B split test a whole range of things such as landing page URLs, ad copy and bidding strategy.

And if you’ve tried out one strategy that hasn’t impacted the performance metrics in the way you had hoped – don’t worry, now you know! Just try something else.

4) Pivot Tables are Beautiful Things

Since working at Blueclaw I’ve developed a new-found appreciation for data. We pride ourselves on being a data-led agency; working within the PPC team, I analyse data daily. Pretty much everything I do should be informed by performance metrics.

One of the simplest ways to slice the thousands of puzzling rows of information downloaded from Google Ads or Bing Ads into something digestible is by using a Pivot Table. This feature in Excel reorganises and summarises selected columns and rows to show you the information you want to see.

 

5) Define Your Goals (and Don’t Over-Promise)

There are countless metrics to measure the performance of a paid search account, from Impressions to Conversions to Cost-Per-Acquisition (CPA).

Prior to beginning work on an account, it’s important to pick one or two metrics (at most) that align with the client’s goals to effectively measure the success of your future blood, sweat and optimisations.

For example, on a brand awareness campaign, the metric of Impressions might be suitable, but on an account where the goal is to drive profit, harder metrics like Conversion Volume and CPA would need focus.

It’s also vital to manage client expectations. If a client asks to double their Conversion Volume within the next quarter, do your best to meet their request but also explain the expense required to make it happen. Often the situation the client asks for is possible, but not without conditions. Being clear from the get-go can avoid future disappointment and keep relationships strong.

If you’d like to find out more about our Paid Search services and the game-changing ROI we’ve delivered for our clients, please get in touch.

The post Five Things I’ve Learnt as a PPC Executive appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

With the recent developments in the JavaScript field, it’s tempting to start using the new ‘modern’ website building approach. However not understanding the repercussions of that decision can be devastating for your SEO.

To understand the SEO implications of using complex javascript frameworks, let’s take a look at the difference between traditional CGI (common gateway interface) website building standard used since 1993.

With the traditional CGI deployment, the HTML is formulated before it’s presented to the client (web browser/crawler). The process could slightly differ depending on back-end frameworks used, however, the result is always fully or partially rendered before it is passed on to the server which in turn sends it back to the browser. The advantage of this method is that the content received by the browser is mostly ready to use and will load extremely fast with proper optimisations e.g. Amazon.co.uk. This approach uses JavaScript as a UI support rather than the main logic processor.

The modern Javascript Framework e.g. ReactJS methodology is to handle most of data rendering on the Client’s side (the browser), this can include Routing (deciding how the page link gets handled by the web application). It works by delivering basic HTML structure to the browser and initialising JavaScript framework to handle the rendering. This requires a chain of requests to come back from the server successfully which increases the initial loading time greatly. The selling point of this approach is that once you’ve loaded everything up, you can navigate quickly through the web application without fully reloading the page.

Here is an infographic that shows the support of different search engines for JavaScript frameworks:

Looking at the graphic we can assume that ReactJS is supported by the google bot? – No, it’s a lot more complicated than that.

To understand why, we need to have the knowledge of how crawlers get our URLs and index them.

Crawlers are simple programs that go to different URLs and process the HTML they receive to make decisions about rankings. They are not browsers and they do not process any JavaScript as far as we know it. The process of extraction of meaningful data from a markup language like HTML is called parsing. As it stands, there is no easy way to parse JavaScript as it is not a markup language but a scripting language, so it needs to be interpreted by the browser or NodeJS.

Therefore it is a huge effort to interpret Javascript as a browser simulation and a lot of additional server resources are required.

Have you ever wondered what the website code looks like, right clicked on the page and chosen view source? You can safely assume that the crawler can see all that’s displayed to you on that view unless special server rules are in place. The crawler will only read what is immediately returned from the server. So if you’re running a one page JavaScript app and all that is sent is some wrapping HTML then that is what will get indexed.

For more complex debugging try command (Linux or Mac): curl –user-agent “Googlebot/2.1 (+http://www.google.com/bot.html)” http://exmaple.com

You’re probably confused as the green ticks on the infographic suggest the support.

Yes, there is limited support but it’s not the Crawler interpreting the website. Google has developed a “Web Rendering Service” that is a separate piece of software and it will run at different times to the main Crawler.

The indexing process example:

1st week – crawl homepage (crawler) and schedule the “Web Rendering Service” to visit the page —> render homepage using “Web Rendering Service” and find all the links and relevant info (does not follow links on its own)

2nd week – crawl homepage (data from “Web Rendering Service” is used to crawl links previously not seen) —> render homepage using “Web Rendering Service” and find all the links and relevant info (does not follow links on its own)

As you can see those 2 software pieces don’t run together very well and the Crawler runs on its own schedule independently of what the “Web Rendering Service” is doing as well as it is the ultimate decision maker in the process of indexing your website. You can also notice that there is a minimum of 1 week lag in indexing pages returned by the “Web Rendering Service” which can be very undesirable for quickly changing content e.g. e-commerce shops, news website. If you have a large website, it could take an unreasonably long amount of time to index every page.

It’s also important to understand that the “Web Rendering Service” has a hard cut off point after 5 seconds of loading, which means that if your website loads for longer than 5 seconds the service will quit and not pass anything back to the crawler. This will cause your website to be invisible to Google.

Web Rendering Service” is an experimental technology and is flaky at best. You cannot expect proper indexing and for Google to see exactly what you see on your browser. It’s been reported that it requires most of JavaScript to be inline to be even considered for processing. As you can imagine it’s very expensive to run this service and it’s very unlikely that Google will increase the 5 seconds load limit in the future.

But not all hope is lost.

There are mechanisms that can make your website visible to the crawler:

  • Isomorphic JavaScript – render your JS manually using NodeJS on the server side
  • https://prerender.io/ – semi-automatic service that can be deployed using NodeJS on the server side
  • Anything that renders your HTML in full before it hits the browser

The main idea behind those mechanisms is to get the HTML rendered before it’s received by the browser like the CGI method I described at the beginning of this article where the server will serve pre-rendered HTML for the search engine and non-rendered for the standard user. We can confirm that this method works when your website loads more than 5 seconds and the “Web Rendering Service” sees nothing. However, we cannot confirm what the penalties SEO penalties can be applied if the Crawler and “Web Rendering Service” do not agree on the content seen. The user-agent detection is critical here and any small error can cost rankings or apply long term penalties.

So, should you use one page JavaScript frameworks for web application where you want to gain ranking?

It is a cool, trendy and new way of making websites, however, the tradeoffs in the area of SEO are too big at the present:

  • You may be penalised for the increased initial load time (crawler will load your website from the beginning every time – it doesn’t behave like a user)
  • Your website might completely disappear from all search engines
  • Unless your users spend hours on your website at one time then the increased initial load time will actually make the UX worse
  • Splitting the server logic between front-end and server can be a mess if the team doesn’t work well
  • “Web Rendering Service” uses chrome 41 which means your JavaScript needs to have been usable 3 years ago
  • Sharing crawlers like Bing, Facebook, Twitter will not render your JavaScript
  • Larger server resources are required to handle pre-rendering of the content and caching
  • Using pre-rendering services increases the danger of cloaking penalty
  • Great effort in debugging required
  • Auditing your website will be more expensive and complex
  • Even using everything at your disposal cannot guarantee correct indexing
Where it is safe to use:
  • When no ranking is required e.g. admin panel, not publicly accessible content

The post Will your one page JavaScript app get indexed? appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

A few years ago, I was the editor of a student magazine. Any press releases I received that were giving a particularly hard sell to a product or service were directed into a folder named “Desperation”. Nothing I put in there was ever used.

There’s a reason for the pushy salesman to appear as a recurring comic device on the small screen. Whether they’re flogging used cars, phones, double glazing or wheelchairs, they’re never the hero of the piece.

Their lack of trustworthiness is what results in their downfall, and that’s because they’re always relatable to what people experience in reality. Their habit of not taking no for an answer, together with their statements’ pig-flying implausibility, is what puts their victims on the defensive. Nobody wants to feel pressured.

This, though, is one of the major tripwires for newcomers to public relations – salesy campaigns.

The red herring of publicity

It’s not uncommon for people outside the media to assume that PR is the lovechild of sales and advertising. A sort of pushy message, groaning with statements about how a new offering changes everything.

This is particularly common when business managers try it themselves, thinking “it must be simple, so we can do that”. In much the same way that someone in the Department for Transport must have said “yes, we can do that” when asked to reorganise the train timetables.

Invariably, efforts along these lines end up in the electronic abyss of journalists’ deleted email folders.

The key is to remember that the majority of news organisations make most of their income through advertising. If a journalist is sent something that appears to be a sneaky advert in disguise, they’ll delete it and leave you to buy ad space instead.

On top of that, they’re only ever going to publish information that’s genuinely useful to the majority of their readers – not the things that are useful to a business’s sales team.

This is why it’s so important to carefully consider how meaningful your message is to anybody outside your office. If it isn’t, don’t force it. Take a different approach.

Don’t shout – think laterally

It’s far better to, instead, show how knowledgeable your business and the experts it employs are. In many cases, this is best done by avoiding talking directly about your industry.

For example, in a recent campaign for car leasing specialist AMT, we created a Carnage Calculator that totals the value of damage James Bond has caused to all the vehicles he’s driven throughout the 24 films.

This was covered 85 times by different websites and publishers around the world. If, on the other hand, we’d done a campaign about how much James Bond would be quoted to take out a three-year lease on a BMW 5 Series, I suspect the figure might not have been as high.

This is an approach we know works for very different subjects. Our Pet Wingman experiment for Webbox, which tested exactly how much having a dog in your dating app picture affects your chances of finding love, didn’t have anything to do with pet food. Yet it exposed the brand to a huge international audience with 46 pieces of coverage.


Put yourself in others’ shoes

I’ve worked in PR for eight years now, and in that time I’ve seen suggestions for press releases and complete campaigns as saturated with hard-sell techniques as a sofa salesman’s patter.

In fact, they sink so completely that a search for examples proved so fruitless that a hunt for Shergar might have been more successful. This, in itself, speaks volumes. The only hint of their existence is my experience of having laughed at them.

Think of creating a PR campaign – however small – directly around a product or service as similar to receiving a cold call from a PPI or injury claim service. Will you really consider it? Probably not, and neither will journalists.

But if you want to know more about how we can create campaigns to help you, or for more information about our services, get in touch with us.

The post Sales messages are cyanide for press campaigns appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Making Your Own Wrappers For Vanilla JS Plugins In Vue

When it comes to developing interactive content assets – to amplify your brand, earn links to boost SEO or offer your site visitors a new experience…it’s important to work efficiently.

Vue has many great repos out there that can help you add new effects and functionality to your builds. Sometimes though, you’ll find that it’s not possible to achieve your particular use-case with what’s already available.

That might call for you to crack your knuckles and dive into crafting some hand-rolled solution, or in some cases, you might find a 3rd party vanilla JS solution which fits the bill. In the case that you’ve found the perfect vanilla solution, it’s often useful to create your own wrapper component to make it play nice with your project and to easily reuse the code next time you need the same functionality.

To work through the basics of what that could look like, we’ll make a simple component that adds a parallax effect to background images that will be easy to reuse in various contexts. It’s not supposed to be exhaustive in terms of feature set, and there are some good offerings for Vue-flavoured parallax systems out there, but hopefully will serve as a good illustration of what’s possible.

Here’s the CodePen demo where you can see the complete code and follow along.

So the vanilla JS plugin in our sights is ‘Just Another Parallax’, which seems like quite a nice dependency-free option for taking the heavy lifting out of our parallax implementation. We will create its own reusable wrapper component that allows us to interface with whichever parameters we want on the plugin and handle any needed events.

For our example, we will style the component to completely cover its offset parent. This will allow us to quickly use it in various contexts by setting its parent to position: relative; and then sizing/positioning as required. So all we need to do to use the component is include it in a template with an image URL prop (plus any other options that we want to set up and pass in) and wrap it in a parent positioning element.

So let’s dive in.

We’re going to end up with two main files; our Parallax component, and its parent. Normally I’d structure this using .vue single file component files, but for the sake of being able to follow along on the Codepen, I’ll refactor this slightly as two .js files and a context .html.

Here’s our index.js file:

import Parallax from './Parallax.js';

var app = new Vue({
  el: '#app',
  components: {
     Parallax,
  },
  data() {
     return {
       testImg: 'https://images.unsplash.com/photo-1548152050-75c6ee050f54?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1500&h=1000&fit=crop&ixid=eyJhcHBfaWQiOjF9',
     };
  },
  mounted() {},
  methods: {},
  template: `
    <div >
      <Parallax
        :imgUrl="testImg"
        :speed="0.5"
        type="scroll"
      />
    </div>
  `,
})

This just provides a context for our new component. You can see that we’re feeding an image and basic config through props to our Parallax component. Our main concern is what’s going on inside this.

So let’s start with some boilerplate to build up our Parallax component code in the Parallax.js file:

export default {
  name: 'Parallax',

  props: {},

  data() {
    return {};
  },

  computed: {},

  mounted() {},

  beforeDestroy() {},

  methods: {},
};

Next, we’ll add our markup with a class to hook styles on to and a ref for our script:

export default {
  name: 'Parallax',
  template: '<div  ref="jarallaxEl"></div>',
  
  props: {},

  data() {
    return {};
  },

  computed: {},

  mounted() {},

  beforeDestroy() {},

  methods: {},
};

Now we need to define our props, and they should look like this:

export default {
  name: 'Parallax',

  template: '<div  ref="jarallaxEl"></div>',
  
  props: {
    imgUrl: {
      type: String,
      required: true,
    },
    speed: {
      type: Number,
      default: 0.5, // between -1.0 - 2.0
      validator(val) {
        const condition = val >= -1 && val <= 2;
        if (!condition) console.warn('WARNING: "speed" prop requires a value between -1.0 and 2.0');
        return condition;
      },
    },
    type: {
      type: String,
      default: 'scroll', // scroll, scale, opacity, scroll-opacity, scale-opacity
      validator(val) {
        const vals = ['scroll', 'scale', 'opacity', 'scroll-opacity', 'scale-opacity'];
        const condition = vals.indexOf(val) > -1;
        if (!condition) console.warn(`WARNING: type prop must be one of the following strings: (${vals.join(', ')})`);
        return condition;
      },
    },
  },

  data() {
    return {};
  },

  computed: {},

  mounted() {},

  beforeDestroy() {},

  methods: {},
};

We need the ‘imgUrl’ prop so we can select which image is rendered and make this useful as a repeatable component. The ‘speed’ and ‘type’ props are a way to interface with the Jarallax config. Jarallax offers quite a few config options (things like image size, position, repeat etc) and these can all be mapped via props, but in our case, we’ll only be creating an interface for speed, type and image URL. We’ve also used custom prop validators here just to make sure that the values supplied fall into the range of what’s supported by Jarallax.

The next step is to add a way to assign the background image that’s coming in from the ‘imgUrl’ prop to our dom element. We could do this inline but I’ve elected to use a computed property to keep the markup a bit cleaner and allow additional prop-dependent styles to be added more easily:

export default {
  name: 'Parallax',

  template: '<div  ref="jarallaxEl" :style="elStyle"></div>',
  
  props: {
    imgUrl: {
      type: String,
      required: true,
    },
    speed: {
      type: Number,
      default: 0.5, // between -1.0 - 2.0
      validator(val) {
        const condition = val >= -1 && val <= 2;
        if (!condition) console.warn('WARNING: "speed" prop requires a value between -1.0 and 2.0');
        return condition;
      },
    },
    type: {
      type: String,
      default: 'scroll', // scroll, scale, opacity, scroll-opacity, scale-opacity
      validator(val) {
        const vals = ['scroll', 'scale', 'opacity', 'scroll-opacity', 'scale-opacity'];
        const condition = vals.indexOf(val) > -1;
        if (!condition) console.warn(`WARNING: type prop must be one of the following strings: (${vals.join(', ')})`);
        return condition;
      },
    },
  },

  data() {
    return {};
  },

  computed: {
    elStyle() {
      return {
        backgroundImage: `url(${this.imgUrl})`,
      };
    },
  },

  mounted() {},

  beforeDestroy() {},

  methods: {},
};

First, add a handle so we can trigger methods on Jarallax:

export default {
  name: 'Parallax',

  template: '<div  ref="jarallaxEl" :style="elStyle"></div>',
  
  props: {
    imgUrl: {
      type: String,
      required: true,
    },
    speed: {
      type: Number,
      default: 0.5, // between -1.0 - 2.0
      validator(val) {
        const condition = val >= -1 && val <= 2;
        if (!condition) console.warn('WARNING: "speed" prop requires a value between -1.0 and 2.0');
        return condition;
      },
    },
    type: {
      type: String,
      default: 'scroll', // scroll, scale, opacity, scroll-opacity, scale-opacity
      validator(val) {
        const vals = ['scroll', 'scale', 'opacity', 'scroll-opacity', 'scale-opacity'];
        const condition = vals.indexOf(val) > -1;
        if (!condition) console.warn(`WARNING: type prop must be one of the following strings: (${vals.join(', ')})`);
        return condition;
      },
    },
  },

  data() {
    return {
      jarallax: false,
    };
  },

  computed: {
    elStyle() {
      return {
        backgroundImage: `url(${this.imgUrl})`,
      };
    },
  },

  mounted() {},

  beforeDestroy() {},

  methods: {},
};

Now we populate that handle with the Jarallax object:

export default {
  name: 'Parallax',

  template: '<div  ref="jarallaxEl" :style="elStyle"></div>',
  
  props: {
    imgUrl: {
      type: String,
      required: true,
    },
    speed: {
      type: Number,
      default: 0.5, // between -1.0 - 2.0
      validator(val) {
        const condition = val >= -1 && val <= 2;
        if (!condition) console.warn('WARNING: "speed" prop requires a value between -1.0 and 2.0');
        return condition;
      },
    },
    type: {
      type: String,
      default: 'scroll', // scroll, scale, opacity, scroll-opacity, scale-opacity
      validator(val) {
        const vals = ['scroll', 'scale', 'opacity', 'scroll-opacity', 'scale-opacity'];
        const condition = vals.indexOf(val) > -1;
        if (!condition) console.warn(`WARNING: type prop must be one of the following strings: (${vals.join(', ')})`);
        return condition;
      },
    },
  },

  data() {
    return {
      jarallax: false,
    };
  },

  computed: {
    elStyle() {
      return {
        backgroundImage: `url(${this.imgUrl})`,
      };
    },
  },

  mounted() {
    this.jarallax = window.jarallax; // refactored slightly for codepen, previously catering for SSR
    this.init();
  },

  beforeDestroy() {},

  methods: {},
};

Note that usually, we would use something more like this (in this case using require to prevent issues with window references within the library in our Nuxt projects):

mounted() {
  const jarallax = require('jarallax'); // SSR
  this.jarallax = jarallax.jarallax;
  this.init();
},

But because of the CodePen environment, we’ve refactored slightly to the first snippet.

Here’s our companion cleanup method to release resources when our component is unmounted:

export default {
  name: 'Parallax',

  template: '<div  ref="jarallaxEl" :style="elStyle"></div>',
  
  props: {
    imgUrl: {
      type: String,
      required: true,
    },
    speed: {
      type: Number,
      default: 0.5, // between -1.0 - 2.0
      validator(val) {
        const condition = val >= -1 && val <= 2;
        if (!condition) console.warn('WARNING: "speed" prop requires a value between -1.0 and 2.0');
        return condition;
      },
    },
    type: {
      type: String,
      default: 'scroll', // scroll, scale, opacity, scroll-opacity, scale-opacity
      validator(val) {
        const vals = ['scroll', 'scale', 'opacity', 'scroll-opacity', 'scale-opacity'];
        const condition = vals.indexOf(val) > -1;
        if (!condition) console.warn(`WARNING: type prop must be one of the following strings: (${vals.join(', ')})`);
        return condition;
      },
    },
  },

  data() {
    return {
      jarallax: false,
    };
  },

  computed: {
    elStyle() {
      return {
        backgroundImage: `url(${this.imgUrl})`,
      };
    },
  },

  mounted() {
    this.jarallax = window.jarallax; // refactored slightly for codepen, previously catering for SSR
    this.init();
  },

  beforeDestroy() {
    this.jarallax(this.$refs.jarallaxEl, 'destroy');
  },

  methods: {},
};

Finally, we create the init method itself, passing the props handling the configuration:

export default {
  name: 'Parallax',

  template: '<div  ref="jarallaxEl" :style="elStyle"></div>',
  
  props: {
    imgUrl: {
      type: String,
      required: true,
    },
    speed: {
      type: Number,
      default: 0.5, // between -1.0 - 2.0
      validator(val) {
        const condition = val >= -1 && val <= 2;
        if (!condition) console.warn('WARNING: "speed" prop requires a value between -1.0 and 2.0');
        return condition;
      },
    },
    type: {
      type: String,
      default: 'scroll', // scroll, scale, opacity, scroll-opacity, scale-opacity
      validator(val) {
        const vals = ['scroll', 'scale', 'opacity', 'scroll-opacity', 'scale-opacity'];
        const condition = vals.indexOf(val) > -1;
        if (!condition) console.warn(`WARNING: type prop must be one of the following strings: (${vals.join(', ')})`);
        return condition;
      },
    },
  },

  data() {
    return {
      jarallax: false,
    };
  },

  computed: {
    elStyle() {
      return {
        backgroundImage: `url(${this.imgUrl})`,
      };
    },
  },

  mounted() {
    this.jarallax = window.jarallax; // refactored slightly for codepen, previously catering for SSR
    this.init();
  },

  beforeDestroy() {
    this.jarallax(this.$refs.jarallaxEl, 'destroy');
  },

  methods: {
    init() {
      this.jarallax(this.$refs.jarallaxEl, {
        type: this.type,
        speed: this.speed,
      });
    },
  },
};

And this is all we need for our Parallax component.

Now, any time we need to parallax an image we can just import our component and pass in the image URL! And our component will handle the initialisation and teardown for us. We could also improve this example by adding watchers that re-init the plugin on prop change.

So there you have it, we’ve found that this type of technique can be a nice timesaver for any third-party libraries that you might want to reuse.

The post Making Your Own Wrappers For Vanilla JS Plugins In Vue appeared first on Blueclaw.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview