Impression: Award Winning Digital Marketing Agency
Impression is a digital marketing agency that delivers outstanding results through SEO, PR, PPC, content marketing and web design. In Nottingham and London. SEO and the landscape of online marketing is always changing. At Impression, we are always keeping up to date and discussing the latest trends.
We’re thrilled to announce we have been shortlisted in 3 categories of the UK Agency Awards 2019.
The awards recognise excellence in agency teams across the country. They are entered by a wide range of highly respected agencies and provide an opportunity to showcase the great work done by those teams in the previous year.
This year, we’re honoured to be shortlisted for the following awards:
Best PR Campaign
The ‘best PR campaign’ award is given to the agency that is able to prove how their PR campaign has delivered tangible results through creative PR ideas.
The digital PR campaign for which we’re shortlisted is one we’ve done recently with our client Tonik Energy, a national green energy provider. Their focus on electric vehicles led us to create this campaign which looks at how prepared we are across the UK for the EV Revolution. To date, the campaign has earned over 130 new links to our client’s website, as well as coverage to be proud of in publications including national press, local press and trade press too. Topically relevant, high quality and high quantity links to support brand growth and SEO results.
The ‘biddable media agency’ category recognises those agencies that provide paid media services (commonly referred to as PPC and comprising search ads, display ads, mobile ads, video ads, remarketing, social media advertising and more).
We’re really proud of the work our PPC team does and our entry into this category included examples of high ROAS campaigns across a variety of paid media channels.
Integration is so important for any business looking to market themselves effectively in 2019 and beyond. The terms ‘omni channel’ and ‘multi channel’ have become increasingly commonplace in recent years and focus on the experience users have of interactions with brands that occur across multiple platforms; integration is therefore about how we as marketers facilitate those cross-channel experiences.
In order to be shortlisted in this category, agencies have to prove not only that they have expertise in and deliver fantastic results across multiple channels, but also that those channels work together seamlessly to generate even better results and better experiences for users.
There has been some chatter within the industry of a potential Google search algorithm update on June 27th. However, as Google pre-announced June’s broad algorithm update and will carry on notifying site owners and SEOs of any major updates to be more proactive.
Due to Google usually releasing small changes to the algorithm each day to improve the search results, mostly unnoticeable, it is unlikely that this fluctuation being discussed is a broad algorithm update focusing on one key area.
This possible algorithm talk was also mirrored earlier on in the month around June 19th. The chatter seemed to be relatively random amongst industries and with an average of 10-15% decrease in traffic drop off. Many have hypothesized that an update may be in the pipeline with tweaks in traffic being recognised – one to watch out for.
Search Industry Updates
During the beginning of July, Google announced after 25 years of being in use and adopted by over 500 million websites, the robots.txt protocol is now working toward becoming an internet standard. The Robots Exclusion Protocol (REP) has been regarded as one of the most fundamental and critical components on the internet, allowing crawlers to access the site in full or only partially.
As this has never been turned into an official statement, developers have subsequently interpreted this protocol differently over the years. To create a unified approach, helping website owners and developers to create great user experiences across the internet and how to effectively submit to the IETF (Internet Engineering Task Force).
This is an important step that should be flagged with developers who parse robots.txt files. It is not crucial to parse at least the first 500 kibibytes of a robots.txt, which can help to define how long a file size is, alleviating unnecessary strain on servers.
If you’re unsure whether this affects your site, get in touch with the team or have a read of the Google’s Developer guidelines for further information.
Listing Restricting on the SERPs
On June 6, Google announced that they were updating the number of listings from the same site in the top search results.
The new change now means that it is highly unlikely that a user will find more than one result from the same website for the same search term. Google also announced that it would broadly consider subdomains as part of the root domain, except in cases where they are treated as separate sites for diversity purposes.
This is an important development, as it gives smaller businesses a greater chance of ranking on SERPs traditionally dominated by larger websites, such as Yelp or Quora.
Google Search Console Display Data
On June 12, Google announced that Search Console would now display data from search clicks and Google Discover over the past 90 days.
Search Console had previously only shown data from the previous 28 days. While this was a relatively minor industry update, the news was still welcomed by SEOs, many of whom now had a better overview of their site’s performance.
Google Taking Action on Large Websites
In a webmaster hangouts session held on 1 July, John Mueller revealed that Google employees had discussed possibly taking action against large websites leasing out their subdomains to third-parties. While restraining from calling the practice ‘spam’, Mueller did acknowledge that negative feedback had called the practice into question and that the search engine giant was searching for a solution.
He suggested that perhaps “the right approach is to find a way to figure out… what is the primary topic of this website and focus more on that and then kind of leave these other things on the side.” He did, however, say that there was “unlikely to be a simple answer that could be applied to every website”.
Have we missed anything out? Let us know in the comments.
Audience targeting has existed for as long as the concept of marketing itself. In order to be a successful company, a business needs to drive sales from a customer base in a profitable way and targeting the most interested parties is an obvious way to do that. But not only do we need to identify our regular customers whom we can build a business around, we also need to be able to spot the groups who can be wooed and persuaded over time to grow the customer base too.
So how does this age-old technique translate into the digital realm? How can businesses make the most of their online audiences, whether they be brand new website visitors or loyal social media fans? What is the best way to tap into what makes your users tick and to capitalise on this to drive sales and revenue? Well, I asked our team of expert marketers for their thoughts on some of the burning questions concerning audiences and personas in digital marketing, so read on to find out what they said!
Hugo Paige Luke Liv Laura Greg
Ian Georgie James Sophie George
How important is it to know your audience in digital marketing, and why?
As SEO and PPC specialists, we rely on accurate audience data to inform our campaigns, whether it’s on user location, age group, general interests or the buying journey. Knowing your audience also informs keyword targeting and content creation, which form the foundation of website optimisation.
In order to effectively reach your target audience, you need to understand their intent, what they need and how they relate themselves to your products or services. By knowing this, you’ll then be able to effectively tailor your marketing strategy specifically to fit your audience. However, if you’re unaware of your audience, users are likely to go to a competitor who better relates to their needs.
Without knowing your target audience, you run the risk of communicating in a way that does not resonate with the audience. This will lead to a lack of engagement and a campaign which flops.
When you’re running PPC campaigns, knowing your audience is vital unless you want to spend loads of money and not get results.
The success of your marketing plan hinges on how well you know your target market, whether you’re optimising your content for user intent to drive better SERPs visibility, crafting effective audience targeting strategies for your paid ads, or building audiences through top of the funnel PR campaigns.
It’s important to find the audience whose needs will be satisfied with your product, and market to them. Your PPC campaigns are unlikely to be very profitable if you put a +100% bid adjustment on a 65+ year old demographic when you’re trying to sell the new installation of FIFA.
What are your top tips for identifying your audience?
To gain a holistic perspective of an audience, it’s important to complete a segmentation, targeting and positioning (STP) analysis that is relevant. This helps to fully understand where to target our organic efforts towards, resulting in the best possible outcome for our clients.
A quick and easy way to perform some top-level analysis is by looking into your Google Analytics data. However, I believe the best way to identify your target audience is to speak to members of your business who have the most interaction with your customers. They will have the best understanding of who your target audience really is.
Google provides us with high converting age groups, genders, locations, and in-market audiences to help us identify the audience that has converted well in the past and we need to focus on. Feedback from the client is also useful, as they are the experts.
My first step is always to speak to my client, as more often than not, the business knows their own audience at least fairly well and is able to articulate audience categories and demographics even if they can’t delve any deeper into motivations and intents. Next, I call on data to inform me. CRM data can also be really telling of not just who the audience is, but which areas of the audience are most likely to convert, which makes for more effective marketing prioritisation.
In the first instance, we can use common sense as, for example, a student isn’t likely to buy a coffee machine worth hundreds of pounds. We then check these ideas against data from Google Analytics and social media audiences to discover which user groups are engaging most with the website and social media channels.
How helpful do you find it to use personas?
When creating content, personas are incredibly useful as you can gear terminology or visuals to different audiences. If you’ve got retired Doris who would love to buy some gifts for her grandchildren, you need to speak her language. Personas help you to choose the words you pick, and to decide what kind of content would be most appropriate.
By personifying your audience, you can create something that’s “real” and therefore build empathy for your audience. Creating visual graphics of your typical customer personas can provide a quick and easy reference for your team when they are making key decisions in how they design and communicate.
As marketers, our job is to understand what users need and how they search for it, once we know this, we can build a marketing strategy fit for our target audience. It’s important to think about everything from the messaging we’re using in our ad copy, all the way down to the locations and age groups we are targeting, ensuring that we’re reaching the right people at the right time.
Audience personas are a useful first step to knowing your audience. However, the personas may change over time, so they should be reviewed regularly to reflect updates in the industry and wider economy.
When we talk about the main differences between ‘digital PR’ and ‘traditional PR’ the common talking points usually refer to the processes of linkbuilding, campaign ideation and the varying metrics (think DR v AVE or referral traffic v ‘reach’) used to demonstrate the value of links and mentions to clients. However, despite the many differences in how we approach campaigns to generate the type of coverage we want one thing remains universal: that the content we are create is newsworthy.
I have seen many campaigns not realise their full potential simply because the content or asset they have created is simply not newsworthy. It’s therefore crucial that the hook remains at the front of mind right at the start of the ideation process and throughout the data analysis process, and of course, when drafting the final press release.
Yes a piece of content telling me how much home insurance the family from Malcolm In The Middle would have been required to pay would have been interesting if I had watched the show and was thinking about purchasing home insurance but that’s a very small minority. Journalists have similar goals in mind to us as PRs; they want their stories to reach as many people as possible through social shares and ultimately traffic to their sites. As we need a reason to create links to our clients’ sites a journalist needs to create that trigger point where a reader has that inclination to click on their article and read more.
But back to the purpose of this blog…one of the best ways to establish what makes a great news hook is to consume as much media as possible. Read about what journalists are writing about, what is topical in today’s news agenda and think about how any of this news could be relevant or connected to your brand or client. Write down a list of example headlines and ask yourself which article you (or your Mum, Nan or your mate down the pub) would be most likely to click on.
To give you an example of a couple of our recent ideation sessions, these led us to the dream headlines of “Why your hipster beard could be causing delays at UK airports” and “Which UK cities are most and least prepared for the rise of electric vehicles?“. The first one at least didn’t end up being the finished product but I’ll explain what happened with that one in another blog.
Once you have your dream story or headline in mind you then need to think about how you’re going to prove your ‘hypothesis’ or claim and this doesn’t necessarily need to come from information or data from yourselves or the client. We live in a world where data is now so freely available be that through a quick Google search, FOI requests or by simply reaching out to fellow PRs, industry experts, universities etc. the list goes on.
We’ll go back the latter example I mentioned earlier. We wanted to work out which UK cities were most and least prepared for the anticipated rise of electric vehicles and to do this we decided to look at charging points specifically. But how do we get the data? We obviously didn’t have the time or budget to commission our own research to count the total amount of charging points in various towns and cities across the UK….. However after speaking to the team over at Zap-Map we realised they had all the data we would need (to form part of our story anyway…).
Next we needed to know what the situation was in terms of where the charging points were most needed to prove how and why some cities were behind and ahead of the curve. With a quick Google search for a previous FOI request from the DVLA we were able to find out the total number of full driving license holders for each individual postcode area of the UK. With both of these data sets we could now use some quick maths to work out the total amount of license holders per charger for each location (as visualised below).
As you can see in the results the ‘most prepared’ cities where the ratio of license holders to charging points was highest included Sunderland, Milton Keynes and Dundee; some great news for their local areas and a strong positive story we could go after when we pull together our press releases – it turned out Sunderland City Council were ecstatic with the findings and they helped to push our news out even further! On the other hand it was clear Portsmouth, Shrewsbury and Derby had some catching up to do and we could also let their local areas of press know about this too.
We outreached the various versions of the press releases to national, regional and trade press and ended up securing 300+ links for our clients across a number of a number of high profile titles including I, Daily Express, MSN, The Scotsman and Yorkshire Evening Post. The campaign only took a few days to pull together as well demonstrating how using existing credible data can deliver maximum bang for buck when it comes to outreach. It’s less likely to be questioned by journalists meaning it’s more likely to be used.
There is an abundance of data out there in the public domain ready to be used for digital PR campaigns, the main challenge is just crafting the story (or hook) that you’re going to shout about. You should be able to explain your concept or idea to someone in a few words or ultimately to a journalist in a couple of short sentences during a pitch email so make sure this is evident right from the start. Combine your story with the best existing data and reap the rewards of your hard work when you begin your outreach.
Think of your newshook(s) and visualise what kinds of publications would realistically run that story before you begin anything
Establish how you can prove your theory or claim in your headline
Demonstrate as much clout as possible in your press release; reach out to industry experts for additional data, comments or to help steer your story if needed
Ensure the final data and story is easy to understand is presented clearly and concisely: you should be able to explain your story in a couple of short sentences via email to a journalist
Build your media lists to fit the various sectors outlined in the data – this could relate to the various industries, regions or age brackets mentioned
If it doesn’t work first time revisit the data and look at alternative angles and stories
Don’t be afraid of ‘re-outreaching’ a campaign a few months later if you feel it could fit with the constantly changing news agenda
How to use data to create the best PR news hooks was last modified: June 28th, 2019 by James Watkins
Right now, I’m alone in the Impression office. A number of my colleagues are on annual leave for Glastonbury, so I’m guessing that, right now, they’re sat in traffic with thousands of other people.
But give it a couple of hours, and the updates will begin. Photos of the campsite, videos of the music, random chatter about what a fantastic event it is… basically, a whole bunch of stuff to make me a tad jealous that they’re there and I’m not.
If you’re also facing a couple of days of jealousy-inducing content from colleagues and friends, worry not! I’ve pulled together 26 of my favourite things from around the web to entertain, inform and help keep you away from the Glasto-filled Facebook feed. Enjoy!
13 conference write ups
June isn’t just festival season; it also seems to be conference season! We’ve enjoyed not one but TWO awesome conferences this month. Here are some of our favourite session write ups from Outreach Conference and Search Leeds:
PR is all about creative thinking combined with a deep understanding of what content travels well; we regularly share our favourite campaigns from around the web – here are some of our faves this month:
Virtual Flam – it’s not new, but with video continuing to grow in popularity, it’s a great example of the power of cool visuals
ePassport Delays – national coverage, bad news for beards (another of our campaigns)
1 cracking bit of news for the Impression team…
OK, so maybe this isn’t quite as ‘useful’ as the rest of the links shared here, but hopefully it is interesting and maybe even a bit inspirational…
Last week, Impression won TWO awards at the European Search Awards. One was for Best Use of PR in a Search Campaign, in recognition of that time we named a worm after a US president and achieved global coverage and massive ranking increases. The other was for Best SEO Campaign in the Health Sector in recognition of outstanding ranking, traffic and revenue increases in the highly competitive eyewear industry.
So there you have it! 26 things to read while you avoid Glastonbury updates. Have we missed anything here? Feel free to share your own interesting news, updates, thoughts and advice in the comments below.
26 Things to Read While Avoiding Glastonbury Updates was last modified: June 26th, 2019 by Laura Hampton
For businesses to remain competitive online, they must recognise that their largest competitor is no longer their industry rivals, but rather Google itself. In recent years, the search engine giant has progressively increased its market share in various heavily feed-based verticals, including hotel bookings, flights, and more recently recruitment.
In 2017, Google introduced Google For Jobs, an enhanced search feature that aggregates listings from job boards and career sites and displays them prominently in Google Search. The image below shows job listings in the Nottingham area.
By introducing this new search feature, Google has quietly placed itself as the second largest job platform in the UK according to SEO tool Sistrix.
People searching for roles are now able to access job descriptions and even apply to positions from the Search Engine Results Page (SERP) itself. Google is able to pull in job roles from correctly marked-up schema data from job sites like Monster, LinkedIn and Glassdoor and give job seekers the option to filter by category, city, date posted and company type.
Google’s reasoning behind the update was that it sought to improve the candidate experience and counteract the saturation of job boards in its search results. However, the more cynical amongst us believe that Google was simply trying to protect and grow its market share. With LinkedIn and Indeed becoming increasingly dominant, Google may have introduced the search feature to avoid people navigating to these sites directly and thereby not share their data with the search engine giant.
Moreover, LinkedIn is owned by Microsoft, who use LinkedIn data to improve targeting options on Bing search ads. Google may have also taken this into consideration, as reducing LinkedIn’s market share consequently reduces the competitiveness of Bing, its primary competitor.
Google has also extended its reach into other industries, most notably holiday bookings. Google Flights was integrated into the Google search results in March 2018 and immediately took the SERP real estate that competitors such as Expedia and Kayak had previously enjoyed. The image below shows the search results for ‘flights to New York’, and it is immediately clear that Google wants users to click on Google-provided results, whether ads or Google Flights.
In March 2019, Google launched Google Hotels. Similarly to Google Flights, this new website is integrated with Google search results and even Google Maps, allowing them to increase their online market share of the travel industry. The image below shows the results for ‘hotels New York’.
With Google implementing these industry-changing features to their search results on a seemingly yearly basis, how long until we see a similar update in another feed-based vertical?
In my opinion, not long at all. And there’s a specific industry that I have in mind: real estate.
Why? The real estate industry holds many similarities with the verticals that Google has already targeted; it utilises local search, runs from scraped or feed-based content, and is easy to monetise.
The SERPs for real estate cater to local search, as shown in the image below for ‘flats to rent’.
Real estate revolves entirely around locations, and Google will serve users with the results it deems most relevant to either their geolocation or to geo-targeting keywords such as ‘flats to rent London’.
Google has a tendency to make big plays into spaces which are feed-based or easily scraped, and easily aggregatable, if even through Machine Learning (ML). While third-party property sites like Rightmove and Zoopla provide existing sources for property data, Google could gather data straight from feeds provided by industry property sales/letting software, thereby ‘cutting out the middle man’.
There are several reasons why Google would do this. Firstly, it would allow estate agents to advertise their properties without paying fees charged by more established job portals, consequently cutting their costs and increasing their exposure.
Secondly, being able to promote their properties at reduced costs would allow more agents to allocate money to paid advertising opportunities. Google Ads, Google’s online advertising platform, could be the perfect place for real estate agencies to spend their remaining budget. Given the sheer size of the real estate industry, which is currently valued at 250 billion euros in the UK alone, Google would be set to generate a huge amount of new revenue.
Google would not even necessarily need feeds, as they could simply scrape content from various websites and reward those websites with the best on-site schema markup. Nevertheless, feeds are likely to be the method of choice, as the dominance of Zoopla and Rightmove means that real estate already primarily operates in this way.
Additionally, many property companies already pay specialist web agencies to use platforms which come with Zoopla & Rightmove integrated into the codebase. If a property feed were to be introduced, these companies could eventually take a similar route with Google in the future.
Better User Experience
Similarly to job postings, real estate is an extremely saturated, feed-based market. Zoopla, Rightmove and On The Market are all dominant players in the industry, and many of their house and flat adverts are duplicated across the different property portals. This represents a very real reason for Google to step in and create a more streamlined service that will help users make a more informed decision.
Opportunity for display and sponsored listings
Google has made it no secret that on-SERP advertising will play a big part in the search engine’s future. It is estimated that Google generated $136 billion dollars in 2018, $116 billion of which was made through its advertising platform. The search engine giant relies heavily on its adverts, which is why we’ve seen a large increase in the real estate granted to adverts in recent years.
The image below shows how search ads have increased in size in just two years (Source: Sparktoro).
Google has not only increased the amount of text advertisers can use for their products, but has also made ads more subtle by switching from the yellow block to the current green version. It, therefore, makes sense that Google would look to build out further advertising channels.
We could see a similar business model for real estate SERPs, especially given the increased interest around property search terms in recent years.
Below is a mockup of what a future ‘Real Estate’ search listing could look like.
As you can see, users would have the ability to filter results by ‘Rent’ or ‘Buy, as well as the type of property and its price range. Given that this information is already available in property portals, Google should have no issue in collecting and presenting it to users.
The launch of Google Jobs should not have come as a surprise to anyone, and neither should its potential foray into real estate. Google will want to diversify its income channels as much as possible, and any market that allows it to do so is under threat. It is no longer a matter of if they succeed, but when.
Is Google about to shake up real estate SEO? was last modified: June 25th, 2019 by Hugo Whittaker
We believe our wins can be attributed to our great team and our approach, which combines creativity with technology and data to achieve measurable, impactful results for our clients. Have a look at our photos below, or get in touch if you’d like to know how we can help you.
Impression Wins SEO and PR Awards at European Search Awards 2019 was last modified: June 24th, 2019 by Laura Hampton
Sophie is talking about how and why we should be using Google data, way beyond traditional keyword research.
Sophie is the Director of answerthepublic.com, a search query data visualisation tool, which maps out keyword predictions that you see when users make a Google search, allowing us to start answering our public/audience a lot better.
She starts by discussing the art of using search data to uncover motivations, attitudes and truths about audiences that are typically lacking in standard research methods. By ignoring these attitudes and behaviours that consumers display when they search, businesses are missing out on a wealth of incredibly valuable insights which come for free.
Jokingly, Sophie makes a Spice Girls reference in telling us that “consumers don’t tell marketers what they want, but they do tell google what they want, what they really really want”. But it’s completely true, users treat Google as a confessional box, inputting some of their deepest darkest thoughts, giving us, the marketers, so much valuable insight on what goes on in a users mind when they turn to Google for a solution.
“Google searches are the most important dataset ever collected on the human psyche”
She makes the point on how long-tail searches are grossly undervalued, and how there is a huge amount of potential when looking into these deeper through different methods of keyword research.
Traditionally, keyword research has been performed to better understand what phrases searchers are using to find content. And as marketers, the Google data set that we can access is huge, with users talking to Google as though talking to a friend, we can delve into a huge amount of data, discovering users personalities and needs, with zero research bias.
Sophie then goes on to get the audience to think of all of the things we’ve told Google in the past? Would you share these thoughts with your close friends, family, or partner?
We really do turn to Google for absolutely anything, and in most cases users will act as though they are directly talking to the search engine – sharing their deepest secrets, catastrophic life events or even desperately trying to resolve a kitchen nightmare.
She then begins to share more insight around Answer The Public and what their tool actually does for marketers. But essentially, it’s a clever insight tool that combines the suggested searches from Bing and Google and extracts them in to a “keyword wheel”, doing the job of auto complete suggestions for you.
Search listening tip – use spaces within Google autosuggest as wildcards to uncover rich attitudinal and behavioural insights.
Sophie then goes into search listening a little deeper, and how you can use it to explore tribalism in your audiences world. She then uses the example of looking into searches for queries around “dresses like” and how by doing this, you could get valuable insight on key personalities that are influencing particular products, e.g “dresses like Megan Markle’s”. This would also be extremely useful for looking into your competitor sets.
Using another example around hairstyles, and you can start to understand who the influencers are in the social world, e.g “hairstyles like Kris Jenner’s”. Allowing you to gain insight on what real women in the world want, and what they are telling Google.
If you’re working with a new client or a business that doesn’t have any personas to work with, this is a great starting point. You can look at users, and how they relate themselves to your products and why they are looking for them, building personas from this.
By doing this, you can uncover labels, both literal and inferred, that your audience gives itself and then focus on these.
Sophie then explains, that you can really get a snapshot in time by using search listening, and understand the most pertinent questions and sentiment around important topics being searched for recently – societal or commercial.
To be a better marketer, you need to understand your audience, and to do this, you need to understand their intent and what they need. If not, they’ll only go to a competitor who better understands them and relates to their needs a lot more.
At the end of the day, as marketers, we are the gatekeepers to a massive source of insight, so we need to really think about how we can make use of this search data.
To summarise, Sophie shares the key learnings from today’s talk:
Think about the entire purchase journey
Make use of wildcards and look out for tribalism
Look at searches to understand the most influential brands and personalities
Understand shifts in audience sentiment by comparing search content over time
Search Leeds: Sophie Coley – Search Listening: How and why we should be using Google data way beyond traditional keyword research was last modified: June 21st, 2019 by Paige Lambert
One of the hardest things that digital marketers find at the moment is trying to understand digital media in an AI context – the purpose of Jon’s talk today is to give us a starting point for implementing AI in our digital strategies.
Jon is going to give us an overview of how the big 3 – Google, Facebook, and Amazon – use AI. He will then show us that the future of Paid Media is automated. Finally, he is going to give us some tools we can use tomorrow.
A good starting point with machine learning is that a lot of it can be entirely useless – as digital marketers we need to find a way of focusing AI in practical ways for advertising. The best way to do this is to look to the strategies employed by the big 3.
What do Amazon, Facebook, and Google do which is useful to us as marketers?
Amazon offer unlimited movies, music streaming, and photo storage for Prime members through a huge number of their products. They are also a big player in voice search with products such as Alexa. Google have now entered this market as well
The amount of information which the big three can collect on customers using these products as well as search and site behaviour can potentially give them a huge advantage over competitors when it comes to advertising and selling more generally
This can be quite aggressive, and there are fears that these companies are storing more data than they let on.
This data is invaluable to these companies, as it lets them personalise advertising to people who are already interested in certain products. More data plus user intent leads to better personalisation and therefore more ad revenue
Automation and the future of Paid Media
Jon goes on to explain what machine learning promises for the future of PPC. He mentions just a few features which will be increasingly useful to us as digital marketers:
Ongoing audience expansion
Automated bidding strategies such as maximise conversions
Automatically build customised ads
He points out, however, that we have to be quite careful with how we use these services. Does AI have the potential to be the worst kind of black box? As marketers we need to be careful not to just let the AI services of the big 3 run wild without knowing what’s going on. In order to use what’s available to use effectively we need to follow 3 steps:
Interpret how they’re using AI
Test new releases to verify which add value
Understand how to implement strategies using AI effectively
The process for judging whether AI is adding value should always be the same – does it make us better, brighter, and faster?
Strategies you can use tomorrow
As promised, Jon now details some opportunities – where can you start tomorrow with AI?
Automated bidding – programmed rules and scripts for bidding which platforms offer off the shelf.
Positional based bidding – try to use this in a combination on high impression share campaigns that are already performing, but you want to hit top spots for. Set min/max rules to ensure that you don’t overbid for the top spot
Target CPA/ROAS – these strategies reflect rules based on commercial objectives. Don’t just set these arbitrarily across the whole account – tailor the rules specifically to different campaigns, product areas, or service areas in order to make it more effective at reducing CPA/increasing ROAS
Impression Share Targeting – Test on most valuable segmented campaigns such as those with high ROAS in order to further boost
Scripts – readily available online, scripts can automate time consuming tasks as well as helping us make more effective data-driven decisions.
Time-based heatmapping is a great place to start. Try to focus not just on traffic and CTR but look at margin, conversions, conversion value etc. Also don’t just look at trends across the whole account if you’re going to apply changes based on it – every campaign is different. The 24/7 bidding script can then be used to take action on insights provided by heatmapping
N-gram – statistical keyword fragment analysis which is great for finding both positive and negative keywords to add to the account e.g. high converting phrases
Ad copy customisation based on factors such as device and location, price and promotions, and so on. Use google sheets to build templates for this and have ad platforms import the data
Micro moment targeting – target the moments that matter in the user journey.
Using off the shelf predictive programs like DataRobot, we can better understand people’s search habits and predict the next most likely action they will take. This can massively help advertisers by enabling us to target people at the right moment – when they’re ready to convert
Jon summarises his talk as follows:
Focus on the big three – what are Amazon, Facebook and Google doing?
Be cynical – don’t blindly accept what various platforms are offering as useful. Develop your own filter and test new features as they are rolled out.
Start with small, incremental opportunities and build out if results are positive.
Search Leeds: Jon Greenhalgh – A.I. in Paid Media: strategies you can use tomorrow was last modified: June 21st, 2019 by Luke Northbrooke
This is not a useful talk and it won’t have actionable insights. This is a 20 minute rant. Barry wants a beer.
First, let’s look at what it is.
It’s a client-side scripting language, which means that code is sent to the client, which compiles it and executes it in whichever way is required. The client (normally a browser) does the heavy lifting.
JS serves a lot of different purposes. It makes pages come to life, it tracks you, it serves you ads and all sorts of different things.
The beauty of JS is that it only needs to be written once and then it runs everywhere. You can write code and create experiences that are independent of the platform. This sort of makes sense. The web needed this to make it grow and do cool shit like Barry haunting you across the web with Sistrix ads.
The problem is that JS causes a lot of bloat. And the problem is that data isn’t free. Pages are getting bigger and making us use more data to access them. Since 2010, the average size of a web page has gone up by 1000%. The average page is now 1.6MB. And let’s not forget that the UK has some of the priciest data plans in Europe.
JS ads mean you’re paying with your data and paying with your data plan. It also means you have to have a pretty decent phone and these things cost a lot of money.
The long and short is that we’re paying for the privilege of using websites.
The web is not getting faster, it’s getting slower.
The bells and whistles that we’re enabling aren’t even for our own benefit. It’s there to make us buy shit we only think we want.
Google has data centres everywhere that are constantly crawling the web, and yet they’re complaining that they don’t have the power to render JS in real time all the time. It just costs too much.
Another big issue is privacy. The data that all that code is harvesting is being sent to advertisers or whatever and who knows how it’s being used. Well, we have a bit of an idea. Businesses have a good idea of who you are and what you want to buy. The issue is that it’s not just corporations using this information.
However much Barry complains about Google, he doesn’t believe they intend to be evil bastards. He worries more about governments and other companies that can just plug into that data. We seem to have collectively forgotten about Edward Snowden.
In the early stages of the web, organisations like the NSA realise they didn’t need their own spying infrastructure, they just needed to be able to access Google, then you have everything you need to do everything. Google is not just Google, it’s also Android, YouTube and more. Everything you do on an Android phone is known by Google.
Let’s not forget about Cambridge Analytica. It doesn’t exist anymore but it was a proof of concept of what happens when you advertise with bad intentions.
If you take all that data, well, Brexit happens.
This shit happens because the technology actively encourages it. The web has allowed us to build systems that let us manipulate people on a massive scale and influence nations.
We also have the Hawthorne effect. This is a psychological phenomenon where, when you as a person are aware you’re being observed, you change your behaviour subconsciously. We all now know that we’re being watched and we’re altering our behaviour, whether you accept it or not.
Can we be truly free if we’re altering our behaviour because we know we’re being watched?
Are you a free person?
His key message is that technology is not neutral. It doesn’t exist in a vacuum, for its own sake. Tech encourages certain behaviour.
When the internal combustion engine was invented, it encouraged people to unintentionally kill one another by going really fast on the road. As a consequence, a lot of regulations came into force.
You cannot escape from it.
Does he want JS to disappear into a hole? Well, no. It exists and it’s here. No matter how shitty it is we have to live with it. And it’s even given us some benefit.
The web needed client-side code to enable the next level of technology. Something was going to fill the void if JS didn’t. We almost had Flash-based code everywhere with Action Script.
On top of that, JS can be done fast, lightweight and clean. You don’t need a massive JS library to have a good website, as long as your developer doesn’t load a whole library to do one thing. This is how you distinguish bad developers from good developers. Cherish good developers. They’re very, very valuable.
(At this point, Barry used a Magic: the Gathering card to illustrate the JS exposes evil practises and it became my favourite slide deck of the day, as if the DnD reference wasn’t enough).
Because we’re aware of the evil that can be done with JS, people are now trying to work out how to make the web more ethical.
So should we get rid of it? No. We have to hang onto it and make it the best we can.
It’s the devil we know, rather than whatever might fill the void.