Loading...

Follow LunaMetrics Blog | Google Analytics, AdWords & SEO on Feedspot

Continue with Google
Continue with Facebook
or

Valid


Things are moving quickly in the Google product ecosystem. One change that’s flown under the radar is the switch to parallel tracking on October 30, 2018. If you’ve been procrastinating on prepping for the change, don’t worry! We’ve got your back.

What Is Parallel Tracking?

Announced way back in 2017, parallel tracking is Google’s next advertising tool to speed up performance in the “mobile-first world,” alongside features such as Accelerated Mobile Pages (AMP) landing pages and the Mobile Speed Scorecard. With a one-second delay in load time decreasing conversions by up to 20%, every moment counts.

Parallel tracking changes the way click measurement happens when using 3rd-party tracking software that relies on redirects (Marin, Kenshoo, etc.) outside the Google ecosystem. If you don’t use one of those, you don’t need to do anything. Go optimize your accounts!

When using a tracking URL with linear or sequential tracking, a user clicks on an ad, is sent to an intermediate landing page, then redirected to the final destination. While this often happens so quickly most people can’t perceive it, Google insists that it can negatively impact conversion rates.

Sequential tracking

With parallel tracking, the user is taken directly to the final URL while the tracking URL is loaded in the background, similar to asynchronously loading scripts. The idea is that the user is taken directly (and quickly) to their ultimate destination, improving the user experience and increasing conversion rates.

Parallel Tracking

This is currently an optional feature for Search and Shopping campaigns, but starting October 30, 2018, advertisers will be automatically switched over to parallel tracking for all campaign types. You can see if you have parallel tracking turned on by going to All Campaigns > Settings > Account Settings and looking under “Tracking.”

What Do I Need to Do to Prepare?

Ultimately, you’ll need to talk with your 3rd-party vendor to ensure your tracking templates are compatible. Google has been working with the major players, so you’re likely covered.

Marin has already announced support for parallel tracking and an upcoming webinar to help users get prepped. And according to Kenshoo’s Twitter account, they’ve been working with Google since the change was announced, although they don’t have public documentation on what that looks like.

Here are some general best practices to keep in mind and get ahead of the switch:

HTTPS

Ensure the tracking server supports HTTPS. Also, make sure to include the HTTPS protocol in the tracking template URL to keep everything consistent and clean.

Redirects

Make sure the tracking redirects use server-side redirects as opposed to on-page redirection through JavaScript. The tracking sequence will stop otherwise.

Ad Changes

If you’re making changes to the ads themselves, the review process will be initiated and ad delivery will be paused until the process is complete. If the changes are at a higher level (ad group, campaign, account), the review process will be initiated in the background, allowing the ads to still serve unless flagged.

AdWords Editor

If using the AdWords Editor Tool, doublecheck that you have the latest version. If you’re not, any uploaded changes may delete previously made URL or parallel tracking changes.

GCLID

Make sure the GCLID (Google Click Identifier) is still being added. We would recommend using auto-tagging to accomplish this.

Testing Campaigns Ahead of the Switch

Test individual campaigns (without opting into parallel tracking for the whole account) to make sure the templates are compatible before rolling the change out to the entire account. This can be found under Campaings > Settings > Campaign URL options.

Under the Campaign URL options setting (you may have to expand Additional settings), create a custom parameter called {_beacon} and assign it the value true. You don’t need to use this custom parameter in any of your URLs.

If Google finds the presence of this parameter, all clicks under that campaign will get parallel tracking treatment.

Verify parallel tracking is enabled by looking for &gb=1 in your tracking calls (this indicates a background call from the browser).

Additional Resources

Google has also provided several resources for partners to refer to, including an implementation checklist we recommend looking at before you make the switch. We’ve compiled them for you below:

The post Google Ads Parallel Tracking Guide appeared first on LunaMetrics.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


We’ve written about this topic before, but it still seems to haunt us … sampling. If you’re using Google Analytics Standard, we feel your pain. Let’s talk about why sampling happens, and how to know if your reports are affected.

When Does Sampling Happen?

Sampling can happen when you apply segments, secondary dimensions, or filter your Google Analytics reports. Whenever we modify our standard reports or create custom reports, Google has to query our data, which can lead to sampling. Basically, Google estimates the data based on a percentage of sessions. Jonathan wrote an awesome post that gets into the nitty-gritty of how accurate sampling is in Google Analytics.

How Accurate is Sampling in Google Analytics?
By: Jonathan Weber
Published: March 3, 2016

The threshold for sampling is 500k sessions at the property level for Google Analytics Standard or 100M sessions at the view level for Analytics 360 for your date range. Sampling at the property level means that you can’t use filters on your views to limit the number of sessions.

How Do I Know If My Reports Are Being Sampled?

Google is constantly updating and improving the Google Analytics interface, and that includes sampling warnings. In the past, you may have noticed a warning banner in your reports that your data is based on a sampled number of sessions that are a percentage of total sessions.

The sampling warning has since received a makeover. We now have the sampling shield that changes color when our reports are being sampled. You can find the shield icon next at the very top of your report next to the title. If your shield is green, you’re good! No sampling here.

But if your shield is yellow, hover over it to see your sampling levels

Flow Reports

The Flow reports (User Flow, Behavior Flow, Events Flow & Goal Flow) are almost always heavily sampled, even if you’re using Google Analytics 360. The sampling for these flow reports works differently than our other reports. The flow reports are always sampled at 100k sessions at the property level. You’ll find this warning banner at the top right of your Flow visualizations, under the date range.

Data Studio

If your Google Analytics reports are being sampled, you’ll see this sampling in Data Studio, too. You can check for sampling in Data Studio under the bottom left-hand corner of your report. Normally, you will see a small footnote about when the report was run and a link to the Privacy Policy. If your report is being sampled, you’ll also see a link that says “Show Sampling.”

Clicking on this link will show you which widgets are being sampled and the sampling levels.

What Can I Do About Sampling?

Sampling can be a huge pain because it sometimes paints an inaccurate portrait of your data. If your sampling levels are a small percentage of your overall sessions, the data can be almost unusable.

If you’re looking for a quick fix, you can shorten your date range or try using standard reports. Try shortening your date range to fit within the sampling thresholds. If your data is still being sampled after shortening your date range, your company should consider Analytics 360.

You might also be able to find the data you’re looking for by using some of the built-in reports like Mobile Overview rather than applying segments. Check out our full list of ways to solve sampling.

Even Google Analytics 360 customers will still see sampling, but there are a number of 360-only options available to help address sampling, even when connecting to other platforms. Check out Unsampled Reporting with Google Analytics 360 for more details.

The post Are My Google Analytics Reports Being Sampled? appeared first on LunaMetrics.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Data quality issues can plague a Google Analytics account, making analysis difficult, time-consuming, or at its worst, leading to incorrect conclusions. There are many ways to affect the data that you collect and use in Google Analytics, and with recent improvements to Google Tag Manager, it’s even easier to have clean, readable data.

For each data challenge, you can typically address the problem either on the “sending” side or on the “reporting” side, with generally the same final result. Which one you choose is often a question of internal team dynamics – who has the technical know-how, the right access, and the time and focus to address these problems.

Why Should We Format & Filter?

Let’s take a step back to talk about why scrubbing, formatting, and filtering our data is so important in the first place.

Reason #1: Data Consistency

Data consistency is one of the most important components to successful, accurate analysis. Consistent capitalization, page URL structure, and data symmetry allow for comparisons to be drawn and business objectives to be measured and examined. Formatting & Filters enables data consistency, and should be one of our closest allies in our quest for analytics accuracy!

Reason #2: Data Relevancy

Making the data make sense to more people is a worthwhile exercise. Where undefined or (not set) might be accurate information within certain Google Analytics reports, it’s not easy to understand why that value might appear. Cleaning these values up and putting in human-readable labels can help provide valuable information for future analysis.

Reason #3: Automation

We love automation. Really, we can’t get enough – it makes life so simple! In the past, addressing data quality might require chaining together variables in Google Tag Manager, complicated filters in Google Analytics, or advanced ETL processes in a third-party visualization. Cleaning and storing information correctly, using the tools available to us, can save us time and improve efficiency.

Reason #4: It Doesn’t Require A Developer

I considered titling reason #4 “Because you can!” — because it’s true! Filtering and formatting ultimately allows us to clean up messy data that would otherwise skew our reports – and we can sometimes complete these steps without touching a single piece of code or enlisting the help of our trusty development team. Hooray for that!

“Sending” Changes vs “Reporting” Changes

Over the years, we’ve written more than a few posts on the benefits and wizardry of Google Analytics view filters. These live at the “reporting” step and are the last chance option to sift through the data sent to your Analytics account, retrieve the pieces you need, and ultimately ensure data quality for analysis and reporting.

Until now, view filters were one of the most automated ways to format data to lowercase values, remove those pesky undefined values, and clean up messy URLs or other data. In the past, most formatting & filtering work was completed within the Google Analytics admin window. Filters also help control what data is included or excluded from a particular GA View, though we’ve also made the case that this too can happen at the “sending” step, and be handled partly from Google Tag Manager. Check out our post: A Better Alternative To Exclude Filters in Google Analytics.

Until recently, cleaning up data in Google Tag Manager was a relatively challenging process – requiring nesting custom JavaScript variables or lookup tables to perform repetitive cleanup functions. Then, along came the Format Values option for User-Defined Variables in Google Tag Manager, which gives us greater control over cleaning our data before sending information to Google Analytics or other tools. Check out Simo Ahava’s post here: #GTMTips: Format Value Option In Google Tag Manager Variables.

It should be noted that good data doesn’t require Google Tag Manager and that on-page changes could help classic JavaScript or GTAG implementation of Google Tag Manager. When possible, using naming conventions and standardization at the page level (e.g. classes, data attributes, or data layer variables, etc.) will help everything that follows. However, this isn’t always an option, especially if relying on user input or anticipating eventual human error.

Fixing the Data First

When possible, fixing the data at its source or as close to the source is often preferred. For us, that might mean using the feature available in Google Tag Manager to clean up data before it ever gets processed by Google Analytics. While this potentially shifts the burden to a more technical point of contact, there are a few scenarios where this is especially useful.

While one of the most common uses for Google Tag Manager is to send data to Google Analytics, let’s not forget about the other places that we send data. Often, we have pieces of information that are sent to Google Analytics as well as third-party tags. These might include product/transaction info that is shared with third-party conversion tags, page or section level information shared with retargeting or recommendation engines, or copies of data sent to other analytics, CMS, or CRM platforms.

Additionally, with Google Analytics, data is collected at the Property level and then flows down into the various Views underneath that property. When cleanup occurs at the end of the collection process, at the View level, there can be issues with consistent usage of filters. New Views will have no filters applied, and it’s up to the individual to remember to add the existing data cleanup filters to the correct views.

These scenarios should lead you to prefer a “sending” side solution, when possible, fixing the data in Google Tag Manager or on the page so that it’s consistent in Google Analytics and across other platforms.

Fixing the Data Last


On the other hand, with the ease of Google Analytics view filters, it’s entirely reasonable to handle many of the cleanup items inside of Google Analytics. This can be especially helpful when you don’t have the access or resources to use Google Tag Manager.

Other scenarios may help sway your decision towards GA as well. Consider scenarios where many different data sources are flowing into a Google Analytics property. Perhaps you have multiple sites, apps, or offline data that is being sent to the same Google Analytics property. In this case, it may be easier to add one filter in GA instead of tracking down the implementation across many sites/technologies.

Events are a common area where you may want to encourage some standardization – lowercasing all the event dimensions with a view filter is an easy and quick change, and works across any event from any source. Compare that to the effort of remembering to lowercase all variables for all events set up through Google Tag Manager.

If you’ve followed best practices in Google Analytics and have a Test View created, it’s also a fairly easy and standard process to test new filters on the test view and letting it run for a period of time before moving that filter to your main reporting view. This testing process can be trickier to handle in Google Tag Manager, or may require a greater level of testing sophistication.

Tools For Cleaning Up Data

With those considerations in mind, let’s talk through the various ways to clean up data, using both Google Analytics and Google Tag Manager.

“Format Values” on Variables in Google Tag Manager

For issues where you need to uppercase or lowercase a variable’s value, this is a great new feature. As you create new user-defined variables in Google Tag Manager, look below for the Format Values option and use the built-in features to standardize the format before using it in Tags. This is great for any text fields, whether you’re pulling them from the data layer, form fields, or elsewhere.

You can also use this feature to cleanup unwanted “undefined” or “not sets” – helping to make sure that missing data doesn’t muddy your reports or worse, block a hit from sending to Google Analytics. Consider a custom dimension for an Author field on a content website. If we’re missing that specific piece of data, consider replacing that with an appropriate replacement like “other” or “Author Not Set.”

Filters in Google Analytics

The Google Analytics filters are great and can cover a number of issues we want to clean up – like lowercasing, uppercasing, but also gives us filters like Search and Replace and Advanced filters, to replace common mispellings, pull information out, and move things around.

Basic Google Analytics Filters for Every Site
By: Zee Drakhshandeh
Published: December 10, 2015

Check out our section on Data Consistency for tips on how to handle:

  • Prepend Hostname to Request URI
  • Lowercase Hostname
  • Lowercase Request URI
  • Lowercase Search Term
  • Lowercase Campaign Dimensions
  • Remove Query String
  • Append Slash to Request URI
  • Search and Replace Filter
View Settings in Google Analytics

In addition to the common filters we can add, the View Settings in Google Analytics can also help us with a few common issues. Head here for common challenges with extra query parameters, the default page name, or site search.

Again, with a challenge like query parameters, you may find this will work best in Google Analytics or in Google Tag Manager. We covered this debate as well in an earlier post.

Methods to Strip Queries from URLS in Google Analytics
By: Samantha Barnes
Published: April 17, 2015
When Should I Use GTM vs GA?

There’s no one-size-fits-all for variable filtering and adjusting, but generally our rule of thumb is to create a strategy, document, and disseminate the strategy among your team, and stick to it! For example, if you decide to use Google Tag Manager for making lowercase variables, make this your official process. Ensure that all users with access to GTM are trained on the tool, and add variable filters to your publishing checklist.

The post Cleaning Your Data with Google Tag Manager and Google Analytics appeared first on LunaMetrics.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


Nonprofit organizations have tough jobs. They’re busy saving the world one creative idea at a time and often have few resources to devote to developing or managing a digital strategy.

Through all the work we do – consulting, training, and blogging – we want to help demystify digital marketing and empower teams to do make efficient strategy decisions so they can spend time promoting good in the world. Our community outreach efforts range from give-back days, fundraising, and perhaps our favorite – sharing information with local organizations and nonprofits about our specialty areas. Learn how a few quick-hitting digital marketing wins could set up your organization for similar success.

Four Hours of Learning, Questions and Discussion

A few weeks ago, we invited 412FoodRescue to our offices for a half-day digital training session. We shared information about SEO, Paid Search and Google Analytics. Our goal was to equip the nonprofit team with best practices and provide quick wins to support their ever-expanding mission.

Having an in-person training allowed for a day that was about collaboration and team brainstorming instead of a one-sided lecture. As Becca from 412FoodRescue noted, “It was great to have an open learning environment where we could jump in and ask any question we needed.”

How 412FoodRescue Helps Pittsburghers

412FoodRescue is a 3-year old start-up that began–as the name suggests–in the “412”: Pittsburgh, PA.

Their business model is simple, yet so effective.

They partner with local businesses like grocery stores and restaurants to “rescue” surplus food that’s about to go to waste. When they identify an over-abundance of food, they leverage their network of 1,300+ local volunteers to complete same-day deliveries. This is coordinated through their app where volunteers receive real-time notifications when there’s a local delivery in their area.

Think of it as Uber for browning bananas!

Since 2015, 412FoodRescue’s volunteers have rescued 3.3 million pounds of food, generating 2.8 million meals. Their app is gaining national attention and they’re looking to roll out their business model to other cities. Some of our employees are food rescue-ers, so it became a cause we wanted to contribute to digitally. If you’re in the  Pittsburgh area, you can sign up for their volunteer opportunities, too.

Optimizing Your Site to Find Volunteers

We began the day with SEO, sharing tips for writing unique, tightly focused title tags, meta descriptions and H1s. The goal of these fields is to reflect a page’s theme. With a little keyword research, we work-shopped writing title tags and meta descriptions for a few of their key pages.

Their site scored a 70/100 on our 13 question, “Mini SEO Checklist”, which is a great score, so we spent time improving their on-page tags instead of delving into advanced technical items.

After using Google’s Keyword Planner, The “Volunteer” page changed to “Volunteer Opportunities” and the meta description was refreshed to include a stronger call to action. When we took a look at Google Search Console, we saw their CSA program–lovingly dubbed, “Ugly CSA”–actually had search demand. We added “in Pittsburgh” to help the title be more locally relevant.

It’s these small changes that can make a huge difference to a nonprofit site. Especially when there’s a critical need to be focused locally.

As a next step, the team plans to update metadata for their top 15 pages and think of future content opportunities as they conduct keyword research.

Does your business have a few key pages that could benefit from a little keyword research and some data analysis? Start small by taking cues from Google Search Console and optimizing pages by incorporating the keywords that users are already typing in to get to your pages.

Reaching a Larger Audience with Google Grants

Next up was paid search.

The team was curious to learn about creating better campaigns. Their agency had set up a Google AdWords Express account a few months ago, but we encouraged them to change their account to a “typical” Google Ads account and to sign up for a Google Grants account for additional bidding options, better management tools, and more detailed insights.

With a monthly Grants budget of $10,000, we found they could amplify their coverage by expanding their campaigns, defining important goals, choosing audience targeting and experimenting with ad formats. Having this capability will significantly change the way 412FoodRescue communicates with Pittsburghers via advertising.

Get Free Advertising For Your Nonprofit with Google Grants
By: Heather Post
Published: June 1, 2017

Together, we outlined a search campaign designed to encourage people in the Pittsburgh area to volunteer. We chose keywords, wrote ad copy and decided on the best targeting to get their ads in front of the right people.

The team was also excited to learn about remarketing opportunities that would allow them to communicate a different message to people who had already been to their site and had shown interest in their app. This option will be a quick win, especially in the volunteering space.

To complete the paid search portion of the training, we provided a template report in Data Studio to help 412 Food Rescue quickly analyze their Google Ads data to facilitate future marketing and advertising decisions.

Take a minute to think about what relevant ad copy might look like for your users. Are you trying to rally an army of volunteers or drum up additional donations? Crafting ad copy to speak to each group of users and applying the targeting options in Google Ads can ensure you’re delivering appropriate advertising.

Seeing the Big Picture with Google Analytics

We closed the day with an overview of Google Analytics. We saved the most complicated, most detailed project for last.

The goal here was to really empower the team on where to begin with their overall analytics strategy and what resources are available to learn more about GA. We provided them with a customized list of “homework” items to help plan out their overall analytics solution and links to great resources to learn how to accomplish those tasks.
We started with a discussion of 412FoodRescue’s business objectives and how to define them. Their goals revolve around engagement, user interactions, and downloading the app to become a Food Rescuer. To plan out your own strategy, check out Sam’s amazing post A Simple Start to a Powerful Analytics Strategy.

From there we went into the importance of a solid foundation and taking advantage of everything there is to offer out of the box within GA. We provided some recommendation of things to enable and update before starting to implement more customized features. For example, filtering out extra query parameter and setting up site search.

The true power of analytics comes from customization, your needs are different from the needs of the person next door; it’s impossible for Google to make an all-encompassing solution. Events and custom dimensions are the easiest tools to unlock more in-depth insights for your site or app. We walked through the site and talked about important event to start tracking now, such as clicks on download the app.

Finally, we ran through the Google Analytics interface together – answering reporting questions, showing the team our go-to reports, and making quick easy updates to their settings. One quick change we made was setting up a test view or making updates to filters.

Applying These Tactics to Your Nonprofit

Digital marketing can be overwhelming, but it can be approachable! The first step, like learning any new skill, is making it a priority.

As 412FoodRescue’s CEO and co-founder Leah Lizarondo noted, “We learned so much from our session and also learned that there is so much more to sink our teeth into. “

“We learned so much from our session and also learned that there is so much more to sink our teeth into. “

Leah Lizarondo
CEO & Co-Founder, 412FoodRescue

Setting aside time for keyword research and analyzing your existing site data can help you focus your marketing efforts and make sure you’re maximizing your limited resources.

Even a few hours can make a big impact. This is especially true for smaller organizations like nonprofits where time is valuable and resources are scarce. Our advice is to start small and scale what works.

Below are a few quick wins that you can apply to up your digital marketing:

  1. Conduct keyword research for each page to determine a priority term. Write an optimized title tag and meta description centered around that term. Ensure you’re using your full character limit and include a descriptive, compelling call-to-action. Think about how your audiences might search, from volunteers to donors for those researching about your mission. Learn how to write effective title tags.
  2. Write compelling ad copy for target audiences. Use compelling call-to-action verbs to active your audience. Terms like “Volunteer,” “Donate,” “Learn,” or “Help” can speak to the specific audiences you’re targeting. For ideas, view our call-to-action cheat sheet.
  3. Define and implement event tracking. The standard metrics that come with an analytics install are great, but to get in-depth knowledge of your users implement customizations like events. Check out our event naming post to understand best practices for event tracking.
  4. Apply for a Google Ad Grant. If you are an eligible nonprofit, a Google Grant can go a long way in helping create awareness for your cause.

Our team here at Lunametrics–Kristina, Megan and Jayna–so enjoyed our morning with 412FoodRescue. Our one regret is that the day ended too soon! We could have brainstormed for another couple hours.

Leah, Sara and Becca are a team of tenacious learners with a passion to end food insecurity in Pittsburgh and we’re happy that we could be a small part of 412FoodRescue’s digital journey. We can’t wait to see what they’ll dream up next!

Local to Pittsburgh? Learn more about 412FoodRescue and join their mission to end food insecurity in our communities and neighborhoods.

The post A Day with 412FoodRescue: Tips for Managing Nonprofit Digital Marketing appeared first on LunaMetrics.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In the Google Marketing Platform, Audiences are how we pass collections of users between tools – like sharing a Google Analytics audience with Google Ads, Google Display & Video 360, or Google Optimize. While there are many ways to accomplish the same objective, using simple audience definitions in Google Analytics can improve your flexibility and accuracy when remarketing to users through Google Ads.

By keeping each audience definition modular and relying on tool-specific features, you can avoid situations that waste your money and annoy users.

What is a Google Ads Audience?

Remarking audiences in Google Ads allow you to target specific users based on a set of criteria you get to define. These audiences can be created in Google Ads or they can be created in and imported from Google Analytics.

You can then use multiple audience definitions when targeting a remarketing audience in Google Ads. For example, you could create 2 separate audiences in GA. Then in Google Ads, you could target a remarketing audience that includes everyone from the first audience and excludes everyone from the second.

This post by Michael explains in detail how to set up audiences in Google Analytics and how to import them into Google Ads.

Tell Me If This Sounds Familiar

Perhaps you’ve been in this situation: 1) You’re shopping for something online. 2) You buy that thing. 3) You proceed to get bombarded by ads for that thing. A quick search shows this is not a unique problem.

Oh great internet algorithm. I have already bought the thing. STOP SHOWING ME ADS FOR EXACTLY THE SAME THING. DO I NEED 2 THINGS? NO. So stop it!

— Tod Z (@TodZed) August 14, 2018

Why is it so impossible to add a “I bought this already, stop showing me ads for it” feature on the Internet?

— Karol Markowicz (@karol) June 6, 2018

How Does This Happen

Anecdotally, I think most of can recall to a time that advertising has failed – which seems particularly infuriating in digital platforms where we expect/hope there’s a greater form of the good kind of personalization. While there are many reasons why search and display advertising can fail, one particularly manageable problem is how people define the audiences they target.

There are more advanced retargeting methods for ecommerce websites, like dynamic remarketing, but let’s go through an example where we might need to remarket to someone who visited a valuable page on our site – either a service or product that we’re trying to promote.

Say you want to remarket a specific product to users who have added that product to their cart but did not purchase it. You could include all of those criteria in a single audience definition and use it to create your remarketing audience in Google Ads:

The problem arises when a user comes back and ultimately purchases the product – whether they return later that day or two weeks from now, they’re still going to be in that remarketing audience because, at one time, they abandoned that product in their cart. Meeting the criteria of the audience adds them to the audience, but purchasing the product does not remove them from the list.

BUT IT CAN – as long as you have your remarketing audience set up correctly.

Streamlining Your Audience Definitions

The solution to the audience issue can be simple: rework your audience definition into something that isn’t so specific.

You can and will have audience definitions that involve multiple criteria, but you need to think through the definition to make sure you aren’t trapping users in a remarketing loop.

To remedy the situation I’ve created above, define one audience that includes all users who added that product to their cart:

Then, create a separate audience for users who purchased that product:

When creating your remarketing audience in Google Ads, you can include all users from the add to cart audience and exclude all users that are in the transaction audience.

Voilà: a remarketing audience that automatically removes users who have purchased that product.

Every Coin Has Two Sides

There’s a second principle that we should all adopt, and this applies to almost every problem we try to solve. When we attempt to target an ad to users by creating an audience to target, we need to answer both questions: Who should see this Ad? as well as Who should not see this Ad? These questions can help guide the audience creation and setup inside of Google Ads.

This applies to other challenges as well – when we’re adding tagging to certain pages on our site or creating experiment targeting in Google Optimize, we have to answer similar questions or we’ll end overcounting our conversions or showing our experiments to too many people.

Complicated Audience Definitions are a Bad Idea, Cont.

If you aren’t convinced by the lone scenario above, we have a few more reasons complex audience definitions are a bad idea.

1. Your audiences don’t collect enough cookies to be useful.

If your audience applies to just 3 people it is way too specific. You want/need to find that balance between targeting the most appropriate groups while also collecting enough cookies.

Bigger audiences are better. Audiences with 1,000 cookies can be used anywhere. Less than that, and they can only be used in display.

2. You think you’re targeting one group of users when, really, you’re targeting this other group.

The example I’ve given above applies here. You were inadvertently keeping users who had purchased the product in that remarketing audience.

This can cause the data your collecting (or at the very least your interpretation of that data) to be incorrect. It can also waste money.

3. You end up with a million audience definitions.

If your audience definition is specific, chances are you’re going to end up with a lot of those very specific audiences.

Keep in mind: once an audience has been created, it can’t be deleted – it can only be closed.

Additional Tips

Evaluate the data before you create the audience. Set up your audience as a segment in GA first to ensure you aren’t running into any of the issues above. You can either look at the segment preview or apply that segment to your GA reports and click around to see if the data the segment is pulling in makes sense.

“I think people also try to make audiences very specific because they forget that they are able to combine it with targeting native to the tools or other dynamic elements. Dynamic attributes instead of product-level audiences help to scale. Targeting in conjunction with keyword and topic targets helps to contextualize ads.”

Stephen Kapusta

Shoutout to my colleague Stephen Kapusta for contributing to this post!

The post Google Analytics Audiences Strategy: Keep It Simple appeared first on LunaMetrics.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


Migrating or redesigning a site is a chance for new beginnings. New pages! New functionality! New systems!

But with new things comes new responsibilities. One of the first items to consider is your new URL structure. Taking time to discover the best URL structure for your site–and for SEO–will set you up for future success.

As an SEO overseeing a migration, your role will shift from an in-the-weeds analyst to a consultant and educator. You’ll be asked best practices and tasked with guiding teams through this unfamiliar world full of potential and new HTML tags. You’ll work with IT teams and developers you previously never knew existed.

Suddenly, you’ll be fielding questions about:

  • URL Structures
  • Which New Pages to Create/Delete
  • How to Organize the Site
  • Analytics

And the list goes on. If you’ve done your job up to this point, you’ll have advocates on other teams asking, “But what about the SEO implications.” They may not understand just how SEO works, but you’ve scared them with mentions of algorithm updates and ranking drops that any project – no matter how small – must be signed off by the SEO team.

Congrats! You’ve made yourself invaluable and involved in every decision. Before you think this is a bad thing, you’re actually #blessed.

While these pre-migration discussions may dominate your schedule, it’s better to be involved upfront and tell the UI team that Google views a blue vs. orange button equally than to be brought in at the 11th hour and learn *gasp* that no one’s thought of canonical tags.

URL Structures

If your site is upgrading its platform, 99% of the time that means your URLs will change, too. Hopefully, for the better. Gone are the days of CatID=12345 and hello keyword rich URLs!

This is your chance to establish ground rules. You’ll be able to create folders and character limits that will dictate URL structures for the foreseeable future.

Before You Dive In…

Google’s John Mueller has said to not change URLs just for SEO purposes. If you have the option to keep the same structures, don’t. touch. anything.

While it’s tempting, those URLs have been accruing authority for years, may be linked to from other sites, and you introduce risk and volatility when using redirects. Plus, it’s one extra thing you’ll have to manage.

Sometimes, though, you might not have a choice.

If you’re doing a full redesign and the new back-end systems necessitate a URL change, do it wisely. If the company is rebranding and the new brand comes with a fresh URL, make it count.

Strategically approach this time and infuse SEO best practices into your shiny new URLs.

Use Normal Naming Conventions

While keywords in URLs inform visitors what a page is about, they don’t provide the same ranking boost they once did.

In a 2016 Google Webmaster Hangout, Mueller shared that keywords in URLs are “a very small ranking factor. It’s not something I’d really try to force.”

This means you should encourage teams naming URLs to name them something intuitive, but don’t stress about finding the exact right phrase. If you’re between /car-repair/ or /auto-repair/ – use either! Google gets it.

Shorter URLs are Better Than Longer Ones

To quote Occam’s Razor, “The simplest solution is always the best.” The same is true when creating URLs.

If you’re considering if /new-homes-for-sale/ or /new-homes/ is best, save yourself nine characters and opt for /new-homes/.

Other pages on your site will provide context and you’ll save character limits by removing implied phrases. Shorter URLs are also easier to display on smaller screens for mobile searches.

What About Mobile?

After reading my first draft, my colleague asked that question. It’s what we all should be asking these days.

What about mobile URLs?

Responsive design is Google’s recommended design pattern.” Having your mobile and desktop pages responsively load on one www. domain is preferred, but you can handle separate m. and www. URLs by setting appropriate canonical and rel=alternate tags between the two.

If that’s how your site is configured, see Google’s guidelines on how to annotate.

URL structure is less important to mobile visitors from a visual standpoint. It’s an element of your site that few phone visitors may even see. On my 5.7 inch, XL phone, I see 28 characters in my Chrome browser.

When you think about it, that’s really small. It’s the equivalent of “https://www.examplesite.com/”. Everything beyond that is hidden.

With the common use of schema.org markup and Google getting better at understanding a site’s folder structure, mobile URLs also rarely show up in SERPs. All a user will see on their small screen is a list of folders.

If you have separate m. or www. URLs, apply the same best practices to both. Keep the URL structure the same between your domains to help with cross-device consistency. Since site users will rarely see your URLs and because Google doesn’t surface them anymore, don’t overthink mobile URL structure.

Subfolders Should be Used in Moderation

The number of subfolders to include can vary greatly; there is no one-size-fits-all recommendation. That’s because considering URL folders and the right number to expose gets tricky.

Our recommendation is to consider your specific implementation and the size of your site.

Let’s run through options.

For small sites, displaying 2-3 folders in a URL can provide users with additional context before even viewing the page. Your site is small and it’ll be easy to manage.

Perhaps you’re a restaurant with multiple locations and unique menus. You’d want to expose each of those locations within the URL to help customers know which location they’re viewing: /east-end/menu and /north-shore/menu. See how those URLs are better than /menu-1 or /menu-2?

Think of URLs as a way to tip-off visitors about what they’ll see before then get to your site.

For large sites – especially eCommerce ones – you’ll have more things to consider. You may have to decide whether to display parent categories (and how many) or perhaps you’ll want to name URLs according to their page template.

If you’re leaning towards a multi-folder URL path, you’ll want to consider what’s intuitive without being overkill. It’s very much a “trust-your-gut” situation.

Let’s imagine you’re a jeweler who sells gold solitaire engagement rings. You have a few options of the child-parent relationships for that landing page:

  • /rings/engagement/solitaire/gold (4 folders)
  • /solitaire/gold (2 folders)
  • /rings/gold-solitaire-rings (2 folders)
  • /gold-solitaire-rings (1 folder)

All are viable options, but which one provides context and is most intuitive? That’s up to you.

You also may want to use a naming convention based on page templates.

If, for example, you have a page template that shows products, you may want to name all those pages… wait for it… /products/. This would apply to your necklaces, rings and earrings pages.

  • /products/necklaces
  • /products/rings
  • /products/earrings

So which option is best? I wish we could give you an answer, but this one will depend on your site. To help you decide, sketch out one section of your navigation and what the URLs might look like. Does it feel like overkill if you have 6 folders when 2 will do? Are your users going to get lost by seeing a page that’s 5 folders deep? Does each folder provide value to your customers by being exposed?

It’s Okay to Be A Little Selfish

Once you’ve considered what’s best for your visitors, it’s time to be a bit self-serving. Is the solution you’ve developed also best for YOU? How will your internal teams use those URLs in their reporting, monitoring and general navigation around the site?

If you’re an internal team or an agency working with one client, you will see these URLs every. single. day. of. your. life.

Would it help to filter your data to product detail pages if every page lived within a /p/ folder? (Probably, yes.)

Do you want to view only earring pages so having a parent directory of /earrings/ would help you easily sort the data? (Probably, yes here, too.) Those with a decent subdirectory structure will be easily able to roll up behavior metrics by subdirectory in Google Analytics with an underused feature called the Content Drilldown report.

When sharing the Store Locator URL with your Social team, is it easier for you to share /stores/store-locations/view-all or more simply, /stores/? The more characters in a URL, the easier it is to mess it up.

Create New URLs with Purpose

A migration or site redesign is your chance to reset naming conventions for URLs. While keywords in URLs no longer have an impact on how a page ranks, creating simple, intuitive URLs are a win-win-win all around.

They’ll:

  1. Help customers orient themselves on your site
  2. Provide context for users when clicked on from an external link
  3. Make life infinitely easier for reporting or daily SEO tasks

If your URLs need to change, embrace the opportunity and create a foundation that will benefit IT, Marketing and SEO teams for years to come.

The post Site Migrations & Creating New URL Structures appeared first on LunaMetrics.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


Companies should be making constant tweaks to their businesses to get better, whether it’s a new marketing campaign or website redesign. And ultimately, we want to know whether or not these changes have an impact on their goals – does the new landing page drive more email signups? How about articles read? Has the PPC campaign increased transactions?

Answering these types of questions is not quite as simple as pulling numbers. We can’t just look at a snapshot in time; we have to put our data into context if we want to learn about the causal relationship between the changes and the numbers.

What Is Causality?

Causality is a phenomenon through which one thing (the cause), under certain conditions, gives rise to or causes something else (the effect). However, there is a difference between “after this” and “because of this.” Thus it is important that when we study causality, we actually try and measure the influence of changes analytically.

How do we know that our new campaign is responsible for an increase in new customers? Can we be sure that our new checkout process redesign is responsible for more funnel dropouts? We’ve all heard the phrase “correlation is not causation” enough times that we don’t need to be reminded with an example. Rather I’d like to discuss some methods available to actually measure the impact of our marketing or design efforts.

Why Is It so Hard to Track?

When making changes in the real world, it is hard to account for all the different variables. Confounding variables such as seasonality, selection bias, geography, and many more, can make numbers imply one thing when the reality is different.

There are also other consequences of a new campaign or promotion, such as cannibalization or halo effects. Cannibalization is when the increase in one “thing” decreases a related “thing.” An example of this could be your PPC ad campaign is increasing paid traffic, but is decreasing (cannibalizing) your organic traffic. This example situation is sub-optimal because you are now paying for traffic that you were already getting organically.

On the other hand, a halo effect, is when the increase in one thing increases a related thing. For example, having a promotion on shampoo can also increase the sales of (non-discounted) conditioner. There are usually intricate relationships at play, and we want to be able to control as many of them as possible, to get clear answers.

What Can We Do?

Whenever a change is made, we want to think about how we are going to see the effects. It’s important to treat these changes like experiments. Ideally, we would love to have a control group. A control group is a group in an experiment that does not receive “treatment,” and is then used as a benchmark to measure how the test subjects (the group that received the treatment) do. We would have only one change made (for example, change the wording on a CTA button or change the colors on a banner), and have a measurable goal in mind. Do we want to see increased purchases? Do we want a higher average time on page?

Essentially, we need to determine what our response variable is going to be.

Controlled Experiments

In a controlled experiment, we have a random sample of participants that see or experience a new version that we want to test, and everyone else experiences the old version with no changes. The participants who did not see the change are considered the control group, and serve as a baseline in order for us to make a comparison.

Google Optimize is a great tool to use in these cases. Read the following blog post for more on why you should run A/B tests:

Why Should I Run A/B Tests on My Website?
By: Becky West
Published: April 29, 2016

Google Optimize is a WYSIWYG editor that lets us run experiments on our websites and integrates natively with Google Analytics. Using Google Optimize, we get to try out design changes on our website and select a percentage of users who come to our website on whom to test the change. It then gives us feedback on which A/B variation is performing better with respect to a pre-selected goal.

Once enough data is collected, Google will determine which A/B version is the winner.

Since the Optimize change is the only thing that is different between the experiment and control group, we can say scientifically that any change to our goals was caused by that change. Note that you will have to run the experiment for as long as it takes to get enough data, in order for a result to be statistically significant.

With Optimize 360, we can target specific audiences, and try multivariate tests instead of just A/B test. It’s important that our test and control group come from the same population of users, to avoid selection bias or other confounding factors.

Data Science Solutions

Sometimes, a randomized controlled experiment is not an option. Maybe we’ve changed our entire checkout process. Maybe we’ve added pictures to some subset of our product detail pages, and we want to see the effect they’ve had on total transactions. Or maybe we’ve made a redesign to one of our international websites. With a measurable goal in mind, we want to see how these changes have impacted the goal. In such cases, we need to dive into some data science solutions to do a causal analysis.

The basic idea behind these models is that we use historical data as training data, forecast (predict) what we would expect to see if no change had occurred, and then compare it to what actually occurred with the change.

We can use a data science tool such as R or Python for these analyses and import our data either from BigQuery or Google Analytics (read more about data imports and exports here).

Getting Started with R and Google Analytics
By: Becky West
Published: June 2, 2016
Forecasting

Forecasting should be used when we have no control or comparison group. It is a good option for site-wide changes. It uses time series data, that is, data taken at regular intervals (daily, weekly, monthly, etc), to run the analysis. For a single-variable time series data, we could import daily total transactions, and then we use the “forecast” package to help analyze the data before the change occurred.

Once the data is analyzed, it gives a prediction, within a confidence interval (usually 95%), of what it would expect the future data points to be. Plotting the actual observed data on top of the predicted interval shows whether or not the change has had a significant impact.

The gray area represents the 95% confidence interval. Notice that as we get further away from the historical data, the range of the confidence interval grows.

Causal Impact

This method is good for measuring changes that only a subset of our users encounter. Causal Impact considers the difference in historic patterns between the test and control groups and uses that difference to determine whether or not a measurable effect has been observed with the change. This implies that the control group needs to be somewhat similar or correlated to your test group. For example, if we’ve redesigned our Canadian website, but not our US or Mexican websites, then the web data from the Canadian site would be the test data, and the US and Mexican website data would be our control. (Notice how this is different from a controlled experiment!)

In a causal impact analysis, we define a predictive variable (our control group), a response variable (our test group), a pre-period (the historical data before the change occurred), and a post-period (starting at the time your change occurred). The CausalImpact R package can then produce a few graphs and a summary of the analysis. See this article on the CausalImpact R package. An analysis produced with example data is shown below, where the change occurred at 70 on the x-axis.

The first panel shows the data and a counterfactual prediction for the post-treatment period. The second panel shows the difference between observed data and counterfactual predictions. This is the pointwise causal effect, as estimated by the model.

The third panel adds up the pointwise contributions from the second panel, resulting in a plot of the cumulative effect of the intervention. There are many R and Python packages to do forecasting and causal impact models.

Using a large amount of historical data is important because it can catch seasonal and other long-term trends, and take those into account when doing the statistical analysis. However, it is more important that we are feeding our model reliable data, and that our goal has been tracked consistently throughout that time.

These methods should help us analyze any changes or experiments that have been made and also help us think about how we can test any future changes that we make to our websites.

The post Seeing Causality in Google Analytics Data appeared first on LunaMetrics.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview