Loading...

Follow Local SEO Guide | Local SEO Blog on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Hey friends! I’m way over-invested in rare beer and underinvested in giving back to the SEO community (aren’t we all?). There has been a lot of tweetin’ and discussin’ about forum/comment links lately. Do they have zero value for SEO, or is the devil in the details?





Anyway, enough talk. If you can create a resource online showing how you used forum/comment links to boost the rankings of a site that averages over 1k visits a month from organic search traffic I will send you one of these two rare beers from my collection:

A little background on the beers, these are from two fine SoCal breweries.

First on the left is Parables of Red from Casa Agria. Casa is a craft brewery located in Thousand Oaks, and are known for their fantastic hazy hops and robust wild program. They even did a mutual collab with DeGarde, so they are the real deal. This beer is from their club program and was not released to the public.

On the right is a 2014 Black Tuesday from Orange County’s own The Bruery. The Bruery has been around since 2008 and has helped pioneer the high ABV pastry stout game and mainstreaming of wilds. This 2014 BT should be drinking spectacularly right now.

Any questions hit me up @danleibson on Twitter, or leave a comment.
Must be 21 years of age or older.


Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Here’s a stupid little GMB Q&A thing I figured out yesterday I thought you all might enjoy.

I was asked by mi nuevo amigo, Ruben Coll Molina of PA Digital in Spain, what is the event that triggers the Q&A functionality in a GMB profile? Ruben had found that some of their SMB customers did not have the functionality. He sent me to this SERP for “nouvelle couquette”, a clothing store in Torrent, Spain. At the time, their GMB did not display the “Ask a Question” module like this:

Ok, I doctored it. Of course I forgot to take a “before” screenshot, but trust me, I’m an SEO consultant…

Anyhow, I searched for “women’s clothing stores in Torrent, Spain,” got a local pack then clicked on the “More Places” link and saw Nouvelle Couquette listed in the Google Maps Local Finder, but this time it had the Q&A widget, but no questions had been asked:

On a hunch, using my best 7th grade Spanish, I asked a question:

Ruben answered:

A few seconds later we witnessed El Milagro de Las Preguntas y Respuestas:

Quien es mas macho?

The post How To Force Q&A On a GMB Page That Doesn’t Have It appeared first on Local SEO Guide.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


  1. SMBs As a Group Will Continue To Not Get SEO
    A little over a year ago, my dentist moved his office but never thought about updating his listing in Google Maps, Apple Maps or his website. I showed him how to fix the issue, but this morning on the way to my end of the year teeth cleaning, not only was his old location back on Apple Maps, but he had also decided to change his business name from Joseph A. Grasso DDS to San Ramon Valley Cosmetic & Family Dentistry (perhaps for SEO reasons?), but had not bothered to update either his GMB or Apple Maps listings, let alone his Facebook page or any other citations. I mentioned this to his receptionist. Her response was “Wow, I didn’t know anything about that stuff.” I envisioned my kids’ future tuition bills and sighed with relief.
  2. Voice Search Will Continue To Be YUGE, But So What?
    We keep seeing reports of how everyone is increasing their use of voice search to find information and buy stuff. Outside of being the default app for specific type of query on the various assistants, the end result is still often position #1 or #0 for a Google SERP. For local businesses this means you’ll want to be #1 or #0 for relevant local queries, and if there’s an app (e.g. Apple Maps, Yelp, etc.) that shows up in that position, then you’ll want to be #1 in those apps. Kind of like the way local search has been working for years…
  3. Some Of Your Clients May Actually Ask For Bing SEO Help
    If people are asking Alexa a lot more questions, per the previous prediction, Microsoft’s Cortana recently announced integration with Alexa may lead to more Bing results surfacing via Alexa. So those clients who have a data issue on Bing and the CEO happens to hear their kids looking for their business using Cortana on Alexa might send you that urgent message for “Bing SEO ASAP!” OK, we know – we just needed an extra prediction to get the right number for an Instant Answer result…
  4. Google My Business Posts Will Be Where The Action Is
    Since the roll out of GMB Posts, we have been calling them “the biggest gift to SEO agencies in years.” The ability to add minimal content to appear on a business’ GMB/Knowledge Panel that can attract clicks, most of which are from brand queries, and show clients how these impact performance will be hard to resist for most agencies that are currently blogging for their clients and praying someone cares about their 250-500 words of cheaply written brilliance. Expect GMB posts to be standard in most Local SEO packages, until of course Google deprecates them later this year.Bonus Prediction!: And while we are on the subject of GMB, I expect to see a lot more functionality, and promotion thereof, poured into this service. I wouldn’t be surprised if we saw a Super Bowl ad this year that shows how a business uses all Google services (websites, GMB messages, Q&A, Local Service Ads, GMB Post Videos, Reviews, etc.) to run its business and get customers.
  5. Retailers Will Invest More In Local SEO
    Google will continue to cannibalize SERPs with ads and “owned and operated” content such as GMB (which is basically training wheels for ads) making it easy for brands to increase their ad spend to eye-popping levels. Sooner or later multi-location brands who are tired of having to re-buy their customers every week will realize that for about 1% of their Google Ads budget, they can make a serious dent in their organic traffic and revenue. Topics like rebuilding their store locators, rewriting location pages, local linkbuilding and GMB optimization (including feeding real-time inventory to GMB) will no longer cause the CMO’s eyes to glaze over.
  6. Links Will Still Be The Biggest Line Item In Your Local SEO Budget
    There are only so many ways you can publish the best content about how to hire a personal injury attorney, before and after bunion surgery photos, or local SEO predictions. I am sure there are some cases where E-A-T trumps links, but sooner or later, in 2019 we will all need a link or two, or twenty…
  7. We Will See More Consolidation In Local Listings & Review Management
    While the business has become somewhat commodified, there is just too much value to owning the customer relationship attached to thousands of locations. Yext appears to be continuing its focus on high-value verticals (healthcare & financial services), international expansion to serve global brands and adding related functionality like Yext Brain. Over the past year, Uberall gobbled up NavAds and Reputation.com grabbed SIMPartners. Any big digital agency serving global multi-location brands sooner or later will want to own this functionality. Look for Asia to be a big growth area for these services.      (7.5)And I Wouldn’t Be Surprised If One Or More Of The Review Management StartUps Gets Acquired
    While online review management feels like something of a commodity, kind of like listings management, it’s also a great gateway drug for multi-location brands & SMBs to eventually buy more of your services. I recall Ted Paff, founder of CustomerLobby, once telling me “the value of review management is trending towards $0.” Of course, that was right before he sold CL to EverCommerce and took off to Nepal to find his Chi. Fast-growing services with review management and related services that are not trending towards $0 are prime targets. Keep an eye on Broadly, BirdEye, Podium, GatherUp, NearbyNow and others.
  8. Google Search Console Will Specify Local Pack Rankings In The Performance Report
    Yeah, right. But maybe, just maybe, we’ll get regex filtering?
  9. Apple Maps Will Continue To Be The Biggest Local Search Platform Everyone Ignores
    Apple made a big deal in 2018 about its new map platform and while it is exciting to have more vegetation detail, Apple still shows little sign of giving a shit about its business data. In the four years since Maps Connect launched, the functionality for businesses to control their Apple Maps profiles has barely changed. While I find Apple Maps generally fine to use (except for that time it led me straight into a dumpster in San Francisco), I still see plenty of people criticizing it. At some point perhaps Apple will realize that businesses and their agencies can help make Apple Maps much better. It would be great if we could get actual analytics, ability to enhance profiles, true bulk account management, etc., but I am skeptical that will happen in 2019.
  10. Amazon Will Not Buy Yelp!
    But if the stock price goes below $25, it seems like there’s a private equity play here. cc: David Mihm.

So in 2019, Local SEO will pretty much look like this:

The post 10 Local SEO Predictions for 2019: Job Security For Local SEOs appeared first on Local SEO Guide.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Black Friday and Cyber Monday are closing in. For most retailers, Black Friday is one of the biggest revenue generators of the year, but it’s surprising (except perhaps to most SEOs) how many of them neglect the basics. If you fall into that category, you still have a few days to try to turn that lemon into lemonade with some very simple updates to your site.

If you search for “black friday sale near me” you will likely see a Local Pack like this:

Notice how Google calls out that these sites mention black friday sales, deals, etc.

While most retailers likely already have a Black Friday Sale page and mention it on their home page, two out of the three sites above, Macy’s and Walmart, also mention Black Friday on their store location pages. For example:

Macy’s Black Friday:
https://l.macys.com/stoneridge-shopping-center-in-pleasanton-ca

Walmart Black Friday:
https://www.walmart.com/store/2161/pleasanton-ca/details 

While Kohl’s shows that you don’t need the location pages to be optimized for Black Friday to rank for these queries, updating your location pages to target Black Friday & Cyber Monday queries in both the title tag and in the body copy should likely improve your chances of appearing in localized Black Friday SERPs.

Even if your site is in code freeze, you (hopefully) should be able to make these updates and maybe next week you’ll find yourself with more than just some leftover turkey…

The post It’s Not Too Late To Localize Your Black Friday SEO Strategy appeared first on Local SEO Guide.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Justin O’Beirne has published another amazingly detailed analysis of Apple Maps and how it has developed compared to Google Maps. While I shall yield to Justin’s mastery of all things geospatial, I feel like he kind of punted on his analysis of Apple Maps’ business listings data.

O’Beirne observes that Apple Maps has few business listings for Markleeville, CA and then claims “all of the businesses shown on Apple’s Markleeville map seem to be coming from Yelp, Apple’s primary place data provider.”

While only Apple and Yelp know for sure, I am fairly certain Yelp is not Apple Map’s “primary place data provider.” I imagine Yelp is Apple’s primary U.S. business review provider, and perhaps has a significant role in helping Apple verify a business is in a specific place with specific data, but there are several other business listings data providers that likely are providing the “primary” place data to Apple, not the least of which includes Acxiom, Factual, Neustar Localeze and TomTom. These companies likely have significantly larger POI datasets than Yelp, while Yelp likely has the lead in newly created businesses in its popular categories. Clear Channel Broadcasting may also be a provider, although it is unclear what data it is exactly providing Apple.

In his analysis of Apple’s lack of businesses in Markleeville, O’Beirne claims that “Apple Maps doesn’t have some of the businesses and places Google has.” This is possibly true, but not based on the data O’Beirne shows. Here’s his comparison of what Apple and Google show for a section of Markleeville:

I think O’Beirne is confusing that Apple Maps is not displaying the businesses in this view v. actually having them. Each of the highlighted businesses on Google Maps are on Apple Maps, they just don’t appear in the default view of this section:

I believe a lot of the Markleeville business data (surprisingly) comes from Factual, not Yelp.

O’Beirne also makes a point about a discrepancy between Apple Maps and Yelp re a single listing as evidence of a larger problem. O’Beirne states “there’s a place on Apple’s map with no Yelp listing at all: the “Alpine County District Attorney”. Even stranger, it appears to be a garage:” Then he shows the following the Apple Maps listing next to an image of a garage at the same location from Bing Maps:
The problem actually is that Apple Maps has the Alpine County DA location correct, but it also has a dupe listing in the wrong place:
I am not going to claim that Apple Maps has fewer dupe listings than Google Maps, but given the amount of crap we deal with for clients on Google Maps on a daily basis, I wouldn’t be surprised if this were the case. Regardless, the really odd thing is that had O’Beirne checked Google Maps, he would have seen its address for the Alpine County District Attorney’s office is 100% wrong:

I am not trying to dispute O’Beirne’s take that Apple Maps still has a long way to go and it may never get to feature parity with Google Maps, but maybe Apple Maps is not in as bad shape as he thinks.

The post Apple Maps May Not Be As Far Behind Google Maps As You Think… appeared first on Local SEO Guide.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

via GIPHY

Google just published an article on how to “Get Started With Dynamic Rendering.” If you are working on a site with a “modern framework” (e.g. Angular, React, or other tech with a lot of JavaScript features), you’ll want to bookmark that post. If reading is not your thing, a few weeks ago I put together Server Side Rendering For Dummies (& Non-Technical SEO Decision Makers), which boils down a lot of the Google techno-jargon into a single PowerPoint slide.

While that Google post has most of what you’ll need to get started with server side rendering, I’d like to focus on the Troubleshooting section – talk all you want about answering user questions, relevance, domain authority, etc. – if I had to define 2018 SEO with one word, it would be “troubleshooting.”

Google gives you most of what you need to troubleshoot prerendering problems in the “Verify your configuration” and “Troubleshooting” sections. Here’s what they say to do (edited for brevity):

Verify your configuration

Check a URL with the following tests:

  1. Test your mobile content with the Mobile-Friendly Test to make sure Google can see your content.
  2. Test your desktop content with Fetch as Google to make sure that the desktop content is also visible on the rendered page (the rendered page is how Googlebot sees your page)
  3. If you use structured data, test that your structured data renders properly with the Structured Data Testing Tool.
Troubleshooting

If your content is showing errors in the Mobile-Friendly Test or if it isn’t appearing in Google Search results, try to resolve the most common issues listed below.

Content is incomplete or looks different

What caused the issue: Your renderer might be misconfigured or your web application might be incompatible with your rendering solution. Sometimes timeouts can also cause content to not be rendered correctly.

High response times

What caused the issue: Using a headless browser to render pages on demand often causes high response times, which can cause crawlers to cancel the request and not index your content. High response times can also result in crawlers reducing their crawl-rate when crawling and indexing your content.

Structured data is missing

What caused the issue: Missing the structured data user agent, or not including JSON-LD script tags in the output can cause structured data errors.

We call these “Smoke Tests.” Here’s a little more nuance to server side rendering troubleshooting based on some real-world situations we’ve encountered.

  1. How To Test Server Side Rendering On A New Site Before It’s Launched
    It often is the case that SEOs get brought into the process well after a site has been built, but only a few days before it will be launched. We will need a way to test the new site in Google without competing in Google with the old site. For a variety of reasons we don’t want the entire new site to get crawled and indexed, but we want to know that Googlebot can index the content on a URL, that it can crawl internal links and that it can rank for relevant queries. Here’s how to do this:
    1. Create test URLs on new site for each template (or use URLs that have already been built) and make sure they are linked from the home page.
    2. Add a robots.txt file that allows only these test URLs to be crawled.
      Here’s an example:
      User-Agent: Googlebot
      Disallow: / (this means don’t crawl the entire site)
      Allow: /$ (allow Gbot to crawl only the home page even though the rest of the site is blocked in the line above)
      Allow: /test-directory/$ (allow crawling of just the /test-directory/ URL)
      Allow: /test-directory/test-url (allow crawling of /test-directory/test-url)(you can add as many URLs as you want to test – the more you test, the more certain you can be, but a handful is usually fine)
    3. Once the robots.txt is set up, verify the test site in Google Search Console.
    4. Use the Fetch as Google tool to fetch and render the home page and request crawling of all linked URLs. We will be testing here that Google can index all of the content on the home page and can crawl the links to find the test URLs. You can view how the content on the home page looks in the Fetch tool, but I wouldn’t necessarily trust it – we sometimes see this tool out of sync with what actually appears in Google.
    5. In a few minutes, at least the test home page should be indexed. Do exact match searches for text that appears in the title tag and in the body of the home page. If the text is generic, you may have to include  site:domain.com in your query to focus only on the test domain. You are looking for your test URL to show up in the results. This is a signal that at least Google can index and understand the content on your home page. This does not mean the page will rank well, but at least it now has a shot.
    6. If the test links are crawlable, soon you should the test URLs linked from the home page show up in Google. Do the same tests. If they don’t show up within 24 hours, while this doesn’t necessarily mean the links aren’t crawlable, it’s at least a signal in that direction. You can also look at the text-only cache of the indexed test home page. If the links are crawlable, you should see them there.
    7. If you want to get more data, unblock more URLs in robots.txt and request more indexing.
    8. Once you have finished the test, request removal of the test domain in GSC via the Remove URLs tool.
    9. We often can get this process done in 24 hours, but we recommend to clients giving it a week in case we run into any issues.
    10. Pro-tip: If you are using Chrome and looking at a test URL for the SEO content like title tag text, often SEO extensions and viewing the source will only show the “hooks” (e.g. {metaservice.metaTitle}) and not the actual text. Open Chrome Developer Tools and look in the Elements section. The SEO stuff should be there.
  2. Do Not Block Googlebot on Your PreRender Server
    Believe it or not, we had a client do this. Someone was afraid that Googlebot was going to eat up a lot of bandwidth and cost them $. I guess they were less afraid of not making money to pay for that bandwidth.
  3. Do Not Throttle Googlebot on Your PreRender Server
    We convinced the same client to unblock Googlebot, but noticed in Google Search Console’s crawl report that pages crawled per day was very low. Again someone was trying to save money in a way that guaranteed them to lose money. There may be some threshold where you may want to limit Googlebot’s crawling, but my sense is Googlebot is pretty good at figuring that out for you.

The post SEO Smoke Tests for Dynamic Rendering appeared first on Local SEO Guide.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In 2011, Google, Bing & Yahoo announced Schema.org which got SEOs all excited to start marking up website content to turn it into “structured data.” The benefit would be that search engines would be more certain that a text string of numbers was in fact a phone number, or at least they would be more certain that you wanted them to think it was phone number. The search engines could then turn the structured data into eye-catching fripperies designed to seduce searchers into surrendering their clicks and revenue to your fabulously marked-up site (aka “Rich Results).

It also could help your fridge talk to your Tesla.

So pretty much every SEO marked-up their audits and conference presentations with recommendations to mark up all the things. LSG was no exception. And we have seen it work some nice SEO miracles.

There was the ecommerce site that lost all its product review stars until we reconfigured the markup. There was the yellow pages site that got a spammy structured data manual action for merging a partner’s review feed into its own. There is the software vendor and its clients that (still!) violate Google’s structured data guidelines and get away with it. There have been countless Knowledge Panels that have needed the tweaking one can only get from a perfectly implemented https://schema.org/logo.

But structured data is not a killer SEO strategy for all situations, and it’s important that SEOs and clients understand that often it’s more of a future-proofing game than an actual near-term traffic or money-generator. For example, let’s take this UGC site that generated about 22 million clicks from Google over the past three months and see how many clicks are reported as coming from “Rich Results” in Google Search Console:

So less than one-half of one-half of 1% of clicks came from a “Rich Result.” Not particularly impressive.

The good news is that Google is in fact using the structured markup. We can see evidence of it in the SERPs. But it’s likely the content of this site doesn’t lend itself to eye-popping featured snippets. For example, many of the Rich Results appear to just be bolded words that appear in the URL snippets in the SERPs, kind of like this:

It also may just take time before Google trusts your markup.

So before you drop everything and prioritize structured markup, you may want to consult Google’s Structured Data Gallery to get an idea of which types of content Google is pushing to markup. You also should check the SERPs to see what your competitors are doing in this area and how their marked-up content is being displayed. This should give you a good idea of what the potential is for your site.

And remember,”you can mark-up anything, but you can’t mark-up everything…” – Tony Robbins?

The post Structured Data Can = MehSEO appeared first on Local SEO Guide.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


Two weeks ago a client reset its bot-blocker, unintentionally blocking Googlebot. We had SEORadar monitoring the site so we quickly discovered the problem and alerted the client. Unfortunately, by the time they fixed the bot-blocker settings, they had lost about 100,000 daily visitors from Google. Of course, the first thing they asked was:

How Long Will It Take Our Google Traffic To Recover From Blocking Googlebot?

While your mileage may vary, in this case the answer is about one week.

Here’s my theory on how this process works:

  1. You block Googlebot from crawling your site (the most common reasons I see are improper bot-blocking settings or adding a “Disallow: /” rule to the robots.txt file).
  2. Googlebot gets a 403 error when it tries to crawl the site or just stops crawling because of the robots rule. After hitting the home page (or robots.txt) a few times, it gets the message and starts demoting the site’s URLs. Traffic drops dramatically within a few hours. In this case, the site saw about a -50% drop within two hours and a -60% drop within 24 hours that held for most of the time Googlebot was blocked.
  3. GSC showed that crawl rate dropped from about 400,000 URLs/day (it’s a 5MM URL site) to about 11,000 URLs/day. I haven’t investigated how Googlebot was able to crawl 11,000 blocked URLs yet. That’s for another post.
  4. When you unblock Googlebot, it starts to crawl again. In this case it immediately went back to its pre-block levels, but if you don’t have a strong domain, you may need to do something to spur crawling (aka “get links”).
  5. As Google recrawls previously inaccessible URLs, it starts reevaluating their rankings. As best I can tell these URLs were never excluded from Google’s index (the URLs still showed up in site: queries), but it does appear the content of their Google caches were deleted. So Google needs to “see” the page again and reapply its algorithms.
  6. On a big site, or a small site with weak backlinks, it may take several days/weeks for Googlebot to recrawl all of the URLs it had demoted. So the recovery pattern can be gradual. Here’s what it looked like for the site in question:

 

On the bright side, when you block Googlebot from your entire site, your avg time downloading a page metrics improve quite a bit pic.twitter.com/CGV3UItX0z

— Andrew Shotland (@localseoguide) August 18, 2018


The post How Long Does It Take SEO Traffic To Recover From Blocking Googlebot? appeared first on Local SEO Guide.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Your engineering team just mentioned they are rolling out a new type of product landing page built with REACT or Angular or some other hipster tech name you may have seen out of the corner of your eye on your Twitter SEO feed. Your gut says this could be an SEO problem, and your gut is probably right. You search Google for “React SEO Issues.” etc. and you get a lot of smart bloggers giving you way too much information on how this technology works, when you really just need a few bullet points that you can hand to an engineer so you can move on with making that sweet pitch deck for the C-Suite.

This one’s on me*:

  1. A lot of modern sites use “Single Page Applications” (SPAs) which have performance/UX benefits
  2. SPAs usually return an empty HTML file initially which screws your SEO. Google is getting better at figuring this out, but I wouldn’t trust it.
  3. When you render the app on the server first (using pre-rendering/server-side rendering) the user (and bots) get a fully rendered HTML page which = SEO

*There is a lot of detail beneath the surface in terms of how to best implement this stuff, how to test and troubleshoot it, etc., but for now you just need the dev team to fully render the HTML on the server before it gets fetched, and you need them to think you are not totally clueless. Now go knock out that pitch deck, Killer…

The post Server Side Rendering For Dummies (& Non-Technical SEO Decision-Makers) appeared first on Local SEO Guide.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

For fans of the long-running SEO Death series, this isn’t really a new issue. About two years ago I posited Is Your Adwords Campaign Hijacking Your SEO Traffic?

The answer of course was “yes.” Here’s the graph that illustrated the problem:

I am re-upping this because we just got a new client that hired us because their organic traffic had started tanking, but when you looked at the data in Google Search Console, it only showed brand queries were tanking. And said tanking started pretty much when the paid traffic started increasing.

We did some back of the envelope calculations on SEMRush CPC data available for their brand queries and we calculated this client is spending an additional $20,000/day to buy $50,000 in revenue*. This isn’t such a bad deal until one considers that until the increase in paid search traffic, the client was spending ~$0 to achieve that revenue. So their margins just took a 40% haircut.

So, next time your organic traffic tanks, before you panic try this:

  1.  Look at the Performance report in Google Search Console for your site (for all variants)
  2. Filter the report by “Queries containing” a proxy for a brand query. It may not always be easy to do when you have a generic brand name, but try different phrases to see what captures the most traffic that is properly bucketed
  3. If you see the traffic for queries containing your brand term going down while traffic for queries not containing your brand term is flat or going up, move on to step #4
  4. Look over at the paid search team. If they are high-fiving each other, we have a winner…

UPDATE
The site in question turned off most of the PPC ads targeting their brand a few days ago. On the first day after they did this, PPC revenue was down $90K. SEO revenue was up $170K.
 

*Data has been changed to protect the innocent

The post Paid Search Targeting Brand Keywords = SEO Death appeared first on Local SEO Guide.

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview