Loading...

Follow Marketing Land » Retargeting & Remarketing on Feedspot

Continue with Google
Continue with Facebook
or

Valid

SEATTLE – Where Facebook ads were once a hotbed for Lookalike audiences and bottom-of-the-funnel conversion actions, it’s now a crowded space that enables ad blindness and brand apathy on a foundational level.

That’s the view of Susan Wenograd, VP of marketing strategy at Aimclear, who delivered a session on optimizing social ads at SMX Advanced last week.

What happened?

For starters, an oversaturation of ads served to the same users means that Lookalike audiences are no longer as effective as they once were. Add that to an over-reliance on conversion-focused campaigns and Facebook’s removal of third-party targeting, and you’ve got an ecosystem where brands are more fixated on lead counts than sustainable brand growth.

“Agencies that can’t strategize their way out of a paper bag will fail, as they should have long ago,” Wenograd said. In a flooded social landscape, the most successful brands will be the ones that integrate robust paid strategies and prioritize high-quality creative.

Shifting the campaign goal

If brands want to thrive in the next wave of social advertising, she added, then we (as marketers) need to return to our roots and examine conversion actions as only one piece of the larger picture. By shifting the initial campaign goal from CPA (conversions) to CPM (impressions), marketers can build a foundation for a more engaged, qualified audience pool with more efficient ad spend.

Once qualified leads begin engaging with a brand awareness campaign, advertisers can then use that rich data for retargeting. Building remarketing pools is a crucial component to converting leads, and a necessary undertaking for social advertisers planning to come out on top.

“Brands and agencies that can measure and understand triggers of brand lift will win the long game in expensive media arenas,” Wenograd said.

In going back-to-basics with engagement campaigns, brands should also be leveraging cross-channel data to maximize targeting opportunities. Implementing tagging structures and UTM parameters across all digital media can provide advertisers with streamlined access to tested segmentation data.

Creative at the center

But, of course, all this could mean nothing without compelling creative to back it up. Our rapidly evolving social footprint means that audiences expect high-quality visual assets, optimized to the format, screen, or platform they view it on. As Wenograd concluded in her session, “Brands and agencies that cannot execute and be nimble on creative will not survive in the Facebook and Instagram landscape.”

The post Facebook ads aren’t what they used to be, so it’s time for smarter social ad buying appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Tomorrow is the one year anniversary of the EU’s General Data Protection Regulation (GDPR). The landmark law created a unified, pan-European approach to privacy and data regulation. It was designed to protect EU citizens against non-consensual data collection by global tech companies and give individuals more control over their personal data. It also carries potentially severe penalties for violators.

Since being implemented last May, GDPR has impacted privacy debates around the world. It has also been an influence on California’s forthcoming CCPA, set to take effect next January. But has GDPR accomplished what it set out to do; is it working?

For perspective, we asked Johnny Ryan, chief policy and industry relations officer at Brave Software. A long-time privacy advocate and vocal critic of industry data-collection practices, he was substantially responsible for the recently announced Irish investigation into potentially improper exposure of personal data in Google’s programmatic platform.

We invited him to reflect on the impact of GDPR on the digital ecosystem and how it has changed the lives of marketers. Most of the changes Ryan expects have yet to take place, as he discusses in the interview below.

ML: What have been the most significant effects of GDPR on marketers and brands?

JR: Marketers are now controllers, even when they do not realize that they are. This exposes them to legal hazards, and will ultimately cause them to be more careful about the targeting that is used in their campaigns.  In June the European Union’s highest court ruled that marketers are responsible for how data is used in marketing campaigns — even if they never directly touch the data.

The European Court of Justice ruled that a marketer’s use of Facebook for advertising “gives Facebook the opportunity to place cookies on the computer or another device of a person visiting its fan page, whether or not that person has a Facebook account.” In addition, the Court observed that the marketer “can ask for — and thereby request the processing of — demographic data relating to its target audience” such as age, sex, relationships, occupation, lifestyles, areas of interest, purchases and online purchasing habits, and geographical data.” According to the Court, a marketer is therefore “a controller responsible for that processing.”

This applies to RTB: marketers are liable as “controllers” of the processing undertaken by the various adtech businesses involved in the RTB system on their behalf. RTB broadcasts personal data without security in hundreds of billions of bid requests every day. It is the most massive data breach ever recorded. Marketers now find themselves liable for it because of the adtech companies they or their agencies work with.

ML: What has changed in the day-to-day lives of marketers following GDPR?

JR: Most marketers are not aware of the risk that RTB companies expose them to. Otherwise, they would already have conducted data protection impact assessments (DPIAs), as required by Article 35 of the GDPR. DPIAs are required when AdTech is profiling and using intimate personal data (referred to as “special category personal data” in article 9) on a large scale to target people in the European market. The inescapable conclusion of any such assessment is that RTB is a “data protection free zone,” as The Economist indicated. This conclusion triggers Article 36 of the GDPR, requiring a marketer to alert a data protection regulator in an EU Member State about the risks it has uncovered.

ML: What changes have you observed in data collection practices?

JR: Change has yet to happen. As I told the Senate Judiciary Committee when I testified this week, we are at the very start of the application of the GDPR. But things are looking bleak for Google, Facebook, and the conventional RTB companies. They will be forced to reform.

ML: There seems to be a fair amount of non-compliance with GDPR. Why haven’t there been more fines or callouts of violators? 

JR: [This week] the Irish Data Protection Commission announced that it was launching a probe of Google DoubleClick/Authorized Buyers on suspicion of infringement. This, finally, marks the start of enforcement action that will force adtech to reform.

ML: Have there been any “unintended consequences” of GDPR? For example, some argue that it has strengthened the hand of dominant companies vs. smaller competitors. 

JR: First, let me dispel this idea that Google and Facebook benefit from the GDPR in the medium term. The GDPR is risk-based. That means Big Tech that creates big risks get big scrutiny and potentially big penalties. Regulators are only starting to enforce the GDPR and it will take years to have full effect. But already, things are looking bleak for our colleagues at Google and Facebook. Their year-over-year growth declined steadily in Europe since the GDPR – despite a buoyant advertising market.

They face multiple investigations and it is very likely that they will be forced to change how they do business. Google’s consent has already been ruled invalid. Yes, of course things are even bleaker for other tracking companies, that don’t have a search business to fall back on, as Google does.

Second, let me talk about the nonsense “consent” notices that currently despoil the Web. The IAB’s consent gambit was certainly an unintended consequence. However, these annoying and unlawful consent notices will become a rarity, if there is enforcement. Article 7 (3) of the GDPR  requires that an opt-in must be as easy to undo as it was to give in the first place, and that people can do so without detriment.

Once this is enforced, consent messages will become far less annoying in Europe – because if a company insists on harassing you to opt-in, and you finally click OK, it will be required to keep reminding you that you can opt back out again. In addition, most of the consent notices are for RTB companies whose processing is itself unlawful. So enforcement against Google and the IAB on RTB will prevent the majority of these notices.

ML: Finally, what does the experience of GDPR in Europe say about the implementation of CCPA in the US?

JR: Very little. Although its animating principles are noble, I think the CCPA is a pale imitation of the GDPR.

The post Is GDPR working? Brave’s Johnny Ryan says it’s starting to, and marketers must heed the risks appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

As expected, Google announced coming changes to the way its Chrome browser handles cookies and addresses fingerprinting on Tuesday at its annual I/O developer conference. New tools in Chrome will allow users to block or clear third-party cookies more easily, Google said. The company also announced a browser extension that will show more information about parties involved in ad transactions and tracking.

Chrome’s new cookie handling. Google said “blunt approaches” to cookie blocking haven’t been effective for users because they treat all cookies alike — from first-party cookies used to keep users signed-in to sites to third-party cookies used for tracking — so it’s changing how cookies work in Chrome.

From a security standpoint, Google said this change will also help protect cookies from cross-site injection and data disclosure attacks by default. Eventually, Google said, Chrome will limit cross-site cookies to HTTPS connections.

In the coming months, developers will be required to specify explicitly which cookies are able to work across sites and potentially used to track users through a new mechanism based on the web’s SameSite cookie attribute. The SameSite attribute can be used to restrict cookies to first-party or same-site context.

In the weeds. Chrome 76 will include a new same-site-by-default-cookies flag, according to web.dev. Cookies without the SameSite attribute will not be available in a third-party context. Developers will need to declare cookies that need to be available on third-party sites to Chrome with SameSite=None. Google says this will allow Chrome users to clear cross-site cookies and leave single domain cookies used that are used for logins and site settings in tact.

Developers can start testing their sites to see how the cookie-handling changes will affect their sites in the latest developer version of Chrome.

Cracking down on fingerprinting. The company also said it is taking further measures to restrict browser fingerprinting methods that are used as workarounds to keep tracking in place when users opt out of third-party cookies.

Google said Chrome plans to “aggressively restrict” browser fingerprinting and reduce the ways browsers can be passively fingerprinted. “Because fingerprinting is neither transparent nor under the user’s control, it results in tracking that doesn’t respect user choice,” said Google.

The company added that it doesn’t use fingerprinting for personalizing ads or allow fingerprinting data to be imported into its ad products.

User cookie controls. Google said it will provide users will more information about how sites are using cookies and give them simpler controls for managing cross-site cookies. The company didn’t say what these changes will look like in the Chrome interface, but said it will preview the features for users later this year.

Ad data browser extension. The company also announced it is developing an open-source browser extension that will show the names of ad tech players involved in an ad transaction as well as the companies with ad trackers attached to an ad. The extension will also show the factors used for personalization. That will be the same information Google shows when you click “Why this ad”.

Why we should care. The end of digital advertising ecosystem’s reliance on cookies for tracking and attribution has been a long time coming. Cookies aren’t supported on mobile apps, and the mobile web and apps now account for the majority of ad spend. Google and Facebook have led a shift away from cookies to relying on deterministic IDs of signed-in users.

Chrome is not a first mover in this realm, either. It’s following in Apple’s Intelligent Tracking Prevention (ITP) footsteps. The latest version, ITP 2.2, will limit cross-site cookie tracking of users in Safari to one day. Earlier this week, Microsoft announced its Chromium-based Edge browser will also have new tracking controls for third-party cookies.

For marketers, the full impact of these changes and how users respond to the tools likely won’t be seen for months, but stand to have a significant impact on remarketing, analytics and attribution efforts. It’s also unclear if (or how much) Chrome’s new requirements will benefit Google with its first-party relationships with billions of users over other ad tech firms, as the Wall Street Journal has predicted.

The Chrome announcements come amid a broader PR campaign by Google aimed at would-be U.S. regulators. Google CEO Sundar Pichai published an op-ed in The New York Times Tuesday night titled “Privacy should not be a luxury good” in which he reiterated Google’s position that “a small subset of data helps serve ads that are relevant and that provide the revenue that keeps Google products free and accessible” and listed ways in which the company addresses user data. Pichai called for federal data privacy legislation in the vein of the EU’s GDPR. Google reportedly began lobbying for a “friendly” version of a federal law last summer.

The post Google’s Chrome will change cross-site cookie handling, ‘aggressively’ tackle fingerprinting appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Quora’s Google Tag Manager integration.

Pinterest and Quora are now approved Google Tag Manager vendors, making it easy for marketers to manage their Pinterest and Quora Pixels via Google’s platform.

Why we should care

The native integrations for Quora and Pinterest makes it much easier to set up those pixels in Google Tag Manager (GTM) to track ad campaign performance from those channels. No more having to create a custom HTML tag in GTM.

Within GTM, you can set up your pixels from channels to track user behaviors such as viewing a piece of content, or adding items to the cart, without having to alter the code base.

Currently, Pinterest and Quora’s Google Tag Manager integrations only support tacking from websites not apps, according to Google’s supported tag manager list.

More on the news
  • Both Pinterest and Quora shared quick steps for adding each platform’s tags into your Google Tag Manager account: Pinterest instructions here. Quora instructions here.
  • Google Tag Manager currently supports more than 80 websites natively, including LinkedIn, Twitter, Adobe Analytics, Microsoft Ads and more.

The post Quora, Pinterest ads pixel integrations now available in Google Tag Manager appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

At a time when technology is enabling marketers to inch ever closer to one-to-one marketing, one of the industry’s most highly respected institutions suggests that this “Holy Grail” may not be all it’s cracked up to be. The Advertising Research Foundation (ARF) warns marketers that excessive targeting and retargeting can lead to lower-than-expected ROIs, a poor customer experience and potential damage to a brand’s reputation.

“The perspective of the ARF is that there’s always been targeting and you have to have it,” said Scott McDonald, CEO and president of the ARF. “But when you move farther out on the spectrum toward one-to-one marketing, you’re getting into a more problematic area where you have to ask yourself both about the trade-offs of the higher cost and the return on that cost.”

ARF is an independent science-based non-profit organization that acts as a neutral research body for questions on measurement and efficacy in advertising.

Personalization can work … for some targets

Some marketers do benefit from personalized, people-based marketing, admits McDonald. “If it’s a highly niche product, then you’ll need to pay more attention to targeting.”

And search marketing works, “because people are actively looking for something and they’re expressing interest in it in the moment.”

“But re-targeting, where you follow people around for weeks, is much more questionable,” McDonald said. “When marketers move into those modes of targeting, it has the potential of giving all the rest of the targeting a bad name because there’s a possibility of consumer backlash.”

Consider the risks vs the benefits

McDonald bemoaned the errors that can be found in third-party data, as well as the negative brand experience customers can have when bombarded with excessive targeting and re-targeting ads.

“There’s an economic issue as well because you’re going to pay more for a more specific target,” McDonald said. “And you may not be building a brand because you’re really only targeting too narrow of the group.”

McDonald also mentioned the “creep” factor, when users start to feel spied on by messages that follow them everywhere they go online.

“I think it’s conceivable that, that we’ve taken targeting too far in terms of effectiveness and ethics,” McDonald said. “And I think it’s a reasonable question to debate.”

Can we put the personalization genie back in the bottle?

Advertisers should definitely consider moving away from personalization and consider putting those dollars toward more traditional advertising, McDonald said. And for a wide variety of reasons, some brands are doing just that. Likely motivated by its frustration with digital advertising in general, CPG giant P&G upped its investment in AM/FM radio in 2018 — a move that spearheaded similar action from other brands.

“It’s been decades since [P&G] was advertising on AM/FM radio and they’ve got good results from it,” McDonald said. “If it’s not person-based and if you’re not getting into third-party databases, you’re not getting into all of this GDPR-compliance stuff. So yeah, I think there’s a good argument for re-evaluating and seeing whether people have been too quick to abandon tried-and-true marketing channels from the past.”

Why you should care

rise in the use of customer data platforms (CDPs), which are used to identify customers across platforms and devices is just one proof point of how invested marketers are in data-based, personalized marketing. But personalization comes with its costs. Not only does it raise efficacy and ethical questions as ARF has articulated, but it is causing a content crisis as marketers struggle to keep up with the demand of producing ads that are tailored to each micro-target.

“Advertisers have to research what the impact is for the long term for their brand,” said Paul Donato, chief research officer of the ARF, adding that it’s hard to tell a brand’s story within the confines of a typical targeted digital ad.

“It’s very important that advertisers understand the tradeoffs between what they’re getting in a short promotional ad on a digital platform versus what they could do with more traditional media, like a :15 ad on radio or television. They need to understand what the sales return is for the digital ads, but then also balance that out with what the long-term impact on their brand is,” Donato said.

The post Why ARF thinks marketers should reconsider their personalization strategy appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Identity resolution tools take simple analytics a step further by tying online behavior to a consumer’s unique identity, giving marketers the information they need to zero in on their target consumers with highly personalized, tailored offers that, in turn, lead to higher ROI.

Identity resolution has become increasingly important for marketers as people move across devices — mobile phones, desktops, connected TVs — throughout the day. Identity resolution can help marketers understand that Mobile User A is the same person as Desktop User B. Without that understanding, marketers aren’t able to control messaging to users as they progress through the customer journey on different devices and that’s where identity resolution can help.

It works by reconciling all available data points, which include those collected by first-, second- and/or third-parties. A composite is built that provides marketers with a cherished 360-degree view of a customer’s identity and user journey, and enables an insight-informed, data-driven “single-customer view”  — also known as people-based, or user-level, marketing.

Marketers use a number of tools and platforms to reconcile users’ identities, including simple customer relationship management (CRM) systems. In 2018, the martech landscape saw a proliferation of customer data platforms (CDPs)— tools that track user omni-channel behavior across different devices, platforms and channels.

At the center: the identity graph

To identify individual customers, data is plotted against an identity graph. Consumers give consent along their path for various pieces of marketing technology to collect, process and analyze data such as device ID, email addresses, phone numbers and cookie data, in addition to behavioral information such as purchases or website visits. That information is matched to other data in the graph using algorithims and patterns to create a likelihood, or probabilistic match.

Over time, the systems use artificial intelligence (AI) and machine learning to get smarter and make better guesses at matches. When a user takes an action that requires them to verify their identity, such as paying with a credit card, that guess then becomes deterministic — a perfect match.

It’s a mutually beneficial arrangement. In exchange for that information, brands provide customized experiences that are more relevant and useful to the consumer, a convenience that some studiessay is worth it even for those who are concerned with privacy.

Walled gardens such as Facebook have their own identity graphs, as do data management vendors, leading some of the industry’s leading demand- and supply-side platforms to formthe Advertising ID Consortium in 2017.

Data laws threaten the ID graph

Strict restrictions governing the use of personal data, such as Europe’s General Data Protection Regulation (GDPR), the soon-to-be implemented California Consumer Privacy Act (CCPA) and potential upcoming federal legislation, might throw a monkey wrench into companies’ ability to collect and use second- and third-party data.

Signal CEO Mike Sands, whose company provides such a solution, says that these laws provide “a strong incentive to invest heavily in first-party data that brands can own and operate with users’ consent.”

“Making a strategic pivot away from third-party data toward first-party data also puts brands on better footing in the fight against Amazon and other industry disruptors (e.g., direct-to-consumer upstarts) with vast troves of first-party customer insight,” Sands said, though he believes that second-party data still has a future in ID resolution.

“Second-party data is a different story,” Sands said. “Because it brings together first-party data, it is unique and high quality: not everyone can access it. While brands have chosen for decades to participate in data sharing agreements and data co-ops, emerging technologies are taking second-party data to the next level while addressing the privacy and security concerns that historically deterred many marketers from utilizing these other options.”

What’s the future of ID resolution?

Mara Chapin, digital and social media specialist at Massachusetts-based Market Mentors, says that privacy challenges could force marketers to rely on other customer analyses in addition to identity resolution.

“Due to the regulations over the past year and the reduction of second- and third-party data, a lot of the targeting options and interest-based information have been taken away from marketers when building out their marketing strategies,” Chapin said. “This has required us to learn more about consumers’ behaviors and work harder on our creative in order to accurately target the right markets and make sure the users we want to reach will be the ones who click and convert into a sale.”

“In the future,” Chapin said, “we’re going to start focusing more on user intent rather than geographic-, demographic- and interest-based targeting in order to build our audiences. Instead of having to know personal information that regulations are being built to protect, we will rely more on what people are looking for rather than what they like.”

Sands doesn’t think ID resolution is going anywhere.

“I do not envision identity resolution giving way to rival marketing solutions — if anything, it’s becoming a requisite step at all stages of the customer lifecycle, and I predict it will displace many technologies and philosophies that do not enable the same real-time, continuous engagement,” Sands said.

Chapin ultimately agreed that identity resolution is key to a good marketing strategy. “It’s important in order to reach the proper audience in your marketing messages, whether it’s through the latest digital offerings or in your traditional advertising placements,” Chapin said. “Knowing who your audience is and being able to segment your buys to those targets is the best way to use your budget and overall marketing strategies effectively.”

This story first appeared on MarTech Today. For more on marketing technology, click here.

The post What is identity resolution? appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Understanding the steps a customer takes before converting can be just as valuable to marketers as the sale itself. Attribution models are used to assign credit to touchpoints in the customer journey.

For example, if a consumer bought an item after clicking on an display ad, it’s easy enough to credit that entire sale to that one display ad. But what if a consumer took a more complicated route to purchase? She might have initially clicked on the company’s display, then clicked on a social ad a week later, downloaded the company app, then visited the website from an organic search listing and and converted in-store using a coupon in the mobile app. These days, that’s a relatively simple path to conversion.

Attribution aims to help marketers get a better picture of when and how various marketing channels play contribute to conversion events. That information can then be used to inform future budget allocations.

Attribution models

Following are several of the most common attribution models.

  • Last-click attribution. With this model, all the credit goes to the customer’s last touchpoint before converting. This one-touch model doesn’t take into consideration any other engagements the user may with the company’s marketing efforts leading up to that last engagement.
  • First-click attribution. The other one-touch model, first-click attribution, gives 100 percent of the credit to the first action the customer took on their conversion journey. It ignores any subsequent engagements the customer may have had with other marketing efforts before converting.
  • Linear attribution. This multi-touch attribution model gives equal credit to each touchpoint along the user’s path.
  • Time decay attribution. This model gives the touchpoints that occured closer to the time of the conversion more credit than touchpoints further back in time. The closer in time to the event, the more credit a touchpoint receives.
  • U-shaped attribution. The first and last engagement get the most credit and the rest is assigned equally to the touchpoints that occured in between. In Google Analytics, the first and last engagements are each given 40 percent of the credit and the other 20 percent is distributed equally across the middle interactions.

Algorithmic, or data-driven attribution. When attribution is handled algorithmically, there is no pre-determined set of rules for assigning credits as there is with each of the models listed above. It uses machine learning to analyze each touchpoint and create an attribution model based on that data. Vendors don’t typically share what their algorithms take into consideration when modeling and weighting touchpoints, which means the results can vary by provider. Google’s data-driven attribution is just one example of algorithmic attribution modeling.

Custom attribution. As the name suggests, with a custom option, you can create your own attribution model that uses your own set of rules for assigning credit to touchpoints on the conversion path.

Benefits, limitations of attribution

So is that it? Pick a model and be done? Not quite.

Marketers face the ongoing challenge of being able to stitch all the various touchpoints available to their customers together for a grand view of attribution. There have been improvements, with greater ability to incorporate mobile usage, in-store visits and telephone calls into models, but perfection is elusive.

“Given the increasing fragmentation of platforms and the types of media that marketers have available to them, attribution has never been more important from a marketing measurement perspective,” says Simon Poulton, senior director of digital intelligence at digital marketing agency Wpromote. “Unfortunately, the nature of attribution is one where the goal posts are constantly being moved and just like an asymptote, we’ll never be able to reach the point of 100 percent attribution.”

Chris Mechanic, CEO and co-founder of digital agency Webmechanix, agreed. “Any attribution model is going to be messy,” Mechanic said. “Find one that makes some degree of sense and stick with it. Whether it’s first touch, last touch or blended, the really important thing is getting everybody [on a team] to buy into it and then stick with that over time.”

As marketers invest in more channels and digital mediums, getting a unified view of a customer’s journey is only getting harder. “This will become ever more complicated by increased investments in influencer marketing and Amazon where there are significant challenges in creating unified IDs,” Poulton said. “While many of the major players like Visual IQ and Neustar look to solve for some of these challenges via partnerships, we will still be faced with larger challenges on the horizon for out-of-home (OOH) media as one example.”

“In addition to the customer journey tracking that (Google’s and Facebook’s attribution platforms) provide, we’ll likely see the development of variance analysis solutions within the platforms that will enable marketers to better understand the existing impact of their strategies,” Poulton said. “At an overarching level, the key takeaway here is the convergence of data across platforms and the ability to understand interactions that occur across channels in both an impression and click capacity.”

The post What is attribution modeling? appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Aram Zucker-Scharff, ad engineering director for Washington Post’s research, experimentation and development team, lit up a tweetstorm Wednesday, calling just about every way that the digital marketers measure and reports on performance fake.

The anger is real. Zucker-Scharff’s comments touched a nerve in the adtech community, with his initial tweet racking up more than 6,000 likes, nearly 3,000 retweets and a host of comments and sub-threads. The rant cam in response to a New York Magazine article, which accused most corners of the Internet as being largely “fake.”

The numbers are all fking fake, the metrics are bullshit, the agencies responsible for enforcing good practices are knowing bullshiters enforcing and profiting off all the fake numbers and none of the models make sense at scale of actual human users. https://t.co/sfmdrxGBNJ pic.twitter.com/thvicDEL29

— Aram Zucker-Scharff (@Chronotope) December 26, 2018

A house of cards. Frustrated by misleading metrics and what he says is a proliferation of “fake” ads, traffic, users and metrics, Zucker-Scharff said that the industry’s “scale business models are a house of cards on a base of imaginary numbers.”

Some of the fake audiences and traffic Zucker-Scharff mentioned is surely due to malicious hacking such as botnets, but some are due to simple deceptions, like those uncovered in an October revelation that Facebook had been inflating its video metrics.

Zucker-Scharff and other notables within the adtech community such as Ross Maghielse, the Philadelphia Inquirer’s manager of audience development, and CEO of trade group Digital Content Jason Kint, chimed in with their own commentary and examples throughout the thread.

In her response to the thread, former Reddit CEO Ellen K. Pao said, “Everything is fake. Also, mobile user counts are fake. No one has figured out how to count logged-out mobile users, as I learned at reddit. Every time someone switches cell towers, it looks like another user and inflates company user metrics. And, if an unlogged-in user uses the site on multiple devices, each device counts as a unique user.”

Why you should care. It hasn’t been a good year for transparency. Facebook alone has spent most of the past year apologizing to marketers for making mistakes.

Zucker-Scharff said that the problem extends to every metric, including app installs, agency-reported metrics, and even the ads themselves, and whether they are trending. Throughout his Twitter thread, Zucker-Scharff provides links to other reported issues with metrics, including those provided by third-parties like comScore and IAB.

As an advertiser, you should know what you’re paying for. Digital advertising is still experiencing growing pains but at some point, the industry needs to grow up. Perhaps regular uproars from adtech practitioners like Zucker-Scharff will finally get ad platforms to stop issuing apologies and instead figure out a way to deliver usable and accurate metrics. The time has come.

The post Fake, fake, fake: Epic tweetstorm targets marketing’s metrics house of cards appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Facebook is updating its policies around who can enable its Pixel and event sets, now requiring agencies to confirm their relationship with clients before leveraging either ad-targeting tool for client campaigns.

Why marketers should care

Facebook’s Pixel and offline event sets allow marketers to track activity outside of Facebook and then use that data to target ads to users on the platform. Often, it’s a brand’s ad agency or marketing partner that enables a Facebook Pixel or event sets to optimize campaign results. With this latest policy update from Facebook, agencies must now confirm their relationship with a business before leveraging these tools on the business’ behalf.

“In order to begin sharing pixel or offline event information with another business, you’ll need to define your business’ relationship with the business you want to share data with and review and reaffirm your compliance with our existing Business Tools Terms of Service,” writes Facebook on its business blog.

Brands will also be able to assign “advertising partners” when setting up a Pixel so that their agency will have access to use the retargeting tool.

According to Facebook, this move is an extension of recent, similar decisions aimed at making ads on the platform more transparent, such as removing third-party targeting and limiting access to Custom Audience lists.

More on Facebook Pixel and event sets
  • Any ad agencies or marketing firms initiating a new Facebook campaign that involves a Pixel or offline event set for a client will be asked to define their relationship with the client during the set up process.
  • For existing Pixels and offline event sets currently in use, Facebook will require businesses to update their information during the first half of 2019.
  • In August, Facebook gave a limited number of Groups access to its ads Pixel, offering the tool to admins who manage groups with 250 or more members.

The post Agencies must now confirm client relationships before enabling Facebook Pixel and event sets appeared first on Marketing Land.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


The Advertising ID Consortium announced earlier this month that it will be ramping up its efforts toward adoption of standardized methods for targeting digital ads to users across devices. To do so, the group says it will work with other tech leaders to ensure that marketers are able to easily access user exposure data within their own platforms.

Exposure data refers to information about ad campaign measurement, such as where the ads were served and to whom. The group’s tracking solution now includes the following common domains or cookie IDs: Live Ramp-powered Open Ad-ID, The Trade Desk’s Unified Open ID and the recently adopted DigiTrust ID.

The organization said it would work with its partners to provide:

  • Exposure data tied to device-based and people-based identifiers that can be accessed through a standard, privacy-conscious ID for use in measurement and analytics
  • People-based ad serving technology that will make advertising exposure data directly available to a marketer’s ad server
Why you should care

Marketers cannot effectively target their customers without the ability to connect user behavior across devices. Google and Facebook have their own user logins, but other publishers and ad exchanges use their own identifiers, making the process messy and slow.

To further complicate matters, in May, Google put restrictions on the use of its user login, the DoubleClick ID, telling media buyers who use its data transfer service that they will no longer be able to use the ID to generate reports, limiting them to Google’s own Ads Data Hub for those metrics.

The consortium favors an open identity solution outside the so-called walled gardens.

The road has been rocky for the organization, who lost two out of three of its founding members since it was founded in 2016. In March, MediaMath pulled out in favor of DigiTrust’s vendor-independent approach, only to be brought back in the fray when DigiTrust integrated its solution with the consortium. MediaMath was replaced by Index Exchange. And just last week, new owner AT&T withdrew AppNexus, saying it wanted to concentrate on internal integration.

A standard, transparent standard ID would help marketers to more seamlessly access data, accurately chart the customer journey and ultimately, provide a more personalized experience to their customers. But with three IDs currently in use, it’s unclear as to how the consortium will accomplish that goal.

From a release announcing the move:

Today’s consumers spend time in an ever-growing number of digital channels, making it difficult for marketers to understand who has been exposed to their advertising, how many times, and where. Recent policy changes from some providers of ad serving technology will further complicate brands efforts to understand consumers’ omnichannel journey and provide more relevant, customized experiences for their customers. This creates the need for an open, transparent standard that anyone can leverage to provide or access advertising exposure data. The platforms participating in this initiative represent a majority of the ad volume on the open internet and have committed to sharing data on digital ad exposure with the entire ecosystem in a standardized format.

More about the news
  • The tech leaders include industry heavy hitters such as Criteo, OpenX, PubMatic, Thunder, IgnitionOne and AdRoll Group.
  • “Brand marketers are taking a closer look at their digital advertising supply chain and are proactively choosing to only work with partners that prioritize quality and can solve real needs,” said Jason Fairchild, co-founder, OpenX. “This new measurement solution will help quantify the value of open web inventory, and when evaluating who to work with, advertisers should look to partners that can offer this holistic view of measurement.”

.

The post Ad ID Consortium will work with tech firms to help marketer access critical exposure data appeared first on Marketing Land.

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview