When developing content for multiple locales, in multiple languages, companies struggle with the idea of translation versus transcreation. Transcreation is about recreating the content in a language and style that connects with the consumer in a meaningful and emotional way. Professor Nitish Singh and Frank Hartkopf, Head of European Content, Axonn Media, explore the importance of transcreation to create content which truly hits the sweet spot of the local consumer.
This Global Marketing Insight video is provided by Brand2Global, the conference for global marketers, September 28-29, 2016 in Silicon Valley. Brand2Global Conference is an annual event designed for professionals who drive global marketing and are responsible for international market share and revenue. If you’re a global marketing practitioner, this is the conference for you. Learn more.
A branch of computer science that focuses on the development of software agents, also known as cognitive technologies, capable of performing tasks that would normally require human intelligence, such as finding, interpreting, and manipulating visual and textual information.
Why is it important?
Artificial intelligence is producing cognitive technologies that are radically changing, and even automating, many traditional communication tasks. Technical communicators need to adapt accordingly.
Why does a technical communicator need to know this term?
Artificial intelligence (AI) has been advancing rapidly recently, and, as cognitive technologies, its impact has been spreading. Here are some of the reasons why AI has become so important today including:
Cognitive technologies have become much more practical, shifting the focus to performing human tasks rather than emulating human thought.
Massively scalable big data acquisition, storage, and processing infrastructure has become broadly accessible.
Decades of research and experimentation in AI, while not successful in emulating human thought, has been successful in improving problem-solving and learning algorithms.
A key area of application for AI is Natural Language Processing. Here, tasks commonly performed by people are being increasingly automated, or at the very least facilitated, by intelligent software applications. These tasks include text translation, summation, validation, classification, interpretation, and even generation.
Another area of AI advancement is computer vision, where image and video processing automates the selection, interpretation, and manipulation of visual resources. Yet another important area of AI advancement is information discovery, where contextually aware applications help select relevant information resources for users based on real-time data.
For technical communicators, these changes could not be more significant. More and more traditional communication tasks will be subjected to automated support and even replacement. What this means is that the focus for technical communicators will shift more and more towards the human side of the equation, such as facilitating all-important cross-functional collaborations, the value of which will in fact be increased and not diminished by the advance of AI.
About Joe Gollner
Joe Gollner is the Managing Director of Gnostyx Research, which he founded to help organizations leverage content standards and technologies as the basis of scalable and sustainable content solutions. For over 25 years, he has championed content technologies as an indispensable mechanism to help organizations manage and leverage what they know.
Our fifth annual Information Development World conference will take place November 27-29 in Menlo Park, CA. The intimate three-day event focuses on helping attendees learn how to design a technical content resource center where both prospective and existing customers alike can find the information they need about the products and services you offer, regardless of which department created that content.
Information Development World: What We’re Looking For
We’re looking for presenters to share case studies, lessons learned, and best practices for developing amazing customer support websites. If your company has done great work in this area, and you are able to measure the impact of those improvements on your business, we want to hear from you.
We’re planning to showcase companies that value customer experience and have worked hard to turn their online resource center into a useful one-stop shop for product and service information. Specifically, we’re interested in learning from companies that have brought together technical, training, and marketing content under one roof.
We’re also looking to hear from organizations that have implemented chatbots, live chat, voice interfaces, infographics, video documentation, customer stories, augmented reality, podcasts, downloadable tools, personalized content, and other innovative solutions.
Submit a Presentation Proposal
If you think your team has learned lessons it could share with others, we’d like to encourage you to submit a presentation proposal. Presenters are provided a free ticket to the conference (valued at $1995) in exchange for their participation.
Adapting content to make it more meaningful, appropriate, and effective
Like many of you, I have frequently had to explain what localization is, usually in the context of what I do for a living. I find localization easy to explain, often using the example of getting a mobile phone adapted for other countries. Because most people own one, they can imagine why adaptation makes sense. For good measure, I add that it is not only the language, but often local laws, different technical specifications, and customs that need to be considered.
As I get fired up and start explaining the difference between localization and internationalization, eyes quickly glaze over because, like in all languages, words are not just words; they represent concepts that, in turn, often rely on the comprehension of other concepts for a clear understanding. At this point, the value of a guide like The Language of Localization becomes apparent. Like any other profession, we have developed a jargon of our own. We took common words that have multiple meanings and defined them in the context of localization, or if a word did not exist, we invented one.
A clearinghouse for localization terminology
This sounds a lot easier than it is because there is no central authority to do that work, no arbiter to clarify meaning. Instead, it was and continues to be, a crowd-sourced exercise with no central repository to use for reference.
Enter Kit Brown-Hoekstra, Scott Abel, Richard Hamilton, and their merry band of subject matter experts. For the benefit of localization professionals, global marketers, and technical communicators, they compiled a “must-know” list of terms that define the common language of our industry.
I appreciate how thoroughly they tackled the task. Not only are the terms defined clearly and their importance explained, but in the essay, they explain why business professionals need to have this information.
Having clarity in the way we communicate about the many tasks and processes that make up life in localization benefits every aspect of our work. It helps us achieve good quality, speeds up time-to-market, and improves the cost-effectiveness of our collaboration. Everyone benefits, from the translator to the project managers to, maybe most important of all, the end users.
We all owe thanks to the people behind this project for helping to develop a shared language for a still young industry. Along with other standards, it is a critical tool in meeting the challenges of an ever faster-paced, globalized world.
As new terms appear and existing ones change their meaning, I hope that the authors will issue new editions of this excellent guide.
Well done, and thank you on behalf of the localization community.
It’s intuitive to believe that visuals are more memorable than text. To a degree, science confirms this. Research shows that visuals impact recall because they help viewers process information faster and assist them to pay attention by being more engaging than text.
But there is such a thing as a forgettable visual. Think of all the information you encounter in a typical week. How much of it do you remember? We forget our lives almost as quickly as we live them, and visuals can still escape our memories.
In her July 19, 2017, Content Wrangler webinar, The Science behind Memorable Visuals, cognitive neuroscientist, Dr. Carmen Simon, talked about how to stay on people’s minds by applying science-based guidelines from the angle of how the brain processes visuals.
A bestselling author and leading expert on using memory to influence decision-making, Carmen covered how to use visual thinking skills in four areas that are prevalent in business communication: facts, processes, data, and abstracts. She also covered how to use design elements (images, text, lines, shapes) to create interesting—and memorable—content.
Carmen provided universal visual design principles and explained how they influence viewer attention and retention, including how to create and select visuals that impact memory and how to avoid those that don’t.
Visual thinking is important because, when you use images correctly, you have the luxury of staying on your audience’s minds long-term. This helps you influence their decisions because people act in your favor based on what they remember, not on what they forget.
Read on for some highlights from Carmen’s talk. For the details, go to her webinar and listen to the whole hour’s worth for free.
Why does it matter which visuals are more memorable?
“All your audiences make decisions in your favor based on what they remember,” Carmen says. So all businesses need to ask themselves, What makes people remember your content where visuals are concerned? How do we “stay on people’s minds long enough for them to decide in our favor?”
While we can’t assume that people remember pictures more than text, pictures can influence someone’s memory significantly. People often find pictures more interesting than text. Also the brain can process pictures faster. Finally, pictures generally retain our attention longer.
What are the elements of visual thinking for nondesigners?
Carmen breaks down elements of visual thinking into three groups:
Think in pictures
Universal design principles
Think in pictures
Facts: Most information we share with our audiences in the business world is based on facts, some objective reality we have to define. When you are sharing factual information, challenge yourself to communicate it with pictures. “The brain is always looking to conserve cognitive energy,” Carmen says. “Your brain is not like a computer; it’s looking to help you live to fight another day. It enjoys cognitive ease and will retain it.”
Processes: When you describe a process, you tell people “You need to move from point A to point B, and these are the steps.” Use arrows to indicate the order of the information in a sequence and a smooth flow to help the brain visualize and remember.
Numbers: “Challenge yourself to place data in a visual format so that you bring it to life and give it meaning.” She gives this memorable example:
Abstract terms: In these days, when people are constantly multitasking, their brains are “often too cognitively lazy to go through the effort” of making sense of your abstract information (generalizations, theories, feelings, attitudes). So don’t “leave it to the audience to visualize the meaning.”
For example, Carmen asks, what image would you choose to represent the abstract meaning of “revenge”?
It makes sense to evoke emotion with a concrete image where appropriate since emotions are memorable. “The brain is mobilized by specifics.”
Note that words can build mental pictures, too. Mental pictures often come from metaphors, as in Carmen’s Johnson & Johnson example describing Band-Aids as bodyguards:
“Use metaphors to rescue text from content amnesia,” Carmen says. “What is memory but an association between two concepts.”
How much difference is there from one culture to another? Carmen says that the brain hasn’t changed much in the last 40,000 years. “Whether you’re from Portland or Pakistan,” she says, “most of our body receptors are visual. The brain is physiologically equipped to handle images.”
Pictures: “Attention is mandatory for memories to be built,” she says. At the same time, if the brain has too much visual stimulation, it doesn’t know what to focus on. So don’t use pictures gratuitously. Avoid visual clichés. “There is such a thing as a forgettable picture,” she says.
One of her pet peeves is generic images of people touching a tablet, and out of the screen come a bunch of magical icons or interconnected dots, implying “good things come out of these devices.” Don’t use these to make slides pretty, for example. These days, those images are meaningless. “If you’re using those images, it probably means that you don’t know the subject well enough.” Another example: People wearing business suits and using a laptop on the top of a mountain.
Avoid SGSs: stupid generic shots.
Text: Use vivid, concrete words that help the brain build images. For example, if you showed an otherwise forgettable photo of Mount Everest with this vivid description, the words alone would be memorable:
Lines: Lines are important in any communication, for example, in a slide or in a document. With simple lines, you can impact the way the brain processes information. Lines can create a mood, and they can separate or group information. You can tilt them, display them in multiple formations, make them wavy, or get creative in any number of ways to organize and draw attention to your content in ways that help people remember it.
Shapes: You can also “earn a spot in someone’s mind” with shapes. Here’s another of Carmen’s examples:
Remember when context-sensitive help was the revolutionary way to deliver the right content to the right people at the right time in the right way? Just a few years ago, many technical communication teams did nothing but create context-sensitive documentation for software products. They aimed to provide contextually relevant, helpful content based on what the customer was doing in the software at any given moment.
These forward-thinking teams deconstructed large technical documents into discrete chunks, which they then hooked into the product interface. Customers no longer had to paw through a fat user manual or poke around in an online portal to seek answers to their questions. With a click of the F1 (help) key, they got the information they needed on the screen right in front of them.
Oooh. Ahhh. Contextual relevance had arrived in the digital world.
Today, savvy consumers simply expect digital content to be contextually relevant. What’s more, “context” now means more than location in a user interface. “Context” includes many factors: user-profile data, geographic location, product model, version number, preferred language, time zone, interaction history, the device’s capabilities, and so on.
Providing contextually-relevant content today is no trivial matter. It’s challenging, especially for teams that have not adopted advanced practices and tools for developing and managing information.
In his recent Content Wrangler webinar, The Fifth Element: How Structured Content Makes Chatbots Helpful, Alex Masycheff, structured-content expert and co-founder and CEO of Intuillion Ltd., discussed how emerging delivery technologies can take advantage of structured technical content to deliver contextually relevant content via conversational user interfaces, such as chatbots.
Alex delved into the following:
How chatbots improve context-sensitive assistance
Five elements of a helpful chatbot
When chatbots bring the greatest value
Why structured content is critical to chatbot success
Read on for some highlights from Alex’s talk. For the details, go to his webinar and listen to the whole hour’s worth for free.
Single source publishing today
Single source publishing has evolved since the early days of context-sensitive help. It adapts to a range of channels. People might access it through a customer portal, through a chatbot on Facebook that provides a conversational UI, or through an augmented-reality application that applies a visual layer of information over physical objects.
Content may have to adapt also to align with business rules that determine how it gets processed. Depending on the user’s goals and preferences, access rights, and other criteria, a set of business rules can be applied, on the fly, to any content to make it deliverable to the user in a way that fits the situation.
Further, we’ve broadened our notion of context sensitivity. In the early days of context-sensitive help, context meant “the user’s location in the UI.” Today, the user context has many facets. Examples:
Skills and abilities
Five elements of chatbot helpfulness
Alex’s webinar title starts with “The Fifth Element” in reference to the movie The Fifth Element. In that movie, four stones represent various elements in nature. A fifth stone brings them together and activates their powers.
Alex’s fifth element—structured content—brings all the others together and activates their power to create human experiences that just might qualify (depending on the human) as helpful.
Here are his five elements:
Entities of the user’s intent
Element 1: User’s context
The first requisite element of chatbot helpfulness is an ability to capture info about the user’s context. The system can capture some of the contextual info (for example, the user’s location and basic profile data) automatically. The chatbot then kicks into conversation mode to “unveil” other key bits of contextual info (the user’s goal and so on).
Chatbots can gather information about people’s context by asking questions. Based on the answers they get, they can then offer advice, as shown in this conversation between a chatbot and a maintenance engineer:
You could think of this robot as a chatty version of the old F1 key.
Element 2: User’s intent
To efficiently suss out the user’s intent—the thing someone wants to know or do in a given moment—a chatbot must keep the conversation within a narrow domain of information. Here’s an example of a domain that might support conversations between a chatbot and maintenance engineers:
While chatbot designers can’t control what the human will toss out (ever amused yourself by messing with Siri?), they can and must define the scope of machine’s side of the conversation. Presuming that the person stays within that scope—by asking something like “Do I need to lubricate the XZ-135?”—the conversation has a chance of satisfying the user’s intent.
Element 3: Entities of the user’s intent
To understand the user’s intent, chatbots need info about the parameters, or entities, that make up the user’s intent. Here’s what such entities might look like for our maintenance conversation:
To find out which entities go with each intent—to “fill all the required slots,” Alex says—chatbots must ask questions. For example, in the earlier conversation, after the chatbot learns that the first entity is the ZX-135, it asks a question to fill in the slot for the next entity:
When the chatbot has filled all the entities of the user’s intent, it can proceed to offer help.
Element 4: Knowledgebase
Chatbots pull their content from a knowledge base. As content professionals, our challenge is to organize that knowledge base so that the chatbot can find and deliver the content chunks that will satisfy users’ intents.
How do we make this happen? Here’s the critical behind-the-scenes insight: Just as we have learned to structure content in standalone modules (granules), so too must we structure CONTEXT.
Here’s how Alex illustrates a possible structure for context granules:
Creating a chatbot is a game of matching context granules with content granules. The chatbot pulls content from the knowledge base according to that matching.
Element 5: Structured content
Structured content—our fifth element—unites the other elements (context, intent, entities, and knowledge base) and, as Alex put it, “activates their powers.” Structured content is granular content. In other words, it’s made up of topics (or “chunks” or “units”) that can be “managed and processed independently,” Alex says.
Without structured content, he adds, a chatbot can’t create helpful experiences.
Here’s how Alex illustrates structured content:
To enable a chatbot to find and process the right topics at the right time, each topic must be associated with metadata that identifies applicable user contexts and user intents. Example:
You might wonder why we need to bother with structure, why we can’t “just let artificial intelligence do the work.” Here’s why in Alex’s words: “We’re not there yet. Understanding human language is still a challenge.”
Watch the full webinar
For the rest of what Alex has to say on this topic—including his insights into the role of artificial intelligence, deep learning, speech recognition, image recognition, natural language processing, machine translation, metadata auto-identification, and scalability—watch the full webinar here.
If those predictions don’t boggle your mind, read them again.
Is your content team ready for that future? Most aren’t.
The good news is that with some engineering, chatbots can employ and extend an existing content repository. Intelligent content allows us to use single-source publishing to push content out to interactive channels, including those that involve chatbots and intelligent assistants.
Boost the ROI of existing content
Increase sales cycles
Reduce customer-service costs
In his August 9, 2017, webinar in The Content Wrangler series, Building Chatbots with Intelligent Content, Cruce Saunders—founder and principal at [A] and author of Content Engineering for a Multi-Channel World—discussed chatbots as a new content-distribution channel that businesses can’t afford to ignore. Cruce covered basic chatbot content requirements, components and construction, and a future-proofing model that can make your content chatbot-ready.
Why is Cruce so passionate about this topic? “I’ve been in the content structure business for twenty-plus years working across lots of media,” he says. “I’m passionate because I believe that structured content is the path to a more intelligent world.”
Read on for some highlights from Cruce’s talk. For the details, go to his webinar and listen to the whole hour’s worth for free.
Here come the new technologies
Question-and-answer (Q&A) content is everywhere in our organizations. It’s in customer documentation, in frequently asked questions (FAQs), in knowledge bases, in online help—and now, increasingly, in chatbot interfaces.
We’re all in the habit of asking robots questions already. We search every day in text and, more and more, we’re using our voices. Some 60.5 million Americans now use a virtual assistant of some kind at least once a month. According to Gartner, chatbots will power 85% of all customer-service interactions by the year 2020.
Q&A content has been around for a long time in various forms. The new forms fall into three main types:
Assistant avatars (for example, Soul Machines’ Nadia)
These are simply new forms of delivering answers to questions. You might say, “The FAQ is back!”
Although today’s chat-related technologies are often implemented in simplistic and limited ways, they have the potential of making humans capable of doing smarter, better things, Cruce says. “Customers want an immediate way to interact with our content in a conversational way. Chatbots are answering.”
What organizations need to be doing today (and most are not)
We’re moving toward the conversational commerce of the next generation. And we’ll get there only if we can reuse the content we already have, says Cruce. Ideally, companies would publish their Q&A content out in multiple forms, including bots, from a single source. It only makes sense to set up a single repository for all Q&A content, following the principles of the unified content strategy: write it once, use it where needed.
It’s counterproductive—and quickly becomes expensive and messy—when companies create a whole new repository of Q&A content for bots.
Yet, all too often, that’s exactly what happens. Companies take an expedient approach rather than an intelligent approach. As a result, Cruce says, “duplication between content repositories is becoming a bigger and bigger problem for organizations that are answering lots of questions in lots of ways.”
We’re asked to copy and paste existing content into new repositories or platforms all the time, Cruce says.
“Stop! If we don’t start centralizing Q&A content, we will hit the Q&A apocalypse where everything is going to be out of date in various channels, and we’ll have a mischmasch of customer experiences. We’ve got to say ‘no more’ to new content silos. We can’t allow our organizations to continue hiring people to move content from one repository to the next. It’s time to put our foot down.”
The goal—which will require the help of content strategists and content engineers to achieve—is to unify your Q&A content across all delivery channels and platforms. Yes, this is a challenging goal. But shying away from this effort has big consequences for the bottom line. “If we can’t keep our content lifecycle and our publishing infrastructure up to date, “ Cruce says, “we’re going to accrue technical debt in the millions of dollars.”
We need to keep evolving our Q&A content to include voice. As recently reported in Forbes, by 2020 half of all searches will be voice searches. The voice-powered bot market is expected to grow from $1.6 billion in 2015 to $15.8 billion in 2010.
If you’re going to invest in a chatbot, you’re not buying a thing. You’re investing in a process that needs to evolve our way of working. The technology is secondary or even tertiary. “It’s uncomfortable. It’s hard. It takes work. Anybody who tells you it’s easy is selling a widget, a thing. Making the widgets sing with our content requires training, innovation, and change of the culture that supports those customer interactions.”
To make the new technologies and processes work, we must move toward intelligent content, including such elements as structure, schema, metadata, microdata, taxonomy, and content modeling. “Knowledge lives in containers and can make an impact only when those containers are connected with an audience.”
All this talk of chatbots may sound daunting, but there’s no avoiding the importance of these new options and the processes they’ll require us to develop. “Organizations should not play chicken with the future,” Cruce says. “Invest in engineering content now before competitors’ robots steal customer mindshare. This is clear to executives and C suites everywhere.”
Enterprise Content Strategy: A Project Guide
Chapter 7. Publish and Measure Phases
Anyone who has written anything or aspires to be a writer knows that the word publish can bear a profound power. However, within a content strategy, publish functions as a mere step within a content lifecycle where content becomes exposed to an audience. Publish represents the culmination of several steps, and as a step itself, it lives within a larger content lifecycle. In a world where anyone can publish any content online via a blog, tweet, or personal website, the power of the term sometimes becomes lost. But make no mistake, publish does create finality in that the content will be seen, heard, read, and felt by an external audience.
The publish phase brings the content experience to life.
As soon as your content lives in the published or external realm and a consumer can access it, it travels down paths, journeys, and experiences over which you have little control. Tracking the path of your content, its use, and its exposure proves essential to its success.
An effective content strategy requires a performance-driven model, so measuring your content performance ensures a successful, sustainable content experience. By definition, successful content must resonate with a consumer and meet his or her needs. Only through constant evaluation will you know what works and what does not. An effective enterprise content strategy must include a well-defined metrics strategy. Metrics should reflect the strengths and weaknesses of the solution design and provide the impetus for content and solution optimization.
This chapter combines the publish and measure phases since the two go hand-in-hand. It defines measuring content performance, demonstrates how to create metrics, and provides information on reporting.
Let’s define a few key concepts to frame this effort.
Analytics: The capture and assessment of data, particularly with performance in mind. In the case of enterprise content strategy, analytics includes the measurement of content performance and the analysis of those measurements.
Metrics: Units of measurement. A metric can reflect any kind of measurement. This chapter provides the common metrics used to indicate the performance of content, such as the number of consumers who download an article.
Key performance indicator (KPI): A metric used to evaluate the performance of an organization’s objectives, for example, the number of products sold.
Conversion metrics: Measurement of a specific conversion, for example, when a content consumer completes a desired task.Typical conversion activities:
Purchase a product
Add an item to a shopping cart
Download a white paper
Share a video
Create a profile
Click to make a call on a smartphone
Register a product
A successful metrics strategy begins in the assess, define, and design phases. During those phases, identify the metrics needed to ensure a successful experience so you know exactly what to evaluate after you publish.
Identify metrics early during technology implementation, because you may need to customize your technology solution to track the metrics you need. Some systems require programming or database changes to enable measurement, so identifying metrics early will help avoid delays.
Creating performance metrics
A successful metrics strategy starts with business goals and objectives. A business goal frames a general aspiration to which you create specific, measurable objectives. You should always start with a strategic intent for your experience and a goal. Let’s use a desktop website as an example. In this case, the strategic intent, goals, and objectives of a desktop website might look like this:
Strategic intent: Answer the question “why our company?” in a way that competitively differentiates us for the consumer, investor, career seeker, financial analyst, and media.
Goal: Become the premium website in the industry and go-to source for all products, outperforming all other competitors in purchases, traffic, and brand perception.
Sell X number of products within X amount of time to X audiences.
Generate X number of articles in (names of media) over X time due to exceptional media experience in news and media section.
Increase overall website traffic by X percent by X time.
Increase the amount of socially shared content by X by X time.
Increase number of consumer profiles created by X over X time.
The strategic intent provides an umbrella strategy for the experience; the goal, a lofty aspiration; and the objectives, specific and measurable desired outcomes.During the plan, assess, and define phases, identify the key criteria for success. At that point, you should identify the strategic intent, goal, and objectives at a high level. Through the design phase, hone them all so each is specific to the solutions you create, down to the page, template or even module level. Metrics will measure whether you meet each of these objectives.
To develop metrics, first look at an objective, and then extract a metric from that objective. Then define what success or finality of the metric means (for example, through analytics applications, dashboards, consumer surveys, conversion rates, or sales reports). Example:
Objective: Increase online sales by X% over X time with X consumers.
Metric: Number of website consumers who purchase a product within a given time period as measured by web analytics and sales data.
Make the metrics as specific as possible by asking these questions:
For whom is the objective targeted? Customers, potential customers, analysts, career seekers, etc. You can also include persona or segment.
When or how will we complete the objective? Example: within 6 months we will sell 20% more products.
How many consumers, products, downloads, the piece of content shared, etc., are we aiming for?
Where are we targeting the objective? Example: the geographical location, the channel, or a specific area on the site.
Why are we doing it? Example: to increase sales, to increase downloads, to increase shared content, to increase the number of content consumers.
Incorporate as many of the above points as you can within an objective to make it as specific as possible.
Adopt the SMART approach
You can also use the SMART approach to develop your objectives. The SMART approach generally applies to setting business goals and objectives, requiring objectives to have these characteristics:
Example: Increase the number of new visitors to the home page by 20% within the next 6 months.
From your objectives, you can glean what to measure. See Table 7.1, “Common metrics” for a list of common metrics.
Table 7.1 – Common metrics
User/consumer path and clickstream
Measures the path a user takes to complete a task. To use this metric, assume user journeys or paths for the completion of specific tasks (for example, purchase an item or download a white paper). This metric helps you determine what a content consumer does within a journey. This metric helps to validate what you think your consumer journeys are versus the actual path a content consumer takes. For omnichannel experiences, measure this journey across multiple channels.
Length of visit
Captures how long a consumer stays within the experience. For example, how long does a content consumer stay on the website?
Depth of visit
Shows how far a consumer goes into an experience, such as a website. You can also look across channels to see which channels a content consumer engages and where and when.
Measures the completion of a task. Many types of conversion metrics exist. You will want to measure number of consumers, tally bounce and exit rates prior to conversion (noting where the exit happens), and review the journey taken to convert. For each conversion metric, create one or more user/consumer journeys.
External keyword search terms
Identifies which terms are used in search, both within your digital experience and through organic search (for example, Google.com, Bing.com). You may want to review both mobile and desktop experiences. Google Analytics or other tools can help track this information. Stay informed regarding changes to algorithms by major search engines, which can render this task difficult.
Onsite search keywords
Shows which key terms are used within your digital experience for search, as opposed to an external search engine. These indicate people’s interests. Note when a consumer jumps to use online search, often indicating that the consumer cannot find what he or she seeks via navigation. In addition to top search keywords, look at failed searches or searches that return no results. Also note when the consumer refines the search terms, and capture facet usage, if relevant. Preferred search terms (canine over dog) are another important metric.
Number of visits to convert
Identifies the number of times a consumer leaves and return before converting. Where does the consumer go (if you can track it) upon leaving the experience?
Point of entry
Identifies where a consumer enters the experience or content. This metric may provide a starting point for the consumer journey. How does a content consumer get to the experience: via a keyword search? via a banner ad? via a competitor’s site?
Value of interaction
Calculates the total revenue generated from the visit. This metric can be itemized or can account for all visits to the website by dividing the number of visitors by the total revenue.
Cost to convert
Demonstrates how much a conversion costs a business or an organization. This metric looks at internal spending and the total number of conversions as well as revenue of conversions when relevant.
Measures where a content consumer exits an experience. Note the length of time spent and which device the consumer uses prior to exiting. An exit does not necessarily correlate to a cause for concern; perhaps the visitor accomplished what he or she needed to do and, thus, left your experience satisfied.
In contrast to exit rates, bounce rates inform you that a visitor reached your experience and left immediately. In other words, a consumer might reach a product-landing page through an external site and – without spending any time there or going further into the experience – “bounce” out of the website by going to a different URL. Track whenever this happens, as well as point of entry, length of time of visit, where the consumer went after, etc. This metric may help you detect under-performing content.
Indicates how often a consumer visits an experience. What does he or she do while within the experience? For consumers with profiles (users who are logged in), which features, functions, and content do they use?
In addition to the metrics in Table 7.1, “Common metrics,” you might need to capture social media metrics. Table 7.2, “Example social media metrics” provides some common social metrics:
Table 7.2 – Example social media metrics
Tracks which content (for example, a product or video on Facebook, Twitter, Tumblr, Pinterest) is shared by whom and when. Look at how often a consumer re-shares the content (for example, by retweeting).
Share of voice
Captures how frequently social media mentions your experience, brand, or organization.
Referrals from social media
Indicates which social media refers visitors to your experience, for example, a link in Twitter that results in a visitor landing on an article on your website.
Tracks what others are writing about you in social media. Sentiment can be tracked with regard to perception of a brand, an experience such as a website, specific pieces of content such as a video, or even the experience with a product or service.
Indicates which consumers, and how many, continue to mention your experience or content, for example, repeat likes within Twitter, repeat shares of your content on Facebook, repeat mentions of your brand or organization, etc.
The metrics in Table 7.2, “Example social media metrics,” can all be attained in various ways, including Google Analytics, Bing Analytics, social-tracking tools, and web analytics software. Additionally, many content management systems include this functionality, and there are applications that track a variety of metrics. In many cases, you may require more than one application.
So far, I’ve covered metrics for digital experiences. Obviously, though, digital metrics do not capture all the objectives that an enterprise should measure. Let’s consider the following operational metrics, which can prove equally important for showing the value of content within your organization.
Reduction in cost to produce content: Measured by data supplied by business units, internal audits, and operational metrics dashboards
Reduction in cost associated with finding and leveraging content within an organization: Measured by user and consumer surveys, audits, and operational metrics dashboards
Reduction in localization cost due to improved processes and systems: Measured by audits and operational metrics dashboards
Cost per word (used in translation cost assessments): Measured by audits and operational metrics dashboards
Time saved authoring, maintaining, and optimizing content: Measured by user and consumer surveys, audits, and operational metrics dashboards
Increase in internal satisfaction with information and content: Measured by surveys and operational metrics dashboards
Decrease in content redundancy: Measured by user and consumer surveys, audits, and operational metrics dashboards
Reduction in cost due to content reuse: Measured by user and consumer surveys, audits, and operational metrics dashboards
Time saved in taking a product to market: Measured by user and consumer surveys, audits, and operational metrics dashboards
Decrease in employee attrition through improved employee tools, self-service tools, and resources (portals): Measured by user and consumer surveys, audits, and operational metrics dashboards
Content experience metrics
Finally, you should look at other evidence related to content experience. User/consumer/customer feedback, surveys, and user-testing tools can show how your content performs and why content consumers may or may not respond to it.
Additional content experience metrics:
Consistent brand experiences with all customer touchpoints (facilitated by content that is on-brand and effectively targeted across multichannel platforms): Measured by consumer surveys and audits
Retention of customers: Measured by customer databases, sales data, surveys, and audits
Acquisition of new customers: Measured by analytics, sales data, and audits
Optimized content quality (means consistent content across channels, free from errors): Measured by quality standard audits, customer feedback, and time-to-publish updates and modifications
Up-to-date, relevant content: Measured by quality standard audits, customer feedback, and time-to-publish updates and modifications
Efficacy of content related to its value proposition and key selling points: Measured by analytics, testing (for example, A/B testing or multivariate testing), customer feedback, audits, and sales data
Improved localized content with fewer errors and revisions: Measured by quality standard audits
Identifying the types of metrics to capture only provides you with partial success; what you do with the metrics is what really matters. Let’s discuss how to analyze metrics data and report on it.
Analyzing and reporting metrics
Metrics provide you with data that helps you draw conclusions about your content and its performance. But metrics by and large do not answer the question why? Metrics do not tell you why consumers do or do not view or share your content. To find out why you must dig deeper.
Let’s first discuss when and where you should look to answer this question. If content performs well, that is, it’s meeting its objectives, then perhaps you will want to produce more content similar to it and make investments in its ongoing success.
When content fails to meet its objectives, you have a problem. Look at every place where content does not perform well. After you have a list of the problem areas – which can be anything from consumer journey to conversion to content not receiving any visitors at all – find the cause. For content not viewed at all, are consumers interested in the topic? Do they seek it out? Are issues in search or navigation preventing them from getting there in the first place? Have you received negative feedback on the content?
When something seems amiss, first check to see if there are issues with the user experience. Then, see how the content performs elsewhere in the industry. Do competitors use the same content? If so, how does it differ from yours? Are there social metrics to indicate interest? You may need user testing to see why content fails to perform successfully. In some cases, you might need to modify your objectives. Maybe, content you consider important is not important to your audience.
As you determine the causes, build and maintain a list of resolutions.
Report to the content team any findings, perhaps using a dashboard. Present internal metrics, track efficiencies, costs, etc. quarterly. For metrics that track your content experience, determine how often you wish to review and present. In many cases, you will want to analyze metrics monthly. In other cases, you might want to do so quarterly. In some larger e-commerce environments, organizations track metrics hourly. Chapter 8, Optimize Phase, deals with how to optimize your content based on your findings.
Every strategic business decision has an impact on content. Entering a new market creates the need for localization. Images that work well in our home market may need to be altered or replaced in order to avoid offending members of new audiences. Mergers and acquisitions create a need for content updating and adaptation. Content obtained from others requires rebranding, creates the need for new metadata tags, and training content creators on new tools and workflows. A new product feature may require your content team to update your content strategy.
Unfortunately, business strategists don’t always foresee the impact of their decisions on content. They can find themselves surprised by the amount of time and money required to tackle the challenges they introduced.
Who should be involved in making such strategic business decisions, and what process would they follow, ideally, to avoid such rude surprises? Ann Rockley and Charles Cooper of The Rockley Group have a model for you.
While your process may not have exactly eight stages, and while you don’t have to use exactly these labels for them, “You need everything that’s in here,” Charles says.
As you read this post, which sums up Charles’s take on this process, look for opportunities for your company to smooth out its process and to unite people in your organization around it.
Why put this kind of process in place?
The process that Charles maps out—although it may seem, to some, like a series of roadblocks—enables corporate leaders to make strategically sound decisions in a timely manner, guiding the company toward activities that forward the business and away from activities that don’t.
The process also increases the chances that strategists will take into account the full impact of their decisions on their company’s content. I’ve been involved in situations where that didn’t happen, and the scrambling that resulted wasn’t pretty.
Ideally, this process keeps a company’s whole body of content consistent and up to date. As a result, even as the business evolves, customers and content teams alike get to have the kind of satisfying content experiences you would expect from a brand you love.
“This process gives companies control without over-controlling,” Charles says. It does this by giving people in various departments a regular opportunity to talk and get aligned—a critical benefit for content teams that operate in silos.
“If you have one team doing videos and another doing podcasts, you want to keep them moving in the same direction—hitting the same talking points, delivering the same messaging—as they create the content.”
Who belongs on the decision-making team?
Charles suggests that companies appoint four or five people (referred to variously in this post as “decision makers,” “strategists,” “approvers,” and “the team”) who have a stake in both the short- and long-term repercussions of the decisions.
The appropriate project manager
The person who manages the project managers
Someone who understands the relevant content systems and teams
Someone from the legal team
Someone from the brand team
Someone from the region in question
The people you choose to involve may vary. Not every strategic decision needs to involve legal representation. And maybe your product or service isn’t international. It may even be hyper-local, requiring no consideration of regional concerns. Don’t force these roles onto your team. Identify the roles your company needs to support its business decisions, and choose people who can fill those roles.
Subject-matter experts (SMEs) don’t need to be on this team; the team reaches out to them as appropriate.
If the team is focused on a product, the product manager might be at the appropriate level. If the team deals with a number of products within a brand, the brand manager might be at the appropriate level, calling on individual product managers, as SMEs, between meetings. In some cases, those product managers might be invited to a team meeting to provide extra input.
The goal: Create a stable team that provides strategic consistency.
How much time does this process take?
Depending on the nature of the request, this decision-making process may take a concentrated hour, or days, or weeks.
The team may want to hold regular decision meetings—maybe weekly, maybe monthly—depending on the number of requests that roll in. Organizations with a large backlog of requests may want to meet frequently at first, perhaps weekly or biweekly for three to five months, Charles says, tapering to monthly meetings after that.
The conversations that happen between meetings don’t have to take long. A quick Google search or 30-minute conversation may yield a pivotal discovery. At the same time, some requests merit in-depth research.
Companies that rush their strategic decisions or follow inconsistent methods shortchange themselves; they may suffer expensive—and avoidable—consequences.
Don’t be a slave to consistency, though, Charles says. Flex as needed to support your business requirements. If you’ve settled down to a monthly meeting cadence, for example, and an important issue comes up, don’t wait for the next monthly meeting to address it.
“Use the process—gather information, distribute it for discussion, research the situation, and come together to decide—to support your business needs. Don’t force your business needs into a defined meeting schedule if that would cause more problems than it would solve.”
In short, this process takes however long it takes. Wise leaders give each phase its due.
A walk through the process
The following sections detail the stages of the decision-making process that Charles recommends for strategic leaders of any company.
Stage 1. Someone submits a request, following a defined method
Strategic ideas may come from any number of sources: people anywhere in the organization, customers, the public. The ideas may come in via email, web forms, tweets, phone calls, hallway conversations, meetings—any way that human beings communicate.
Somebody somewhere asks somebody at some company to do something different. Charles calls this input a “request” (aka an idea, a change, a suggestion).
To smooth out the infinite variability at this stage, he suggests that companies define and streamline the ways that people submit requests. “If you’ve got 50 ways of receiving requests across all your touchpoints, see if you can cut that down to a smaller number, perhaps to five,” Charles says. “Come up with a consistent approach.”
Stages 2 & 3. A sanity checker reviews the request, rejecting it if appropriate
When a request comes in, someone must determine whether the idea merits further assessment (Stage 2). Charles calls this the sanity check.
Your company may want to establish a separate group of people who do the sanity checking. Choose people who know enough to understand the requests, the audiences, and the business needs and who can spot ideas that should be immediately rejected (stage 3) rather than waste everyone’s time by moving it on to the decision-makers (stage 4).
Sanity checkers may reject a request for several reasons:
It may not make business sense.
It may not be described clearly or fully.
It may not include enough information for the decision-making team to consider.
Where appropriate, the sanity checker returns the request to the requester, asking for whatever additional information is needed. From the requester’s point of view, this feedback loop takes the guesswork out of submitting a request. From the company’s point of view, it keeps underdeveloped requests from wasting the team’s time.
The goal: Improve requests so that they are more likely to be approved (and approved efficiently).
Stage 4. The sanity checker distributes information about the request to the team
When a content request passes the sanity check, the sanity checker creates and distributes a packet of information to the members of the decision-making team. Like anything worth reading, this information must be fair, accurate, concise, and easy to understand, giving the strategists everything they need to make a good decision.
What does this packet contain? Whatever it takes to sell the idea.
The request. (Exactly what is being proposed?)
The rationale. (Why do this? What problem would be solved or what market advantage gained?)
The scope. (Would this require action locally, in certain regions, or worldwide? What departments would be affected and how?)
The consequences of rejecting the request. (If the company doesn’t do this, what’s likely to happen?)
The concerns. (If the company does this, what concerns might need to be considered?)
This packet must be distributed far enough in advance of the decision meeting (Stage 7) that the team can examine it and have the necessary conversations with SMEs (Stages 5 and 6).
Stages 5 & 6. Team members examine the request, getting input from SMEs as needed
In most companies, decision makers have too little time to examine requests (Stage 5) and then to reach out to SMEs for further input if needed (Stage 6). Sometimes companies skip these “incredibly important” stages altogether.
“It’s not unusual for five people to come into a meeting to approve a bunch of requests without having looked at any of them. They’ve had no time to understand the requests, no time to ask their own questions, no time to talk to people across the organization.”
In that situation, no one can make good decisions.
Although it sounds like a lot of work, this research often doesn’t take much time. A brief conversation with an expert or knowledgeable colleague– via phone, email, desk visit, hallway encounter, or electronic exchange in a formal approval system—may be all it takes to address the questions.
“Ask SMEs what they think,” Charles says. Give them a chance to weigh in, especially on changes whose ramifications will resonate for years.
Charles uses the term “SME” to describe someone knowledgeable about anything—a product, a country’s culture, a group of customers—anything that the decision makers need to understand.
Budget sufficient time for Stages 5 and 6. To give strategic direction is to take the time required to understand and wonder about the requests you’re being asked to approve.
Stage 7. The team meets and decides whether to approve the request
At this penultimate stage, decision makers meet to decide which requests to approve and which to reject. In this meeting, no one is looking at the requests for the first time. People walk in having done their homework, ready to make informed, considered decisions.
Occasionally, in the course of the meeting, it becomes clear that more information is needed. Issues come up in the conversation. People go back, get the information, and make a decision at the next meeting or through email. No problem. But the goal of this meeting is to say yea or nay.
Stage 8. The team passes on the approved request to be implemented
After the decision makers approve a request (Stage 7), they let the appropriate teams know what they need to do to implement the request (Stage 8). Sometimes, though, especially when a process like this is just getting started, the decision makers don’t know whom to pass the approvals on to. “That education is crucial for the process to work,” Charles says.
Charles’s insights ring true for me. I’ve worked in situations where business decision makers underestimated the impact of their decisions on content teams. Stress abounds, and expenses soar. A process like the one described above could have made all the difference.
How about your company? To what extent do strategic decision makers anticipate the impact of their decisions on content teams across the company? What would you add to what Charles has to say?
There’s been a lot of talk about convergence between marketing and technical communication over the past few years. Most of the ideas being discussed are focused on finding ways to improve customer experience by unifying content production and distribution efforts, but few companies are actually making a concerted attempt to break down the silos that prevent collaboration. Adobe aims to change that.
This October 10-12, Adobe will attempt to bridge the gap between technical communication and marketing professionals by bringing together content creators from both camps to learn about structured content. The effort is called Adobe DITA World, an online event to which the software maker expects to attract over 1,000 marketing and technical communication experts from around the globe.
The three-day virtual confab aims to showcase experts in the fields of content management, content strategy, content engineering, translation, and localization in order to help “connect the dots” between marketing and technical communication content.
The Content Wrangler is pleased to be named the official media partner of Adobe DITA World. Founder and Chief Wrangler, Scott Abel, will serve as the opening keynote presenter. He’ll discuss the impact of cognitive content, artificial intelligence, and agentive technologies on technical communication and marketing.