Follow Tableau Blog on Feedspot

Continue with Google
Continue with Facebook


Hi, I’m Steve. I lead up Developer Relations at Crunchy Data and in this post, I am going to show you how to use PostgreSQL and its gold-standard spatial extension, PostGIS, in combination with Tableau. Tableau is an awesome analytical platform and its integration with PostGIS provides two key benefits. The first you can work with spatial data directly from the source, without requiring pre-processing, using a live or extract connection. The second, you can leverage the spatial capabilities of PostGIS to expand what is possible with Tableau. Let's dig in.

PostgreSQL is the hottest relational database management system (RDBM) technology that just happens to be 30 years old. While giving you all the power and data cleanliness of relational databases, it also has advanced capabilities like Full-Text Search, JSON storage, query, retrieval, and the ability to perform advanced spatial analysis, which happens to be our topic today. PostGIS is the extension for PostgreSQL that provides both simple and advanced geospatial capabilities.

To follow along with today's exercise, you can either use a Docker container from Crunchy Data, which has the spatial extension already enabled OR you can download and install PostgreSQL with PostGIS from the community download. The data we are using today can also be downloaded from GitHub. Today we will be working with U.S. county boundaries and U.S. storm events captured by U.S. Weather Service.

Set up your data source
  • Connect Tableau to PostgreSQL: For the purposes of this post, I am going to assume you have loaded the data into your database and know how to connect Tableau to PostgreSQL. For a refresher, learn how to connect Tableau to PostgreSQL here.
  • Make your first data source: Next, drag the county_geometry table (1) into the data source area.

    You will see in area #2 that Tableau already understood the spatial columns inside the table, Interior_pnt and the_Geom. As per standard behavior, Tableau replaced the _ character in the column names with a space character. You may want to extract and save the data to make the rendering to go faster.

And with that we are ready to make our first map!

Create a simple Tableau map with PostgreSQL data

Making a map in Tableau is easy. We can add the spatial column to a blank worksheet and Tableau will automatically create a map.

For our first example we are going to look at the area of water in each county. To visualize this, we will make a choropleth map where we change the fill color of the polygon based on the water area. This is a classic geospatial technique for rapidly visualizing patterns in our data. You will need to disaggregate the polygons by adding one of the identifying dimensions, like County name or County FIPS code. You can also turn off aggregate measures so that you can interact with each polygon individually. In this example, let’s turn off aggregate measures by going to the Analysis menu and uncheck Aggregate Measures.

Now drag The Geom Measure onto the Detail shelf and after that, voila, you have a map for all the counties in the United States!

Wondering where the background map is coming from? There’s a great Tableau blog post about Tableau’s vector tiles.

Let’s go ahead and color the polygons by the area of water in the county. Drag the water area (“Awater”) column onto the Color shelf and you should see the whole map change color, but most of the county polygons are the same color.

If you look at the legend on the right you can see that the range of a water is from 0 to > 26,000,000,000. If we look at the histogram for “awater” we also see that it has a severely right-skewed distribution (note the X axis is log scaled).

To accommodate this skew, we can create a custom color ramp that moves the center of the color ramp to something significantly smaller than the average. Now we have a nice map that shows all the counties that border on the oceans and great lakes, as well as the ones that contain many lakes and rivers. So far this is pretty standard mapping in Tableau. Let’s move on to some more interesting use cases with PostGIS and Tableau.

Leverage Spatial SQL to gain sophisticated insights

With a PostgreSQL connection we have the ability to create a data source that is actually a SQL query. To illustrate a simple, but powerful example, let’s visualize the distance of every county from the geographic center of all 50 United States. The coordinates for this point are: 44.967244 Latitude and -103.771555 Longitude.

Add a new data source to your Tableau workbook that points to the same database. This time, rather than dragging over a table, drag the New Custom SQL Option over to the data area.

This action will bring up a box to drop in your SQL. Now I get to teach you some spatial SQL. Paste the code and while you are waiting for it to run, come back here and read more to understand the query. The code will take a little while to run because it is calculating the distance between every single county and the center point. This requires a full table scan and can’t take advantage of any spatial indices we have created. I HIGHLY recommend you extract this data.

SELECT id, county_name as name, ST_Distance('POINT(-103.771555 44.967244)'::geography, the_geom) as distance, the_geom as boundary FROM county_geometry

The ST_distance[ST_DISTANCE DOC URL] function is where all the action is happening. This PostGIS function calculates the distance between two spatial objects. First we take the center point of the U.S. and make it into a character string 'POINT(-103.771555 44.967244)'. Then we cast that string to a geography type (a spatial entity). The :: operator in PostgreSQL is shorthand for casting one data type to another.

Now that we have one point, we calculate the distance between the center point and the polygon. Under the hood, PostGIS will calculate the distance between the point and the closest point of the polygon. Because we are using a PostGIS geography type, our resulting distance is in meters.

Now our resulting data source has the county_name, distance, and the original polygon boundaries. I found it useful to rename the data source to “Center Distance”.

Go ahead and create a new Worksheet. Like before, turn off aggregation, drag the boundary measure onto the Detail shelf, and finally, drag the distance measure onto the Color shelf. Again, because of the large range of values, we end up with not much visible color differentiation on the map. But if we use a custom color ramp for the coloring of the polygons...

We end up with a really nice map showing the center of the U.S. with subtle coloring to show how distance changes. Adjust the colors as you see fit.

Think about how complicated it would be to make this visualization in Tableau without the use of PostGIS. First, you would be writing a lot of custom calculations on the data. But not only that, you would have to download the full table from the database into the Tableau client before doing the calculations. So not only do you have the complexity of writing all those custom calculations, you also have to wait for all the data to come over the wire before seeing your results.

Execute Advanced Spatial Queries with PostGIS and Tableau

For the last example, we are going to solve a specific business problem. Let’s assume we work for an NGO or government agency responsible for disaster relief. We have a question: “which counties in Maine are the best for pre-staging emergency equipment?”

To answer this question, we buffer 12.5 KM (about 8 miles) off a storm event’s center point and then select all the counties that intersect that buffered circle. We will then use a grouping query to do a count of the storms circles per county. To do this query we are going to use a PostgreSQL Common Table Expression (CTE). CTEs allow us to write subqueries with much cleaner syntax.

To build the query, first we write the query that does the buffer and intersection:
select geo.statefp, geo.county_name, geo.the_geom as boundary, se.locationid from county_geometry as geo, se_locations as se where geo.statefp = '23' and ST_intersects(geo.the_geom, ST_Buffer(se.the_geom, 12500.0));

Then we do the aggregation on that query:
with all_counties as (
select geo.statefp, geo.county_name, geo.the_geom as boundary, se.locationid from county_geometry as geo, se_locations as se where geo.statefp = '23' and ST_intersects(geo.the_geom, ST_Buffer(se.the_geom, 12500.0)) )
select statefp, county_name, count(*), boundary from all_counties group by statefp, county_name, boundary

Again, paste this into a Custom SQL Dataset. I named mine "Best storm counties." This query will take a while to execute even for one county because we have create a circle polygon for each storm event location, then check for intersection with each of the polygons in the county, and finally sum how many times each county had an intersection.

Turn off aggregation, drag boundary to detail, drag count to color, and if you want to see the county names, you can drag that to tool tips or labels. You should see something like this:

It looks like Aroostook and Kennebec would probably be our best choices (Penebscot actually has one more incident than Kennebec, but if we put something up north, then better to put a second one farther south).

Again, spatial SQL helped us answer quite a complicated question. And if you wanted to generalize this to allow business leaders to change the state, such as TX or WA, Tableau makes it easy to add a parameter to the SQL query.

Learn more

I hope this post today showed you all the power that comes from combining PostGIS with Tableau. You can ask very sophisticated spatial analytic questions with the ease of Tableau.

If you want to get more hands-on with PostgreSQL and PostGIS without installing anything on your machine, use the free tutorials available on the Crunchy Data Learning Site. And for more fun, check out the Tableau Community Forums to learn how to generate routes dynamically with PostgreSQL + pgRouting.

Thanks for reading and I look forward to seeing some of the great spatial visualizations that you'll produce in Tableau!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The Foreign Corrupt Practices Act of 1977 (FCPA) is a United States federal law known primarily for two of its main provisions, one that addresses accounting transparency requirements under the Securities Exchange Act, and another concerning bribery of foreign officials. FCPA provisions apply to any person or US entity doing business in foreign jurisdictions and are jointly enforced by the Securities and Exchange Commission (SEC) and the US Department of Justice (DOJ).

Given the high level of scrutiny and enforcement, companies that violate the FCPA can face substantial fines, investigative time, and reputation and brand damage due to non-compliance. As such, when developing leading compliance program practices, companies should establish appropriate monitoring, auditing, and ongoing evaluation of program effectiveness.

For teams tasked with monitoring and auditing (e.g. internal audit, compliance, etc.), using data analytics is important to improving the effectiveness of compliance programs. By using analytics to identify risk factors, flag anomalies, risk rank transactions, and send proactive alerts, your compliance program can focus on early detection and prevention of corruption schemes.

Analytics considerations for anti-corruption programs
1) Identify corruption risk factors.

As part of the risk evaluation process, companies should evaluate corruption risks specific to their company or industry. After these risks are identified and prioritized, companies should next consider possible corruption schemes and scenarios and understand if mitigating controls already exist that would prevent or detect these corruption schemes.

The output of this exercise should be a listing of program areas where detailed testing or further analysis should be performed due to potential risk. Identifying areas of higher corruption risk is the first step in designing controls and analytics to help mitigate these risks.

2) Design analytics to identify corruption red flags.

Data analytics can be leveraged to identify risky transactions for additional testing. By understanding the underlying corruption schemes that could exist, companies can design analytics to profile transactions that may be at higher risk based on noted trends or anomalies in comparison to the rest of the transactions in a population.

For illustrative purposes, an example corruption scheme with channel or partner sales is when a product is sold to a reseller at a deep discount, who then uses the large margin on that sale to pay a bribe to the end customer. In this scenario, reviewing discounting outliers from partner sales transactions may indicate higher corruption risk that should be investigated. Risk factors such as the size of the transaction, region, typical employee or partner discounting patterns, etc. may reveal corruption red flags.

Companies can use analytics to quickly identify outliers and atypical discounting patterns based on these various risk factors. See the below Sales Discounting Scatterplot dashboard, which makes it quick and easy to view outliers and atypical patterns. Designing analytics this way allows companies to further focus their review in geographical jurisdictions with a lower Corruption Perception Index (CPI) score where potential corruption and bribes are more prevalent.

Sales Discounting Scatterplot Dashboard: Using Tableau, sales data can be easily rendered, filtered, and analyzed. In the below example, deal discount percentage is plotted on the y-axis and date of deal close on the x-axis. Each circle represents an individual deal with circle size representing deal size. The following filters are utilized in the dashboard, each of which can be updated to help isolate risky transactions.

  • Date of Deal Close: What is the review period?
  • Deal Close Month: Are deals closed in certain months inherently riskier than others?
  • Discount Percentage: What discount level would require additional scrutiny?
  • Region and Sub-Region: Where are higher risk deals closed?
  • Number of Days from Deal Creation to Closure: If a large deal is created and closed quickly, was it reviewed and approved through the appropriate channels?

3) Risk rank transactions and perform testing.

After analytics are developed, companies should identify transactions that are anomalies or do not conform to expected patterns or other items in the data set. Consider assigning a risk ranking to these transactions and perform additional testing procedures. Using the example of sales transactions noted above, additional detail testing could involve inspecting the supporting sales documents, inquiry with the sales or partner representative, etc.

4) Use analytics to provide proactive alerting of high-risk transactions.

Share information learned with your compliance team to determine how to deploy analytics for ongoing monitoring at scale. How can corruption be prevented in the future? How can corruption be detected earlier in the sales cycle?

With Tableau, you can set up data-driven alerts. For example, let’s say you want your compliance team to be notified when a pipeline partner-led deal closes in a country with a low CPI score and exceeds a discount percentage of 50%. By establishing that threshold with an alert, you can automatically send notifications to the team tasked with monitoring these transactions, enabling them to take action based on data.

Data analytics is a powerful part of a larger anti-corruption program. Analytics can detect corruption red flags and prevent corruption schemes, helping you avoid potential fines, investigations, and immeasurable damage to your reputation. The application of these four analytics considerations to your compliance program helps monitor and audit transactions real-time, improving anti-corruption efforts in an effective and powerful way.

To learn how to take a proactive approach to exposing risk and improving anti-corruption prevention, visit the Tableau Audit, Risk, and Compliance Analytics solutions page.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

It’s a Friday morning and Tableau volunteers are helping sort produce at FamilyWorks Food Bank in Wallingford, Seattle. Several volunteers biked from Tableau headquarters, just a mile south of the food bank. FamilyWorks is an important community partner for many Tableau employees. In addition to regularly hosting Tableau volunteers, FamilyWorks has received Community Grants from Tableau Foundation for the last five years.

Since launching Community Grants in 2015, Tableau Foundation has granted over $800,000 to organizations, like FamilyWorks, that are working to make the communities where we have offices healthy, stronger, and more vibrant for everyone. Our goal is to partner with and support our local nonprofits. We are always looking for ways to make our processes more efficient, so that nonprofits can focus on what matters most. One of Tableau’s core values is to “keep it simple,” and we apply that philosophy to our Community Grants as well. We keep the application short and straightforward so that nonprofits don’t have to devote hours that they could otherwise spend serving clients. Additionally, our grant funding is general and unrestricted, which gives nonprofits flexibility in how they choose to use the resources.

“When Tableau first arrived in the neighborhood, they sought out local nonprofits like FamilyWorks, to ask how they could best support the community. That was refreshing! Since that time, our partnership has grown in meaningful ways and is significant for us in our ability to offer nutritious food and essential resources—through their grants, group volunteering, technical assistance, and in-kind support. They have made a positive imprint in strengthening the community.” – Jake Weber, Executive Director FamilyWorks.

Volunteer groups of Tableau employees lead the Community Grant process from local offices. It’s an opportunity for employees to direct resources to organizations they are passionate about and to get involved in their communities. This is especially important for young professionals and employees who are new to a city. With so many great causes, it can be hard to decide where to spend your paid volunteer day-off. Community Grants allow employees to see firsthand the good work and services accomplished by nonprofits in our communities.

This year we will donate nearly $275,000 of unrestricted funding in a record number of communities—Austin, Boston, Dublin, Kirkland, London, Palo Alto, Seattle, Singapore, Vancouver, and Washington DC. Applications are now open for nonprofits in these cities—so help us spread the word to the nonprofits in your community that matter most!

Apply now for a Tableau Foundation Community Grant

The deadline for applications is July 19, 2019. Don’t delay! You can find more information and apply here.

For more information about past grantees or about Tableau Foundation, our Living Annual Report (below) offers a transparent view into the organizations we’ve helped support.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Today, we kicked off our sixth year of Tableau Conference Europe with our Tableau customers, partners, and amazing community—to talk data culture and exciting new additions to the Tableau platform. It's all driven by our guiding mission to help people see and understand data—to decipher fact from fiction, to create a community of shared knowledge, and to help people move onto their next breakthrough, faster.

Parallels between lasting cultures and modern business

James Eiolart, Senior Vice President of Tableau EMEA, jump-started the conference by exploring parallels between lasting cultures throughout our history and cultures of data within modern organizations. He explained three key traits of lasting cultures: they have a shared language, they have a way to share, and they are adaptive.

First, cultures can’t survive without a common language. From gestures to symbols to words, we use language to communicate and align on common goals. In a Data Culture, leaders must invest in creating a data literate workforce where people use data as their common language to make an impact. James used the example of a commercial real estate firm, JLL who attributed $35 million in benefits from creating a data literate workforce.

Over the course of history, our ancestors came together to share stories, experiences and knowledge, and together they gathered new wisdom. We see this community of sharing echoed in Tableau communities, where people continue to inspire, support, and lift each other up. Community is the heart of an organization’s Data Culture, bringing people together under a shared mission.

Lasting cultures are adaptive, evolving to meet the changing needs of the environment. This means that cultures build the rules and guardrails needed to establish a balance between freedom and regulation. This is also true in modern organizations where an effective Data Culture means you have the right balance between governance and access.

Building a Data Culture at Credit Suisse

Homa Siddiqui, Head of Digital Transformation and Product Labs at Credit Suisse, shared some of the successes and challenges Credit Suisse has experienced in their journey to build a Data Culture. It started as a top-down mandate from senior leadership, but it quickly grew because of the excitement on the ground—all supported by buy-in and advocacy from leaders across the organization. Homa reinforced that data ubiquity is a must when it comes to the future of the banking industry and modern business in general.

Introducing Tableau Blueprint: Your step-by-step guide to building your Data Culture

James then announced the new Tableau Blueprint—your step-by-step guide to building the capabilities you need to cultivate a Data Culture and accelerate organizational change. We created Tableau Blueprint by bringing together best practices from tens of thousands of organizations around the world. Adam shared how we’ve found that technology is only one part of a successful Data Culture. Focusing on your people, to drive behavior change ensuring they are making decisions with data is what will make your data culture successful. Learn more about Tableau Blueprint.

New product innovations: Explain Data, Project McKinley, Tableau Catalog, and more

Chief Product Officer, Francois Ajenstat, highlighted upcoming Tableau platform innovations and made a few announcements along the way, including the upcoming Tableau 2019.3 beta. Francois shared how we’re doubling down our innovations in three core areas.

Analytics for everyone: Ask Data and AI-powered Explain Data
Analytics for everyone means that every person within your organization feels empowered to make decisions based on data. Earlier this year, we released Ask Data, letting you ask questions of your data in a conversational and intuitive way. And now we’re announcing Explain Data, coming in Tableau 2019.3, which leverages the power of artificial intelligence to provide explanations for unexpected values in your data. With Explain Data, you can answer “why” faster and discover insights you may have never found before.

Analytics at scale: Project McKinley, encryption at rest
As more and more people use data within your organization, there’s a greater need to manage analytics at scale. We continue to develop new enterprise features that make this easier, including encryption at rest for extracts, enabling organizations to store sensitive data in the platform, while minimizing the risk of access by unauthorized users. Francois also introduced Project McKinley, which provides enhanced security, manageability, and scalability capabilities for Tableau Server. With Project McKinley, its simpler to run large, mission-critical Tableau Server deployments, allowing you to react to the changing needs of your business, and to save time by streamlining the management process.

Analytics built on trusted data with Tableau Catalog
A successful analytics environment must be built on trusted and accurate data. That’s why we’re excited to announce Tableau Catalog! As part of the Data Management Add-on, Tableau Catalog makes it easy to get a complete view of the data being used in Tableau. Data owners can automatically track information about the data (i.e., metadata), including user permissions, usage metrics, and lineage. There’s no need to set up indexing schedules, configure connectivity or reset permissions—it’s all done automatically.

Explain Data, Project McKinley and Tableau Catalog will be available in the Tableau 2019.3 beta, coming soon! Join our beta program to try these new capabilities.

There were so many jaw-dropping moments today and we’re excited for even more over the course of these next few days. We can’t wait to see what you accomplish with these new innovations.

Note: Many of the capabilities and products discussed in this blog post are not currently available and may not be ready on time or at all. We advise customers to make their purchase decisions based upon features currently available and upgrade to the latest version of Tableau to enjoy the latest capabilities.
Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Adapted from a conversation between Andy Cotgreave, Technical Evangelist at Tableau and co-author of The Big Book of Dashboards, and data storyteller RJ Andrews, creator of Info We Trust. Skip to the bottom of the article for more resources.

Historic data visualization still matters. Consider why these visualizations were created. Historic practitioners faced the same challenges we do today: Data overwhelmed their capacity to manage it. The audiences they had to inform lacked graphic and numeric and statistical literacy. Historic viz was created to understand the world better.

Yet, historic practitioners worked to meet these similar challenges under different constraints than us. They had different data, different tools, and a different environment. The intersection of their similar pursuit with such different conditions produced many interesting designs. Their natural solutions are less likely to emerge in today’s digital environment, where we work under own unique constraints. Certain creative paths open at different moments in history.

This tour presents eight examples of historic visualization. Each one is brimming with design lessons relevant to today’s practice. These lessons highlight intentional and thoughtful aspects of design—decisions made by real people to better convey information. We have the luxury of hundreds of years of people pioneering our field. It is best to take advantage of their hard work.

Watch the livestream and read on for more information:

Livestream: Design secrets from influential data visualizations - YouTube
Timeless Style

As viable now as the day it was created.

1. LINEAR CHRONOLOGY was published in 1821’s Chronology of Events. The chart shows time series of economic factors from 1770–1820. It is the last known chart created by William Playfair.

Look past the aged paper and elegant script. See how the bold color palette distinguishes different categories. Squint and you might mistake it for something made by Google. Look closer. Each line is not only distinguished by color, but double-encoded with line style.

Lines are labeled where they cross the vertical axis, and along their ensuing curves. Timeline labels mark only the last numeral of intervening years for a clean effect. Annotations at the top and bottom of the chart provide context for war years. Negative space at the end of the chart looks toward the future. This chart was in fact updated—recreated after Playfair’s death—with fresh data.

A brilliant yellow legend orients you to the chart's unique vertical axis. Each line shares the same numerals (0–120). But there is a different unit for each series: tens of millions, millions, pounds, and farthings. This creative design is not a best practice because it creates meaningless intersections. But knowing it might help you someday.

2. DIAGRAMME DES RECETTES ET DÉPENSES (Diagram of Income and Expenditure) was included in Émile Cheysson’s 1882 Album de Statistique Graphique. The album was part of an annual series published by the French Government. The albums are a gold mine of hundreds of examples of golden age visualization.

Before even translating this chart, we can admire its annotations and restrained color palette. Notice how the fine drop lines connect each mark to the baseline. It does not feel 140 years old.

Today, our data and tools bless us with incredible power. But we also suffer under our own particular digital constraints. One of the advantages that historic visualization had is that it was published on paper, allowing for a much larger canvas and higher resolution. Paper allows more colors. Historic constraints were different and that made different solutions possible.

Sublime Maps

Analytic maps demonstrate how exceptional historic work can be.

3. ACCÉLÉRATION DES TRAVERSÉES MARITIMES (Acceleration of Maritime Crossings) is part of Émile Cheysson’s 1888 Album de Statistique Graphique. This unique cartogram shows that the time to cross different seas is shrinking. It does this by making the geographic distance appear to shrink.

Across a tableau of views, the trip between Marseille and Corsica is the most interesting. See how travel time goes from 44 hours to only 15 hours between 1830 and 1887. How? In that time we transitioned from sail to steamships. Do not miss the intensifying browns across the set of Corsicas and beautiful blue waterlining.

Think about how you might create this today with modern tools. It would have to be a completely custom solution! When this map was created doing something completely custom was not a relative barrier because so much work was done by hand. Creative limits vanish when you are only armed with a pencil and a blank piece of paper.

4. THÉÂTRES DE PARIS (Theaters of Paris) is the last example from Émile Cheysson’s Album de Statistique (1890). Each fan diagram represents a different theater in Paris, with one wedge for each year. The total area of each wedge corresponds to that year's gross receipts. You can make macro comparisons between the overall size of each theater's diagram. Then, see micro comparisons across any individual theater’s operation.

The fan is a wonderful form for this data. Its area packs in larger values, allowing the display of many tiers of business. Its natural center orients us to each theater while still keeping the timeline’s left-to-right convention. Missing wedges indicate some theaters opened during the period under examination. Further, there is some visual form linkage between the shape of the fan and the theater audience. Audiences sits in a fan configuration, oriented toward the stage. To keep themselves cool in a packed house (and signal social cues) they fan themselves.

Years with a Parisian Exposition Universelle (World Fair) attracted more visitors to the city. Theater business often correspondingly went up during the expo. Golden yellow highlights these expo years. They anchor each fan’s boundaries. Intervening red years sport a neat dashed summary line. Its average ticket sales is careful to omit the outlier exposition years. Dense annotation calls out values. Text scales with the size of each diagram and year labels angle with the wedges. Magnifique!

Notice how the diagrams are powerfully salient without occluding their basemap. A neighboring bar chart provides extra context in a more standard form.

Design Details

Small flourishes can make the difference.

5. TERRITORIAL EXTENT AND POPULATION by Alexander von Humboldt (1811) may present the first stacked bar chart. Each pair of stacks represents an empire (Spanish, British, Turkish, Russian). The big fat stack is territory area of different empire lands. The skinny bars represents corresponding populations.

The block on top of the British Empire stack represents Britain’s American colonies, the recently liberated United States. Its block breaks off from its mother empire. Imagine the moment Humboldt decided to cock that little rectangle to represent the Revolution. This small design flourish gives the whole piece life.

6. INDEX NUMBERS was published in Willard Cope Brinton’s second book, Graphic Presentation (1939). The design detail of note is this chart’s unique baseline. Today it is bad practice to accentuate a non-zero baseline because it implies undue importance. Today’s standard practice has us style all non-zero lines the same as other grid lines because they all share the same significance.

Brinton goes further. He intentionally calls attention to the non-zero baseline by making it wavy, almost like a torn sheet of paper. Without this emphasis a reader might miss that there was a non-zero baseline. Doing it this way calls the viewer's attention to this aspect of the chart. It is a best practice that deserves revitalization.

Pictorial Elegance

Illustrations that inform.

7. HELMETS is by Bashford Dean (c. 1917), a zoology professor and the man who established the Metropolitan Museum of Art's Department of Arms and Armor. You can see his two expertises combine in this Darwinian evolutionary tree of medieval helmet design. The branched structure looks like a biological evolutionary tree. It reminds us that technology morphs over time too.

The illustrations let us know at a glance that the topic we are dealing with is the age of knights. The dots that link the helmets suggest some logical relationship between them. Further inspection reveals that connected helmets look similar to one another. Imagine how less engaging—and less informative—this piece would be if each helmet was represented by a circle instead of an illustration.

This chart prompts so many questions-- why did helmets evolve? Why did certain branches wilt? Were they overly-specialized? Did fashion change? Did tools change? Did metallurgy change? Did weapons change? Were individuals changing their helmets or were centralized designers controlling these decisions?

8. THE OCEAN SHRINKS was created by the Isotype Institute (c. 1943) for the book Only An Ocean Between. It shows how the Atlantic Ocean has apparently shrank with faster transportation. It reminds us of Cheysson’s earlier travel-time cartogram.

The diagram and its styling appears simple, almost childish. The big idea hits you first: the world is getting smaller. Look closer. The use of color is sublime. See how the designers baked in all kinds of detail. Each wave represents a day of travel, the most salient red vehicle changes form from boats to a plane. The city skylines grow (is that London with St. Paul's?). Yet these details do not distract from the big lesson.

Learn More

The canon of historic visualization is vast. It can become overwhelming. As you explore the following resources, keep in mind their originating context. They may be static, but they are still interactive. Many examples were designed for a more patient audience. Take time, read the details, zoom in and out with your eyes. Historic viz was not designed for social media feeds. Their original canvases were not pocket-size screens. They were often printed on sheets of paper larger than your biggest monitor. If you find a favorite piece then consider printing it out at the original size. Enjoying it like this will help the viz come back to life. Give them your attention and you will be rewarded handsomely.

Follow RJ @infowetrust and Andy @acotgreave on Twitter.

Resources for further exploration:

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Location data can come in many forms and formats, but one of the most common formats is in separate fields containing latitude and longitude values. This data might reside in an Excel file, CSV, or any other table of data. These values could represent anything—the position of a physical location, object, Wi-Fi hotspot, wildlife, or literally anything else on the planet.

For people trying to understand their location data and to see spatial patterns, Tableau is adding two spatial functions that will make analyzing location data easier than ever. The spatial functions are called MakePoint and MakeLine. While their names are self-explanatory, permit me a few paragraphs to showcase what is now possible with maps in Tableau.

Use longitude/latitude fields in a spatial join to map police incidents using MakePoint

Let’s start with MakePoint, a function that will allow you to use latitude and longitude fields in a spatial join. Spatial joins allow you to use points and polygons to join datasets based on their spatial relationship and MakePoint is the function that can...make points! Another way to think about this function is that it turns non-spatial data sources like text files and Excel tables into spatial data sources that allow you to spatially aggregate spatial data.

In this example, I am going to explore the city of San Francisco’s open data for crime incidents by neighborhood. To get started, I visited data.sfgov.org and downloaded a neighborhood shapefile and the latest incident report data as a CSV file. The incident dataset includes latitude and longitude fields and a number of other categories like Police District or Incident Type. My questions are which neighborhood has the most reported incidents and what neighborhoods have the highest risk of theft. Look at how easy it is to answer these questions now. After connecting to the shapefile, I can add a connect to the CSV and use MakePoint in a join calculation to allow me to use Intersects to complete the spatial join.

With the data joined, I can use Number of Records to visualize which neighborhood has the most crime incidents and use a filter to answer my question about theft.

Source: https://data.sfgov.org

Connecting origins and destinations for flight routes using MakeLine
The second function we added to Tableau is called MakeLine. This function is useful when you want to visualize the connection between two points on a map. If you have ever built an Origin-Destination map or worked with pick-up and drop-off datasets, MakeLine makes connecting the dots simple because you do not have to replicate rows of data or perform any pre-processing, so long as you have the latitude and longitude for the start and end of your data. A great use case for working with origins and destinations are flight routes. You will need the departure city or airport’s latitude and longitude, along with the latitude and longitude for the arrival city or airport. My dataset includes a number of routes between major cities in Asia; using MakeLine I can now visualize the flight routes.
After I create the calculated field, I can simply drop that field on to the canvas and we’re ready for take-off. MakeLine generates geodesic lines, which will display as curved when connecting locations that are far apart on the earth. When working with data at a local/city-scale, like pick-up and drop-off locations for a bicycle-share, the lines will appear straight.

Get more out of your location data to reveal hidden spatial patterns. Spatial calculations make geospatial analysis simple, and remove the need for pre-processing and shaping of data. Spatial functions represent another building block capability in Tableau and we can’t wait to see what you build with it. Members of the Tableau community are already sharing some of their examples. Marc Reid build an interactive dashboard to look at flights from different origin cities.

The new #makepoint and #makeline spatial calcs in #Tableau pic.twitter.com/TZeczKzTzi

— Marc Reid (@Marc_DS5) May 28, 2019

Jeffery Shaffer takes some community inspiration to build an exploration of ride-share data using set actions, vector maps, and spatial calculations.

I want to encourage everyone to stay tuned in for the 2019.3 beta as we expect to release more spatial functions for early testing. Until then, go forth and make beautiful maps!
Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Editor’s note: This is the second post in a series about visual design with data. Read the first here, and check back soon for the next!

Last time, we talked about the information communication triangle that connects the creator, the audience, and the information product. We pointed out that the success or failure of our work as creators is truly dependent on what happens on the side of the triangle over which we have no direct control: the relationship between the information and the audience.

Notice that we’re saying that we have no direct control over that relationship. While this is true—nothing we do can directly change that relationship—everything we do is an attempt to indirectly affect that relationship.

We do that by using our direct control over our information product—making it as appealing, accessible, interesting, and useful to the audience as possible.

And how do we do that? By using the things we know about the audience, based on our direct relationship with them. We consider their motivations, their prior knowledge, and their interests—as well as some basic human tendencies—to create something specifically built for them.

So when we create a product, it takes more than just following recipes, tutorials, or decision trees of chart choices. To create an effective information product, we cannot be merely technicians.

What we are not

We are not automatons, looking through Ye Olde Booke of Beste Practices to see what chart matches up with what data, and what color palette is the “right” color palette to use, based on the available criteria.

But neither are we merely artists (or, to use a modern recasting of the word, “creatives”), restlessly in pursuit of the magical aesthetic that will dazzle in a way no audience has ever been dazzled before.

We are not pure analysts either, assessing the data with an intense objectivity and providing findings with as little fanfare or editorial contextualizing as possible.

What we are

We are not creatives, but creators: builders of experiences specific to a time and place, to an audience, to a subject and context, to a scope, to a goal.

To be successful in doing this, we can’t position ourselves definitively above all else as aesthetes or analysts or technicians or psychologists or scientists or coders. Rather, we have to continually become more of all of these things, together and interwoven. This will give us the power to use all of the various skills, tactics, and techniques of these fields, in whatever proportions are appropriate for each individual information experience we create.

OK, pal, calm down. It’s just a chart.

There are those who would dismiss such a statement out of hand, who would claim that we are making grandiose, self-important claims about data visualization; that we are elevating the humble task of making charts (”Excel does this automatically”) to a level of importance far beyond what it deserves.

That is why there is so much bad design in data visualization.

To think that “it’s just a chart” or “it’s just a dashboard” is to mistake the tool for the goal, the medium for the message. And the tool doesn’t make the product—you do.

For instance, I can buy a sewing machine, but that doesn’t mean I automagically know how to make a suit. I can buy a table saw but I can’t build a house. I can even watch videos online telling me how to do these things, but I won’t know what about a suit makes it appropriate for the wearer or the situation. And I won’t know all the things about home construction that make a house legal, safe to live in, durable, comfortable, and so on.

Tools exist for skilled practitioners to express their vision more accurately—not to eliminate skilled practitioners from the creation equation entirely.

What do we mean when we say “design?”

We did mention one reason there’s so much bad design in data visualization. Another reason that “design” causes so many disagreements in data visualization is that different people in the conversations are using “design” to mean different things.

Design is a loaded term. It means many different things in many different contexts.

Maybe most commonly in general usage, it’s a synonym for “aesthetics.” Maybe in some people’s definitions, design is a term solely reserved for how things look, which others might also include how things feel. Some might even go so far as to include look, feel, and “how this makes me feel.”

For an architect, it’s precision and science as well as aesthetics—a building’s design must include the engineering parameters that make that look and feel possible.

For an app developer, design includes the functionality of the product, the interactions and feedback that make the product usable, and even the motivations behind the way the product is used (e.g., “it’s designed to make you want to stay and play one more game”).

Technical designer. Sound designer. UI designer. Interior designer. Graphic designer. Product designer. Interactive designer. Multimedia designer. Publication designer. Set designer. Instructional designer. Studio designer. Network designer. Structural designer. Exhibit designer. Digital designer. Fashion designer. Editorial designer. Color designer.

All of these job titles exist, and yet they have wildly different responsibilities, concerns, and mandates. How can we realistically use one word, “design,” to cover all of these things?

A specific phrase for a specific idea

If we’re going to talk about what “design” means in the world of data visualization, we should find a way to be more specific in our language—to find a term that is harder to misconstrue and can be universally understood.

That’s why we’re using the term delta design.

Because our way of talking about design in data visualization, and our concept of “success” in data visualization, need to change.

The Delta Design Philosophy: four goals for success

Our general definition of success in delta design is to provide an information experience to an audience that materially benefits them.

More specifically, the delta design concept comprises four distinct goals necessary for an information experience to be considered successful.

  1. Attention and engagement
  2. Usability
  3. Comprehension
  4. Retention and recall

(In truth, there is a fifth, unstated goal, which is “actionable change,” but rather than a goal that we as creators can set out to achieve on its own, it will follow from the successful achievement of the four main delta design goals.)

More simply, delta design is about presenting your product to an audience in such a way that they can answer “yes” to all four of these questions:

  1. Am I interested?
  2. Can I use it?
  3. Do I understand it?
  4. And later on, do I remember it?
What am I supposed to do with this?

Here’s the important part about delta design: anybody can do it. It is the truest answer to the question so many dataviz creators ask: “How can I become a better designer?” We hope these posts will help to get you there.

Because delta design isn’t a magic trick or a fad diet. It’s a different philosophy of how to approach our dataviz work. It takes practice and focused effort and a new way of thinking about creating data visualizations.

There is no prerequisite, no bright-line test, no minimum level of aptitude in one specific skill area. Remember the various roles we mentioned earlier? Aesthetes, analysts, technicians, psychologists, scientists, and coders? All of us—those of us writing these words and those of us reading these words—are all of those roles, in different measures.

We will all solve the challenges in front of us using the skills we have, and the skills we eventually develop. Delta design is about reframing the challenge so that a successful solution leads to the best outcome for the audience. If we lean on aesthetics, and our colleague leans on psychology, but we get to comparable solutions, that’s perfectly fine.

In our next few articles, we will be talking about these four goals in more detail, including how you, as the creator, can reset your perspective and learn specific tactics to achieve these goals more easily and more frequently.

This is the second post in a series! Read the first post and stay tuned for the next!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Let’s set the record straight. Providing women and girls with equal access to education, health care, decent work, and representation in political and economic decision-making processes is important for the world. Societies have advanced their gender equality and women’s empowerment initiatives, but women and girls continue to suffer from discrimination and violence. And this needs to change.

Equal Measures 2030 (EM2030) is a joint effort of leading regional and global organizations from civil society and the development and private sectors working to connect data and evidence with advocacy and action on gender equality. This coalition includes partners like the Bill and Melinda Gates Foundation, KPMG, ONE Campaign, Asia-Pacific Resource and Research Centre for Women (Arrow), Data2X, Plan International, and others. Their mission is to ensure girls’ and women’s movements, advocates, and champions have access to the robust data and evidence they need when they need it, in an easy to use format so they can drive faster progress towards the gender equality commitments in the Sustainable Development Goals (SDGs). Data—especially about the lived realities of girls and women, about what is working and where we’re falling behind—has the power to guide and drive ambitious policy, hold governments accountable, and to change laws, policies, and budget decisions needed to eradicate gender inequality.

Equal Measures 2030 and Tableau Foundation join forces to visualize gender equality data in 129 countries

In 2015, world leaders from 193 countries committed to achieving gender equality by 2030 for every girl and every woman when they signed on to the United Nations Sustainable Development Goals (SDGs). On June 3 at the 2019 Women Deliver conference, Tableau Foundation partner EM2030 released the SDG Gender Index, the most comprehensive accountability tool available to explore gender equality in 129 countries aligned to the UN’s SDGs.

The Index, displayed through interactive data visualizations using Tableau technology, is now accessible on EM2030’s Gender Advocates Data Hub, an online platform designed for advocates working to encourage countries across the world to make faster progress on gender equality laws, policies, and budget decisions. There you can explore their data visualizations, compare country performances across regions, explore the SDGs based on thematic areas of interest, and read about the girls and women who are using data to drive action in their communities.

“We believe that data has real power when it is in the hands of advocates who are working to drive change on gender equality in their countries and on the issues they care about,” says Alison Holder, Director of Equal Measures 2030. “We know that data can drive accountability, bring light to hidden issues, spark debate, and tell the story of progress for girls and women.”

The Index took shape by EM2030’s work with partners across seven initial focus countries—Colombia, El Salvador, India, Indonesia, Kenya, Senegal, and Tanzania —as well as dialogue with thousands of other stakeholders worldwide. Its development formed from the findings of two formal surveys: one with more than 100 policymakers and the other with more than 600 gender equality advocates. Together, these surveys increased their understanding of the demand for gender-related data and the inherent challenges and opportunities in connecting such data with advocacy and action for gender equality.

The 2019 SDG Gender Index covers 14 of the 17 SDGs, measures 129 countries in five regions on 51 gender-equality issues ranging from health, gender-based violence, climate change, decent work, and others. The next iteration of the Index will launch in 2021 and have regular updates every two years until 2030.

“All over the world, we’ve seen that the most effective advocates are those that use data to frame complex problems as local, accessible, and solvable,” says Neal Myrick, Global Head of Tableau Foundation. “Equal Measures 2030 has done a terrific job designing the SDG Gender Index and Gender Advocates Data Hub to really help local leaders make a meaningful difference in the lives of women and girls.”

Leveraging data and analytics in advocacy

EM2030 held training for national partners on how to use and interact with data from the Index to fuel storytelling and advocacy during the Women Deliver 2019 Conference. They also held media training for over 100 journalists from Women Deliver who have a huge role to play in developing stories, interest, and attention to the issues of gender equality. Partners will roll out training in their own countries, demonstrating how to use the data to inform and make a strong case for advocacy.

“We want our data to be truly accessible to advocates and gender equality champions, whether they’re in the United States or Kenya, and whether they’re pushing for equal pay, ending gender-based violence, ensuring financial inclusion or any other issue affecting the rights of girls and women,” says Holder. “Tableau is helping us move away from charts, tables, and static graphs to tap into technology so we can easily track progress and ensure governments deliver on their promise of reaching gender equality by 2030.”

We are honored to join with EM2030 on this journey towards greater use of data advocacy. We look forward to helping them bring the power of accessible, easily understood, and compelling data to advance the rights of women and girls all over the world.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

You all know those people—they shine with solutions and creative approaches in the Forums or on Twitter. They invite new voices into conversations at User Groups, and they engage and inspire on Tableau Public. They are evangelists and leaders, and they represent the spirit and voice of the Tableau Community.

In our community, we call these champions ambassadors, and we need your help selecting the new group. Each year we choose new leaders to join our existing ambassadors to further our support of the Tableau Community. We ask them to serve a one-year term (with the option for more) and in return offer fun perks, access to resources, sneak peeks into what’s coming from our dev team, and exclusive swag just for them.

By nominating an ambassador, you're not only recognizing your data heroes. You're also giving your say in who you want to lead the Tableau Community forward—who you want to be empowered with resources and insider knowledge to help create or continue amazing programs and initiatives.

Do you know someone who should be a Tableau Ambassador? We want to hear from you! Nominate them now.

Welcome Student Ambassadors

We have been working with our friends on the Tableau Academic Team to figure out how we can join forces to establish and support student champions. It’s important that we support their professional development while growing the awareness of Tableau through this core group of students at specific institutions.

We are excited to share in July we are launching the Student Ambassador Program that will align with all our other Ambassador groups! In the inaugural year, we plan to pilot this program with target institutions for students who attend university in the following countries: United States, Canada, Singapore, Australia, and the UK.

Unlike our other ambassador programs: Student Ambassadors will have an opportunity to become a data expert and an influencer on their campus. They will get to attend a Data Viz Bootcamp organized by the Tableau Academic Team and learn how to speak about Tableau and teach others in on-campus workshops. Learn more about the new Student Ambassador program here.

What does it take to become a Tableau Ambassador?

Demonstrate Leadership and Evangelism.
Tableau ambassadors represent the spirit and voice of the community. They keep us honest and push us to do more. They teach and share while fostering positive and supportive behavior within the community. They give credit where credit is due, and help nurture inclusivity. They shine the spotlight on new voices, innovative ideas, and allow us to reach more people to help them see and understand their data.

Ambassadors are selected to serve a one-year term in one of five branches:

  • Social Media champions are led by Marissa Michelotti
  • Outstanding Tableau User Groups leaders are led by Jordan Scott
  • Community Forums heroes are led by Patrick Byrne
  • Visionary Tableau Public creators led by Jonni Walker
  • And our new Student Ambassadors will be led by Midori Ng and KJ Kim

There are no firm guidelines for selection, but you can only nominate someone to one specific group. Please choose wisely. We do not select Tableau Ambassadors based on the number of submissions received. While we do read and track all nominations, we also use internal metrics, employee recommendations, and the needs of our global community to determine the new group.

Nominations will be open from June 11, 2019, through July 12, 2019, at 5:00 pm PST.

We encourage you to learn more about all the ambassadors on our website and follow them on Twitter.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Hello Tableau Community,

We have big news to share, Tableau has entered into an agreement to be acquired by Salesforce.

I know, shocker, right?!?

This emerged as a really special opportunity to accelerate our mission. We’ve been at this for a while, from those original days at Stanford when Chris, Christian, and Pat came together with the crazy idea that they could make data understandable by ordinary people. That turned into a revolution with visual analytics that are now used by millions of people worldwide.

At the same time, despite the success, we have only touched a fraction of the people who can benefit. The opportunity is much bigger now than we originally realized sixteen years ago. There is so much more to do. Joining Salesforce will help us accelerate and expand the change in the world we are so fanatical about.

Following the closing, we expect to be operating independently under the Tableau brand. The passionate Tableau team will still be here to support you. We’ll still wake up every day, thinking about what we can do to help even more people see and understand data. We’ll still come to work every day, striving to build analytics products you love to use. And we’ll still be big supporters and participants in the amazing Tableau Community.

There is no better representation of what makes Tableau unique than the Tableau Community. Beyond of course the product, the Tableau Community has been both why and how we’ve succeeded together. How we’ve grown our careers. How we’ve helped our companies be smarter and more productive. How we’ve supported the organizations that make the world a better place.

The core elements of the Tableau Community won’t change. We’ll keep engaging with each other via Tableau Public, Tableau Conferences, and Tableau User Groups. We’ll still work together with the Tableau Zen Masters and Tableau Ambassadors who give back and provide mentorship and guidance for the Tableau Community.

I know this is a lot to get your head around, especially for those that have been on this journey with us the longest. You might be thinking the Tableau Community is going to lose what makes it so special. I don’t see it that way. I see this as an opportunity to amplify the great work we’ve started together. Part of what makes this a great match is Salesforce’s own passion for community. More than a 1.4 people in the Trailblazer Community coming together to learn, connect, have fun, and give back. Sound familiar? Don’t get me wrong, I’m not suggesting there could be anything out there like the Tableau Community. But these look like pretty great allies to keep finding even more ways to make Community amazing.

Yesterday, our mission was to help people see and understand data. Guess what – same mission tomorrow, but we get to go faster and have even more impact together. Looking forward to seeing you at TCE and TC!

Together, we are Extraordinary,
Francois Ajenstat, CPO

PS - Below is a legally required disclosure notice
Additional Information and Where to Find It: The exchange offer referenced in this communication has not yet commenced. This communication is for informational purposes only and is neither an offer to purchase nor a solicitation of an offer to sell shares, nor is it a substitute for any offer materials that the Company will file with the U.S. Securities and Exchange Commission (the “SEC”). At the time the exchange offer is commenced, Salesforce and Purchaser will file a tender offer statement on Schedule TO, Salesforce will file a registration statement on Form S-4 and the Company will file a Solicitation/Recommendation Statement on Schedule 14D-9 with the SEC with respect to the exchange offer. THE EXCHANGE OFFER MATERIALS (INCLUDING AN OFFER TO EXCHANGE, A RELATED LETTER OF TRANSMITTAL AND CERTAIN OTHER EXCHANGE OFFER DOCUMENTS) AND THE SOLICITATION/RECOMMENDATION STATEMENT WILL CONTAIN IMPORTANT INFORMATION. STOCKHOLDERS OF THE COMPANY ARE URGED TO READ THESE DOCUMENTS CAREFULLY WHEN THEY BECOME AVAILABLE BECAUSE THEY WILL CONTAIN IMPORTANT INFORMATION THAT HOLDERS OF THE COMPANY’S SECURITIES SHOULD CONSIDER BEFORE MAKING ANY DECISION REGARDING EXCHANGING THEIR SECURITIES. The Solicitation/Recommendation Statement, the Offer to Exchange, the related Letter of Transmittal and certain other exchange offer documents will be made available to all of the Company’s stockholders at no expense to them. The exchange offer materials and the Solicitation/Recommendation Statement will be made available for free on the SEC’s website at www.sec.gov. Copies of the documents filed with the SEC by Salesforce will be available free of charge under the Financials heading of the Investor Relations section of Salesforce’s website at www.Salesforce.com/investor. Copies of the documents filed with the SEC by the Company will be available free of charge under the SEC filings heading of the Investors section of the Company’s website at investors.tableau.com/.
Forward-Looking Statements: This communication contains forward-looking information related to the Company and the acquisition of the Company by Salesforce that involves substantial risks, uncertainties and assumptions that could cause actual results to differ materially from those expressed or implied by such statements. Forward-looking statements in this release include, among other things, statements about the potential benefits of the proposed transaction, the Company’s plans, objectives, expectations and intentions, the financial condition, results of operations and business of the Company, and the anticipated timing of closing of the proposed transaction. Risks and uncertainties include, among other things, risks related to the ability of the Company to consummate the proposed transaction on a timely basis or at all, including due to complexities resulting from the adoption of new accounting pronouncements and associated system implementations; Salesforce’s ability to successfully integrate the Company’s operations; Salesforce’s ability to implement its plan, forecasts and other expectations with respect to the Company’s business after the completion of the transaction and realize expected synergies; the satisfaction of the conditions precedent to consummation of the proposed transaction, including having a sufficient number of the Company’s shares being validly tendered into the exchange offer to meet the minimum condition; the Company’s ability to secure regulatory approvals on the terms expected in a timely manner or at all; the ability to realize the anticipated benefits of the proposed transaction, including the possibility that the expected benefits from the proposed transaction will not be realized or will not be realized within the expected time period; disruption from the transaction making it more difficult to maintain business and operational relationships; the negative side effects of the announcement or the consummation of the proposed transaction on the market price of the Company’s common stock or on the Company’s operating results; significant transaction costs; unknown liabilities; the risk of litigation and/or regulatory actions related to the proposed transaction; competitive factors, including new market entrants and changes in the competitive environment, pricing changes, sales cycle time and increased competition; customer demand for the Company’s products and services and customer response to the Company’s subscription offerings; ability to attract, integrate and retain qualified personnel; the Company’s ability to protect its intellectual property rights and develop its brand; the ability to develop new services and product features; other business effects, including the effects of industry, market, economic, political or regulatory conditions, including expenditure trends for business analytics and productivity tools; future exchange and interest rates; changes in tax and other laws, regulations, rates and policies, including those related to the provision of services on the Internet, those related to accessing the Internet and those addressing data privacy and import and export controls; future business combinations or disposals; and the uncertainties inherent in research and development.
Further information on these and other risk and uncertainties relating to the Company can be found in its reports on Forms 10-K, 10-Q and 8-K and in other filings the Company makes with the SEC from time to time and available at www.sec.gov. These documents are available under the SEC filings heading of the Investors section of the Company’s website at investors.tableau.com/.
The forward-looking statements included in this communication are made only as of the date hereof. The Company assumes no obligation and does not intend to update these forward-looking statements, except as required by law.
Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview