Loading...

Follow Talend blogs on Feedspot

Continue with Google
Continue with Facebook
or

Valid

With more than 15 million members, XING is the leading online business network for professional contacts in German-speaking countries. XING has an advantage in providing local news and job information.

Because the services XING provides are based on acquiring, storing and manipulating high-quality, timely data, it’s essential for the company to be able to efficiently integrate the many sources feeding data into XING’s systems.

But XING had legacy systems, which required them to script their data integration which in turn made it hard to keep track of where data sets were generated. The file formats also presented a challenge, because the data was in Apache Avro format, which is not usually supported by traditional data processing tools.

Handling a vast amount of data in a time-pressured event-streaming environment

XING evaluated several potential solutions for its data integration needs and selected Talend for multiple reasons including its open source approach, wide range of connectors, and its capabilities for metadata management and automated Documentation, its fast adoption of emerging technologies and the ability to enable implementation of new use cases. 

“Data analysis is key for the success of an online network. Talend helps us find in real-time the signals from our data to support decision-making process for a superior user experience.” Mustafa Engin Soezer, Senior Business Intelligence and Big Data Architect, XING SE

XING now uses Talend as the bridge between a 150TB MapR-DB NoSQL on-premise database and a 60TB Exasol database for analytics.

Connecting professionals to make them more productive and successful

Key benefits of XING‘s new integration architecture include understanding the business better now that data is consolidated on one platform and being able to more efficiently run analytics and reports that support better decision-making.

“In addition,” says Soezer, “maintenance costs are reduced and productivity and efficiency have increased.” “Talend is helping us find insights and measure performance against KPIs,” says Soezer. “ For example, we can now more quickly and accurately analyze data and extract metrics and KPIs that are used to drive business strategies all across XING. We also have better statistics on the number of daily and weekly active users, new job postings, number of users who clicked on specific jobs, and more.”

More than 15 million users entrust their personal data to XING. XING, therefore, has a special responsibility to their customers, who all expect the social network to keep their data safe and to handle sensitive information confidentially. Talend is also helping XING to adhere to strict standards of corporate governance, data protection, and GDPR compliance.

“Online business networking is based on trust,” says Soezer. “So it’s critical for compliance to centralize and track metadata,“ “With Talend, we’re centralizing all source and target systems, and can analyze data to determine which one is relevant to what requirement. We can determine if data is private or not. And we can take full control of our data as well as metadata.“

The post How Social Media Network XING is Connecting Systems for Better Business Networking appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Today during Microsoft Build 2019, we announced that Talend Cloud, our Integration Platform as a Service (iPaaS), would soon be available on Microsoft Azure starting in Q3 2019. For those who have selected Azure as their cloud platform of choice, Talend Cloud running natively on Azure will provide enhanced connectivity options for modern data integration needs as well as better performance.

Cloud services are already (or will soon be) a critical piece to every organization’s digital business strategy. By moving towards the cloud, companies are already reaping the benefits such as speed of provisioning, time to market, flexibility and agility, instant scalability, reduced overall IT and business costs, to name a few. Making the cloud data migration and integration experience simple is key to delivering trusted data at speed.

Let’s dive into a few more benefits of Talend Cloud on Microsoft Azure.

Accelerating Cloud Migration

Before we talk about building end-to-end integration, let’s first talk about getting your data to the cloud. Data migration into the cloud is perhaps one of the bigger hurdles to cloud adoption. If you’ve selected Azure as your cloud of choice, having a scalable iPaaS running on Azure is a must to get data into your cloud and start maximizing your investment quickly.

End-to-End Integration on Azure Cloud

What’s Talend Cloud running natively on Azure mean for you? Put simply, a faster and easier integration experience. Whether you are loading data into SQL Data Warehouse or running it through HD Insight, Talend Cloud can help you create an end-to-end data experience, faster by running natively on Azure.

Faster Time to Analytics and Business Transformation

Customers who use Talend Cloud services natively on Azure will experience faster extract, transform and load times regardless of the data volume. Additionally, it will boost performance for customers using Azure services such as Azure SQL Data Warehouse, Azure Data Lake Store, Cosmos DB, HD Insight, Azure Databricks and more.

Reduced Compliance and Operational Risks

Because the new data infrastructure offers an instance of Talend Cloud that is deployed on Azure, companies can maintain higher standards regarding their data stewardship, data privacy, and operational best practices.

What’s Next?
  • If you are a Talend customer, keep an eye out for the announcement of the general availability date of Talend Cloud on Azure in Q3 2019.
  • Not a current Talend Cloud customer? Test drive Talend Cloud free of charge or learn how Talend Cloud can help you connect your data from 900+ data sources to deliver big data cloud analytics instantly.
  • See three real-world use cases (data architectures included) of companies using Talend Cloud and Azure today by downloading this white paper.

The post Talend Cloud on Azure is Coming! Create A Faster, More Connected Cloud appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Today’s ever-increasing competitive market is forcing organizations to become more data-driven. To support key business objectives such as growth, profitability and customer satisfaction, businesses must digitally transform and become reliant on more and more data to make laser sharp decisions faster.

Per IDC, the global data sphere will more than quadruple by 2025, growing from 40 ZB in 2019 to a whopping 175 ZB in size in 2025 (Source: IDC DataAge 2025 whitepaper).

Mastering Data – One Pizza at a Time

Let’s look at how Domino’s Pizza transformed itself into becoming a digital company that sells pizza.

“We’ve become an e-commerce company that sells pizza. Talend  has helped us make that digital transformation.”

Dan Djuric, VP, Global Infrastructure and Enterprise Information Management, Domino’s Pizza, Inc.

Domino’s Pizza is the world’s largest pizza delivery chain, it operates around 15,000 pizza restaurants in more than 85 countries and 5700 cities worldwide and delivers more than 2 million pizzas daily.

In 2009, Domino’s Pizza was worth $500M. Today the business is worth $11B (20X in 10 years!). 

The story of their growth started six years ago, when they began coupling business transformation with digital transformation. They first reinvented their core pizza, and aggressively invested in building data analytics and a digital platform to reimagine their customer experience. Domino’s can now own their customer experience, optimize customer data and iterate quickly.

They also implemented a modern data platform that improves operational efficiency, optimizes marketing programs and empowers their franchisees with store specific analytics and best practices. 

Domino’s integrates customer data across multiple platforms—including mobile, social and email— into a modern data platform to increase efficiency and provide a more flexible customer ordering and delivery experience.

The company implemented a digital strategy by enabling customers to order pizzas on their favorite devices and apps, any way they want, anywhere. Domino’s knows each member of the household, their buying patterns and can send personalized promotions and proactively suggest orders.

Building Your Data-Driven Enterprise

According to Gartner, through 2020, integration tasks will consume 60% of the time and cost of building a digital platform.

The data value chain

To enable the business with data, you must solve these two problems at the same time and do it a scale.

The data must be timely, because digital transformation is all about speed, accelerating time to market – whether for real-time decision making or delivering personalized customer experiences. However, most companies are behind the curve. Per Forrester, only 40% of CIOs are delivering results against the speed required.

But speed isn’t enough. Because the question remains: do you trust your data? For data to enable decision-making and deliver exceptional customer experiences, data integrity is required. This means delivering accurate data that’s providing a complete picture to make the right decision and has traceability: you know where the data is coming from.

This is also a major challenge for organizations. According to the Harvard Business Review, on average, 47% of data records are created with critical errors that impact work.

Companies that are digital leaders like Domino’s can rapidly combine new sources of data to produce insights for innovation or respond to a new threat or opportunity -because they can deliver accurate and complete data that the business can trust. And they do this at the speed required to compete and innovate. For these organizations, data has become a strategic differentiator.

To see how your team’s competencies match up to a digital leader’s, see this white paper from Gartner: Build a Data-Driven Enterprise.

The post The Secret Recipe for Digital Transformation? Speed & Trust at Scale appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Introduction

Among Talend’s blog posts are many outstanding ones on data governance, such as David Talaga’s “Life Might Be Like a Box of Chocolates, But Your Data Strategy Shouldn’t Be” which encourages us to know our data, and his two-part post on “6 Dos and Don’ts of Data Governance,” in which David offers steps to take and pitfalls to avoid when starting out on data governance. In “5 Key Considerations for Building a Data Governance Strategy,” my colleague Nitin Kudikala describes five data governance best practices and success factors.

I plan to add a few more posts to the category, focusing on specific Talend products that contribute to operationalizing data governance. Before I talk in detail about these tools and how they add value I feel it necessary to say something about data governance first, as a way of establishing a foundation on which to build.  

In the preface to the children’s tale “How the Elephant Got His Trunk” is this short poem:

I keep six honest serving-men

(They taught me all I knew);

Their names are What and Why and When

And How and Where and Who.

These questions, “What? Why? When? How? Where? and Who?” are fundamental to solution-seeking and information gathering. I’m going to use them to frame this introduction to data governance fundamentals. Part 1 covers the What, Why and Who. Part 2 will cover the When, Where, and most importantly, How. Keep in mind as you read that the “5 Ws and 1 H” are neither mutually exclusive nor as separate as their presentation suggests. They pop up together and continually throughout whatever journey to data governance you may take. Ready? Let’s get started!

The “What” of Data Governance

When talking with someone about a topic, it’s always worthwhile to confirm that all parties are aligned on what exactly they’re talking about, so let’s begin by establishing what data governance is. In a tie-in to the earlier poem, recall the parable of the blind men and the elephant, which originated in India (fig. 1).

Figure 1: Parable of the blind men and the elephant

There are many interpretations of this story, but the one that applies here is that context matters, and that the truth of what something is often a blend of several observations and impressions.

<<ebook: Download our full Definitive Guide to Data Governance>>

What Data Governance is vs. What it is Not

To that end, I offer these definitions from noted luminaries within data governance, including our own product team. These four sources are the wellspring from which I drew most of my content:

  • DAMA’s DMBoK, via John Ladley: The exercise of authority and control (planning, monitoring, enforcement) over the management of data assets.
  • Steve Sarsfield: The means by which business users and technologists form a cross-functional team that collaborates on data management.
  • Bob SeinerFormalizing and guiding existing behavior over the definition, production, and use of information assets.
  • Talend: A collection of processes, roles, policies, standards, and metrics that ensure the effective and efficient use of information in enabling an organization to achieve its goals.

Within the definitions, I’ve bolded important recurring themes:

  1. Data governance is not an IT-only function—it’s a partnership between the business and technologists. Indeed, John Ladley, a noted data governance consultant, advises against the CIO “owning” data governance.
  2. Data governance is a bit of a misnomer, as what’s really governed is the management of data. More on this momentarily.
  3. Data governance is cross-functional—it’s not done in isolation but rather pervades the entire organization. Once it’s live, no single group owns it.
  4. Data governance views data as assets—not in some nebulous figurative sense, but as real, tangible assets–that must be formally managed.
  5. At its core, data governance is not about technology but rather about guiding behavior around how data is created, managed, and used. Its focus is on improving the processes that underlie those three events.
  6. Data governance is intended to be an enabling function, not exclusively a command-and-control one. Its purpose is to align data management practices with organizational goals, not be the data police.

Having established what data governance is, I’d like to say a few words about what it is not:

  1. It is not simply a project—it’s a program. What’s the difference you may ask? According to the Project Management Institute, “A project has a finite duration and is focused on a deliverable, while a program is ongoing and is focused on delivering beneficial outcomes.” Projects have ROIs; programs are enablers—they’re required for other organizational initiatives to succeed. Let me hasten to add that the establishment, i.e., the “standing up,” of, data governance is indeed a project, but its business-as-usual state is as a program.
  2. It is not achieved through technology alone. It’s achieved through changes in organizational behavior. Technology plays a big part in getting there, but to succeed at data governance you need to establish and institutionalize the activities and behaviors the tool is supporting. It may seem strange that a person who works for a technology company is downplaying technology, but keep in mind my role as a customer success architect is to help ensure just that, and critical to the successful leveraging of any tool is the success of the program it supports.
  3. It is not a stand-alone department. As a program, it may have a Center of Excellence, but it’s not a distinct functional area. Instead, you’re putting functionality in place throughout the organization.
  4. It is not the same as data management. Management is concerned with execution, while governance is oversight—an audit and small-”c” control function. Data governance ensures the management is done right by establishing, maintaining, and enforcing standards of data management. Recall I mentioned earlier that data governance is a bit of a misnomer—that it’s really data management Data governance is to data management what accounting is to finance. There’s the governed and the governor.
The “Why” of Data Governance

The benefits of governing data are many, but quite simply, those organizations that govern their data get more value from it.

Here are a few macro-level benefits of DG:

  • Data governance leads to trusted data. As Supreme Court Justice Louis Brandeis said, “sunlight is said to be the best of disinfectants.” By putting eyeballs on data, data governance enables better-quality data. When data quality goes up, trust in the data does too.
  • Data governance enables benefits at every management level by enabling and improving the processes around the creation, management, and use of data. Strategic benefits include aligning business needs with technology and data, better customer outcomes, and a better understanding of the organization’s competitive ecosystem. Tactical benefits include data silo-busting, i.e., greater data sharing and re-use, and timely access to critical data. Operational benefits include increased efficiencies and better coordination, cooperation, and communications among data stakeholders.
  • When deciding whether to do something, organizations commonly decide the value of that “something” by the extent to which it impacts the “Big-3”: revenue, costs, and risks. Governing data increases revenue, reduces costs and mitigates risk in manifold ways, but how it does so is highly specific to an organization. These particulars are identified when the business case for DG is developed, which I cover later.
The “Who” of Data Governance

There’s an old joke that asks “How many psychiatrists does it take to change a lightbulb? “Only one,” goes the answer, “but the lightbulb has to want to change.” Change—especially organizational change—is, of course, difficult, but with data governance, the juice is worth the squeeze.

I mentioned above that those organizations that govern their data get more out of it, so the glib answer to the “who?” question is “everyone.” Bob Seiner argues that “you only need data governance if there’s significant room for improvement in your data and the decisions it drives.” As you can imagine, he believes that description applies to most organizations.

Organizations recognizing data as a strategic enabler govern their data to ensure that data management responsibilities are aligned with business drivers. It’s also the case that heavily regulated organizations such as banking and financial services are driven to implement data governance to ensure they’re doing the right things the right way. If you’re data-driven, you should govern that data.

Conclusion

In this post, the first of two on data governance fundamentals, I’ve discussed the what, why, and who of a governance program. What data governance is (and isn’t), why it’s worth doing, and who should govern their data. In part 2, I’ll conclude with the when, where, and how of data governance. Thanks for reading!

The post The Fundamentals of Data Governance – Part 1 appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

PT Bank Danamon is Indonesia’s sixth-largest bank, with four million customers and a network that stretches across the archipelago.

Competition for customers in the Indonesian market is especially fierce, with 115 banks to choose from, according to recent government statistics. Finding a way to stand out is crucial. PT Bank Danamon has ambitious goals to more than double the number of customers making use of mobile and internet banking channels over the next year – up from 30 percent to 80 percent and believes it can achieve that mark by harnessing the power of big data.

Differentiate on customer experience in a crowded marketplace

Though the bank has a lot of data, until recently it was held in around ten siloed data marts. The analysis was often performed in each silo and it could be difficult to understand how particular decisions had been reached.

A shift by Indonesian banks towards real-time customer engagement became a catalyst for the PT Bank Danamon project. That meant overhauling the bank’s existing static digital channels to allow for two-way communication with customers and overhauling the data infrastructure underpinning those channels that allow opportunities for engagement to be recognized so that a personalized ‘next-best action’ can be recommended for each customer.

Talend won the bank over with channel partner, Artha Solutions. PT Bank Danamon liked that Talend could run on-premises or in the cloud. The new big data infrastructure consists of an on-premises Hadoop cluster and Talend, which ingests more than 40+ source systems, including its core banking and credit card systems, and sets governance standards across the bank on how data is to be organized and used.

“Using data, we want to make sure once a customer opens a relationship with us, that our products and our bank remain top of mind.” Billie Setiawan, Head of Decision Management for Data & Analytics

It now takes half the time it previously did to produce reports, saving PT Bank Danamon additional time and money.

Real-time engagement drives change

The bank is using the big data platform for two initial use cases. Under the first use case, the platform is being used to build a 360-degree profile of customers in a bid to better understand their behavior and recommend products or services they might like based on propensity modeling.

The second use case for big data at the bank is detecting suspected fraudulent incidents faster. The fraud team can now quickly generate a list of customers “who the bank thinks are conducting fraudulent activity” for further investigation. The bank believes it can further improve fraud detection with big data, moving towards more proactive detection.

Further use cases are under development. For now, work is ongoing to bring all data users across the bank onto the new big data governed platform that gives IT administrators control over data, reports, roles and functionality permissions for all users. It will ultimately be a single source of truth for all data held by PT Bank Danamon. “Eventually, all our divisions will be accessing the same data from a single repository. From the data analytics team to finance, risk management, operations, and even our branch network, everyone will be using it,” Setiawan said.

The post How PT Bank Danamon Uses Data to Understand and Respond to Customers appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Gartner has recently released its 2019 Market Guide for Data Preparation ([1]), its fourth edition of a guide that was first published in the early days of the market, back in 2015 when Data Preparation was mostly intended to support self-service uses cases. Compared to Magic Quadrants, the Market Guide series generally cover early, mature or smaller markets, with less detailed information about competitive positioning between vendors, but more information about the market itself and how it evolves over time.

While everyone’s priority with these kinds of documents might be to check the vendor profiles (where you’ll find Talend Data Preparation listed with a detailed profile), I would recommend focussing on the thought leadership and market analysis that the report provides. Customers should consider the commentary delivered by the authors, Ehtisham Zaidi and Sharat Menon, on how to successfully expand the reach and value of Data Preparation within their organization.

After searching the report myself, I thought I’d share three takeaways addressing our customers’ requirements in that exciting market.

Data Preparation Turns Data Management into a Team Sport

Self-Service was the trend that started the data preparation market. This happened at a time when business users had no efficient way to discover new data sources before they could get insights, even after they were empowered with modern data discovery tools such as Tableau or Power BI. They had to depend on IT… or alternatively create data silos by using tools like Microsoft Excel in an ungoverned way.

Data Preparation tools addressed these productivity challenges, an area where reports have shown that data professionals and business analysts spend 80% of their time searching, preparing protecting data before they could turn actually turn them into insights. Data Preparation came to the rescue by enabling a larger audience with data integration and data quality management.

This was the challenge in the early days of the 21st century, but since that time data has turned into a bigger game. It is not only about personal productivity but also about creating a corporate culture for data-driven insights. Gartner’s Market Guide does a great job at highlighting that trend: as disciplines and tools are maturing, the main challenge is now to turn data preparation into a team sport where everybody in the business and IT can collaborate to reap the benefits of data.

As a result, what’s critical is operationalization. In order to capture what lines of business users, business analysts, data scientists or data engineers are doing ad-hoc and turn it into an enterprise-ready asset that can run repetitively in production in a governed way. Ultimately, this approach can benefit to enterprise-wide initiatives such as data integration, analytics and Business Intelligence, data science, data warehousing or data quality management.

Smarter people with smarter tools… and vice-versa

Gartner’s market report also highlights how tools are embedding the most modern technologies, such as data cataloging, pattern recognition, schema on read or machine learning. This empowers the less skilled users to do complex activities with their data, while automating tasks such as transformation, integration, reconciling or remediation as soon as they become repetitive.

What’s even more interesting is that Gartner relates those technologies innovation with a market convergence, as stated in this prediction: “By 2024, machine-learning-augmented data preparation, data catalogs, data unification and data quality tools will converge into a consolidated modern enterprise information management platform”.

In fact, a misconception might have been to consider Data Preparation as a separate discipline geared towards a targeted audience of business users. Rather, it should be envisioned as a game-changing technology for information management due to its ability to enable potentially anyone to participate. Armed with innovative technologies, enterprises can organize their data value chain in a new collaborative way, a discipline that we refer to at Talend as collaborative data management, and sometimes also referred to as DataOps by some analysts, including by Gartner in the market guide.     

Take Data Quality management as an example. Many companies are struggling to address their Data Quality issues because their approach rely too heavily on a small number on data quality experts from a central organization such as central IT or the office of the CDO. Although those experts can play a key role in orchestrating data quality profiling and remediation, they are not the ones in the organization that know the data best. They need to delegate some of the data cleansing effort to colleagues that are working closer to where the data is sourced. Empowering those people with simple data preparation tools makes data quality management much more efficient.

The value of the hybrid cloud

Gartner also heard growing customer demands for Data Preparation being delivered through innovative Platform as a Service deployment models. What they highlight are requirements for much more sophisticated deployment models that goes beyond basic SaaS.  The report notes that “organizations need the flexibility to perform data preparations where it makes the best sense, without necessarily having to move data first”. They need a hybrid model to meet their constraints, both technical (such as pushing down the data preparation so that it runs where the data resides) and business (such as limiting cross borders data transfers for data privacy compliance).

This is a brilliant highlight, one that we are seeing very concretely at Talend: we are hearing sophisticated requirements in our Data Preparation tool with respect to hybrid deployments: Some of our cloud customers are requiring to run their preparations on premises. Others want a cloud deployment, but with the ability to access remotely to data inside the company’s firewalls through our remote engines. Others want to be able to operationalize their data preparations so they can run natively inside big data clusters.

Are you ready for Data Preparation? Why don’t you give it a try?

Enabling a wider audience to collaborate on data has been a major focus for Talend over the last 3 years. We introduced Talend Data Preparation in 2016 to address the needs of business analysts and lines of business workers. One year later, we released Talend Data Stewardship, the brother in arms of Data Preparation for data certification and remediation. Both applications were delivered as part of Talend Cloud in 2017. In Fall 2018, we brought a new application, Talend Data Catalog to foster collaborative data governance, data curation and search-based access to meaningful data.

And now we are launching Pipeline Designer. As we see more and more roles from central organizations or lines of business that want to collaborate on data, we want to empower those new data heroes with a whole set of applications on top of a unified platform. Those applications are designed for the needs of each of those roles in a governed way, from analysts to engineers, from developers to business users, and from architects to stewards.

2019 is an exciting year for Data Preparation and Data Stewardship. We added important smart features in the Spring release, for example extracting part of a name into respective sub-parts with machine learning or extracting parts of a field into subparts based on semantic types definition, i.e. the ability to split a field composed of several parts into the respective sub-parts. We improved the data masking capabilities, a highly demanded set of functions now that GDPR, CCPA and other regulation are raising the bar for privacy management. Stay tuned for other innovations coming in this year that leverage machine learning, deliver more options for hybrid deployment and operationalization, or allow to engage a wider range of data professionals and business users to collaborate for trusted data in a governed way.

Data Preparation is hot topic. Do you want to know more? What about seeing those apps in action? Or simply give them a try.

The post 3 Key Takeaways from the 2019 Gartner Market Guide for Data Preparation appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This post is authored by Datagrate. Datagrate helps their clients launch, manage and scale Talend’s best application integration solutions solving the most complex communication challenges.

When the Application Program Interface (API) first came into existence, developers viewed it as a revolutionary approach to creating re-usable software fragments. Instead of creating new code from scratch for every new program, they could now use existing functionality to develop new features. Not only did this decrease the amount of time needed to deploy a program but also meant they could leverage existing code which was already tried and tested.

Though the original concepts applied in software engineering do not have much in common with modern APIs, the fundamental idea has not changed much. Basically, developers use an existing code base to develop new programs. Advancements in technology have created countless practical use cases and opportunities for seasoned developers to set up competitive API frameworks.

Figure 1 Public API Growth – Source: programmableweb.com

The Role of the Internet in Shaping Current Trends

One major step forward came with the introduction of the World Wide Web in the late 1990’s. In addition to reusing APIs, software engineers were able to execute software remotely from any part of the world and get feedback in real-time.

<<Download the Field Guide to Web APIs>>

As the WWW has continued to mature, so was the use of these resourceful tools. Since the start of the 21st century, more people have access to the internet. Moreover, the web has created a whole new market both for new and existing business entities. By virtue of these changes, people, devices and business systems today need a standard for seamless communication. To address these and other needs, developers have created various API frameworks based on varying application scenarios.

Application Strategies for APIs

1.     Creating an API to Solve Business Challenges

APIs are the connecting points that facilitate interaction between people, their devices and the digital systems they use. They are comparable to a waiter in a restaurant who connects the cook (software) to the diner (user), getting the order and communicating it so as to get food and deliver it.

Within an organization, one might want to get data from a given database and display it on a different application. An API endpoint will communicate with the database to draw out the required statistics and communicate it to the user. To illustrate this, consider how travel sites work. They gather information from the traveler about budgetary and cabin preferences as well as travel dates among other details. They connect to airline APIs which dip into airline databases and offer options for the trip.

2.     Creating an API as a Business Case

On the other hand, an API can in itself become a full business for your company. For instance, Twilio has built a “Communication API” allowing business persons to chat using WhatsApp, SMS or other means. They sell their API to interested parties as a business case for enabling communication.

Opportunities for API Implementation

With the above points in mind, take a look at the following enterprise strategies for APIs:

·      As an Information Platform – Companies with different data buckets, databases or multiple cloud-based applications often need solid APIs. These enable them to collect data from different sources and display them from a consolidated endpoint. As such, the user will get a more holistic perspective of their business.

·      As a Product – This option requires the most experience as it involves creating a unique product to fill a market gap. In this case, the design and purpose of the API is the core determining factor of the success or failure of the business. Its key advantage is having a competitive advantage as the product will be new to the market.

·      As a Service – Modern enterprises usually have IT departments that include multiple systems and applications, each of which handles a specific role. Using an API to bundle the applications, each having a subset of features, is a great way to create a single all-inclusive service. Such services are reusable internally for the implementation of additional business needs.

API Integration in Practical Scenarios

In order to get inspiration for possible business applications of APIs, take a look at some of the ways the technology can be used in various business processes and sectors:

a)    eCommerce

eCommerce was among the earliest beneficiaries of API integration and for good reason. The business sector involves lots of moving transactions which require high speed. Third party APIs are, therefore, a necessity and come in all conceivable forms. Credit card processing is one of the most common of these but there is potential for numerous other innovations.

For instance, business entities can use exposed APIs to:

·      offer information about stock movement to partners

·      help in automatic transaction processing

·      allow access to customer portals

b)    Obtaining Customer Insights

For most business models, there are different departments, each of which has its own data pool. Notably, all such information is a valuable resource for understanding the customer. While CRM and ERP systems usually have more than a single database, this might be limited to financial or transactional aspects.

Using an API offers the opportunity to pull and combine all business data, presenting a comprehensive, real-time picture of the client. This combination of data and easy-to-digest presentation can give an entrepreneur a peek into different client personas. Armed with this information, they can tailor the approach to optimize customer experience.

c)     IoT

Advances in technology have made it possible to computerize almost everything around us. Applying the API concept in this area facilitates real-time interaction between such devices and relevant parties. IoT devices connect such “things” as automobiles, thermostats, medical devices and others within ecosystems. APIs come in handy by exposing the items as interfaces allowing access to them using apps. By virtue of this, users get control and can access the devices at any time and request for live data.

d)    Data Governance

More and more organizations are looking for ways to share data across their enterprises. In cases where such organizations have a multi-shop model, the data being shared may vary from one environment to the next. For example, a business which offers both B2B and B2C solutions might have to offer the different customers different pricing information and other data.

An API can help in such situations if it is implemented as a layer on top of existing data management frameworks. It will act as a control center ensuring that the correct data flows to the correct customer.

e)     Real-Time Supplier Communication

For cloud application users, APIs are more of a necessity than an option. Some providers require the installation of a third party endpoint to allow access to data. These applications have the objective of offering live updates. The use of a reliable endpoint in this case is a key to ensuring real-time communication with suppliers. It could also mean getting updates faster than competitors and thus offer you an edge.

Figure 2 API Usage by Sector – Source: programmableweb.com

From Theory to Practice

Considering the above opportunities and practical use cases for APIs, creating a valuable application that solves a real market need seems achievable. One of the most important factors that determine the success (or lack thereof) has to do with identifying a business case for your potential API. A viable concept does not necessarily have to make headlines but needs to fulfill a specific need in an organization. The need should be the driver of the project.

Business changes such as acquisitions and mergers are a good example of situations that create a need. Changes of this sort create significant needs for new systems, data streams or applications.

To start with, a developer can create lightweight endpoints offering access to data or other functionality. With time, they can promote this to business partners, clients or other organizational departments.

In order to sell an API as a product or service, it is important to meet client expectations and deliver value for the investment. Some of the basic features that will make this possible are creating one that is easy to use, well-documented, reliable as an endpoint and future-proof.

Keep in mind though that many business operators consider reusability as a major strength of a good API. As such, creating a flexible, generic API opens up a wide door of opportunity as it offers increased an output for the end user. While still on the matter of flexibility, remember that modern APIs have remote capability, and can thus be used from any part of the world and on any device.

Possible Challenges in API Development

Whether you are setting up your very first custom API within your organization or creating an API ecosystem, there are 2 major challenges likely to stand in your way:

·      Business acceptance – The target users need to understand the benefits they are likely to gain from the API. Since APIs are intangible, this can be a serious challenge. They usually operate silently in the background as part of middleware. A good selling strategy to make the benefits obvious involves highlighting the flexibility, reusability and generic design that your model offers. To enhance user acceptance, ensure that it offers the desired set of features and is not too complex or error-prone.

·      Technical maturity – Business operators will often notice the benefits of the model when it becomes unavailable. Much as this might seem like a desirable effect, operational reliability takes higher priority. Solid architecture and a reasonable level of experience will reduce the chances or frequency of system downtime. Ensure that it meets high-quality standards before taking it to the market.

Staying Ahead of the Competition

In recent times, there has been an increase in service providers looking to address business needs using various API designs. Companies no longer have to endure a one-size-fits-all approach, but rather, they use a best-of-breed approach. This means seeking out the best available software to meet their individual business challenges. For developers, software vendors and service partners, it underscores the need to ensure that applications offer flexible, comprehensive solutions which facilitate optimal integration.

<< Watch Next: Taming Complex and Hierarchical Data Structures>>

The post Time-Tested Insights on Creating Competitive API Programs appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

We are in the era of the information economy. Now, more than ever, companies have the capabilities to optimize their process through the use of data and analytics. While there are endless possibilities to data analysis, there are still challenges with maintaining, integrating, cleaning it to ensure that it will empower the people to take decisions.

Bottom up, Top down? What is the best?

As IT teams begin to tackle the data deluge, a question they often is often asked if this problem should be approached from the bottom up or top down. There is no “one-size-fits-all” answer here, but all data teams need a high-level view to help you get a quick view of your data subject areas. Think of this high-level view as a map you create to define priorities and identify problem areas. This map will allow you to set up a phased approach to optimize your most value contributing data assets.

The high-level view, unfortunately is not enough, to turn your data into valuable assets. You also need to know the details of your data.

Getting the details from your data is where a data profile comes into play. This profile tells you what your data is from the technical perspective. The high-level view (the enterprise information model), gives you the view from the business perspective. Real business value comes from the combination of both views. A transversal, holistic view on your data assets, allowing to zoom in or zoom out. The high-level view with technical details (even without the profiling), allows to start with the most important phase in the digital transformation: Discovery of your data assets.

Not Only Data Integration, But Data Integrity

With all the data travelling around in different types and sizes, integrating the data streams across various partners, apps and sources has become critical, but it’s more complex than ever.

Due to sizes and variety of data being generated, not to mention the ever-increasing speed in go to market scenarios, companies are looking for technology partners that can help them achieve this integration and integrity, either on premise or in the cloud.

Talend is one of the companies determined to be this partner. Starting as an open source ETL tool, Talend has evolved into an enterprise grade cloud data integration and data integrity platform. This vision becomes clear in the unified suite of applications they offer and focus to get the foundation of your data initiatives right.

Talend strategically moves data management to the cloud to provide scalability, security and agility. The recent acquisition of the Stitch Data platform and full support for the only ‘made for the cloud’ data warehouse platform Snowflake, makes the offering even more complete

Your 3 Step Plan to Trusted Data

Step 1: Discover and cleanse your data

A recent IDC study found that only 19% of data professional’s time is spent analyzing information and delivering valuable business outcomes. They spend 37% of their time preparing data and 24% of their time goes to protecting data. The challenge is to overcome these obstacles by bringing clarity, transparency, and accessibility to your data assets.

Building this discovery platform, which at the same time allows you to profile your data, to understand the quality of your data and build a confidence score to build trust with the business using the data assets, comes under the form of an auto-profiling Data Catalog.

Thanks to the application of Artificial Intelligence and Machine Learning in the Data Catalogs, data profiling can be provided as self-service towards power users.

Bringing transparency, understanding and trust to the business, brings out the value of the data assets.

Step 2: Organize Data You Can Trust and Empower People

According to the Gartner Magic Quadrant for Business Intelligence and Analytics Platforms, 2017: “By 2020, organizations that offer users access to a curated catalog of internal and external data will realize twice the business value from analytics investments than those that do not.”

An important phase in a successful data governance framework is establishing a single point of trust. From the technical perspective this translates to collecting all the data sets together in a single point of control. The governance aspect is the capability to assign roles and responsibilities directly in the central point of control, which allows to instantly operationalize your governance from the place the data originates.

The organization of your data assets goes along with the business understanding of the data, transparency and provenance. The end to end view of your data lineage ensures compliance and risk mitigation.

With the central compass in place and the roles and responsibilities assigned, it’s time to empower the people for data curation and remediation, in which an ongoing communication is from vital importance for adoption of a data driven strategy.

Step 3: Automate Your Data Pipelines & Enable Data Access

Different layers and technologies don’t make our lives easier to keep our data flows and streams aligned and adopt to swift and quick changes in business needs.

The needed transitions, data quality profiling and reporting can extensively be automated.

Start small and scale big. A part of this intelligence these days can be achieved by applying machine learning and artificial intelligence. These algorithms can take the cumbersome work out of the hands of analysts and can also be better and easier scaled. This automation gives the analysts faster understanding of the data and build better faster and more insights in a given time.

Putting data at the center of everything, implementing automation and provisioning it through one single platform is one of the key success factors in your digital transformation and become a real data-driven organization.

This article was made in collaboration with Talend, and represents my view on data management, and how these align with Talend’s vision and platform.

About the author Yves Mulkers:

Yves is an industry thought leader, analyst and practicing BI and analytics consultant, with a focus on data management and architecture. He runs a digital publication platform 7wData, where he shares stories on what you can do with data and how you should do it. 

7wData works together with major brands worldwide, on their B2B marketing strategy, online visibility and go to market strategy.

Yves is also an explorer of new technologies, and keeps his finger on what’s happening with Bigdata, Blockchain, Cloud solutions, Artificial Intelligence, IoT, Augmented Reality / Virtual Reality, future of work and smart cities, from an architecture point of view, helping businesses build value from their data.

The post Can You Trust Your Analytics Dashboard? 3 Steps To Build a Foundation of Trusted Data appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Founded in 2016 and launched in February 2017, PointsBet is a cutting-edge online bookmaker in Australia offering both traditional fixed odds markets as well as “PointsBetting – where winnings or losses aren’t fixed but depend on how accurate the bet turns out to be.

Online betting platforms have long recognized the need for a strong, resilient IT infrastructure. Even a minor glitch during a major sporting event can be disastrous. The losses can run into millions of dollars. Good data systems are also needed to comply with a myriad regulation that online bookmakers are subjected to, such as for example providing reports and information to authorities and racing organizations to meet license conditions.

Eyes on a more reliable Cloud data management solution

PointsBet standardized on Talend Cloud and Microsoft Azure due to its scalability to handle peak online gaming requests and agility to quickly spin up new projects. Talend was a natural fit with out-of-the-box native support for Azure Blob Storage, CosmosDB, SQL Data Warehouse, and SQL Server, Azure SQL Server, Native SQL Server and the flexibility to run workloads in the Cloud or on-premises.

Talend Cloud was live in days and for PointsBet, that meant connecting everything to everything, extracting data from every part and system in the organization – transaction, betting, customer and statistical systems – and providing a unified view of all the requested data. Using Talend, PointsBet managed to accelerate development time from eight hours to one.

“Talend Cloud’s quick and successful introduction meant that we were able to comply with regulations and keep our promise and launch into the United States as planned. “Maayan Dermer, Data Analytics & Business Intelligence Lead / Solution Architect

Entering the US market

As PointsBet starts launching its online sports betting products throughout the US, Talend Cloud plays a vital role in supporting PointsBet’s ability to quickly expand while ensuring the company maintains compliance with varying state regulations. This is important for compliance but also to gain license permission to operate in new countries.

<<Read the Full Case Study>>

Using Talend, PointsBet managed to off-load many software components from their backend engineers. For example, one ETL process that was initially estimated at 2 weeks of work was done in 4-6 hours using Talend. In the future, Talend is set to be expanded to underpin the data needs of the entire Australian business. Once PointsBet expands into more US states, the company is likely to stand up another data warehouse “to consolidate all the data from all of our instances around the world. We’re looking to use Talend for that as well,” Mr. Dermer added.

PointsBet: Tuning up for the US market breakthrough - YouTube

The post PointsBet is Tuning Up for a US Market Breakthrough appeared first on Talend Real-Time Open Source Data Integration Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview