Loading...

Follow Visual BI Solutions on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Power BI supports extensive Data Analysis Expressions(DAX) scripting to achieve complex business scenarios. It is a good practice to maintain a document for each report which helps in future development. The Document can include screenshots of Report Pages, Purpose of the Report, Lists of Navigation, Tables and Queries and a complete listing of the DAX formulas.

All this information is easily achievable except for the DAX formulas. Using DAX users can create Measures and Columns in PowerBI.  However, there is no direct option to export the DAX formulas for documentation purposes. This blog will cover exporting the Measure and Column definitions into a CSV file from Power BI.

Steps to export the DAX formulas

1.  Navigate to Model View.

2. Click File Menu -> Export as Power BI Template.

3. Exported PowerBI template will have the extension .pbit. Open the PowerBI template file as RAR file.

4. List of files in PowerBI template.

5. Click and Open DataModelSchema in any JSON editor.There are plenty of JSON editor available  online.  You can access the below link to use the same JSONeditor used here in this blog.
https://jsoneditoronline.org/

6. Generally, JSON  is represented as a Key Value Pair. You can Navigate to the Key Value Pair of tables which will list all the  tables used in the PowerBI report.
E.g.: In the below highlighted section we use the Key as tables and Value within the set bracket[]

7. For each key value pairs inside tables key have following structure. Columns and Measures created in PowerBI will have separate key value pair. Here we have just highlighted the Measure Key Value Pair.

8. After expanding the Measure key, we can get the value of each Measures. Here the value as DAX formula.

9. Copy the value of the Key Measure.

10.  Paste the copied Value of the Key Measure in any of the JSON to CSV editor. You can refer this website to learn more. https://json-csv.com/

11. CSV File will have the following structure. This is the based Value of the Key MeasureIt may differ for the Key Value Pair of Column.

By this we have achieved exporting Measure and Column formulas from Power BI to a CSV File.

Know more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post Exporting Measure and Column formulas from Power BI appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Dynamic interaction is a key feature that helps to explore and understand data better for any dashboard or story. SAP Analytics Cloud Linked Analysis feature allows us to perform dynamic interaction between widgets. Earlier, each story page has a single linked set, upon which the selections can be passed down as filters. But from version 2019.11 onwards, each chart can have its own set of widgets linked. This blog features on how to fully utilize the Linked Analysis feature for better interaction.

Consider a scenario where you have a Sales Summary page that has various charts for Region, Category, Sub-category, Trend and Top 10 Products. Let us see various options used to enable dynamic interactivity.

Linked Analysis in Action

 

Widget-Specific Linked Analysis

Prior to version 2019.11, the option to enable Linked Analysis was available in the Toolbar. There was only a single linked set for each page. Now since the Linked Analysis is specific to each widget, you can find the option in the Quick Menu of widgets.

Linked Analysis option in Quick Menu

Linked Analysis can be enabled for the following widgets.

    1. Chart
    2. Table
    3. Geo Map
    4. Input Control
Widget as a Story Filter

In the scenario mentioned above the Donut Chart showing Sales per Region can be used to filter the entire story so that it is easy for Regional Managers to analyze their data. In the Linked Analysis panel the option ‘All widgets in the Story’ is enabled. Under Settings, there also an option to override any existing cascading effects.

All widgets in the story option

Once the user filters a member in the Donut Chart a Story Filter is automatically added. The user can manually remove the Story Filter without affecting the Donut Chart. One limitation is that there is no option to enable the filter on data point selection if the widget is used as a Story Filter.

Story Filter added by default

If you want the selection to only affect the page, you can choose the option ‘All widgets on this Page’. To enable filters on selected data point choose the option widgets as Page filters.

All widgets on this page option

Linked Analysis for Input Control

In the Sales Summary scenario, the input control to choose Years must not affect the Trend Chart. The option ‘Only selected widgets’ is enabled. Then the Trend Chart is removed from the list of widgets that can be linked. This way, selections made in Input Control will not affect Trend Chart. Unlike Charts, for all Input Controls, there are only two options in Linked Analysis. The Input Control can either affect the whole page or selected widgets.

Custom Linked Set

Filter on datapoint selection

In the bar chart ‘Sales per Category’, the common selection mode is to choose a single bar. The option ‘Filter on data point selection’ is enabled. In case of the scatterplot for Sub-categories the common selection mode is Lasso and the option data point selection is not enabled so that the user can filter the values in a scatterplot for it to affect other linked widgets.

Linked Analysis of Category and Sub-Category chart

There is also an option to automatically connect newly created widgets while configuring Linked Analysis.

Please note that Linked analysis is supported in all import and live data connections. If there are two models used in Story, make use of the Link Dimensions option which allows us to apply linked analysis for widgets from two different models.

Reach out to our team here to know more about SAP Analytics Cloud other offerings from Visual BI Solutions.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post All About Linked Analysis in SAP Analytics Cloud appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In this blog, we will see how to embed a Google Map in SAP Analytics Cloud Application. This blog will primarily focus on pinning a selected member (eg. a store) in Google Map with the help of R widget. Also, we will learn how to retrieve the dimensions that are not displayed in the chart.

The scenario is to pin the selected store from the chart on to a Google Map. You can see the embedded Google Map in action below.

Google Maps in Action

Defining Chart Structure

Let us consider an example of Sales data. Include Sales (SALE_DOLLARS) measure and Store (STORENUMBER) dimension in a chart. The Store is set to display Description.

SAP Analytics Cloud Bar Chart Structure

Retrieving and passing the selected store to R Visualization

Add the following script to fetch the user selected store. Since you cannot directly fetch the latitude and longitude information of the selected store ID, the selected member is passed to R Visualization which can be used to extract latitude and longitude.

SAP Analytics Cloud bar chart – OnSelect event

The ID of the user selected store is stored in a global variable store_id. Then the store ID is assigned to the R variable StoreNum using the setNumber() API. Another global variable flag is used to ensure the event onResultChange which is executed only after user selection. This will be explained later.

The global variable store_id is defaulted to zero and passed to R Visualization in the onInitialization event so that the application runs without any error at the startup. The definition of global variable store_id is also mentioned in the snapshot below.

Initializing global variable store_id on application start up

Initializing R Visualization Structure

Select the necessary fields that are required (i.e) Latitude, Longitude, Store, and Sales. You can visit this blog to get a detailed perspective on the R data frame and how to set it up in SAP Analytics Cloud. The dimension Store is set to display ID in the R data frame as shown in the below snapshot.

R visualization data frame initialized to ID display property

As this R widget doesn’t visualize anything, you can hide it by disabling the option Show this item at view time from the Styling panel.

Extracting latitude and longitude using R script

Add the following R script to extract the latitude and longitude of the selected store.

R script for filtering and extracting dimensions based on user selection

Here the script filters to the StoreNum input parameter. Then corresponding latitude and longitude values are extracted from the data frame and stored in R variables LAT and LONG. These R variables can later be retrieved to embed Google Map.

Embedding Google Map

Add a WebPage widget to the canvas which can be used to show the Google Map. Then add the following script in the onResultChange event of R Visualization to pass the latitude and longitude through URL.

Manipulating Google Maps link dynamically

The global variable flag confirms the script is executed only after user selection. Then the R variable values (latitude and longitude) are assigned to local variables. These local variables are then appended to the Google Map URL (https://maps.google.com/maps?q=) and set to the Webpage widget.

Now save and run the application. When you select a store, the store ID is passed to the R Visualization from the onSelect event. Once the latitude and longitude are extracted using R script, the onResultChange event is executed which sets the proper Google Map URL to the Webpage widget.

You can also plot an area or city in Google Maps (or others) applying the same logic. This blog is limited to pinning one coordinate at a time based on user selection. You can also expand the functionality to multi-select and pin multiple locations.

Reach out to us here today if you are interested in evaluating if SAP Analytics Cloud is right for you.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post SAP Analytics Cloud – Embedding Google Maps by adopting R Visualization appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Power BI has brought in a new feature that lets us understand the data set that we are working with, helping us identify if the dataset is a Certified Dataset or a Verified Dataset.

This feature will help organizations to understand which is the real and true data that needs to be used on reports. The best example will be the Financial data used by financial firms or the stock market data. Datasets can be in N number of copies but only one will host the true datapoints without alterations.

This is where Power BI helps report users based on categories of users to make use of the correct data to truly gain insights to report on them.

Power BI achieves this by showcasing the concept of Endorsements.

To use this new feature, we need to navigate to the settings of the dataset.

Shared Datasets

Datasets can be shared among different users based on their permission or access level granted on the workspace. We can manually add in the users from the Superuser who creates the workspace, who can then assign the respective users with their roles in the workspace.

The New Experience Workspace is now needed for these datasets to be shared.

When we are creating a new workspace, we will need to set the permissions for the users. We can allow them to connect to the app’s underlying datasets via build permissions.

Build Permissions of Datasets in Power BI

Power BI also provides the user with Data Governance features such as the ability to provide access to datasets or limiting the users from using the datasets.

An admin or member for the workspace where the dataset resides can decide during app publishing on whether the users with permission for the app also get the ‘Build permission’ for the underlying datasets.

Control the use of datasets across workspaces:

Admins can control the flow of datasets within the organization. The Power BI admin sometimes can restrict the flow of information to other Power BI tenants.

Lineage Tracking

Power BI has introduced another new feature on its datasets which is called as Lineage View. This view helps us in understanding which datasets and which reports are consuming what kind of endorsed or non-endorsed datasets.

Dataset owners in Power BI will be able to see the downstream use of their shared datasets by other users through the related content pane, allowing them to plan changes.

We can also see the initial source of data from which these datasets are being created.

Limitations
  • Building a report on top of a dataset in a different workspace requires the new workspace experience at both ends: The report needs to be in a new workspace experience and the dataset needs to be in a new workspace experience.
  • In a classic workspace, the dataset discovery experience only shows the datasets in that workspace.
  • You can create reports in-app workspaces that are based on datasets in a different workspace. However, you can’t create an app for a workspace that contains those datasets.
  • Free users in Desktop only see datasets from My Workspace and from Premium-based workspaces.
  • If you want to add a report based on a shared dataset to an app, you must be a member of the dataset workspace. This is a known issue.
  • “Publish to the web” doesn’t work for a report based on a shared dataset. This is by design.
  • If two people are members of a workspace that is accessing a shared dataset, it’s possible that only one of them can see the related dataset in the workspace. Only people with at least Read access to the dataset can see the shared dataset.

Know more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post Understanding Certified and Shared Data Sets in Power BI appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Visual BI Solutions by Aishwarya Arumugasamy - 2w ago

Azure supports cross querying in Azure SQL Database through elastic queries. Elastic queries allow us to run Transact-SQL that works with multiple Azure SQL Databases and can connect to Microsoft tools like Excel, PowerBI and other third-party tools like Tableau to query across data tiers with multiple databases. Through this feature, we can query out in- large data tiers and visualize the result in business intelligence (BI) Reporting tools.

Advantages of using Elastic Queries
  • Elastic queries support read-only querying of remote databases, and SQL server users can migrate applications by linking servers between an Azure SQL environment and on-premises
  • Elastic queries are available on both the standard tier and premium tier
  • We can execute stored procedures or remote functions using sp_execute _remote and push SQL parameters for execution on a remote database
  • Through elastic query, external tables can now refer to remote tables with a different table name or schema
  • According to customer scenarios, elastic queries are categorized as the following partitioning,
    • Vertical partitioning – Cross-database queries: A vertical elastic query is to query among vertically partitioned databases i.e., multiple databases that contain tables of different schema on different data sets. For instance, all tables for HR are on one database while all Finance tables are on another database. This partitioning helps one to query across or to build reports on top of tables in multiple databases
    • Horizontal Partitioning – Sharding: The process sharding is to distribute a huge volume of data having identical schema among different databases. For instance, this means distributing a huge amount of transaction table data among multiple databases for improved performance. To achieve this, elastic database tools are used where an elastic query is required to query or compile reports across multi shards
Elastic Queries in Vertical Partitioning

Data located in one SQL Database can be made available to other remote SQL Databases through elastic queries. The schema and structure of these databases can vary. This process is also known as scaling up.

Steps for implementation

Let’s assume that there are four databases namely HR, Finance, Products, CRM and here we will perform cross querying in Azure SQL Database. To execute the below queries, the user must have to ALTER ANY EXTERNAL DATA SOURCE permission under ALTER DATABASE permission. These permissions are needed to refer to the underlying data source.

1. Create database Master Key i.e., a symmetric key which is used to protect private keys of certificates and asymmetric keys that are available within the HR database as shown below.

2. Create a database scoped credential which is not mapped to a server login or database user but used by the database to access the external location anytime to perform an operation that requires access.

3. Create other external data sources for remote databases like Finance, Products, CRM with type as RDMS within the HR database. Here in the below image, we have created a data source for Finance but one or many data sources can be created as per the number of databases.

4. Create an external Table for Elastic Database query. For an external table, only the metadata is stored in SQL along with basic statistics about the table referenced. No actual data is moved or stored in SQL Server. Here I have created an external table for Finance with the above-created data source.

Now we can access remote database finance from HR Database. Likewise, we can create data sources and external tables for other databases also.

Elastic Queries in Horizontal Partitioning

Database sharding is a technique to split large databases into smaller partition across identically structured databases. These individual units are called shards which reside on a separate database. This mechanism is also called scaling out. Through this process, the data maintenance became easier.

Steps to implementation

As a prerequisite, you need to create a shard map manager along with multi shards, followed by insertion of data into the shards. For more information on the development of shards, please refer here.

Let’s take CRM Database as an instance,

Once the shard map manager has been set,

1. Create database master key and database scoped credential as shown in vertical partitioning but here the database should have the same structure

2. Create an external data source in the CRM database and pass the name of the shard map created in shard map manager to SHARD_MAP_NAME

3. Create an external table in CRM database for the usage of an elastic database query. This table contains only metadata of the table that has been referenced.

4. Now we can connect the CRM database to any third-party tools like Excel and query out the data from the remote database, namely CRMdbsrc1.

Limitations of using Elastic Queries
    • While using elastic queries in the standard tier, the performance over large databases will be slow
    • Elastic queries are not preferred for ETL when there is a large amount of data movement to a remote database
    • Only the Read-Only operations over remote databases are supported.

Learn more about Visual BI’s Microsoft Azure offerings here.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post Cross Querying in Azure SQL Database appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

We are excited to give you a sneak peek into the upcoming version of our Visual BI Extensions for SAP Lumira.

In case you are not familiar with our Extensions yet, you can find some introduction material here VBX Extensions. Our Extensions are grouped into five main categories: Charts, Maps, Speciality, Utilities, Selectors

This month we will be releasing the new version VBX 2.4 and this blog offers you a preview of its highlights:

1. Charts

With this release, we are adding lots of new Charting capabilities, some of them are listed below:

Pyramid Chart

Activity Gauge

Lolli-Pop Chart

Dot Plot Chart

2. Maps

The tooltip in the Marker Layer of the ESRI Map can be configured to show any VBX chart with the column dimensions and z-axis measures.

3. Specialty

Analytics: Presenting a new way to observe the usage of your Dashboards by different groups in your organization. This provides insights into how your dashboard is being consumed. It will be critical in improving your dashboard design.

Gantt Chart Enhancements: We have brought lots of new enhancements on top of the Gantt Chart, now you can define conditional formatting based on different levels of your data and bring it as part of the table.

4. Utilities

VBX Theme: All the VBX components align with Lumira Application theme and change the application look and feel at par with the Lumira Theme. Just by changing the theme we can alter look and feel of the application, with no added effort.

Dashboard before transformation

Here you can see how the same dashboard transforms to give a radical look and feel using the VBX Theme.

Dashboard after transformation

We are looking forward to your feedback on our latest release. As you can see, there are several new additional options that we are providing for SAP Lumira Designer 2.2 as well as SAP Lumira Designer 2.3. Do note that this is just the “beginning” of our upcoming Roadmap with a lot of enhancements coming every quarter – so stay tuned for more details on these enhancements and a lot more enhancements in the coming months.

Know more about VBI Extensions (VBX) for SAP Lumira designer here.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post Top 10 features: New version of Visual BI Extensions for SAP Lumira Designer appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Paginated Reports are the SQL Server Reporting services which have come to Power BI. Through Paginated reporting, we can now print and share what we call “paginated” because they tend to be well formatted and fit well on a page.

1. Understanding Paginated Reports

Paginated reports are based on the Report Definition Language file in SQL Server Reporting Services. They are also called “Pixel Perfect” because you can control the report layout for each page. We can also load images and charts onto these reports. Paginated reports are best for scenarios that require a highly formatted, pixel-perfect output optimized for printing or PDF generation.

Paginated Report

2. Where to create Paginated Reports?

We can make use of Power BI Report Builder, a tool used to create paginated reports and have them published on the Power BI Service. This is a standalone tool from Power BI.
Alternatively, we can make use of SQL Server Reporting Services (SSRS). These reports are compatible with the Power BI Service. Power BI Service maintains backward compatibility.

3. Data Sources for Paginated Reports

Paginated Reports can have multiple data sources connected. It does not have an underlying data model. Report builder can directly fetch and read the data onto the reports from the server.

Currently, the below data sources are supported;

  • Azure SQL Database and Data Warehouse
  • Azure Analysis Services (via SSO)
  • SQL Server via a gateway
  • SQL Server Analysis Services via a gateway
  • Power BI Premium Datasets
  • Oracle
  • Teradata

There will be more additional sources added in the future.

4. Licensing for Paginated Reports in Power BI

We will need to either have a License purchased for Power BI Embedded or have a Power BI Premium – Capacity P1, P2 or P3. This is used to host the paginated reports onto the Power BI Service.

In order to use Paginated reports in your Power BI Service, you will need to do the following. In your workspace go to settings, under Admin Portal-> Capacity Settings.

Scroll down to Workloads-> Paginated Reports and turn it ON. You will need to specify a memory (capacity) provided for the paginated reports to be used on the Power BI Service.

Create a workspace and assign Dedicated Capacity by turning the toggle to ON.

5. Setting up subscriptions of the Reports to Users

We will need to click on the subscribe button which can be found on the top right of the Paginated Report. This will enable the option to send the Paginated Report as an email to users. You can then set the subscription frequency, body and header of the email to be sent with the pdf file of the paginated report.

6. Exporting options in Power BI Service

We can currently export Paginated Reports in multiple formats like Microsoft Excel, Microsoft Word, Microsoft PowerPoint, PDF, CSV, XML, and MHTML.

7. Limitations on Paginated Reports
  • Pinning report pages or visuals to Power BI dashboards. You can still pin visualizations to a Power BI dashboard from an on-premises paginated report on a Power BI Report Server or Reporting Services report server. See Pin Reporting Services items to Power BI dashboards for more information
  • Interactive features such as document maps and show/hide buttons are currently not possible
  • Subreports and drill through reports
  • Shared data sources and shared datasets
  • Visuals from Power BI reports
  • Custom Fonts
  • Bookmarks

Know more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post 7 Things to Know About Paginated Reports in Power BI appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Visual BI Solutions by Sathya Narayana Karthikeshwar - 3w ago

Visual feature in Power BI makes excellent use of Machine Learning and AI capabilities to derive insights about your data. This feature was introduced in February 2019 Power BI release. Key Influencers is Power BI’s first Artificial Intelligence (AI) powered visualization.

With Key Influencers now business users can gain insights on their data further leveraging Machine Learning capabilities. This blog features how to enable this feature in Power BI.

1. To enable this feature, we will need to go to Options -> Global -> Preview Feature and select the highlighted -> Key Influencers Visual.

Image 1: Enabling Key Influencer Visual in Power BI

2. The user may have to close and start Power BI Desktop again for this visual option to be shown on the Visualization panel

Understanding Key Influencers

The Key Influencer is an AI Visual within Power BI. This will have two tabs showcasing the visual’s usage.

1. Key Influencer Tab: This section of the visual will help in understanding the current selection of dimension and Measure performance with respect to the measure’s use. The Key Influencers tab will display a ranked list of the individual contributing factors that drive the selected condition. Let’s say we want to Analyze Sales (in Dollars) based on Store location and Volume Sold (Gallons).

Image 3: Selection Criteria for Key Influencer

Now the AI behind this visual will get triggered to get insights on the current selection of Analyze tab and Explain by scenario. Power BI will help us in understanding the visual that we have obtained by giving us an insight into the metric. For example, let’s say we want to check when is the ‘profit’ high and by how much ‘Volume (Gallons) needs to be sold’,

Image 4: Key Influencer Chart Visual

We will get a clear visual showcasing the scenario and where action needs to be taken. We can also check on what level of volume sold will result in reduced Sales. Key Influencer is an amazing feature and explains the impacting points on a metric.

2. Top Segments: This section tends to announce to the user when the expected profit is likely to be low or high. Segments are ranked based on the percentage of records where the condition is met.

Image 5: Segments in Key Influencer

When we select one particular segment, we will get a detailed insight into it.

Image 6: Detailed View of Segment Value

We can toggle if we want to see Key influence or Segments only. This is achievable by using either of the below which can be found under the Analysis tab.

Image 7: Toggling Key Influencer Tabs

In a nutshell, Key Influencer is an innovative feature that uses Machine Learning and AI to derive Metric or Key Performance Index. We expect Power BI to come up with even more features in its upcoming releases.

Know more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post Usage of Key Influencer Visual in Power BI appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Dataflow is the initial Data Preparation that takes place in Power BI for the report, to begin with. Power BI follows an ETL-Extract, Transform and Load process to perform the function. Power BI now brings the flexibility of ETL to be self-service through simple interface/navigation. Dataflows creation is performed inside the Power Query functionality.

Image 1: Structure in Power BI Data and Reporting

Data Flows can be easily be created by performing the below steps:

1. Navigate to your workspace and select on Dataflow
2. Go to the +Createon the top right to bring in a dropdown
3. Select the option –> Dataflow

Image 2: Data Flows Creation

The below mentioned 3 options will be visible inside the Dataflow creation:

1. Entities
2. Linked Entities
3. Common Data Model

1. Entities

An entity is a set of fields that are used to store data, much like a table within a database.

Image 3: Choose an option

Select an appropriate entity to start the dataflow creation. The user could see a simpler and rich UI design screen, helping us to choose the data source connection we need. This is applicable for cloud, on-premise or even a simple excel sheet.

Image 4: Data Sources in Power Query

Now, select the data source that you need to connect to your data.

Image 5: Connectivity gateway for the data source

Choose the appropriate tables inside to fetch the data from.

Image 6: Tables in Data Source

Once the tables are selected, we can then proceed to use the dataflow editor – ETL. This step is very similar to the initial power query we have for cleansing our data in Power BI desktop but hosts much more advanced functionalities to cleanse, refresh and schedule your data.
Once you’ve created a dataflow, you can define its ‘refresh frequency’ by setting a refresh schedule in the dataflow settings. You can also use the Advanced Editor for defining your queries in M language.

2. Linked Entities

Linked entities allow users to reuse data which exists in the lake, thereby allowing them to manage and organize projects with complex ETL processes, from one or multiple sources, while also letting analysts build on each other’s work.

3. Common Data Model

These are Microsoft Standardized Schemas for your data. Once we have finished our cleansing process we can start with the transformation mapping fields process and leverage the use of a common data model. To leverage the Common Data Model with your dataflow, click on the ‘Map to Standard’ transformation in the ‘Edit Queries’ dialog.

Image 7: Mapping Fields

If any fields do not get mapped to the common data model fields they are pushed to be null. You can then proceed to save your dataflows. Finalize your dataflows and create ‘scheduled refresh’ for your data.

Image 8: Refresh Scheduling in Power BI

Now you can consume the dataflows directly in Power BI Desktop and use them for your reporting and analysis.

Know more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post Dataflow Creation and Usage in Power BI – The Self Service ETL appeared first on Visual BI Solutions.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The Legend of a Chart plays an important role in enhancing the Visualization appeal and Interactive ability of the Chart. Legends are no longer simply static indicators of the data series displayed in the chart but have grown in function to provide customizable selection options to a user viewing the chart.

The VBX suite of charts provides an option by default to de-select a data series by clicking on the legend. But some users might encounter situations where de-selecting a series on-click of legend is not desirable, they might want a different outcome such as highlighting the data series of the legend option which was clicked and de-selecting the remaining data-series (or) greying out the remaining data series. For such custom requirements, VBX Script Box is the go-to component.

The following code snippet allows you to achieve the scenario as shown in the image above with the help of the VBX Script Box:

Explanation for code

1. All properties and data of a chart whose legend event is to be customized are stored as a JQuery object

2. Length variable stores the number of legend options available in the chart, we will be using it as an iteration variable

3. Each legend option is stored as an array element, so, the event associated with each legend option needs to be re-written, hence the first for- loop where the legend on click event is being modified using a function. (Note: legendItemClick events are stored in an array, hence we can push events to this array as well, but for simplification, we are re-writing the first event which is defined by default. Hence, the function is assigned to legendItemClick[0])

4. ‘isShow’ variable is Boolean and is assigned true if the current series index is equal to the series index of previously clicked legend option i.e. if the same legend option is clicked twice consecutively

5.The second for- loop iterates for each series item, if – else condition checks if a series index of the current iteration is equal to the series index of a selected series item in which case it shows the series or else hides it. The ‘isShow’ checks if the same option has been clicked again and in that case, all series are shown

Likewise, this event can be customized to change the selected series color, show/hide the series, highlight the series, etc based on the user’s requirements, thus delivering the expected functionality to the BI user.

The VBX script box can be used to achieve more such customizations to any component available on Lumira Designer. Please follow the below links for a few other interesting scenarios:

Dashboard Hacking with VBX HTML Box for SAP Lumira Designer
Conditionally Format Dimension in VBX Advanced Table using VBX ScriptBox for Lumira Designer

Know more about Visual BI Extensions (VBX) for SAP Lumira Designer here.

Subscribe to our Newsletter
Please leave this field empty

Thank you for subscribing to our blogs. You'll hear from us soon.

The post Customizing Legend Selections using VBX Script Box appeared first on Visual BI Solutions.

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview