Loading...

Follow Fusion Analytics World on Feedspot

Continue with Google
Continue with Facebook
or

Valid

The Ministry of Electronics and Information Technology (MeitY) has been trying to develop the common service centre (CSC) scheme to help connect people in remote parts of India. Indian government plans to use AI & data analytics to improve services across sectors such as education, finance, and healthcare, among others. CSCs are facilities created under the Digital India programme.

According to 3rd party reports, the vision is to develop CSCs into a reliable and far-reaching IT-enabled network of citizen service points connecting the local population with various government departments, business establishments, insurance companies, banks and educational institutions.

A representative at CSC SPV said that the collaboration will help utilise technologies for the delivery of various services to citizens. Education, financial inclusion, and telemedicine are the areas where innovative technologies can be utilised to improve the quality of life for Indian citizens.

Also, the Statistics Ministry and CSC SPV have decided to use a mobile phone application to conduct the country’s 7th economic census, beginning in June. It will speed up the process of data collection and analysis.

The census will provide insights into economic activities and ownership patterns of businesses across the country. According to a statement, the ministry has introduced geo-tagging which will help find out the distribution of economic activity in a certain place.

The ministry said that more than 6,000 training workshops will take place within a month for CSC enumerators. It will kick-start with a state level workshop in Madhya Pradesh & gradually spread across all the states with the district level workshops. There will be two rounds of supervision to ensure the census is accurate.

This initiative is expected to set a precedent for the entire country for carrying out a large-scale survey work similar to this.

 

The post GoI to use AI, data analytics to improve e-services in rural sector appeared first on Fusion Analytics World.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Data is the new oil – You must have heard this one before. As we steer into new digital ecosystems the role of data is becoming more and more critical. Data is no longer some factual information but a vital element in making key business decisions. According to Oxford Economics and SAP, 55% of businesses use data to drive their decisions. Consequently, the tools for collecting, processing and analyzing the data has also seen an expansion in recent years.

While the collection of data is widely talked about, much less is said about data quality. Though data is playing a vital role, it’s quality remains a troubling problem for many businesses. What’s worse is how relying your results on inferior quality data can skew your company’s results in a negative direction.

Fortunately, for those who make data quality a top priority, trustworthy data is within reach. All you need is a framework and some suggestions to make sense of all the data available. This guide will just do that. It will help you analyze and extract high-quality data from all your digital analytics efforts.

The 5 Dimensions of Data Quality in Data Analytics 

  1. Accuracy – Does your data reflect the exact thing happening?
  2. Completeness – Is your data sufficient enough?
  3. Cleanliness – Is your data error free?
  4. Timeliness – Is your data available when needed?
  5. Consistency – Is your data consistent over platforms? 

Let’s discuss each of these dimensions in detail…  

  1. Accuracy of the Data Available: 

To understand this aspect of data quality, you must first understand what accuracy and precision are?

Accuracy: Accuracy refers to the degree to which the measurement is close to its true value.

Precision: Refers to the repeatability or reproducibility of a measurement

The quality of your data analytics depends on these two factors. These are the figures which drive your marketing campaign, key business decisions, how to manage your digital presence etc. With so many decisions being influenced by this factor it is wise to say that accuracy and precision in digital analytics is a non-negotiable factor.

But in recent years with various tools coming in with their own performance metrics, the reliability of those metrics is under question. For example – Can you trust the accuracy and authenticity of the metrics in your keyword planner tool?

Given this, some companies devise their own tools and techniques to measure the data accurately and maintain quality. Yet this is too risky since an in-depth knowledge of quality factors and a certain level of how to refine the data knowledge are absolutely necessary for accurate metric calculation. 

 

Two Important Factors to Consider While Laying The Foundation of Accurate Data Measurement.  

  • Calculations behind your KPI’s: 

Let’s assume your KPI is the number of sessions a visitor is having on your website. If he watches a video first and then starts browsing your website, based on the analytical tool you use, the sessions are recorded as either one session or two sessions.

Measurement accuracy depends on how you have defined your measuring KPI’s and precision depends on the ability of your tool to capture the data available. Both require a rich experience and specific knowledge about digital analytics. In such cases, working with a professional could help you exercise better KPI’s, measurement tools and improve both accuracy and precision.

  • Identify and Exclude Bot Traffic: 

For an accurate view of your website traffic, you should be able to identify and exclude bot traffic from your data. Google and Facebook have implemented various strategies to eradicate this behaviour. But identifying bots is difficult. You need to ask for professional help who has developed an experience of working with such bots. Your ability to exclude bots directly affects the quality of digital data you are collecting.

  1. Completeness 

Is your data sufficient enough to make informed decisions? Is your data collected from all the platforms? Is data missing or corrupt?

In order to make sense of your data collection efforts, your data must be rich in quality and quantity. The data gathered by your digital analytical tool must be complete, comprehensive and sufficient enough to make key business decisions.

Your analytical tool must capture the data accurately and should also provide techniques to filter out irrelevant information to make sense of the data available. When data is missing or corrupt you may end up making decisions that are far away from reality.

If you based your decision on only 60% of website visits, you are not getting a clear picture.

If you based your decisions based on 20% broken links, you are not getting a clear picture.

It can be really dangerous for your business to navigate through insufficient data. Incomplete data can never produce quality decisions. 

 Three Things to Look Out When Reviewing Your Data Completeness  

  • Unreliable data Collection servers:

Your preferred digital analytical solutions should be able to capture, process, analyze and measure every single interaction a user has with any of your digital properties, at any moment, all the time. During the time of heavy traffic, it becomes more critical for your data collection tools to capture data without missing a beat.

For example – Let’s say you are running a digital marketing campaign to drive traffic to your website. In a few hours, you start receiving huge traffic to your website but your data collection tool can’t handle the traffic amount and fails after a certain limit. Not only you miss a chunk of data, but you also miss out the opportunity to optimize your efforts and boost your ROI. You now have an insufficient amount of data and inconsistent campaign performance report.

What you can do is try out different tools (At least their free trial versions) and see which analytical tool suits best for your organization. You can even hire digital professionals to look after your data efforts. But make sure they can commit a maximum availability of their data collection mechanisms. 

  •   Missing or Broken Tags 

As you continuously update and enrich your website performance, there are chances that you may break your analytical tags. It is common to have some broken tags on a website which constantly update its web pages. While sometimes those are difficult to detect, these minor broken tags can have a huge impact on your data quality and completeness.

For example – let’s assume you have an e-commerce website dealing with travel accessories. You try to analyze the path taken by the user to navigate through your most successful products. Since your tags are broken, your digital analytical tool fails to capture this data or captures only partial data which creates an erroneous picture. You won’t be able to see the most navigated path and may end up considering the path as ineffective.

What you need to do is beyond implementing analytical tags, you should also make verification process and quality check a part of your analytical strategy. This will help you ensure that all the tags are valid and capture the data necessary for your processing. 

  • Data Lacking Key Information 

Your data sets must be complete and should be able to paint a clear picture of what exactly you are looking for. It’s not that every time a tool will malfunction, sometimes the tool may even lack the capability of capturing and processing the data.

For example – Let’s say you want to compare the number of conversions happened through mobile devices, but if your tool lacks the capability of capturing that data you may end up with a complete set of data but lacking key metrics.

For a reliable view of all your complete data, choose the analytical tool which has the capability of capturing all the necessary information. Look for a solution which provides enriched data and flexible co-relation possibilities.

  

  1. Cleanliness 

In the previous section, we saw why data completeness is necessary for data quality. But having just complete data doesn’t guarantee you a rich data quality. The data collected must clean enough to understand and measure.

Despite all your efforts to maintain clean data, there is always some room discrepancies. With each campaign, each tag, each web page you run the risk of capturing unclean data. It becomes hectic for your analyzers to navigate through unclean data every time you make some changes. It can create inefficiency in your data analysis by providing unclear data values.  

Two Major Reasons for Capturing Dirty Data 

  • Incorrect Tagging: 

You must have experienced these situations where you forget to update tags when you make some changes to the website. Or launch a marketing campaign only to realize that you forgot to add the important tag. Situations are many, but problems remain the same: Your data is now unclean because of a minor mistake.

The solution is you need to identify the mistakes in the first place. Make a list of all the tags you need on the website and if necessary the list of tags you don’t need on the website. Whether you use an integrated tag management system or a dedicated TMS, this is the step you need to follow.    

  • Inconsistent Formatting: 

When collecting data, short numeric strings are preferred over long text-based strings. If there are some parameters which are expressed in a different manner (Ex language as “eng” and “English”) will be collected in different columns and it becomes a headache to navigate through this type of data requiring your analysts to manually analyze the data.

To ensure that the data you collected is clean, you need to define some internal parameters on how you want your tags to be defined. There are numerous tools which convert the complex analytic data into a human-readable form for better extraction of insights. 

  1. Timeliness

Well it’s important to have the data accurate, complete and clean. But all your dimensions are useless if they are not available on time. The real boon of the digital world is that it provides data in real-time and if your tool doesn’t leverage this opportunity, you will end up making some costly mistakes.

Timeliness is indeed everything. People react to your marketing campaigns quickly and if you don’t have data available to make necessary changes in real time they can cost you revenue opportunities. To understand and better understand the need of your customers, it’s not just enough to have data, but you also need data in real time. You need some strategic data like sales revenues, products which are performing better and products performing worse in that exact moment. Having real-time data means you can quickly fix any technical errors before they cost you any opportunities.    

Three Things To Follow While Implementing Timeliness 

  • Adapt Site Content and messaging in real time: 

You go to great lengths to capture customer information and see how they are interacting with your website. With real time data you not only can understand your customer better but react to their needs at the critical moments.

For example – With real-time data analytics you can segment the social media data for a specific geographical location and understand how different topics and content is trending in different location. You can tailor your approach to suit the need of that particular audience at that particular time.

You can leverage real-time analytics on your e-commerce website and target those customers who have recently abandon your cart to incite them to make a purchase.  

  • Detect Technical Issues Immediately: 

Real-time data has the merit of updating you with real-time technical issues pertaining to your website.

If there is a sudden drop in the website traffic or somebody hacked your website, the data and all the necessary information should be readily made available by your data analytical tool. It will help you identify the significant roadblocks and keep functioning your site smoothly.

  •  Optimize Marketing Efforts in Real Time: 

When it comes to your marketing efforts, every penny wasted is a loss. Timely data is the key to smart optimization of your spendings.

For example – You may invest in PPC ads to boost your E-commerce sales but soon get to know that your social media efforts are actually paying off better. You can soon switch off the PPC campaign and relocate your efforts to Social media.

  1. Consistency 

Today the conversations between brands and customers happen over multiple platforms. A user may start their journey on social media but may end up purchasing the product by clicking on the PPC ad. As companies try to collect data from different sources they are burdened with different tools resulting in a fragmented and disparate data.

How can you trust your data if each source of information is displaying different metrics? Which source should be trusted? IS your point of reference accurate?

Only via a reliable tool companies can obtain consistent and reliable data from various platforms including website, mobile apps, social media etc. 

Two Tips To Ensure Consistency In Your Data 

Identify Your Point of reference: 

Using different tools to calculate your metrics can mean getting different results for the same metrics. Every digital tool has its own database and a way of calculating the analytics metrics. Even a simple visit metric is calculated differently via different tools. Sometimes the same data set is captured differently in the same tool based on its configuration.

You need to determine in advance which calculation and the method of capturing suits your interests. The main definition becomes your reference point while calculating that particular metric across all platforms.  

Ensuring Consistency across devices: 

When using different analytical tool to measure different metrics you may end up having a fragmented data that is difficult to analyze. For example – A user may log in your website from a desktop and later make a purchase through mobile device. If you two different tool to capture desktop and mobile conversions, the same user may be recorded as two different users. When attempting to wove together all the data you will have an incorrect view of your efforts.

Opt for a analytics tool which is robust and possess the capability of collecting, processing and measuring the data in the same dashboard. A tool which consistently captures all the information from different platforms will save a lot of time, trouble and risk of inaccuracy.

Finding the best analytics tool for you is a tough task, you need to experiment with different data tools to find the one suitable for your organization.        

 Conclusion 

With new advancement in data capturing methods, data will be available to everyone. The digital tools you use to capture the data can be leveraged even by your competitor. But having quality data is what will set you apart from them. The 5 dimensions mentioned above can surely help you capture quality data and will surely optimize your digital analytical efforts.

 

The post Understanding & Maintaining Data Quality in Digital Analytics appeared first on Fusion Analytics World.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In a conversation Mr. Stephen Mallik, CEO, ArtiE divulged Fusion Analytics World how ArtiE is changing the Enterprise landscape through their unique voice activated AI platform managed natively from any device.

Introduction: Yourself. What is ARTIE and how did it all start.

I am Stephen Mallik, originally from India, immigrated to USA to pursue the American dream. I am an Entrepreneur with a well rounded startup experience from seed investment, angel investment, serving as C-Level, bootstrap to exit. I have about 20+ years in the IT industry and this project is about bringing it all together like, my past experience in the industry, wisdom, passion and an innate vision.

ArtiE stands for Artificial Intelligence for the Enterprise. We started with a simple question. We use keyboard, mouse and touch to interact with software, why not voice? ArtiE is a platform to develop voice activated software that works natively from any device.

What do you think sets ARTIE’s offering apart from competitors in AI category?

Voice activated software is not the trend. We expect keyboard, mouse and touch to work with any software. The reason behind this is that the tools used to build the software inherently enables these methods of interactions (keyboard, mouse & touch) as a standard. This means that we need software creation tools that would allow voice interaction as a standard feature. However since such a tool is not available, a developer would have to invent his own solution to achieve voice activation. This involves a lot of R&D, knowledge & integration of various technologies, which makes a voice activated solution almost impossible. ArtiE is filling the gap of such a software development tool, which a developer could use to quickly and easily create a voice activated software that works natively from any device.

What is your view of the Artificial Intelligence (AI) landscape?

We are just getting started and no where close to developing true AI capabilities. There are various AI solutions targeting specific problems. Most of the solution are available as APIs and not as a platforms to build turnkey solutions. AI solutions still need high end engineers and a lot of R&D. In the  field of Voice activated software, we are making the change by bringing in a platform to create turnkey voice activated software. A platform that is easy and quick to learn.

What do you see as key factors for success of AI enterprise products and what are today’s roadblocks?

AI in general is an emerging solution for the enterprise. Current solutions are heavily focused on data. Voice solutions are a nascent concept and the adoption curve is just starting to catch up. The timing seems to be right for Voice intelligence in the enterprise IT industry because of the wild success of consumer based voice solutions like Alexa and Google home.

How do you see the AI market shifting in the next several years?

AI market will continue to evolve. Currently we don’t see a big participation of Enterprise software giants like Oracle and SAP in the AI market. We are hoping this will change in the coming years and they will launch market leading mainstream AI product.

What are the different sectors where the AI can actually add value to existing processes?

AI is an obvious fit for data because of the advancement of ML (Machine learning) technology. As the maturity of specific AI technologies improves, we will start seeing impact in the industry. For example Voice has not had an effect on the industry so far because of lack of technology maturity. We have had voice to text and text to voice capabilities for decades now. But the reason it has not had an impact is because it was not usable. In the last few years there were advancements like NLP (Natural language processing) and NLU (Natural language understanding) and some deep learning techniques that makes voice intelligence viable for business now. Likewise as other technologies mature, we will start seeing adoption in the industry.

Are there any particular industries picking up on the opportunities faster?

We are just starting out and do not have enough data to substantiate it. We are currently seeing success in Signage, IOT and Website industry for voice based solution.

What impact AI is having on Voice Intelligence? How developers are benefitting?

ArtiE is a development platform for developers to create voice activated software. Developers will be able to delight their customers by providing a better user experience via voice.

Because ArtiE makes voice activated  software development easy and quick, even junior developers will be able to develop a voice activated software effortlessly.

What will happen in terms of jobs losses and skills as AI makes devices more intelligent?

We are creating a new way to interact with software using voice. This mean UI creators have to now start thinking on how to create UI for a voice centric user experience. We think it will create a requirement for a new breed of VUI (voice user interface) developers, which means what we are developing will lead to more job creation.

So what are the key challenges you are facing in terms of product deployment? Convincing clients about AI and it’s benefits?

AI development translates to high product development cost due to the nature of R&D involved and the team skill necessary to make the advancement in the product. The challenge is to create  a business model that can support such high product development cost.

From an client’s adoption perspective, a solution approach works better than a technology approach. This brings along all the  complexity of packaging a solution for specific  industries.

Can you talk about 2-3 AI case studies where were recently completed.

Signage industry: A growing percentage of signage solution is interactive. Currently users have to use touch interface. Adding voice changes the user experience entirely. We are expecting a accelerated growth in this segment.

Website industry: Websites are a way to brand and market your products and services. User experience via voice drastically improves the  impact and perception of your brand. Also, voice interaction improves user engagement immensely since it involves sensory process like talking and listening.

IoT device: Almost all IoT devices in the market come with a companion mobile app. ArtiE can be used to create the mobile apps for these IoT devices. ArtiE enables the ability to talk to the device and for the device to talk back. This changes the perception of the device in the mind of the user. The user is now perceives this device to be a robot. By using ArtiE, to build mobile apps for their IoT product, firms can change the perception of their device and enhance the abilities of their device.

What is next for you and ArtiE?

ArtiE’s true goal is to solve the problem of Enterprise using all available AI methods and techniques. We will continue to iteratively make progress and improve our capabilities on speed, data, availability, user experience, functionality, intelligence, connectivity, solution, functionality and accessibility.

It was great talking to you Stephen! 

Pleasure talking to you Kalyan

 

The post ArtiE: Enterprise Platform for Building Voice Intelligent Solutions appeared first on Fusion Analytics World.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

India is always envisioned to be portrayed as a conspicuous innovative hub for rising brilliance and virtuosity. However, we often questioned about the panache and endowment of our countrymen for being at power to the innovations. Aegis School of Data Science, Cyber Security & Telecom, key innovators and veterans of the industries realized that it is important to recognize the Indian Innovators and their innovations that are changing the lives of millions.

In the year 2010, Aegis Graham Bell Award was initiated to promote innovators and innovations in Information & Communication Technology (ICT) domain and as a tribute to the father of telephony, Alexander Graham Bell. Aegis Graham Bell Award 2018 is inviting innovations for this 9th edition.

Mr. Bhupesh Daheria,CEO Aegis School of Data Science said, “After the success of the 8th edition AGBA, we are very excited to open up nominations for the 9th edition of AGBA 2018. This year, we are delighted to announce our collaboration with IMC for the Telecom and Mobile categories. We wish all the nominees a very good luck. We also thank COAI for their continued support.”

This year AGBA will be hosting two Jury rounds, one for the “Telecom and Mobile Categories” and another for the “Focus and Tech Categories”.

Telecom and Mobile Category Jury round will be held on 24th September to 28th September 2018 in New Delhi and the deadline for the same ends on Wednesday, 5th September 2018.

Focus and Tech Category Jury round will be held on 12th November to 16th November 2018 in Mumbai and the deadline for the same ends on Friday, 12th October 2018.

Cellular Operators Association of India (COAI) has been a supporting partner since the inception of Aegis Graham Bell Awards, and for the 9th edition, in the Telecom and Mobile categories, innovators will also be felicitated at the India Mobile Congress, organised by COAI and Department of Telecommunication, Govt of India, from 25th – 27th October 2018, at New Delhi.

Mr. Rajan S. Mathews, DG COAI, said, “India Mobile Congress is proudly collaborating with Aegis Graham Bell Awards, which has done tremendous work towards promoting innovation in the ICT and TMT sectors. The AGBA is a prestigious accolade recognising the contribution of individuals, companies and institutions towards the development of technology in the telecom and mobile sectors. Such endeavours ensure that contributions towards innovation and development in the sector are duly recognised and appreciated.”

AGBA is inviting innovations for the year 2018 under the Telecom and Mobile categories:

  • Innovative Telecom Product/Solution
  • Innovative Mobile App for Enterprise
  • Bringing Fortune at the bottom of pyramid
  • Service Innovation
  • Green Telecom
  • OSS/BSS
  • Innovative Managed Services
  • Best Value Added Services
  • Innovative Mobile App for Consumer
  • Telecom Infra

AGBA is inviting innovations for the year 2018 under the Focus and Tech categories:

  • Innovative Smart City solution
  • Digital India Initiative
  • Application for Social Good
  • Innovation in AR, VR, Mixed Reality
  • Innovation in Cloud
  • Data Science/Analytics/AI in BFSI
  • Data Science/Analytics/AI in Health
  • Data Science/Analytics/AI in Telecom
  • Data Science/Analytics/AI in Media and Entertainment
  • Business Intelligence, BigData, AI, DL, Analytics and ML
  • Innovation in IoT
  • Cyber Security
  • Blockchain
  • Innovative Education Solutions
  • Innovation in eCommerce
  • Innovation in Health
  • Innovative Marketing Solutions
  • Innovative Enterprise Solutions

About Aegis School of Business:

Aegis School of Business, Data Science, Cyber Security and Telecom was founded in the year 2002 with support from Bharti Airtel to develop cross functional technology leaders. In 2015, Aegis and IBM collaborated to launch, India’s first Post Graduate Program (PGP) in Data Science, Business Analytics and Big Data and later in 2017 PGP in Cyber Security. These programs are jointly certified and delivered by Aegis School of Business in association with IBM. IBM has set up high end Business Analytics and Cloud Computing Lab at Campus. Also, Aegis and NVIDIA partnered for Deep Learning and applied AI courses. Aegis is the No.1 School of Data Science and among the top 5 in Business Analytics. Aegis takes up various industry projects, research and consulting assignments in the field of data science under its initiative ‘Data Science Delivered’ and ‘Data Science for social good’, and helping organizations for devolving skills on data science, ML, DL, Big Data, Analytics etc.

 

The post 9th edition of Aegis Graham Bell Award nominations open appeared first on Fusion Analytics World.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview