Very recently, we held our very first webinar where our CEO talked about MVP Readiness. An MVP, or minimum viable product, is a lean solution used to fast track performance while balancing a tight timeline and budget. We’ve outlined some of the main points of the webinar for easy reference.
Minimum Viable Product?
An MVP is a product that has just enough functionality and design to go live for faster access to performance data. It provides a product foundation for organizations to launch quickly and see ROI as fast as possible, all under a smaller budget and shorter timeline.
What is the value?
When you’re short on time or money, a minimum viable product is a valuable investment. Instead of spending more resources to produce a fully complete product, you can choose to build an MVP and further customize it according to incoming data. The freedom to roll out better versions post-launch means more effective iterations that can lead to better conversions.
Starting with the data
Like any data project, you first have to assess and organize everything you have. This is the only part of the process that cannot be condensed because the data is everything to the product.
Do you have access to all the data needed for the MVP?
What format are the data sources?
How clean is the data?
How does new data get added?
Is there a security policy around the data?
What sort of insights are being pulled from the data?
It’s imperative you spend the time to make sure the data structure is accurate to ensure the final product performance (and data) is not compromised.
Every build needs a vision. What can you build when you don’t know what you want or need?
Is the idea and vision approved by internal stakeholders?
Is there a clear understanding of who this vision is focusing on?
Are there identified use case scenarios?
How will the user flow thru the product?
What tools are being used to bring the idea into reality with visual examples?
Is there a document defining logo usage, brand colors, typography, etc.?
Ready to Build
The final product will be one that performs well and can evolve as new data is collected. Even though an MVP requires less time to build, you still need a careful assessment of internal efforts.
Has a budget been allocated and approved by internal stakeholders?
Are there any internal security restrictions?
What resources are available internally?
Is the team new to working with each other? How experienced is everyone?
The Real MVP
Once the build is complete, you’re ready to launch your MVP. Reminder, this webinar only outlines the basics of how to best approach your next Data Analytics MVP (Minimal Viable Product) project. These basics won’t address all potential challenges, it’s only meant to be a starting point. But the MVP process doesn’t end here, since there’s always a new product journey after launch. Learn more about how MVPs can help your organization by contacting us.
**Stay tuned for new resources, including this webinar!
By now, you probably know that data needs structure and context to uncover actionable insights, which is why data visualization is such a powerful tool. However, analysis isn’t a one step process that presents all possible insights at once. There are instances when you need further analytics to expound key insights, mainly when things aren’t right. Projects can fail, people lie, resources can dwindle, and basically anything else that creates complications. So how do you even begin to dive deeper into data that isn’t explicitly connected to your data? By using data visualization as a first step, you can begin explore new correlations.
Oftentimes, looking for insights means looking for the root of the problem. When looking for patterns, visualizations are a great way to represent important data points because they build connections.
With insights like these, organizations and groups can execute more efficient and effective operations.
Business performance is usually measured using KPI’s that indicate items like revenue and sales. But we all know that there are always external factors that affect business and finding those connections aren’t the easiest thing to do. Consider social media, a veritable hub of constant activity. Today, you can even shop through Instagram. So now there are new KPI’s and metrics to consider as well as new correlations. The social media counterfeit market makes about $1.2 trillion in a year. Common payment apps include WeChat Payment and PayPal. It’s in the best interest of the real brands and businesses to investigate counterfeiting because it affects brand image. No matter how much you think you know, you never have the full picture. When more people buy cheap knock-offs, it’ll change sales and public image. How does it change? Use deeper analytics to find the connection.
Managing an office, department, team, and even your own work can be difficult. That’s where task management apps come in. Apps like Asana provide an organizational interface for users to keep track of tasks. A savvy user will know how to optimize the different features to fit their needs. Here at Boost Labs, we’ve found another way to make the most of our Asana experience.
How we use Asana
Asana is a simple and powerful task management app that we use at Boost Labs. Many of our clients use it too and it can be easily implemented for collaborative meetings. We prefer an agile approach to any project, which means sprint planning, backlog grooming, daily scrums, sprint review, and sprint retrospectives. It’s a lot of meetings so it’s imperative we make the most of our time via effective task management.
As suggested by the SCRUM framework, we estimate task points to signal the level of effort. To do that, we use a custom field in Asana to track our team tasks. Unfortunately, there was no easy way to calculate tasks and points in Asana across columns in the board or even in a list view. To fix this viewing issue, we created an Asana tasks and points counter chrome extension.
Why is this important?
It saves time when calculating task points to measure the velocity of the sprint and maintain better communication with the client.
Imagine you have 40 cards on your board, each card has the value of points in the range from 1 to 21. To calculate all points on the board you have to manually add each one and if estimation changed or a new card added you have to do the calculation again. And it’s not easy to answer simple questions like:
How many tasks we have in this sprint?
How many tasks are in progress?
How many points do we already have in this sprint?
How many points have we already completed this sprint?
Asana can’t provide fast answers at a glance, but our Chrome plugin can.
What is it?
It’s a plugin for Chrome desktop browser that automatically calculates points in all tasks on your screen. This extension counts each task and points assigned and then summarizes the sums for each section in a list view or for each column in a board view. Points assigned can be seen in the bottom left corner of tasks and the sum of all points is indicated at the top.
How do I use it?
Create a custom point file in each task and add it to each view (Board and List views). Points must be a custom Asana drop-down property with numbers only in grey. Calculations of tasks and points update automatically every 2 seconds. All you have to do is assign points.
In the data world, there are simply too many fancy phrases that are overused. “Big data” is one that is often said in an attempt to describe what actually is our data industry. This leads us to the point of our blog post about the phrase “Data Analytics Products”.
When you do a basic Google search on it, you end up with results that include solutions like Tableau, PowerBI, and Looker. These companies do in fact sell a product that is focused on Data Analytics. But, to us, that only solves part of the need.
Our definition of Data Analytics Products is software that is based on Data as the core ingredient (the heart). There are two ways you can look at a Data Analytics Product:
A software solution that delivers visual insights (aka dashboards via BI platform). Dashboards
A custom software solution or product that recognizes existing workflows and enhances the workflow(s) by leveraging data into the mix. Data Analytics Product
The ROI on both products are clear to see:
Dashboards focused on the right data story can deliver actionable insight opportunities, both internally and externally. The value isn’t focused just on the tech used to deliver the insight, but also the right data story using the right tool to represent itself. We consider this a one-way conversation the user has with the data provided.
A custom software solution leverages existing data an organization has in order to help users navigate workflows. As the user provides the custom software solution with more information/data, growing Data behind the scenes offers prescriptive solutions for the user. This example alone is ROI that defines growth strategies for most organizations.
One solution isn’t better than the other, as each serves a different need. Data is becoming a more quantifiable asset, causing some of the leading Private Equity firms to reexamine how to conduct company assessments with the Data they own. This need to “rethink” business evaluation formulas is a major indicator of what the future may hold.
The question really becomes: how well are you leveraging your data?
AWS, RDS, Azure, Oracle. You’ve probably heard of these services at some point and wondered what they can do for you. We keep hearing about the cloud and how great it is, but do you know why businesses are choosing it? It’s a complex system of IT infrastructure that manages the everyday difficulties of building and maintaining a server environment. Simply put, cloud computing is like a home away from home. It holds data and environments for you so you don’t have to buy countless servers that get too hot and keep IT support on standby all day. Any DIY hardware solution means money upfront to buy items and people needed for development. Instead of dealing with the physicality of countless computers, servers, hard drives, VPN connections, and extra bodies, you could have all the hardware and connections ready at any time.
API’s, migrations, business analytics, servers, security, all require expertise. Are you prepared to handle everything?
What is cloud computing?
Most businesses that accommodate large amounts of data need lots of computers, hard rives, and servers. We might not be able to see data, but we have to handle the physical space it occupies. A typical in-house operation will have servers for memory and operations. But what happens when business starts to grow? More data means more memory and more memory comes from hard drives. A single hard drive, server, and connection can only hold so much data, so you have to buy new servers as data volumes grow. To alleviate the stress of growing data, people and businesses turn to cloud operations. Data servers provide storage and computing capabilities while platform resources make it easy to build and connect applications, all without continuous management.
Within servers, every database and data stream needs infrastructure to facilitate processing. What is the data source? Where will data be stored? What platform or environment is needed? How are connections made? Cloud computing will establish the connections and data repositories you need for smooth data architecture. Things like data mining and database management become easier when you don’t have to spend time (and money) on running your own server farm.
How does it help business?
Can you buy all the resources you need in time for new data? Can you predict when you’ll need to expand in time? How long will it take to buy, build, and maintain the servers and connections you need? There are uncertainties with data that, if not addressed, will cost you time, money, and customers. When businesses start to grow, there’s an influx of data. This will lead to data ingestion problems that affect processing and analytics. If there are no appropriate storage spaces, connections, or support, your business won’t be able to grow. Buying servers, space for the servers, people to support the servers, and more developers can cost business anywhere from tens to hundreds of thousands of dollars.Any DIY hardware solution means money upfront to buy items and people needed for development. Instead of dealing with the physicality of countless computers, servers, hard drives, VPN connections, and extra bodies, you could have all the hardware and connections ready at any time.
When you’re not worrying about buying more equipment, you can focus on better data analytics and cost-effective scaling. And it stems from better storage and accessibility that doesn’t require time-consuming capital expenses and man-power. Just pay for the computational power you use and request more space as needed. Cloud computation can provide a ready environment for developers to create and manage applications without having to worry about servers and connections.
What do I need?
To reach fully functional cloud services, there are a few crucial steps. It takes time and the help of skilled developers to migrate and translate functions and configurations. Depending on your data, there might a longer migration period or more translations needed to perfect the transition. It’s an analytics process nonetheless that requires data, testing, and validation, so consider budgets, timelines, and scope (like any other project). Growing data is unpredictable and unmanageable if you aren’t completely prepared before big data comes your way. Make the most of your data.
For more information about our data ingestion, data migration, or data lake services, contact us.
Excel spreadsheets have been around for more than 30 years and they’re still valuable. The original concept isn’t much different than what we use today, it just looks better and has a lot of new capabilities.
But aren’t Excel spreadsheets outdated? It’s manual and there are better software programs.
Spreadsheets are still relevant and a great tool to learn about data. It’s true it’s not the only or most fitting solution for all data projects, but it remains as a reliable and affordable tool for analytics. It’s a foundational structure for intelligent data because it deepens your understanding of the analytics process. Many industries and businesses continue to emphasize the importance of Excel skills because it remains as an intelligent way to extract actionable insights. Revenue patterns, operations, marketing trends, and more can be analyzed through Excel spreadsheets, but the real advantage is the process.
What does Excel do?
It’s true that larger corporations have moved away from spreadsheets on a big data scale, however spreadsheets are still used for everyday items. In its most basic form, Excels holds data points in each cell. Anything like raw data exports, date of sales, SKU, or units sold are entered (or imported) into a spreadsheet for easier viewing and organization purposes. A successful Excel spreadsheet will organize raw data into a readable format that makes it easier to extract actionable insights. With more complex data, Excel allows you to customize fields and functions that make calculations for you. Even with larger data sets, segmented data can be studied more carefully and visualized without using other software. Determine hypothetical profit margins or department budgets. While it won’t build a full-scale data product alone, it can present easy-to-read visualizations and accurate calculations.
Say you’re thinking of becoming an analyst or need to handle data to build a report, but analytics isn’t the simplest process to grasp in a single sitting. Utilize data spreadsheeting as microcosmic version of a larger data project.
What is the goal? Overview? What are the insights you need?
Where is the data coming from? What exports/imports need to be made?
Does the data need to be translated?
What roadblocks exist? Limitations?
How are you making conclusions? What post-analysis decisions need to be made?
A real big data project requires way more manpower, skills, and level of detail, but Excel provides a great start to context.
At Boost Labs, we use Excel spreadsheets for many internal applications. We use it to budget and timeline projects, create simple operational dashboards, and even keep track of lunch orders. All spreadsheets fulfill a specific need and are designed to work as a visual data aid. The process of consolidating data points and creating a cohesive narrative is the ultimate goal of any data analysis and Excel can help. Many organizations use Excel files to catalog data sets, import data, create data models, and more. In coming years, Excel is expected to change even more and handle a bigger range of data.
When we work with our clients, we rely on the data provided to us. It can be any kind of data and in many instances, spreadsheets are used to compile necessary data. It’s only after the data story and workflows are established that we can move onto design and development.
Keep in mind
Better analysis means better data products. Excel is a tool for data analytics and not always complete solution. Use different functions to explore the data for better insights. So get started with Excel spreadsheets and see what you can do with data.
In the best way, information privacy is what keeps our personal data safe from hackers and thieves. But we know this isn’t what really happens. Many businesses routinely sell user data for profit. We’ve seen countless headlines about poorly handled data breaches, lawsuits, and illegally acquired data. So, people are fighting for data transparency and their right to data accessibility. Here’s what you need to know.
The best combination of information privacy and data transparency balances the security of personal data and honest business practices. The public wants to know what happens to their data while gaining a deeper insight into operations. A lack of transparency and accountability drives mistrust and poor insights. We’re not saying you have to stop data collection or share every piece of data with the public, but you should be able to share insights that involve personal data and prove you aren’t being dishonest.
If our data is being sold, shouldn’t you get a piece of the action, a “data dividend” so to speak? Or at least tell us what we want to know. Despite customer engagement in providing data, we don’t have access to data that affects us. Because companies know privacy issues damage reputations, they often refrain from revealing data problems or about keep quiet about secret deals. Intentionally withholding data that affects your audience personally is deceitful and disastrous.
Some information must still be safely handled and protected. Other information should be public knowledge to encourage engagement and regain public trust. Become a trustworthy organization by practicing honest data. It can be as simple as building a report or as in-depth as a custom platform and anything else in between. Whatever solution you choose should always revolve around data insights and honesty.
Domesticated pets have been around for centuries and thanks to improving understanding, many pet owners have better access to products and services that make our pets’ lives better. Today, the range of pets includes furry friends, insects, reptiles, birds, fish, amphibians, and more. And we can thank data analytics for everything. Pet care as an industry giant is relatively new, but it’s an exciting example of building market value via data analytics. More people are adding pets to their families than ever.
In the past 20 years, the pet industry has grown into a market worth more than $70 billion. That means pet owners are spending tens of billions of dollars every year on pet needs. Healthy food & treats, toys & enrichment, and behavioral studies mean better health, fun, and training. From buying specialty food to a heated cat house, the market is filled with so many modern luxuries we didn’t have in the past and it’s all thanks to marketing insights.
Data, market, more data
So how were businesses and disruptors able to build such a huge empire? Data analytics. The obvious value of data insights in business is profit or ROI and that’s true for most industries. For a relatively new market like pet care, there’s no precedent for a lot of products and animals so data collection is key. Collecting data on households, purchasing history, and other indicative statistics can be used to identify untouched opportunities.
It makes sense we’re only able to have a large market following technological growth. Buyers are more educated about pets and can easily purchase goods and services online. And ecommerce gives businesses way more consumer data than ever before, which leads to sales of automatic pet feeders, cameras, and even pet-safe plants to nibble on. And it’s all thanks to data insights.
Much like machine learning, natural language processing takes in new data and responds to it, like Siri does when you ask for the nearest restaurants. But unlike the other branches of AI, natural language processing is literally conversational. It’s one of the core foundations for those talking robots we keep seeing, such as Little Sophia, the younger “sister” of Sophia.
Hanson Robotics – Little Sophia
But unless you’re working on robotics AI, you should be using natural language processing as a way to optimize performance. Natural language processing is fundamental to customer engagement and retention. Most businesses aren’t utilizing NLP, subsequently missing out on profitable opportunities.
How does it work?
In a general scope, AI looks for patterns via algorithms (designed by humans) within incredibly large data sets. This smart analysis processes more data than any human is capable of with incredible accuracy (so long as our algorithms are carefully constructed). With NLP, the AI software finds meaning in patterns of speech by intaking textual data and then parroting human conversation. AI will translate words by comparing them to its own memory database, identifying patterns/combinations/context, and then produce a human-like response. In reality, AI is a very complex field and the development requires a lot of testing, but consider it an investment.
This doesn’t mean the technology is perfect, it just means that natural language processing is constantly evolving and adapting to complement business efforts. On the customer end, we simply contribute to the growth by providing data. Every interaction becomes a learning experience for AI that eventually leads to better user experience.
What does it do for business?
AI assistants have been around for what seems like ages in the tech world, but it has more potential outside of the home. Any kind of conversation computation is a data goldmine for organizations. Textual data in large data sets are close to impossible to analyze and hold so much value. The smart tactic is to use natural language processing and its processing power to comb through everything. You get actionable insights that improve brand performance, marketing tactics, revenue, etc. Meanwhile, the rate of customer engagement increases because tools like chatbots provide fast response and live data, not only improving user experience but also providing better contextual data for marketing. Or customers get better recommendations they’re more likely to buy. These are only the most practiced NLP solutions currently in use so they sound much simpler than a robot. However, the potential still stands. NLP creates a self-learning model that continues to evolve with every interaction. That means user experience, campaigning, advertising, etc can only get better at every turn, maximizing AI performance and results. NLP even analyzes public sentiment by reading and picking up on literal and implied opinions. Social media comments, articles, reviews, and any other written text can be analyzed. The science behind it is undoubtedly complex, but that’s why NLP results in equally detailed insights.
How you Explore NLP and see what it can do for you.
Today is National Burrito Day, a day about appreciating the delicious culinary creation and the surprisingly significant data insights it carries.
From conquistadores to people from Sonora, the burrito has a rich Mexican history as well as strong American popularity. Starting in the 1950’s, as Mexican cuisine moved further North, Americans were able to enjoy a brand new culinary experience. And now, it’s so popular that Chipotle even markets itself as a “lifestyle brand” and has even changed the burrito landscape with a bowl. The chain now has more than 1,700 locations Americans can enjoy.
The price of the burrito = important data analytics
The burrito also represents the importance of data analytics because of its cultural, economic, and even socio-political impacts.
The average price of a burrito, like other popular foods, is an indicator of purchasing power. From 2001 to 2016, the price of a burrito rose from $2.50 to $6.50, a 160% increase. However, the official inflation for the same time period only rose by 35%, meaning you would need $1.35 in 2016 to have the same purchasing power as $1 in 2001. Following the official inflation rate, 2016 burritos should have been worth $3.38, but instead cost nearly double the amount. It sounds funny to measure the value of a dollar to a burrito, but it’s an alarming indicator of an inflation imbalance. We aren’t earning enough and our money isn’t worth enough to accommodate the rapidly rising cost of living. But inflation isn’t the only burrito data insight to worry about.
Varying prices for ingredients can usually be linked to more obvious agricultural and natural events, but they’re also connected to politics. President Trump has been pushing to close the US-Mexico border to keep immigrants out. However, data from the Department of Agriculture tells us of all fruits and vegetable imports, nearly half comes from Mexico. And about 100% of the avocados American eat are grown in Mexico so if Trump gets his way, our food supply, businesses, and diets will change drastically and ripple out with other consequences.