AI and Automation meets BI

AI and Automation meets BI

Automation and AI will democratize BI, expand the user base of BI tools, and enable users to perform increasingly more sophisticated analytics.  New tools will reducetime to insights(TTI) by enabling data and business analysts to extract and transform data, uncover patterns, and produce more accurate forecasts, root-cause analysis, and simulations.

Organizations use a variety of BI tools to analyze structured data. These tools are used for ad-hoc analysis, and for dashboards and reports that are essential for decision making. While the typical BI user is an analyst, as BI and data management tools evolved, analysts have been able to add advanced analytics (and even machine learning) to their toolboxes.  

In this post, we describe a new set of BI tools that continue this trend.  These new tools make it easier for analysts and business teams to analyze data and generate reports with minimal assistance from their IT counterparts.  Accompanying improvements inETLand data management systems expand the data sources BI users can use while lessening the need for assistance from IT teams. 

The initial set of companies we list offer mainly SaaS systems. Thus, companies who want to use these new BI solutions will need to move their data to public clouds.

BI solutions first appeared in the 1970s with early systems from companies like SAP, Siebel, and JD Edwards. The growth of data warehouses in the 1980s gave rise to a new set of solutions including Microstrategy, Cognos, and Business Objects. This early group of  BI solutions (“BI 1.0”) were owned by the IT department, meaning that most users were not capable of creating reports and dashboards on their own. Users had to undergo extensive training to become proficient in using and administering these solutions. This generation of tools focused primarily on producing reports and dashboards. 

The early 2000s added more speed to BI development and saw a concentration of BI in the hands of IBM, Microsoft, SAP, Microstrategy, and Oracle.  This generation of BI systems let users perform ad-hoc analysis based on pre-generated schema. More precisely, users could create dashboards and they could “slice and dice” data using different dimensions and metrics. 

The mid aughts saw the rise of new solutions built for data analysts. This new set of “BI 2.0” tools – as exemplified by Tableau and Qlik – put a large emphasis on visualization, interactive analysis, and ease of use. And these companies introduced a new form of interacting with data – visual pivoting – which combined pivot tables with charts and visualizations.  Users might still rely on IT to connect their BI tool to a data warehouse or a database, but they could also use these tools on datasets they control like spreadsheets or text files.  Once they connect to a data source, these tools give analysts the ability to run ad-hoc analysis, create and repost dashboards and complex visualizations, with no interference from their IT department. These tools made BI and interactive analytics ubiquitous.

With the advancements in BI in the last 30 years, the core function of data analysts has remained largely the same. An analyst frequently starts with a hypothesis or a question and interrogates data to refine his or her understanding. This is an iterative process that might entail wading through a series of hypotheses – using a BI tool pointed at a high-dimensional data set – before settling on a reasonable answer.  What if this process can be automated?

BI 3.0 tools attempt to address two significant issues: reliance on IT and manual analysis. In order to solve the first issue, data analysts need to be able to create their own data models by using solutions that generate data warehouses automatically (e.g., ThoughtSpot,  Hypersonix) or by using solutions that abstract the ETL process (e.g., Fivetran, Matillion).

Share it:
Share it:

[Social9_Share class=”s9-widget-wrapper”]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You Might Be Interested In

The 3 Reasons Why Companies Should Use Data Intensive Computing

2 Mar, 2017

Researchers have estimated that 25 years ago, around 100GB of data was generated every day. By 1997, we were generating …

Read more

Battling Data Demons with Data Governance

19 Mar, 2017

The stories you hear on the news often mention how this person or that person was battling their demons. Alcoholism, …

Read more

How to get your big data team to implement functional code every two weeks

26 Jun, 2017

Typically, when I challenge data science teams to build functional code in two-week iterations, they think I’m crazy. When most …

Read more

Recent Jobs

Applications Developer

Washington D.C., DC, USA

1 May, 2024

Read More

D365 Business Analyst

South Bend, IN, USA

22 Apr, 2024

Read More

Do You Want to Share Your Story?

Bring your insights on Data, Visualization, Innovation or Business Agility to our community. Let them learn from your experience.

Get the 3 STEPS

To Drive Analytics Adoption
And manage change

3-steps-to-drive-analytics-adoption

Get Access to Event Discounts

Switch your 7wData account from Subscriber to Event Discount Member by clicking the button below and get access to event discounts. Learn & Grow together with us in a more profitable way!

Get Access to Event Discounts

Create a 7wData account and get access to event discounts. Learn & Grow together with us in a more profitable way!

Don't miss Out!

Stay in touch and receive in depth articles, guides, news & commentary of all things data.