Construction of a New Data Analytics platform

CHALLENGE

There are many reasons why a company might want to reorganize its data on a greenfield site: either no data warehouse yet exists in the company yet or it is more efficient to build a new one than to adapt an existing one. Even extensions of existing solutions are basically limited new implementations. Building a data warehouse requires time, capital and know-how. Mostly ELT or ETL tools are used for this purpose. This means a manual, i.e. slow and step-by-step, development process that can no longer cope with today's rapidly changing business requirements.

SOLUTION

With AnalyticsCreator, you can create a data analytics application such as a data warehouse with graphical support in no time at all, connect any data sources, transfer the data to your cube or MS Tabular model and make the analysis models available to any BI/analytics front end. In this way, you can easily connect business requirements with your data sources and map the core business logic directly in AnalyticsCreator.
If something changes in the data source, in the logic or in the process, the changes are automatically updated and published in all levels.

New_Construction

Benefits

clock

Ultra Fast Prototyping

Get results in hours, not in days

shopping-cart

Time to market of your project

up to 10x shorter

briefcase

Time & Cost Savings

layers

Automatic documentation

shield-1

Drastic reduction of project risk

Vector (1)-1

Modeling instead of programming

shuffle-2

Agile project methodology

Vector (2)

On Premise and Azure Cloud Operations

bulb

The code is freely usable without restrictions

also usable without AC

Vector-2

Advanced support for frontends:

Power BI, Tableau, Qlik Sense, TM1

strip1

Analytics platform modernization

CHALLENGE

The current data warehouse is outdated and no longer meets the new, extended requirements that the data basis should deliver to the analytical front end. Its performance may no longer be sufficient, adaptations are difficult to implement and it would no longer be future-proof. Additionally, the know-how about the existing data warehouse may no longer be readily available because the consultant’s employees or consultants are no longer available.

SOLUTION

AnalyticsCreator allows you to build on an existing DWH and make extensions and adjustments. If a good foundation is available, it is easy to build on top of it. Additionally, AnalyticsCreator’s reverse engineering methodology enables you to take code from an existing DWH application and integrate it into AC. This way, even more layers/areas can be included in the automation and thus support the expected change process even more extensively. The extension of a manually developed DWH (i.e., with an ETL/ELT tool) can quickly consume time and resources. From our experience and various studies that can be found on the web, the following rule can be derived: The longer the lifecycle, the higher the costs rise.

—ÎÓÈ_1

Benefits

shield

Protect existing investment in DWHs

flash

Utilize state-of-the-art modeling capabilities

settings-2

Use reverse engineering methods

lock

Secure knowledge

cloud-upload

Prepare the way to the cloud

file-add

Automatic documentation

Vector-1

GDPR Support

Mask-1

Ultra Fast Prototyping

Results in hours & not in days

bottom-strip

A data platform for Power BI

CHALLENGE

The challenge is to connect different data sources with Power BI. This involves processing large volumes of data, which can be complex. Additionally, the data needs to be historized to use the corresponding advanced analytics tools and methods. A tabular model needs to be built and much more. Power BI is one of the most popular analytics front-ends worldwide. In Power BI, you can design a multidimensional model for analytic applications and connect it to data sources in the cloud or on-site. To achieve higher data quality and meaningfulness, data is typically extracted from the source, cleansed, logically prepared, connected to correlating data, and aggregated. Additionally, model-based and technical performance optimizations are necessary. Another very relevant point is that the data historization of the source data should be implemented and stored in a data warehouse, which is not possible with Power BI. Only in this way are timely and correct analyses and reports possible.

SOLUTION

With AnalyticsCreator, you can design your data model for your analytical Power BI application and automatically generate a multi-tier data warehouse with the appropriate loading strategy. In the process, the business logic is mapped in one place in AnalyticsCreator and implemented in a wide range of Microsoft technologies. The model can be deployed in Azure SQL DB, Azure Analysis Services, Power BI or even on-premises technologies. AnalyticsCreator also gives you the ability to connect to an enormous number of data sources and deliver them to other front-end technologies. AnalyticsCreator allows you to connect to all the data sources you want and create an additional timeline that allows you to access any time-slice from the data warehouse.

data_platform-Power_BI

Benefits

Group

Awesome more power for Power BI

Vector (1)

Orchestration of the MS BI stack on Premises & in Azure

award

Best practises data modelling for Power BI

database

High performing delta loadings

pencil-alt

Enables slowly changing dimension concepts

eye

Near Real-Time concepts

information-circle

Single Point of Information

strip1

Data Vault 2.0 Automation

CHALLENGE

If you have many source systems, large amounts of data, or a complex data model that may change frequently, then Data Vault 2.0 modeling is ideal for building your Data Analytics Platform. Or you may have generally decided to build your Data Analytics Platform using the Data Vault 2.0 modeling approach. Either way, it is advisable to tackle such complex projects with an automation tool. Historization concepts, automated hash key generation and a transformation from the Data Vault model into a dimensional model up to the frontend (e.g. Power BI) needs to be provided.

SOLUTION

AnalyticsCreator includes a wizard that generates a first draft of a Data Vault model immediately after the source data is connected. In doing so, the developer is able to customize it and save their own template for future drafts. AnalyticsCreator alerts the developer with hints if they exceed the strict rules of DV 2.0 but does not restrict the developer. In addition, AnalyticsCreator offers the possibility to transition from the DV model to a dimensional model. The latter can lead directly to a cube or tabular technology or be transferred to a BI/analytics frontend. We call this process a hybrid approach. In addition, it is possible to use DV 2.0 achievements in a classic multidimensional model, thus combining the advantages of both worlds.

Data_Vault_2.0_Automation

Benefits

cube

Guided Data Vault 2.0 modeling

Vector1

Support of any Data Vault paradigm

layers-1

Any layer, raw vault, business vault, etc

Vector-3

Use of all features for multidimensional modeling

settings

Automation in the change process

eye

Visual representation of the models

grid

Clarity with very large data and data models

bottom-strip

Switch to cloud Data Analytics Platform Azure

CHALLENGE

There are numerous architectures and ideas on how to operate a data warehouse in the cloud. But which concept is the right one? More and more ERP and software providers offer operational solutions in the cloud. It therefore makes sense to connect these data sources and transfer them to your cloud data warehouse.

SOLUTION

With AnalyticsCreator, you primarily model independently, whether your data should be stored in the cloud or on-premise. First and foremost, you need to take into account the business model. At the latest possible time in the deployment process, you determine whether the model is to be built or migrated on-premises or in the cloud or on Azure. For special hybrid architectures, it is necessary to specify which areas/layers are to be stored and loaded where.

Data_Analytics_Platform_Azure

Benefits

Vector1-1

Rapid migration from on premises to Azure

link

Support for Azure SQL DB, Azure Analysis Services, Azure Data Factory, Power BI

Vector-4

Modeling for different purposes

toll

One modeling for different targets

strip1

Create a Data Analytics Platform from SAP Sources in the Microsoft BI Stack

CHALLENGE

Your source system is SAP. You may also want to make other data sources such as social media data, cloud data, IOT data, etc., available to your users for analytical purposes in a central data analytics platform in the cloud or on-premise. For this, you usually require expert knowledge, a significant amount of time, and the right tools.

SOLUTION

AnalyticsCreator includes special components that allow you to access your SAP source and decode the metadata from it and create an AnalyticsCreator metadata connector. In addition, predefined SAP metadata connectors are already available in AnalyticsCreator. From these, the AC-Wizard automatically creates a multi-layer data warehouse (DV 2.0 or Dimensional), which can be customized. Microsoft Tabular, Cube models, Power BI Apps, Tableau, QlikSense models are automatically generated from it. For the running operation we recommend the Theobald company data connector, which is delivered in the package on request.

Microsoft_BI_Stack

Benefits

Group1

Create a highly professional DWH up to 20 times faster.

chart-icon 2

Extremely fast ROI

flight-takeoff

State-of-the-art architectures and methods are use

Vector-Sep-09-2022-09-58-48-95-AM

Transparency for business department, BICC, IT

settings

Data lineage and automatic documentation

Vector-Sep-09-2022-09-56-57-23-AM

Collaboration features for "big enterprises”

settings-input-antenna

Distributed development

insert-emoticon

No dependency on AC

Group-1

SAP connectivity: ODP Objects, DetlaQ Tables, HANA, CDS Views.

Vector1-2

Theobald connector included

bottom-strip

Data Lakehouse Automation

CHALLENGE

Model and combine unstructured or semi-structured data in a data lake architecture and combine this or supplement it with a conventional DWH approach. The overall view of the data model must be ensured, and a data dictionary must be made available for the BI developers/analysts/business.

SOLUTION

With AnalyticsCreator you can combine the advantages of a conventional DWH and a data lake. Data Lake models or targets can be built or modeled with AnalyticsCreator.
AnalyticsCreator offers metadata-based data warehouse development. The definition of every source, table, transformation, or task is stored in the metadata database which is open and available for further use. Users have full access to the metadata, and they can manipulate this directly without using SQL scripts. The AnalyticsCreator metadata database has a very simple and user-friendly data structure that can be used in the data Lakehouse Architecture.

Data_Lakehouse_Automation

Benefits

One modeling for different targets

One modeling for different targets

Fast shift to Azure cloud

Fast shift to Azure cloud

Data Lineage & Automatic Documentation

Data Lineage & Automatic Documentation

ONE MODELING FOR DIFFERENT TARGETS

ONE MODELING FOR DIFFERENT TARGETS

strip1

Data export

CHALLENGE

Sometimes it is necessary to use the data of a DWH outside, be it in other analytical databases, BLOB storages, cloud storages, or simply text files to process them further. Alternatively, you may want to write the data calculated in the data warehouse back into your source database. For example, you could calculate the customer debt limit in the data warehouse or use AI functions provided by Microsoft to create assumptions that you then want to reuse in your operational system.

SOLUTION

AnalyticsCreator generates SQL Server Integration Services (SSIS) packages and Azure Data Factory pipelines to export data from your data warehouse to external databases. AnalyticsCreator supports any OLEDB or ODBC driver functionality for this export. The suitable driver will write data into your target system. CSV, text files, and Azure blob storage are supported too. When you export to the Azure blob storage, you can use CSV, Parquet, and Avro formats.

Data_export

Benefits

Use Analytics Data everywhere

Use Analytics Data everywhere

Use Azure BLOB Store

Use Azure BLOB Store

bottom-strip

Tableau push

CHALLENGE

When using Tableau as an analytics frontend technology, it is necessary to connect it with your data sources such as a data warehouse or any source data in a structured or unstructured form. Tableau offers data transformation possibilities that enable you to transform data in the way you want to use it in the frontend. However, using Tableau without a DWH can result in significant quality degradation. The most time-consuming part of an analytics project is the data preparation, which takes up approximately 60-80% of the project time. This is why automation makes so much sense.

SOLUTION

AnalyticsCreator manages the entire design, development, and deployment process for a data warehouse. The analytics model with KPIs is also contained within the data warehouse. This “last layer” is automatically pushed into the suitable format for Tableau. With Tableau, you only need to open the file/project and the complete model is available for analytical purposes. You have direct access to the data in the data warehouse or you can pull the data out of the data warehouse. Changes in the DWH will be pushed to Tableau without needing to do it manually. Recognized data design principles require the entire data model in the warehouse; otherwise, you are taking a path with higher risks.

Tableau_push

Benefits

Model creation for Tableau

Model creation for Tableau

One place of development

One place of development

Deploy changes fast to Tableau

Deploy changes fast to Tableau

strip1

Azure Synapse Analytics

CHALLENGE

In some projects, there are significant challenges with query speed and large amounts of data. In these cases, it makes sense to use the Microsoft dedicated Synapse version. However, there are also disadvantages that should be taken into consideration.

SOLUTION

In AnalyticsCreator, you have one holistic data model and it doesn’t matter if you generate a database for Azure SQL or Azure Synapse. You can start modeling and make your decision later when you decide on the target platform. You can start with Azure SQL first and then switch to Azure Synapse

Azure_Synapse_Analytics

Benefits

Use the Synapse approach

Use the Synapse approach

star-1

One holistic data model for all approaches

Shift from SQL DB to Synapse

Shift from SQL DB to Synapse