New construction of a Data Analytics platform
There are good reasons to reorganize data on a greenfield site: either no data warehouse yet exists in the company yet or it is more efficient to build a new one than to adapt an existing one. Even extensions of existing solutions are basically limited new implementations. Building a data warehouse requires time, capital and know-how. Mostly ELT or ETL tools are used for this purpose.
This means a manual, i.e. slow and step-by-step, development process that can no longer cope with today's rapidly changing business requirements.
With AnalyticsCreator, you can create a data analytics application such as a data warehouse with graphical support in no time at all, connect any data sources, transfer the data to your cube or MS Tabular model and make the analysis models available to any BI/analytics front end. In this way, you can easily connect business requirements with your data sources and map the core business logic directly in AnalyticsCreator.
If something changes in the data source, in the logic or in the process, the changes are automatically updated and published in all levels.
Analytics platform modernization
The current data warehouse is getting on in years and no longer meets the new, extended requirements that the data basis should delivered to the analytical front end. Its performance may no longer be sufficient, adaptations are difficult to implement, and would no longer be future-proof.
Perhaps the know-how about the existing data warehouse is no longer readily available because the consultant's employees or consultants are no longer available.
AnalyticsCreator enables you to build on an existing DWH and make extensions and adjustments. If a good foundation is available, it is easy to build on top of it. In addition, using the AC's reverse engineering methodology, it is possible to take code from an existing DWH application and integrate it into the AC. In this way, even more layers / areas can be included in the automation and thus support the expected change process even more extensively.
The extension of a manually (i.e. with an ETL/ELT tool) developed DWH can quickly consume time and resources. From our experience and various studies that can be found on the web, the following rule can be derived: The longer the lifecycle, the higher the costs rise.
A data platform for Power BI
Different data sources are to be connected with Power BI. This involves large volumes of data, the processing of which can be complex. In addition, the data is to be historized in order to be able to use the corresponding advanced analytics tools and methods. A tabular model is to be built. And much more.
Microsoft's Power BI is one of the most popular analytics front-ends worldwide. In Power BI, there is the ability to design a multidimensional model for analytic applications and connect it to data sources in the cloud or on site.
To achieve higher data quality and meaningfulness, data is typically extracted from the source, cleansed, logically prepared, connected to correlating data, and aggregated. In addition, model-based and technical performance optimizations are necessary.
Another very relevant point is that the data historization of the source data should be implemented and stored in a data warehouse, which is not really possible with Power BI. Only in this way are timely, correct analyses and reports possible at all.
With Analytics Creator you design your data model for your analytical Power BI application and automatically generate a multi-tier data warehouse with the appropriate loading strategy. In the process, the business logic is mapped in one place in AnalyticsCreator and implemented in a wide range of MS technologies. The model is deployed in Azure SQL DB, Azure Analysis Services, Power BI or even on-premises technologies.
AnalyticsCreator also gives you the ability to connect to an enormous number of data sources and deliver them to other front-end technologies. AnalyticsCreator allows you to connect to all the data sources you want and create an additional timeline that allows you to access any time-slice from the data warehouse.
Data Vault 2.0 Automation
Especially when many source systems and large amounts of data are found, and a complex model is derived from this that is constantly changing, the Data Vault 2.0 approach is the right choice.
If you have many source systems, large amounts of data, and a complex data model that is constantly changing, Data Vault 2.0 modeling is well suited to build your data analytics platform. Or you have generally decided to build your data analytics platform with the Data Vault 2.0 modeling approach. In any case, it is advisable to carry out such a complex project with an automation tool.
AnalyticsCreator includes a wizard that generates a first draft of a Data Vault model immediately after the source data is connected. In doing so, the developer is able to customize it and save their own template for future drafts. AC alerts the developer with hints if they exceed the strict rules of DV 2.0, but does not restrict the developer. In addition, AC offers the possibility to transition from the DV 2.0 model to a dimensional model. The latter can lead directly to a cube or tabular technology or be transferred to a BI/analytics frontend.
We call this process a hybrid approach. In addition, it is possible to use DV 2.0 achievements in a classic multidimensional model, thus combining the advantages of both worlds.
Switch to cloud Data Analytics Platform Azure
There are numerous architectures and ideas on how to operate a data warehouse in the cloud. But which concept is the right one? More and more ERP and software providers offer operational solutions in the cloud. It therefore makes sense to connect these data sources and transfer them to your cloud data warehouse.
With AnalyticsCreator, you primarily model independently, whether your data should be stored in the cloud or on Premise. First and foremost, you concentrate on the business model. At the latest in the deployment process, you determine whether the model is to be built or migrated on premises or in the cloud or on Azure.
For special hybrid architectures, it is necessary to specify which areas/layers are to be stored and loaded where.
Create a Data Analytics Platform from SAP Sources in the Microsoft BI Stack
Your source system is SAP. You may also want to make other data sources such as social media data, cloud data, IOT data, etc., available to your users for analytical purposes in a central data analytics platform in the cloud or on premises.
For this you usually need expert knowledge and a lot of time and the right tools for it.
AnalyticsCreator includes special components that allow you to access your SAP source and decode the metadata from it and create an AC metadata connector. In addition, predefined SAP metadata connectors are already available in the AC-Cloud. From these, the AC-Wizard automatically creates a multi-layer data warehouse (Data Vault 2.0 or Dimensional, ..), which can be customized.
Microsoft Tabular, Cube models, Power BI Apps, Tableau, QlikSense models are automatically generated from it. For the running operation we recommend the Theobald company data connector, which is delivered in the package on request.
Data Lakehouse Automation
Using the best of both worlds data lakes and data warehouses. Raw data should be used from several places out of the cloud and combined with structured data as well. Meta data should be collected and stored in a data dictionary which should be available for all data processing and analytical proposes.
AC offers metadata-based data warehouse development. The definition of every source, table, transformation or task is stored in the metadata database which is open and available for further use. User has full access to the metadata, and can always manipulate it directly using SQL scripts. AC metadata database has a very simple and user-friendly data structure that can be used in data Lakehouse Architecture.
Sometimes it is necessary to use the data of a DWH outside, be it in other analytical databases, BLOB storages, cloud storages or simply text files in order to process them further. Or you will write the data calculated in the data warehouse back into your source database – for example you calculate the customer debt limit in the data warehouse and will write it back into the database containing the master data.
AC generates SSIS packages and also Azure Data Factory pipelines to export data from your data warehouse to external databases. AC supports any OLEDB or ODBC driver functionality for this export. The suitable driver will write data into your target system. CSV, text files and Azure blob storage is supported, too. When you export to the Azure blob storage, you are able to use CSV, Parquet and Avro formats.
If you are using Tableau as an Analytics frontend technology you have to connect with Tableau to your data sources, like a data warehouse or any source data in a structured or unstructured form. Tableau offers data transformation possibilities which enables you to transform data in the way you want to use it in the frontend. The most time-consuming part in an Analytics project is the data preparation. Approximately 60-80% of project time is to spend on it.
AnalyticsCreator of course manages the whole design, development and deployment process for a data warehouse. Also, the Analytics model with KPIs is contained there. This “last layer” will be pushed automatically into the suitable format for Tableau. With Tableau you have only to open the file/project and the complete model is there for analytical purposes. So, you have direct access to the data in the data warehouse or you pull the data out of the data warehouse. Changes in the DWH will be pushed to Tableau without needing to do it manually. Recognized data design principals require the entire data model in the warehouse. Otherwise, you are taking a path with higher risks. Read more in our Blog.
Azure Synapse Analytics
In some projects there are big challenges with query speed and large amounts of data; in this case, it makes sense to use the Microsoft dedicated Synapse version. There are also disadvantages that should be examined closely.
In AC you have one holistic data model and it doesn’t matter if you generate a database for Azure SQL or Azure Synapse. You can start modelling and take your decision later when you decide on the target platform. You can start first with Azure SQL and then switch to Azure Synapse. Read more on the Microsoft website.