English
De-risking the complexity of data management in Reporting and Controlling
AnalyticsCreator automates the creation, transformation, and deployment of data warehouses by generating code from metadata, significantly reducing manual effort and improving speed. It enables teams to integrate multiple data sources, maintain full lineage, and avoid vendor lock-in through runtime-independent SQL. This results in faster delivery, better governance, and more flexible analytics architectures.
Questions
- How does AnalyticsCreator automate data warehouse development?
- What is the Data Warehouse Wizard?
- How do you integrate new data sources into an existing model?
- What layers exist in a modern data warehouse architecture?
- How does AnalyticsCreator handle historization?
- How does AnalyticsCreator reduce vendor lock-in?
Key takeaways
- Data warehouse automation removes repetitive manual work.
- Full data warehouse structures can be generated in minutes.
- Metadata-driven design enables consistency and governance.
- Historization and lineage are automatically handled.
- New data sources (e.g. CSV) can be integrated quickly.
- Deployment is automated and standardized.
- No runtime dependency → no vendor lock-in.
- Agile development (sprints) is fully supported.
- Data warehouse becomes scalable and maintainable.
Transcript
My name is Rosario Di Lorenzo, and I lead the AnalyticsCreator global partner organization.
Over the past 15 years, I have had the privilege of helping partners around the world create and share their success stories.
Today, we have a very interesting topic, and I’m pleased to be joined by Tobias, who will give the demonstration, and Hao, who will provide methodology insights.
AnalyticsCreator is a German company based in Munich, founded in 2017. However, the software itself has a much longer history, and we are now working with its third generation.
My name is Rosario Di Lorenzo, and I am responsible for the AnalyticsCreator global partner organization. Over the past 15 years, I have truly enjoyed helping partners around the world create and share their success stories. This is my passion.
Today, we have a very interesting topic, and of course, I’m not doing it alone.
I’m joined by Tobias from BI Automation, who will give you a short demonstration of the tool, and Hao from ESG for CFO, who will comment on the demo and provide insights from a methodology point of view.
AnalyticsCreator is a German company based in Munich and was founded in 2017. However, the software itself is much older. We are currently in the third generation and are proud to have helped promote data automation practices over the past 15 years.
We have more than 50 value-added partners and around 690 active data engineers and developers.
We have a clear mission: to help developers reduce repetitive tasks.
Data engineers have to connect many systems using different tools.
The workload is not decreasing, and response times to business requests are not improving.
We also see an increasing risk of vendor lock-in.
We believe automation is the best way to regain control of data and reduce operational costs.
AnalyticsCreator is highly adaptable.
We focus on restoring engineering freedom and making SQL code independent from runtime dependencies.
We connect sources, create a data catalog, and automatically generate a data warehouse.
Within two to three minutes, you get a full draft, including historization.
You can then go into each stage and define the required transformations.
Deployment is handled automatically.
It starts with the source.
Data lands in a metadata-driven catalog.
The wizard creates all layers automatically.
You can deploy to SQL, Azure, or other environments.
You can visualize data with any BI tool.
Everything is stored in an open data repository.
Importantly, we only handle metadata, not the actual data.
The automation engine generates code, documentation, and lineage.
Developers can collaborate directly on metadata.
Customers can solve modeling issues instantly.
Engineering efficiency improves.
Total cost of ownership is reduced.
The code has no runtime dependency.
Even without AnalyticsCreator, the generated code continues to work.
Customers are overwhelmed by business requests.
Manual coding and documentation slow them down.
Teams are under pressure.
Data quality issues occur.
Projects struggle to reach completion.
Complexity is not always visible.
Automation solves these issues and restores control.
We are ranked highly in industry reports.
Customers rate us strongly across multiple KPIs.
Self-learning is fast.
Within two to three weeks, users are productive.
The tool enhances developer knowledge rather than restricting it.
Traditional waterfall approaches no longer work.
Agile and iterative approaches are required.
Development is split into sprints.
The CRISP-DM method is recommended.
The steps include:
- Understanding
- Data sourcing
- Modeling
- Evaluation
- Deployment
After two to three sprints, results are already usable.
The roles include product owner, scrum master, and developer.
We start after the first sprint.
The initial dashboard is shown.
Then the requirements change.
The second sprint integrates a new data source.
The layers include:
- Source
- Staging
- Persisted staging, including historization
- Core, for modeling
- Data mart
Historization allows time travel.
Data is joined and prepared for analytics.
Create a new connector.
Load the CSV file.
Extract the structure automatically.
Run the Data Warehouse Wizard.
The new source is automatically integrated into all layers.
Join the CSV source to the fact table.
Create the SQL join.
Define the returned flag.
Synchronize the data warehouse.
Changes propagate automatically.
Use the deployment package.
The system generates SSIS packages.
Deploy with one click.
Data loss protection is available.
The workflow package orchestrates execution.
The dashboard updates automatically.
Returns are now visible.
Metrics are adjusted.
The new data source is fully integrated.
Many companies collect ESG data via Excel.
AnalyticsCreator integrates this data into a single source of truth.
It ensures reliable reporting.
It combines financial and ESG data.
Automation generates code.
Data governance improves.
The architecture is scalable.
New sources can be added easily.
Continuous improvement is supported.
The process is standardized.
Development risk is reduced.