English
Deploy your Data Warehouse in Microsoft Fabric SQL using Analytics Creator
Questions
- Does AnalyticsCreator generate pipelines automatically?
- What is required to deploy to Fabric SQL?
- How does AnalyticsCreator use Azure Data Factory?
- What artifacts are generated during deployment?
- How is Power BI integrated in the deployment process?
- Do you need to manually code pipelines or SQL?
Key takeaways
- AnalyticsCreator creates a metadata repository (no data moved initially).
- The Data Warehouse Wizard generates full DWH structure across layers.
- Deployment includes automatic generation of DACPAC and Power BI artifacts.
- Azure Data Factory pipelines are generated automatically (ARM templates).
- Data loading is executed via workflow pipelines.
- Fabric SQL database is populated through automated orchestration.
- Power BI semantic model is generated and ready for use.
Transcript
Welcome to the AnalyticsCreator channel.
In this video, I will show you how to deploy your data warehouse directly to Microsoft Fabric using AnalyticsCreator. Let’s do it together.
First, we need to open AnalyticsCreator and create a new repository. I’ll call it “Northwind.”
AnalyticsCreator is now creating our metadata repository. This repository contains the structure we need to build our data warehouse.
Next, we need a data source. In this case, we are going to use the Northwind dataset.
Now we have our dataset connected to AnalyticsCreator.
Remember, AnalyticsCreator is a pure design-time tool, so nothing has been imported yet, and no data has been moved.
Now we have all the source tables, and we are going to import them into AnalyticsCreator.
We need to select all the required layers, for example, the transformation layer, historization layer, dimensions, and facts.
Let’s find the facts here. In this case, they are Order Details and Orders.
On this screen, we also have some options we can customize. Now, let’s finish.
AnalyticsCreator will create our calendar and apply predefined transformations, such as trimming text and converting null values to strings.
AnalyticsCreator will generate the calendar and create the full structure of our data warehouse.
We now have our data warehouse structure, starting from the sources, through the staging layer, the persistent staging layer, the core layer, and finally the data mart layer.
For this demo, I will not add any DAX measures here. I will publish this directly to Fabric SQL.
Let’s call it “Northwind” and create the deployment for Northwind.
One important point: we will use Azure Data Factory as the runtime. This means we will run the pipeline using Azure Data Factory and publish the data warehouse to Fabric SQL.
Creating a database in Fabric is very easy. Go to “New Item,” search for SQL Database, call it “Fabric,” and create it.
To deploy our data warehouse directly to Fabric, we need the connection strings.
Now we can directly generate and deploy the DACPAC. This is very useful because we do not need to handle the deployment manually.
AnalyticsCreator will automatically generate all scripts, the DACPAC, and the Power BI files, including the TMDL files.
To populate the database with data, we create a new Azure Data Factory.
AnalyticsCreator has generated the package we need. Now it is only a matter of configuration.
We import the ARM template, which includes all 47 resources that were automatically created.
As you can see, our pipeline has run successfully. All the data is now inside Fabric SQL.
Let’s check some tables. We can see that the data from our source has been populated in Fabric.
Now let’s look at the Power BI file generated by AnalyticsCreator.
We can see the Power BI semantic model, ready to use, with active relationships and all the tables selected for our star schema.
Now it is just a matter of refreshing it.