Understanding AnalyticsCreator

AnalyticsCreator is a metadata-driven design application for building and automating data warehouses and analytical models. Instead of manually implementing ETL and SQL logic, developers define metadata such as sources, keys, relationships, transformations, and loading behavior. AnalyticsCreator uses these definitions to generate database objects, pipelines, and semantic models.

How AnalyticsCreator Works

The workflow in AnalyticsCreator starts with a repository, continues with source metadata import, and then uses a wizard to generate a draft data warehouse model. That model is refined, synchronized into SQL objects, deployed to the target environment, and finally executed through generated workflows or pipelines.

  1. Create a repository
  2. Define or import connectors
  3. Import source metadata
  4. Run the data warehouse wizard
  5. Refine the generated model
  6. Synchronize the structure
  7. Deploy artifacts
  8. Execute workflows
  9. Consume data through data marts and semantic models

Repository and Metadata

Every AnalyticsCreator project is based on a repository. The repository is a SQL Server database that stores the full metadata definition of the data warehouse. This includes connectors, source objects, transformations, keys, relationships, deployment settings, and other object definitions. The repository is the design-time control layer and the source for all generated artifacts.

This means the target database is not modeled manually. Instead, AnalyticsCreator reads the repository metadata and generates the required SQL structures from it. Generated code can run independently after deployment because AnalyticsCreator is used as a design-time application, not as a runtime dependency.

Connectors and Metadata Import

AnalyticsCreator connects to source systems such as SQL Server or SAP and imports structural metadata including tables, columns, keys, and references. In some scenarios, metadata can also be imported through metadata connectors, which makes it possible to model a data warehouse without an active connection to the live source system during design.

Imported metadata is stored in the repository and later used by the wizard to generate the draft warehouse model. At this stage, no warehouse data has been loaded yet. Only structure and metadata are being captured.

The Wizard

The data warehouse wizard is the central acceleration mechanism in AnalyticsCreator. It analyzes source metadata and generates a draft warehouse model automatically. Depending on the selected approach, this can be a dimensional model, a Data Vault model, or a mixed approach. The wizard can create staging structures, historization layers, dimensions, facts, calendar dimensions, and default relationships based on detected metadata.

The generated model is not the end result. It is the baseline that developers refine and validate. The main engineering work happens after generation, when keys, joins, historization behavior, measures, and transformations are adjusted to fit the intended warehouse design.

Warehouse Layers

AnalyticsCreator supports a layered warehouse architecture from source to presentation. In a typical setup, this includes source objects, staging, persistent staging or historization, CORE transformations, data marts, and semantic or reporting layers. It can also generate analytical models for tools such as Power BI.

Persistent Staging

A key architectural concept is the persistent staging layer. Source data is first imported into staging structures and then stored persistently for further processing. This persistent layer is used for historization and for decoupling source extraction from downstream transformations. It allows data to be reprocessed without repeatedly reading the source system.

In dimensional scenarios, historized tables typically include surrogate keys together with valid-from and valid-to columns. In Data Vault and hybrid scenarios, additional hash-based keys and references can be generated in the staging layer as persisted calculated columns and then reused in later layers.

Transformations

Transformations in AnalyticsCreator are usually generated as SQL views based on metadata definitions. These definitions specify source tables, joins, selected columns, macros, and transformation rules. In many cases, the default generated view logic is sufficient as a starting point, but it can be refined through metadata rather than by rewriting generated SQL directly.

AnalyticsCreator also supports reusable macros for standard SQL logic, such as date-to-calendar-key conversion or hash key generation. This allows repeated logic to be defined once and reused consistently across the model.

Synchronization, Deployment, and Execution

These three steps are related but different and should not be confused.

Synchronization

Synchronization materializes the metadata model into SQL objects in the target database. This creates the database structure defined in AnalyticsCreator, such as tables, views, and procedures. It does not mean that business data has already been loaded. :contentReference[oaicite:13]{index=13}

Deployment

Deployment creates and distributes deployable artifacts for the selected target environment. These can include SQL database packages, SSIS packages, Azure Data Factory pipelines, and semantic models. Deployment prepares the environment but still does not imply that source data has already been processed.

Execution

Execution runs the generated workflows and pipelines. This is the step where source data is actually extracted, written to staging, historized where required, transformed into CORE structures, and exposed through data marts and semantic models. In Azure scenarios, this may happen through Azure Data Factory. In on-premise scenarios, this may happen through SSIS.

Consumption

After execution, the data warehouse can be consumed through data marts and semantic models. These structures are intended for reporting and analytics, while lower layers such as staging and historization should remain implementation layers rather than direct reporting interfaces. AnalyticsCreator can generate tabular models and structures for tools such as Power BI.

Design Implications

  • The repository is the source of truth
  • Metadata drives generation, not manual SQL-first development
  • The wizard creates a baseline, not a final production model
  • Persistent staging is part of the architecture, not just a temporary landing area
  • Synchronization, deployment, and execution are separate steps
  • Consumption should happen from data marts or semantic models, not from staging layers

Key Takeaway

AnalyticsCreator works by storing warehouse definitions as metadata, generating SQL and orchestration artifacts from that metadata, and then deploying and executing those artifacts in the target environment.