Fivetran Log dbt Package (Docs)
- Generates a comprehensive data dictionary of your Fivetran Log data via the dbt docs site
- Produces staging models in the format described by this ERD which clean, test, and prepare your Fivetran Log data from Fivetran's free connector and generates analysis ready end models.
- The above mentioned models enable you to better understand how you are spending money in Fivetran according to our consumption-based pricing model as well as providing details about the performance and status of your Fivetran connectors and transformations. This is achieved by:
- Displaying consumption data at the table, connector, destination, and account levels
- Providing a history of measured free and paid monthly active rows (MAR), credit consumption, and the relationship between the two
- Creating a history of vital daily events for each connector
- Surfacing an audit log of records inserted, deleted, an updated in each table during connector syncs
Refer to the table below for a detailed view of all models materialized by default within this package. Additionally, check out our docs site for more details about these models.
model | description |
---|---|
fivetran_log__connector_status | Each record represents a connector loading data into a destination, enriched with data about the connector's data sync status. |
fivetran_log__transformation_status | Each record represents a transformation, enriched with data about the transformation's last sync and any tables whose new data triggers the transformation to run. |
fivetran_log__mar_table_history | Each record represents a table's free, paid, and total volume for a month, complete with data about its connector and destination. |
fivetran_log__usage_mar_destination_history | Table of each destination's usage and active volume, per month. Includes the usage per million MAR and MAR per usage. Usage either refers to a dollar or credit amount, depending on customer's pricing model. Read more about the relationship between usage and MAR here. |
fivetran_log__connector_daily_events | Each record represents a daily measurement of the API calls, schema changes, and record modifications made by a connector, starting from the date on which the connector was set up. |
fivetran_log__schema_changelog | Each record represents a schema change (altering/creating tables, creating schemas, and changing schema configurations) made to a connector and contains detailed information about the schema change event. |
fivetran_log__audit_table | Replaces the deprecated fivetran_audit table. Each record represents a table being written to during a connector sync. Contains timestamps related to the connector and table-level sync progress and the sum of records inserted/replaced, updated, and deleted in the table. |
- Connector: Have the Fivetran Fivetran Log connector syncing data into your warehouse.
- Database support: This package has been tested on BigQuery, Snowflake, Redshift, Postgres, and Databricks. Ensure you are using one of these supported databases.
If you are using a Databricks destination with this package you will need to add the below (or a variation of the below) dispatch configuration within your dbt_project.yml
. This is required in order for the package to accurately search for macros within the dbt-labs/spark_utils
then the dbt-labs/dbt_utils
packages respectively.
dispatch:
- macro_namespace: dbt_utils
search_order: ['spark_utils', 'dbt_utils']
Include the following fivetran_log package version in your packages.yml
Check dbt Hub for the latest installation instructions, or read the dbt docs for more information on installing packages.
packages:
- package: fivetran/fivetran_log
version: [">=0.7.0", "<0.8.0"]
By default, this package will run using your target database and the fivetran_log
schema. If this is not where your Fivetran Log data is (perhaps your fivetran_log schema is fivetran_log_fivetran
), add the following configuration to your root dbt_project.yml
file:
vars:
fivetran_log_database: your_database_name
fivetran_log_schema: your_schema_name
If you have never created Fivetran-orchestrated basic SQL transformations, your source data will not contain the transformation
and trigger_table
tables. Moreover, if you have only created scheduled basic transformations that are not triggered by table syncs, your source data will not contain the trigger_table
table (though it will contain transformation
).
Additionally, if you do not leverage Fivetran RBAC, then you will not have the user
, account_membership
, or destination_membership
sources. To disable the corresponding functionality in the package, you must add the following variable(s) to your root dbt_project.yml
file. By default, all variables are assumed to be true
:
vars:
fivetran_log_using_transformations: false # this will disable all transformation + trigger_table logic
fivetran_log_using_triggers: false # this will disable only trigger_table logic
fivetran_log_using_account_membership: false # this will disable only the account membership logic
fivetran_log_using_destination_membership: false # this will disable only the destination membership logic
fivetran_log_using_user: false # this will disable only the user logic
Expand for configurations
Some users may wish to exclude Fivetran error and warnings messages from the final fivetran_log__connector_status
model due to the length of the message. To disable the errors_since_last_completed_sync
and warnings_since_last_completed_sync
fields from the final model you may add the following variable to you to your root dbt_project.yml
file. By default, this variable is assumed to be true
:
vars:
fivetran_log_using_sync_alert_messages: false # this will disable only the sync alert messages within the connector status model
By default this package will build the Fivetran Log staging models within a schema titled (<target_schema> + _stg_fivetran_log
) and the Fivetran Log final models within your <target_schema> + _fivetran_log
in your target database. If this is not where you would like you Fivetran Log staging and final models to be written to, add the following configuration to your root dbt_project.yml
file:
models:
fivetran_log:
+schema: my_new_final_models_schema # leave blank for just the target_schema
staging:
+schema: my_new_staging_models_schema # leave blank for just the target_schema
If an individual source table has a different name than expected (see this projects dbt_project.yml variable declarations for expected names), provide the name of the table as it appears in your warehouse to the respective variable as identified below:
vars:
fivetran_log_<default_table_name>_identifier: your_table_name
If you are using a Databricks destination with this package you will need to add the below (or a variation of the below) dispatch configuration within your root dbt_project.yml
. This is required in order for the package to accurately search for macros within the dbt-labs/spark_utils
then the dbt-labs/dbt_utils
packages respectively.
dispatch:
- macro_namespace: dbt_utils
search_order: ['spark_utils', 'dbt_utils']
Expand for details
Fivetran offers the ability for you to orchestrate your dbt project through Fivetran Transformations for dbt Core™. Refer to the linked docs for more information on how to setup your project for orchestration through Fivetran.
This dbt package is dependent on the following dbt packages. Please be aware that these dependencies are installed by default within this package. For more information on the below packages, refer to the dbt hub site.
If you have any of these dependent packages in your own
packages.yml
I highly recommend you remove them to ensure there are no package version conflicts.
packages:
- package: fivetran/fivetran_utils
version: [">=0.4.0", "<0.5.0"]
- package: dbt-labs/dbt_utils
version: [">=1.0.0", "<2.0.0"]
- package: dbt-labs/spark_utils
version: [">=0.3.0", "<0.4.0"]
The Fivetran team maintaining this package only maintains the latest version of the package. We highly recommend you stay consistent with the latest version of the package and refer to the CHANGELOG and release notes for more information on changes across versions.
These dbt packages are developed by a small team of analytics engineers at Fivetran. However, the packages are made better by community contributions!
We highly encourage and welcome contributions to this package. Check out this post on the best workflow for contributing to a package!
- If you encounter any questions or want to reach out for help, please refer to the GitHub Issue section to find the right avenue of support for you.
- If you would like to provide feedback to the dbt package team at Fivetran, or would like to request a future dbt package to be developed, then feel free to fill out our Feedback Form.
- Have questions or want to be part of the community discourse? Create a post in the Fivetran community and our team along with the community can join in on the discussion!