Integration
Collectors
dbt core
4min
the kensu collector for dbt core is a powerful tool designed to enhance data observability in snowflake environment by collecting metadata from dbt as well as the data sources used by its models, it provides valuable insights into data lineage, schema changes, and metrics how it works step 1 extraction of dbt artifacts the kensu collector is adept at retrieving essential metadata generated by dbt via the already implemented dbt plugin, which includes metadata related to the models and jobs a comprehensive list of data sources , such as snowflake tables or bigquery datasets detailed lineage , encompassing the input/output relationships among data sources step 2 metadata extraction from data sources data source schema extraction for each table identified in the lineage, the collector retrieves schema details, such as column names and data types this helps in understanding the structure of data being manipulated quality metrics collection alongside schema information, the system collects a range of quality metrics for the data sources involved these metrics may include column level statistics like null counts, unique value counts, and distribution summaries, providing a comprehensive view of data quality step 3 transmission to kensu core after collection, all observations—including lineage information, schema details, and quality metrics—are transmitted to kensu core, which will process and analyze the metadata to offer actionable insights, such as anomaly detection, data quality issues, and performance bottlenecks this process works in pair with an agent in the next pages, see how you can configure it with configure dbt core collector with elementary for snowflake docid\ meo4pur1hus1hpholbzx1 or elementary data for bigquery