Integration
The Kensu integration ecosystem is composed of the following:
An agent is essentially a specialized piece of software that is directly incorporated into scripts or programs. An agent can be a spark jars, a python module or an activity in Azure Data Factory for instance.
The deployment of agents is straightforward: they are seamlessly integrated into the respective program or script to enhance its capabilities.
Examples of agents include PySpark.
A collector operates externally from the primary data tools. Its main function is to create data observations, and it employs different strategies to achieve this:
- Retrieve prepared information from data tools' interfaces (e.g., API)
- Consume and interprete raw information generated by the data tools (e.g., logs)
- Trigger processes in the data tools (e.g., sql metrics queries), using for instance Data Source Connections
Examples of collectors are Azure Data Factory
A data source connection is essentially a bridge that facilitates direct communication with database tables. When a collector triggers it, the data source connection leaps into action, querying the database to supplement the collector's data.
Examples of data sources accessed through DS connections include Snowflake and MS SQL.
Kensu can establish connections with various tools using APIs. This connection serves a dual purpose:
- Configuration: Kensu can access tools like Databricks to set up and configure specific agents, like the spark agent.
- Data Push: Kensu can also transmit information to tools, such as sending a list of data sources to a Data Catalog like Data Galaxy.