Configure the Agent
Enter the dam.ingestion.auth.token in the conf.ini file:
Here is the code before we add Kensu:
1️⃣ Init Kensu in your code
First, we create a Spark session, adding the Kensu JAR file.
2️⃣Create Connection to Kensu
SparkSessionDAMWrapper creates a connection between the Spark job and Kensu in order to send the events through the API.
3️⃣ Send metadata to Kensu
Then we write the data to the Kensu cloud using the join()function. Then we save the data model in a Parquet file and sends metadata, profiling and lineage to Kensu:
Here is the Complete code