Kensu Documentation

⌘K
Getting started with the Kensu Community Edition
Marketing campaign
Financial data report
Getting credentials
Recipe: Observe Your First Pipeline
Agents: getting started
Python
PySpark
Scala Spark
Databricks Notebook
Agent Listing
Docs powered by archbee 
8min

Configure the Agent

💻 Include Kensu in your program

Modify conf.ini File

Enter the kensu_ingestion_token in the conf.ini file:

ini
|

🕵️‍♀️Explore the program

Here is the code before we add Kensu:

Scala
|

👨‍💻 Modify Program to use Kensu-Spark

1️⃣ Init Kensu in your code

First, we create a Spark session, adding the Kensu JAR file.

Scala
|

2️⃣Create Connection to Kensu

SparkSessionDAMWrapper creates a connection between the Spark job and Kensu in order to send the events through the API.

Scala
|

3️⃣ Send metadata to Kensu

Then we write the data to the Kensu cloud using the join()function. Then we save the data model in a Parquet file and sends metadata, profiling and lineage to Kensu:

Scala
|

Complete Code

Here is the Complete code

Scala
|



Updated 07 Nov 2022
Did this page help you?
Yes
No
UP NEXT
Running your First Scala Kensu Spark Program
Docs powered by archbee 
TABLE OF CONTENTS
💻 Include Kensu in your program
Modify conf.ini File
🕵️‍♀️Explore the program
👨‍💻 Modify Program to use Kensu-Spark
Complete Code