4min
Running your first Databricks Kensu program
👨💻 Modify Notebook to use Kensu
1️⃣ Init Kensu in your code
Add a new code cell at the beginning of the Notebook
Scala
|
ℹ️ Customizing configuration for single notebook
It is possible to overwrite the settings of the conf.ini file only for this particular notebook (i.e. conf.ini could be shared globally, and customizations applied for certain notebook when needed) with the following code:
io.kensu.sparkcollector.environments.DatabricksCollector.track(spark, "/dbfs/FileStore/conf/conf.ini", "report_to_file" -> "True")
2️⃣ Run the application
Execute the already existing program. You can find a test notebook here
3️⃣ Check the result
Now you can view the data in the Kensu App.
Initialization with Python code
In case Scala language is not convenient, it is possible to use full Python code
Python
|