Rules
In this section you can learn how to work with Rules in Notebooks written in Scala.
We've moved! To improve customer experience, the Collibra Data Quality User Guide has moved to the Collibra Documentation Center as part of the Collibra Data Quality 2022.11 release. To ensure a seamless transition, dq-docs.collibra.com will remain accessible, but the DQ User Guide is now maintained exclusively in the Documentation Center.
Code | Description |
---|---|
val rule = new Rule() | Instantiating a Rule object |
rule.setDataset(<DATASET>) | Adding the name of the dataset |
rule.setRuleNm(<RULE_NAME>) | Adding the name of the given rule |
rule.setRuleValue(<RULE_EXPRESSION>) | Setting the simple RULE_EXPRESSION |
rule.setRuleType(<RULE_TYPE>) | Setting the rule type |
rule.setPerc(<RULE_PERCENTAGE>) | Setting the percentage |
rule.setPoints(<RULE_POINT>) | Setting how many points
will be deducted from total
for each percentage |
rule.setIsActive(<RULE_IS_ACTIVE>) | Making rule active/inactive Possible values:
|
rule.setUserNm(<RULE_OWNER_USERNAME>) | Adding the owner |
import com.owl.core.Owl
import com.owl.core.util.OwlUtils
import com.owl.common.options.OwlOptions
import com.owl.common.{Props, Utils}
import com.owl.common.domain2.Rule
import org.apache.spark.sql.functions._
DataBricks and other notebook execution frameworks are working with managed/shared spark session, therefor we recommend to use this code snippet in your notebook to initialize the current spark session properly.
//----- Init Spark ----- //
def sparkInit(): Unit = {
sparkSession = SparkSession.builder
.master("local")
.appName("test")
.getOrCreate()
}
Don't call spark.stop at any of your notebooks, otherwise the execution engine will exit immediately!
Make sure OwlContext is already created before using any method from OwlUtils!
Last modified 6mo ago