Rules
In this section you can learn how to work with Rules in Notebooks written in Scala.

Instantiation

Code
Description
val rule = new Rule()
Instantiating a Rule object
rule.setDataset(<DATASET>)
Adding the name of the dataset
rule.setRuleNm(<RULE_NAME>)
Adding the name of the given rule
rule.setRuleValue(<RULE_EXPRESSION>)
Setting the simple RULE_EXPRESSION
rule.setRuleType(<RULE_TYPE>)
Setting the rule type
rule.setPerc(<RULE_PERCENTAGE>)
Setting the percentage
rule.setPoints(<RULE_POINT>)
Setting how many points will be deducted from total for each percentage
rule.setIsActive(<RULE_IS_ACTIVE>)
Making rule active/inactive
Possible values:
  • ACTIVE: 1 / true
  • INACTIVE: 0 / false
rule.setUserNm(<RULE_OWNER_USERNAME>)
Adding the owner

Rule types

Requirements

Required imports

1
import com.owl.core.Owl
2
import com.owl.core.util.OwlUtils
3
import com.owl.common.options.OwlOptions
4
import com.owl.common.{Props, Utils}
5
6
import com.owl.common.domain2.Rule
7
8
import org.apache.spark.sql.functions._
Copied!

SparkSession initialization

DataBricks and other notebook execution frameworks are working with managed/shared spark session, therefor we recommend to use this code snippet in your notebook to initialize the current spark session properly.
1
//----- Init Spark ----- //
2
def sparkInit(): Unit = {
3
sparkSession = SparkSession.builder
4
.master("local")
5
.appName("test")
6
.getOrCreate()
7
}
Copied!
Don't call spark.stop at any of your notebooks, otherwise the execution engine will exit immediately!
Make sure OwlContext is already created before using any method from OwlUtils!
Last modified 20d ago