The image above depicts owl-web, owl-core, postgres and orient all deployed on the same server. This can be an edge node of a Hadoop cluster or a server that has access to run spark-submit jobs to the hadoop cluster. This server could also have JDBC access to other DB engines interested in being quality scanned by Owl. Looking at this depiction from left to right the client uses their browser to connect to Owl’s Web Application running on port 9000 (default port). The Owl Web Application communicates with the two metastores (Postgres and Orient). The Web Application can run a local owlcheck job, or the owlcheck script can be launched from the CLI natively. The owlcheck will launch a job using Owl’s built in Spark Local DQ Engine. Depending on the options supplied to the owlcheck command the DQ job can scan a file or database with JDBC connectivity.