Container & readme: dockerhub
The requirements for enterprise grade data quality solutions are:
It is achieved is by adding a Web UI deeply integrated with Kafka to maintain rules comfortably and with sample data.
Once the rules are configured and deployed, they get assigned to an input Kafka topic and are executed one every incoming record.
The rule results are attached to each record inside an additional field of that record and written into an output topic. Thus consumers have the option to either read the raw data or the cleansed data, they can use the audit information and if the audit information is persisted in the Lakehouse, BI tools are used to visualize the rule results.
This allows to build applications like
rtdi.io GmbH
Tallach 150
9182 St. Jakob im Rosental
Austria
UID ATU74541169
Contact
[email protected]