17 C
Tuesday, May 28, 2024

Dataddo Enhances Data Quality at the Pipeline-Level

Must read

Dataddo introduces Data Quality Firewall, a powerful feature embedded at the pipeline level in its automated data integration platform.

Dataddo, an automated, end-to-end data integration platform, is debuting Data Quality Firewall, a new feature for Dataddo’s platform that ensures the accuracy of data extracted by the platform to various storages, including BigQuery, Snowflake, and S3.

Data Quality Firewall comes as a response to the rampant challenges—and costs—of poor data quality accompanying Dataddo’s suite of other data quality features. Becoming one of Dataddo’s multi-layered approaches to data quality, Data Quality Firewall is embedded at the pipeline level.

“Our Firewall is the first line of defense for data quality,” said Petr Nemeth, CEO and founder of Dataddo. “By intercepting corrupt and non-compliant data at the earliest stage of integration, it primes any data-driven initiative for success.”

Data Quality Firewall offers the following capabilities:

  • Performance checks on null values, zero values, and anomalies
  • Individual configuration for each column
  • Granular quality control
  • Multi-mode operation for adhering to various fault tolerance standards
  • Easy configuration and testing of rules via a no-code interface

This latest layer builds on top of Dataddo’s other quality features, including automatic format harmonization for the output of machine-readable data, various write modes, and the Data Quality Watcher. Additionally, the Dataddo platform supports database replication, reverse ETL and direct integration of online services with dashboarding apps.

“The Data Quality Firewall plays into our overarching mission as an integration tool vendor: to enable movement of data from one end of an infrastructure to the other, with as little complexity as possible,” Nemeth concluded.

More articles

Latest news