top of page
  • msm259

AQTIVA.AI, good data from the start

Aqtiva’s data quality platform leverages BigData and AI to help you set up quality points with just a few clicks

Today, modern data ingesting and processing systems, such as Data Lakes, Big Data-based processing environments or NRT processing platforms, have drastically changed data quality requirements, making not an valid option to profile data periodically at database, but need to ensure the correct data quality at the time the information is ingested and therefore available in decision-making systems. The data quality as an informational paradigm has given way to operational data quality, being another element of assurance in the ingesting and ETL/ELT processes of customers.


Aqtiva is operational data quality platform allowing for data quality application at ingestion/processing time, completely non-intrusively, applicable to both NRT and batch processes, with traditional ETL technologies or integrated into Big Data environments through its Spark libraries and allowing its execution in both on-premises and cloud environments.

Aqtiva is firstly a SaaS solution, making the platform available without any installation needed, but if customers required an owned installation, any or all of its components can be installed on-premise or cloud


Aqtiva Management Platform


It provides a visual interface for the definition of quality policies as well as for their design and deployment cycle. It is implemented with technology that allows an easy integration with both data buses (like Kafka), document databases and SQL databases


The platform covers the following functionalities

  • Security and access-policy rules definition

  • Data quality rules definition for the quality points that the user wants.

  • Simulation capabilities in order to analyze their viability and goodness of the policy implemented

  • Automatic rule deployment and management and automatic code generation

  • Endpoints for external tool connection such as on-premise ETL


Aqtiva Quality Engine


Aqtiva Quality Engine execute the rules defined by users in the Management Platform over a massive amount of data in real time, in batch mode or NRT mode. Moreover, these capabilities are integrated with on-premise and cloud environments. The Quality Engine is the heart of the solution allowing for a flexible but performance-tunned execution of quality rules, which are defined in the Management Platform in a user-friendly way but executed with the most advance data technology platforms.


The engine uses Aqtiva own libraries, available for Spark and Python for translating and executing the metalanguage of the Aqtiva Management Platform- specific for Spark (batch environment) and Cloud (especially with AWS Lambda functions and Azure Functions).

Aqtiva Analytics Engine


The Analytics Engine gives a step further in data quality automation. Based on a historical data, this engine recommends optimal quality settings and provides dynamic quality rules based on behavioral data patterns, giving users rule recommendation so they do not have to analyze by themselves the Datasources, Aqtiva analyze them and recommend the most optimal set of quality rules.

Quality Governance & Real Time Monitoring


Aqtiva includes a custom dashboard and governance KPIs that provide information at real-time about the quality of the data ingested in the customer system. It allows drill-down and custom quality ontology definition, so detailed KPIs can be derived. This information is provided in real-time, allowing reactive detection of quality anomalies and early counter measured actions.


Aqtiva Data Collector & Integration


Aqtiva aims for an easy integration with any platform or technology in the data ecosystem. Because of that, data quality generated by Aqtiva is stored in its internal storage, but it can be configured to be sent to any customer system so quality information can be integrated with any other system:


  • Event Stream: such as Kafka or EventHub, Aqtiva allows to send data to any data broker or bus for a further processing.

  • Documental Database: its native storage model is document, so connectors can be configured to send information to any documental database

  • SQL Database: by JDBC, any connector can be configured in order to send information to a SQL database

Key Benefits

  • More than 10 times faster than ETL use or custom code

  • Homogenization of quality information

  • Business user defines quality policies(stakeholders), Aqtiva deploys and Data Engineer monitor

  • Real-Time integration and deployment with one-click

Key Functionality

  • Define Quality Rules and Policies without code, in a user-friendly interface

  • Automatic deploy of quality rules in existing ETL

  • Code integration with less than 3 lines of code

  • Full Cloud & On-prem integration

  • Real-time quality monitoring and alerting

  • SaaS or On-premise deployment



About Aqtiva Data Technologies


Aqtiva was created by Mática Partners and some private investors to provide solutions to the issues surrounding data quality. An error at the source can generate expensive errors at the destination. Aqtiva seeks to provide assistance to businesses in environments that process information in real-time and where owners of the information do not have time to verify the quality of the information.

Based on machine learning, Big Data and predictive analytics algorithms, Aqtiva provides data quality management at the intake time. This provides a real-time data quality and auditory mechanism for any information processing system.


20201007 AQTIVA WHITE PAPER V1.0
.pdf
Download PDF • 263KB

58 views0 comments
bottom of page