carrot carrot carrot Change Centers x cognizanti collaborators create-folder Data Science Decisive Infrastructure download download edit Email exit Facebook files folders future-of-work global sourcing industry info infographic linkedin location Mass Empowerment Mobile First our-latest-thinking pdf question-mark icon_rss save-article search-article search-folders settings icon_share smart-search Smart Sourcing icon_star Twitter Value Webs Virtual Capital workplace Artboard 1

Ingest, Capture, Safeguard & Manage



Meaningful insights require good quality data. Cognizant’s experience shows a vast majority of organizations spend a significant amount of time getting the data ready for consumption. 

BigDecisions takes a zero-code approach to this entire process which reduces process time by almost 60%. Our zero-code approach enables organizations to manage their data more effectively. Users can quickly achieve their desired outcome by utilizing the following Data Management workbenches—Data Ingestion, Data Transformation, Data Privacy, Data Quality and Metadata Management.


Our BigDecisions Data Ingestion workbench is a universal data ingestion tool for extracting and loading large volumes of data from a variety of data sources. Examples include databases, REST APIs, FTP/SFTP servers, and filers on Hadoop, relational and IOT capable storage.

  • One-stop solution for data ingestion requirements on Hadoop.
  • Dramatically simplifies the complex data load process into Hadoop and its ecosystem components.
  • Highly extensible. Does not dictate technology choices for the users.
  • Implemented with simple shell scripts and Java/MapReduce programs.
  • Driven by configurations that often don’t require any coding to address data ingestion requirements.

The BigDecisions Data Transformation workbench enables engineers to integrate, transform and aggregate data from one or more source datasets to target datasets.

  • Graphical designer interface (drag & drop) for data engineers and developers
  • Comprehensive feature set includes:
    • Support for a variety of data feed formats and compression techniques
    • Drag-and-drop interface for various transformation functions and SQL override options
  • Supports load strategies into target datasets
  • Metadata capture, logging and auditing is available out-of-the-box
  • Export and import mappings across environments for code promotion

The BigDecisions Data Quality workbench enables self-service data quality governing capabilities. It also provides an intuitive user interface for data stewards to define and manage rules, that govern the ingestion of source data feeds across relational and big data systems.

  • Profiles incoming data, identifies data quality issues, defines and refines multiple data quality rules.
    • Data Validation Rules: Preprocessing data or metadata-driven validation checks performed either at source-feed or source-attribute level.
    • Business Rules: Business rules-driven checks governing the data integrity and/or referential integrity of the data being ingested.
  • Automates translation of data quality executable code components.
  • Provides enhanced DQ Dashboards with details of success records, error or warning records at Source System and Source Feed level.
  • Support for User Defined (Custom) data quality rule.
  • Support for data validation on HDFS datasets, XML files on HDFS and Hive and flat-files hosted on remote servers.
  • Support for validation against predefined reference datasets (for example ISO codes) and XML Files on Hadoop.
  • Bulk upload for data quality rules and enablement of drill-down option in graph.

Our BigDecisions Metadata workbench plays a vital role in metadata management lifecycle for enterprises by addressing critical aspects of metadata governance, capture, integration, access and visibility, and maintenance and retirement of metadata. It also enables end-to-end data lineage.

  • Enables automated capture and browse functionality for various types of metadata.
  • Support for stewardship and governance for curation and publishing of metadata-searchable repository of various types of metadata
    • Data Dictionary: Physical and business metadata describing datasets.
    • Business Glossary: Curated definitions of business terms and KPIs.
    • Technical Metadata: Physical metadata for data processing and data consumption.
    • Data Lineage: End-to-end (graphical) lineage for data movement from system of origin to consumption layer(s).
  • Operational metadata runtime statistics for data processing jobs.
  • Associate and link various types of metadata captured.
  • Import various types of metadata using bulk-upload templates.

The BigDecisions Data Privacy workbench enables data stewards and data administrators to set data privacy rules for ethical governance of personally identifiable information (PII) data.

  • A robust module for data stewards to specify PII attributes for ethical governance of data.
  • Self service module for data stewards to configure PII rules for field name match.
  • Provides options to encrypt, obfuscate and drop PII data from the data load process.
  • Set default treatments for specific PII attributes.
  • Tightly integrated with data ingestion framework to log PII usage for audits.
  • Prevents ‘potential’ misuse of PII data being ingested into the platform for any analytical usage.


Please contact us at for more information. ​
Learn more about Cognizant's AI & Analytics solutions.

Please Enter Valid First Name
Please Enter a Valid Email Address
Please Enter Valid Company Name
Please Enter Valid Contact Number
Please Input the Region
Please Enter Message

Thank you for entering your contact information below.

We will be in touch shortly to provide a demonstration.