Skip to main content Skip to main navigation Skip to search Skip to footer

Enterprise Information Management

Enterprise Information Management (EIM) addresses the #1 issue most organizations face, which is the lack of integrated, high quality and timely data for critical business decisions.  As a result they are unable to achieve the following:

  • Single version of master data ‘truth’ (customer, material, product, vendor)
  • Improved data quality
  • Timely access to critical data
  • Reducing time spent manually integrating data
  • Eliminating spreadsheet sprawl and data silos
  • Effective integration of Big Data sources

To address these issues, HCL provides a comprehensive array of EIM services:

Services

  • Reference Data Management – (a) Alignment and maintenance of key codes and terms used through the enterprise (b) Ontology definition, synonym cross-mapping, repository development and the deployment of system access methods.
  • Master Data Management (MDM) – Definition, identification and management of master data objects used throughout the enterprise (product, customer, vendor, material, etc.).  This includes definition of the MDM architectural style (consolidation, transactional, hybrid), modeling and stewardship process.
  • Data Governance – Development of an organizational charter, structure, operating procedures, communication plan, program initiation and management.
  • Data Management Center of Excellence (CoE) – Definition of organizational charter, structure, asset types and rules, and cross-enterprise service agreements.
  • Data Quality – (a) Data quality assessment, profiling, creation of rules repository (b) Definition of enterprise standardization rules (c) Definition and implement of a stewardship process.
  • Data Architecture – (a) Definition of an architectural framework and design for the effective management of all types of data including structured, semi-structured and unstructured (b) Definition of standards for modeling, storage and data retention (c) Modeling for each layer of the enterprise architecture.
  • Data Integration (ETL) – (a) Definition of the integration architecture including the use of ETL, federation and replication technologies  (b) Definition of the development standards and guideline for the movement and integration of data(b) Definition of the functional and technical specification and mapping standards(c) Development of functional and technical specifications(d) Development, test and deployment of integration jobs(e)Performance monitoring and support of the integration environment.
  • Metadata Management – (a) Definition and management of data lineage from data source, transformation and report (b) Definition of business names, context and glossaries to drive consistent data use throughout the enterprise.
  • Big Data Integration – (a) Assessment and recommendation for Big Data integration within an existing landscape (b) Architecture and design of Big Data platform(c) Implementation of Big Data platform, including center of excellence and operational best practices.
  • Data Migration & Consolidation – Data matching, transformation and movement to support system retirement and consolidation.

Service Accelerators

  • DMaaS – Data management as a service suite of cloud based services for data integration, data migration, data quality, test data management.
  • ETL Feeds Factory - Lean integration model for lowering ETL costs by using code generation.
  • ETL Component Library
    • Process Director – Real-time ETL performance monitoring
    • Code Red - Automated ETL code evaluation
    • Data Concord – Auto dataset load reconciliation processing
    • Reference Manager – Reference data repository and management environment
  • MDX - Data quality framework including pre-defined business rules and data models
  • EIM Console - Central console for managing all data asset including governing data, managing the data supply chain, fixing data quality issues and enabling data integration/replication/migration capabilities
  • Migration Validation of ETL platforms - Automates the process of validating the ETL jobs pre and post migration by leveraging the metadata and simulating runs.
  • Data Quality ABC Framework – Audit, Balance and Control (ABC) process, model and structures for data validation and verification
  • Data Quality Governance Framework – Process and organizational structure for establishing a governance process

Key Partners

  • IBM WebSphere
  • IBM InfoSphere
  • Oracle
  • Informatica
  • Ab initio
  • Talend
  • Pentaho
  • BusinessObject Data Services (SAP)
  • Master Data Governance (SAP)
  • Tibco
HCL Beyond the Contract

What Customers Say

“This week we successfully went live with the domestic rollout of our foreign exchange and money market systems. The initiative encompassed the integration of 120 systems involving more than 350 people across the organization, 12 technical working groups, and about 30 test teams.
The delivery of the system is a tremendous outcome for us as it continues to transform our capability. Thank you for working tirelessly to ensure that the initiative met its goals”.

- General Manager, a multinational bank

“Our sales force recommended a list of implementation/migration partners, including HCL. After a round of interviews, we decided to hire HCL. We were working against a very tight timeline and a lot of unknowns. HCL was able to articulate a migration plan that not only fit our schedule but removed a lot of risks. HCL brought in a team of experts. We are very happy to have worked with HCL and would do it again.”

 - Director, Engineering Services, a global information services company