We provide IT Staff Augmentation Services!

Data Warehouse Architect Resume

0/5 (Submit Your Rating)

Detroit, MI

SUMMARY:

  • Results oriented Data warehousing & Business Intelligence technology specialist with 15 years of dedicated experience in real time data warehouse implementations across various domains like HealthCare, Retail Banking and Human resources. Handled implementations across various business verticals for more than 13 years & performed a role of Data warehouse architect.
  • 15 years of experience in effectively designing and implementing complex & challenging Data warehousing projects. Specialized in Data Integration concepts & technology.
  • 10+ years of experience in Healthcare, 3+ years of experience in Retail Banking & 1+ years of experience in Human Resource Enterprise Data warehouses.
  • Progressive experience as DW Technical Architect role across various Data warehousing projects across industries.
  • 10+ years of extensive experience in the Data Architecture & ETL methodologies using ERWin, Informatica, IBM InfoSphere & custom tools. 2+ years of experience in Business Intelligence Tools (Cognos).
  • More than 10+ years of experience in design & development of Data Quality patterns related to data applications.
  • Proven experience and solid understanding of relational databases Oracle, Teradata & SQL Server.
  • Has full life cycle implementation experience with several Data Warehouse tools including industry leading ETL and Reporting.
  • Extensive experience in developing Data Warehouse architecture patterns & models.
  • Expertise in publishing enterprise data standards such as a common techno - business vocabulary, naming conventions, and transformation rules.
  • Very skillful in defining technical standards and guidelines that pertain to data/information use, access and governance (including defining accountabilities in support of data quality mandates).
  • Expert in Performance tuning for high volume Terabytes of Data loads and aggregation in most optimized time.
  • Wide knowledge on metrics collection of an enterprise warehouse, identify measures that impact performance and provide solution.
  • Designed and implemented metadata repository and registry and the processes for domain-specific stakeholders (Data Stewards) to maintain their data entities.
  • Enriched experience in Reviewing corporate data sources for new business data/information and better sources of data feeds.
  • Assisted and implemented the development and implementation BI/DW strategy.
  • Leading and guiding the Data Warehouse (DW) & Extract, Transform and Load (ETL) specialists
  • Expert in Software Development Life Cycle (SDLC), involved in various phases like Requirements, Analysis/Design, Development and Testing.
  • A good experience of working as a Technical team Lead handling globally located team size of 20 to 30 resources. Strong interpersonal, verbal and written communication skills.
  • Rich experience in translating Business requirements to understandable IT systems requirements and Implementing CMM Methods & Procedures across the organization.

TECHNICAL SKILLS:

TECHNOLOGY: Health Care (, Claims, Provider) Retail & Investment Banking Human Resources Informatica 10.x\9.x\8.x (Power center & Power exchange) Enterprise Edition, Big Data Edition & Cloud Edition, IBM InfoSphere, Cognos ERWin, Oracle SQL Data ModelerData Vault 2.0Teradata 12, Oracle 10Master data management, Metadata, Data QualityLinux &Windows Machine learning methods using Python

PROFESSIONAL EXPERIENCE:

Confidential

Data Warehouse Architect

Responsibilities:

  • Developed solution architecture to setup acquisition of inbound data from Medicaid systems which provide data related to Eligibility, Enrollment, Claims, Provider.
  • Designed BIDM EDW’s building blocks such as Data acquisition layer, Staging layer, Common staging, Federated ODS, Star Schema, User access & Business Intelligence layer.
  • Built data model blueprints for the agency’s critical Medicaid subject areas such as Claims.
  • Performed PoC to acquire data from systems like MMIS, CBMS, FDB, FinCore, NCPDP. Designed a scalable & efficient CAP (Common Acquisition Pattern).
  • Built high-level Data Integration design & patterns which will handle functional areas such as enrollment of members & providers, claims processing (adjudication, adjustments, reversals).
  • Prepared data quality architecture to monitor, register & alert critical business data quality exceptions that the data provider has delivered.
  • Developed Operations support strategy & methods for ongoing EDW data absorption & management for successful business intelligence reporting.
  • Defined data modeling policy and standards which adheres to the common principles.
  • Defined & developed data governance guidelines with stakeholders from State Medicaid Agency.
  • Worked with Business Analysts team and Customers to develop data domains and logical & physical data models related to subject areas such as Enrollment,, Provider, Claims using ERWin.
  • Developed metadata layer data model components & inter dependencies between business & technical meta data related to ETL.
  • Document Interface Control Document (ICD) for each data exchange
  • Developed a STM template which defines Source To Target mapping with business mappings and transformations which will be transferred to ETL developers to perform ETL mappings.
  • Developed Star Schema consisting of Medicaid fact & dimension tables and user access layer to integrate business requirements such as PHI data restrictions.
  • Reviewed Data Architecture team’s deliverables such as logical & physical data models, STM, Oracle DDL and implementation plan.
  • Built reusable design patterns such as reconciliation, in-built data quality while loading data from source to destination.
  • Defined & developed ETL coding best practices using Informatica PowerCenter tool sets.
  • Instituted methods related to performance tuning on the most expected areas such as Claim header & line loading of various claim types such as Professional, Pharmacy and Dental &Vision.
  • Reviewed ETL deliverables (unit test results, data quality dashboard, migration plan, implementation plan).
  • Define inter dependencies related to scheduling the jobs in PowerCenter scheduling tool during the implementation phase.

Technologies Used: Oracle 12g Exadata, ERWin 9x, Linux, Informatica 10x (Enterprise, Big data & Cloud Edition), Informatica MDM, Informatica Data Quality, Metadata Manager, Cognos, Address Doctor, Rally (CA).

Confidential, Detroit, MI

Data Integration Architect

Responsibilities:

  • Designed the logical and physical data model of the data mart.
  • Development of data strategy and associated polices
  • Set up a metadata registry and allows domain-specific stakeholders to maintain their own data elements.
  • Data Profiling of Enterprise Data warehouse which acts as a Data source, to identify and analyze the integral part of source data components.
  • Designed the data flow architecture that loads the Member Data Mart.
  • Analyze and baseline the history data requirements & incremental data needs.
  • Assessment of time period - data load, response, fail-over and recovery.
  • Construct the Balancing mechanism between the ETL layers and the target schema.
  • Plan for the implementation in terms of SDLC phases to target the business dates and plan out phased deliveries.

Technologies Used: Informatica 9.0, Oracle 10g, UNIX, SQL Assistant 13, SCM, Tivoli

Confidential

Data Integration Architect

Responsibilities:

  • Lead the effort to establish, define, maintain and communicate data integration architecture for a large scale direct sourcing project for Enterprise Warehouse.
  • Designed & developed the dimensional model for the integrated Enterprise warehousing structure.
  • Worked closely with Subject matter experts and other key stakeholders to construct the ETL architecture, design, its data integration pattern, data flow and operational data elements.
  • Worked closely with solution/BI architects to establish a robust framework for data integration activities such as data abstraction, transformation and distribution - Batch/Intra-day/near real time/Real-time processing.
  • Defined the audit balancing methodologies between each layer and the consolidated balancing rule for source and target.
  • Created a technique to track the data flow by record level in terms of load log keys across the layers.
  • Developed proof-of-concept and prototypes to help illustrate the typical direct sourcing of a and Claims data from legacy sources.
  • Designed the data cleansing modules using IDQ to integrate and standardize non-standard source data.
  • Directed the data modeling team on the data design including slowly changing dimensions.
  • Responsible for recommending DB normalization, partitioning, aggregation and conversion solutions.
  • Accountable for the solutions proposed on the data sourcing and feeds to ensure the standards are integrated across the board are followed.
  • Collaborate with the client and internal infrastructure team to translate the solution architecture into deployment architecture.
  • Work with the team of architects, project managers, tech leads, modelers & development team.

Technologies Used: Informatica 9.0 (Power center & Power exchange), Teradata 12, UNIX, SQL Assistant 13, Clear case, WLM, IBM Optim

Confidential

ETL Architect & Technical Lead

Responsibilities:

  • Analyze the existing Lab Data Integration solution & Oracle pilot project to arrive at the final design for Consumer Integration.Build prototype on Teradata environment & showcase the % of consumer derivation out of higher volume of members.
  • Design the ETL process for the identified functional requirements.
  • Prepare Technical Design Document & ETL Specifications. Participate in the Design review meetings.
  • Build the ETL process flow, review the code with Client Tech Lead, Unit Test & promote the code into the higher environments for testing team.
  • Load data into various test environments based on testing team requirements.
  • Work with the implementation team on the objects promotion and execution.

Technologies Used: Teradata 12, Informatica 9.0, UNIX, SQL Assistant 13, Clearcase

Confidential

ETL Lead

Responsibilities:

  • Requirement gathering from customerParticipate & contribute in the functional and technical specification phase.
  • Design and develop the ETL maps for loading the data mart tables. Unit testing of the ETL maps.
  • Translate the business requirements into technical specifications.
  • Support the data load activities on ETL on other applications.
  • Analyze the business related changes that need to be accommodated in the maps.
  • Validate the data loaded into the data mart tables against the source system.
  • Analyze the data quality issues during load. Perform gap-analysis and support (migrate objects) the Informatica 7.1 to 8.1 upgrade.
  • Convert the UNIX scripts into ETL maps.
  • Co-ordination with offshore team for work allocation, review & ensuring quality delivery to the customer. Managing the team with the head count of 4.

Technologies Used: Informatica Power Center 7.1/8.1, UNIX Scripts, Oracle database, Toad SQL utility tool, Putty, Mercury Quality Center, CVS/Perforce/Visual source safe

Confidential

Developer Lead

Responsibilities:

  • Support towards Month-End activities (ETL and Reporting) in data transformation from the Peoplesoft ERP data source and validation against the source.Support in the redesign of data marts
  • Impact analysis of ETL maps during PeopleSoft upgrade and database migrations
  • Design and develop Business Intelligence reports through various tools of COGNOS
  • Refresh the reporting cubes for each subject area, after the data load and validate the cubes for the correctness.
  • Part of the team with 15 member headcount.

Technologies Used: Informatica Power Mart 6.1, COGNOS Toolset EP Series 7 (Powerplay, Transformer, Impromptu, Upfront) and PeopleSoft EPM 8.8.

We'd love your feedback!