Etl Datastage Lead Resume
2.00/5 (Submit Your Rating)
VirginiA
SUMMARY
- 10 years of experience in Enterprise Data Warehouse (EDW) Implementation, ETL - BI/DW Data Architect.
- 4+ years of experience working as a ETL Data Architect building data warehouse and operational system for very large corporations.
- Extensive experience in building and maintaining Data Warehouses systems using IBM Datastage v8.5/v8.1.x.v7.5 (Administrator, Director, Manager, Designer).
- Extensively worked on building DataStage jobs to process, merge the data using various stages like Transformer, Modify, Funnel, Join, Merge, Look-up, Pivot etc.
- Experience in creating DDL scripts for Operational Data Store and ETL Source to Target mapping documents. Hands on experience in dimensional modeling - Star schema, Snowflake schema.
- Very good understanding of DataStage Parallel framework, Partition and pipeline parallelism techniques and APT Configuration file.
- Experienced in troubleshooting DataStage jobs, enhancement of jobs and addressing production issues like Performance tuning and Enhancement
- Extensively worked on Oracle 9i, Teradata 12, 13.10, IBM DB2 UDB 8.0/7.0.
- Experience in designing Job Batches and Job Sequences for scheduling Server and Parallel jobs using DataStage Director, UNIX scripts.
- Worked on defining dependency and scheduling DataStage jobs using enterprise-scheduling tools such as Autosys.
- Expertise in writing Teradata SQL, Oracle SQL, Stored Procedures and performance tuning of Teradata queries.
- Worked with Teradata Load utilities like Multi Load, Fast Load, BTEQ and Fast Export.
- Designed relational and dimensional models for OLTP and OLAP systems. Experience in creating Physical data models and managing metadata information using data modeling tools CA Erwin and ER Studio.
- Driven two Hadoop and seven Data warehousing and one Compliance projects for leading banks in United States of America & Canada.
PROFESSIONAL EXPERIENCE
Confidential, Virginia
ETL DataStage Lead
Responsibilities:
- Played a role of Data Architect working directly with Business Partner on Data design and sourcing to be able to leverage existing BI capabilities to scale applications for future advance analytics.
- Designing of ETL jobs from end to end independently to read all the Position, Account from all the regions from VW Global and populate the warehouse.
- Developed ETL jobs for DataMart and written reconciliation scripts from Infomart to Datamart
- Involved in enhancement to the existing jobs as per the CR request
- Involved in writing functions and triggers in Oracle which will be used across the application for reconciliation of customer data like telephone number etc.,
- Developed ETL outbound extracts for different downstream like sandbox and Developed Autosys scripts for outbound extracts.
- Given support in production for the jobs/scripts which we have developed
- Supported to SIT/QA environment. To resolve all the data related issues and solving the defects raised by QA team. Code migration between SIT/QA/UAT environments. Preparation of release note to the production.
- Work with BA’s to get the business knowledge and get the work that that needs to be done.
- Achieving basic validations on the source systems and achieving intermediate tasks using Unix Shell scripting and testing of the artifacts and development of Parallel Jobs.
- Designing, developing and implementation of Metadata driven framework with Multi-Time zone source integrations, Staging, Homogenization, Error and Audit process, SCD-1, SCD-2, Surrogate keys generations, variable frequency loading by handling Late Arriving Dimensions.
- Source system analysis, data analysis, analysis of integration between modules, analysis of business sense of source entities and relationships.
Confidential
Senior ETL DataStage Consultant
Responsibilities:
- Interview stakeholders, understand the business requirement and scope of the projects to direct Data teams on future ready architecture designs.
- Identify repeated data transformations and processes to narrow the process streams like batch processing, message processing.
- Conceptualized and designed the solution framework for the use-case and evolve with KPIs for measuring ROI by doing event data analysis to keep a check on various business metrics which includes the performance of the marketing team.
- Responsible for migration of EDW applications from their old system to new system besides incorporating minor enhancements to the applications.
- Analysis of existing applications - its High level design, UNIX, Database and ETL components to be converted, downstream and upstream application dependencies.
- Identification of potential dependency issues, suggesting a suitable road map to migrate applications to minimize dependency conflicts and workarounds.
- Tuning the Teradata queries as part of this migration. Resolved the formatting issues in Teradata that are not matching after data comparison
- Documenting the findings, mentoring my team to perform code conversion and testing besides participating in the same.
- Application code conversion which included UNIX and Autosys script changes to point to new system and environment. Post production support and stabilization.
- Coordination with various teams to perform and validate Production parallel runs between old and new applications followed by Signoff.
Confidential, Cincinnati, Ohio
Big Data Architect/Supervisor, Analyst & Designer
Responsibilities:
- Implemented Account Based Costing (ABC) techniques and reveal actions to decision makers which can be taken to increase throughput or reduce spending to convert the savings into increased profits.
- Overseeing a team of 35 and ensure timely delivery with highest quality of the software. Bea backbone for the baseline team of the bank and identify tasks that can be automated.
- Develop / Manage functional architecture design requirements with the vendor if required, suggest data quality inconsistencies in the application architecture and method to improvise.
- Create Data Flow Diagrams (DFD) and recommend Data Models suitable for the environment with overall Technology Landscape in scope.
- Identified 970 application after Data Lineage for FSLDM migration.
- Estimation of Projects (Hadoop and Data warehouse), preparing Request for Proposals (RFPs), SoWs and brainstorm on architectural design recommendations.
- Create Requirement documents for bringing data from various sources to Operational and MOSAIC Data Warehouse models. Provision of OLTP and OLAP data models.
- Create Logical and Physical data model using ER Studio for operational and data warehouse and get the design reviewed and approved by the Data Architect Manager and then publish and distribute the changed / new model to various stakeholders and teams.
- Streamline the design process to provide better design. Introduced more processes to improve the design standards
- Interacted with reporting and ETL team to make them understand the design so that all the teams can be up to speed to understand the project.
- Create DDL scripts for the change for Teradata DB and work with DBA to create the tables. Validate the results and work for production implementation.
Confidential
Solution Architect/Subject Matter Expert/Business
Responsibilities:
- Define, develop and implement all related changes to policies, processes, governance, systems, data and reporting in support of meeting the BCBS 239 Principles. (RDARR - Compliance)
- Developed and Managed product development plan, manage delivery to the plan and identify risks in data replication strategies.
- This project involved in loading data from network this is used by network operation team. Loading around 200 Billion rows every day. Involved in Database migration (Migraiton CoE).
- Analysis and design of the client requirement on release to release. Actively participated in tuning the long running queries. Physical model by using ERWIN tool. Involved in sizing the database.
- Interact with all upstream and downstream teams to gather right information to provide accurate and efficient solution.
- Interacted with reporting and ETL team to make them understand the design so that all the teams can be up to speed to understand the project the project
- Involved in Dynamic and static partition maintenance on all env.
- Lead, Develop, Design, Implement and act as SME for end-to-end Support for the Program Creditor Insurance Critical Illness
- Decent experience scheduling (Zeke/Autsys/Cron-tab) and trouble-shoot production issues w.r.t all components of IBM Datastage, migration of code across Dev, Test environments.
Confidential
Software Engineer
Responsibilities:
- Design, build and testing of the development of Parallel Jobs in Datastage & Qualitystage.
- Achieving basic validations on the source systems and achieving intermediate tasks using Unix Shell scripting.
- Designing, developing and implementation of various approaches achieving Metadata driven ETL framework with Multi-Time zone source integrations, Staging, Homogenization, Error and audit process, SCD-1, SCD-2, Surrogate keys generations, variable frequency loading by handling Late Arriving Dimensions.
- Decent experience scheduling (Zeke/Autosys) and troubleshoot production issues w.r.t all components of IBM Datastage, migration of code across Dev, Test and Prod environments.
- Tools & Technologies: Oracle 9i SQL, Datastage8.1, Autosys for Scheduling for Parallel Jobs.