We provide IT Staff Augmentation Services!

Lead Etl Developer (datastage | Informatica | Business Objects) Resume

0/5 (Submit Your Rating)

Tampa, FL

SUMMARY

  • Scaled Agile/Agile/SDLC: Over 12 Years of IT Experience in Data Warehouse Application Development using ETL and OLAP tools involving Business Requirements Analysis, Application Design, Data Modeling, Data Profiling, Preparation of Functional/Technical Specifications & Test cases, coding, Testing, implementation of Data Warehouse/ODS/Data Marts, Project Documentation and Problem analysis and solution in the areas of Banking, Finance, Business Intelligence, Insurance (Property &Causality) domains
  • ETL Informatica: Strong experience in Informatica PowerCenter 9.x/8.x/7.x, Power Exchange, OLAP, OLTP. Expert - level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, XML files Informatica Designer Components Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager and Monitor. Worked on Informatica Cloud to createSource / Target SFDC connections, monitor, synchronize the data in Sales force
  • ETL DataStage: Experience in designing & developing jobs using IBM InfoShpere DataStage V9.1, Data Stage Designer, DataStage Director and DataStage Administrator. Environment Variables, Parameter Sets, Shared & Local Containers.
  • Big Data: Strong knowledge on Apache Hadoop - Distributed File System (HDFS) and MapReduce Architecture, Partitioners, Combiners & Comparators. Designed & developed jobs to load and read Hive Tables. Hands on experience on Pig, Flume and Sqoop
  • Data Analysis: Experience on Data Design/Analysis, User/Business Requirement Gathering, User/Business Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis
  • Data Modeling: Experienced in Dimensional Data Modeling experience usingDB Designer, Oracle Designer, OLTP, OLAP, Star Schema/Snowflake modeling, Facts & Dimensions tables, logical and Physical data modeling. Created slowly Changing Dimensions (SCD) Type1/2/3 dimension mappings
  • Methodologies: Complete knowledge of Data Warehouse methodologies (Dimensional Modeling - Ralph Kimball, Inmon), ODS (E-R Modeling), EDW and Metadata repository, Good in conversion and migration of data from legacy systems to Oracle 10g/DB2 UDB/Netezza/SQL Server using Informatica and DataStage processes.
  • RDBMS: Extensive work experience on Database Systems of Netezza, DB2 UDB LUW, Oracle 10g, 9i,8i, SQL Server 2012 SQL*Plus and Teradata.
  • Developed Batch Jobs using UNIX Shell scripts (csh, bash, ksh) to automate the process of preparing Parameters dynamically, triggering informatica workflows and datastage jobs, pushing and pulling data from & to different servers.
  • On-site: Worked at Client site involving in Business Analysts Engagement, Business Application Users Engagement, Business Requirement Analysis, Design, Preparation of Functional Specifications, On-shore Off-shore Co-ordination, UAT, Project Delivery, Implementation and Post implementation support.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter v 9.6.1/9.5/9.1/8. x/7.x, Informatica Power Exchange 8.x/7.x (Repository Manager, Designer, Server Manager, Workflow Manager, Workflow Monitor) IBM InfoShpere DataStage and Quality Stage v9.1 (Designer, Director, Administrator & Monitor) Informatica Cloud (Monitor & Synchronize)

RDBMS: Netezza, DB2 UDB LUW, Oracle, MS SQL Server 2012,Teradata

Big Data: Apache Hadoop, HDFS, MapReduce, Hive, Pig, Flume and Sqoop

Data Modeling: Dimensional Data Modeling (Star Schema, Snowflakes, Facts, Dimensions), Physical and Logical Data modeling, Entities, Attributes, ER Diagrams, DB Designer. MS VISIO

Scripting: UNIX Shell Scripts (csh, bash, ksh)

Programming: COBOL, C, C++, SQL, JCL.

Tools: TOAD, SQuirreL SQL Client, SQL*Plus, SQL * LOADER.

Environment: UNIX, Z/OS, Linux, Windows 98/2000/XP/NT/7.

Scheduling: IBM Tivoli Work Scheduler (TWS), Control M, CA-7

Reporting Tools: SAP Business Objects, SAS, Easytrieve

PROFESSIONAL EXPERIENCE

Confidential, Tampa, FL

Lead ETL Developer (Datastage | Informatica | Business Objects)

Responsibilities:

  • Performed Project Set up in DataStage Designer, Created required Database Connections to Netezza, SQL Server & DB2 UDB, Created Environmental Variables, Created parameter sets to enable the Project team to develop and test jobs and provided instructions and guidance.
  • Performed Project Set up in Linux Server using ETL Framework and set up the Project Scripting Environment.
  • Set up Re-usable Utilities like Generic Extractors
  • Designed & Developed Re-usable Loaders using Sequencer Jobs with RCP that can Load data from Datastage Dataset into Any type of database target. Added functionality Truncate Reload with Pre/Post SQL ability.
  • Designed and Developed Parallel jobs using Funnels, Aggregators, Look ups (including Range look up), Transformers - used stage variables, Sorter, Remove duplicates, Filter, Shared Containers, Parameter Sets, Environmental Variables
  • Designed & Developed Informatica mappings, utilized Cross Database Join (under single appliance), Synonyms & Views to perform Push Down Optimization (PDO). Performed Source, Target and Full Pushdown and made job run entire logic in Database Server without bringing any data into Informatica Server
  • Created Basic Webi-UI Reports on SAP Business Objects including Input control to pick and drop down at different granular level and promoted them to QA, Production and handed over to Business team to manage and utilize. The Source table(s) for these reports are loaded using ETL tool (Informatica/DataStage)
  • Involved in Requirements Acceleration Center to gather requirements.
  • Analyzed requirements and came up with High level Design and Detail Design
  • Preparation of Analysis & Design (A&D) document, Unit Test Plan/Results (UTP/R)
  • Designed & modeled various database tables as needed in project, got them created with appropriate security privileges and promoted them to test and production thru change management process.
  • Designed and developed jobs to load and read to and from Hadoop - Hive Tables. Debugged and analyzed job logs. Wrote Hive QL to create and read Hive Tables. Scheduled and monitored Hive jobs
  • Involved in all actives of Scaled Agile Methodology which includes increment planning, iteration planning, iteration retrospective & increment retrospective, Features, stories and tasks throughout the project.
  • Executed, scheduled workflows using Informatica Cloud tool to load data to and from SFDC (Source / Target)
  • Creating new Informatica Mappings, Workflows, Parameter Files, UNIX Shell Scripts to invoke workflows, worked on Mapping Parameters and Session Parameters. Used debugger and breakpoints to view transformations output and debug mappings.
  • Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate One time, Weekly, Monthly and daily loading of Data

Environment: IBM InfoSphere DataStage and Quality Stage v9.1, Informatica PowerCenter v9.6, Informatica Cloud, Netezza, MS SQL Server, UNIX, SQuirreL, Control-M and SAP Business Objects, Hadoop - Hive

Confidential, Irving, TX

Lead ETL Developer

Responsibilities:

  • Lead a team of Informatica-DB2 application developers and assist on projects.
  • Work alongside team members when demand and/or pending deadlines require it
  • Business rules and Field mappings from source file to various header/detail tables will be provided by technical specification along with mapping document from internal business
  • Creating new Informatica Mappings, Workflows, Parameter Files, UNIX Shell Scripts to invoke workflows
  • Designed mappings with transformations like Aggregator, Filter, Joiner, Expression, Lookup (Connected & Unconnected) and Update Strategy, Sorter, Normalizer, Sequence generator.
  • Extracted, transformed data from various sources such as Flat files, DB2, XML files and transferred data to the target data base DB2 UDB LUW database and Flat files
  • Used the Workflow Manager to create Workflows, Tasks and for partitioning the sessions.
  • Preparing test data and performing Unit and Integration Testing and updating Unit Test Results (UTR) accordingly
  • Developed several complex Mappings and Reusable sessions and UDFs to facilitate One time, Weekly, Monthly and daily loading of Data
  • Ensure code review (Informatica Mappings, Workflows, Parameter Files & UNIX Shell Scripts) and review of test results.
  • Support IT infrastructure upgrades and migrations
  • Use database and Informatica experience to reduce maintenance costs and improve performance
  • Standardize coding techniques
  • Develop and maintain a strong relationship with on-site, off-site, and off-shore resources to meet development and implementations deadlines
  • Interact well with other IT groups and internal clients
  • Participate in training, orientations, and growth of team members
  • Documentation of new and existing Informatica claim processes

Environment: Informatica PowerCenter v 9.5/9.1/8.6.1 , DB2 UDB LUW, UNIX, TOAD, IBM Distributed Tivoli Work Scheduler (TWS), CA-SCM

Confidential, TX

Senior ETL Developer

Responsibilities:

  • Perform Impact analysis, Creating Gap Analysis documents and coming up with Gaps between AS IS system and TO BE system (i.e. Legacy and Enterprise Auto/Prop SDS/ADS) and proposing unavailable fields for adding into New Financial Data Model.
  • Creating AS IS Source to Target sheets, TO BE Source to Target sheets with corresponding business rules for each target field
  • Preparation of Analysis & Design (A&D) document, Unit Test Plan (UTP)
  • Creating new Informatica Mappings, Workflows, Parameter Files, UNIX Shell Scripts to invoke workflows
  • Designed mappings with transformations like Aggregator, Filter, Joiner, Expression, Lookup (Connected & Unconnected) and Update Strategy, Sorter, Normalizer, Sequence generator.
  • Worked on Mapping Parameters and Session Parameters.
  • Extracted, transformed data from various sources such as Flat files, DB2, XML files and transferred data to the target data warehouse SQL Server database and Flat files
  • Used the Workflow Manager to create Workflows, Tasks and for partitioning the mappings.
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Preparing test data and performing Unit and Integration Testing and updating Unit Test Results (UTR) accordingly
  • Ensure code review (Informatica Mappings, Workflows, Parameter Files & UNIX Shell Scripts) and review of test results.
  • Migrating New Informatica Mappings, Workflows, Paramer Files & UNIX Shell Scripts from Development Environment to Testing Environment. Assisting the System Testing Team while performing System/Release Testing.
  • Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate One time, Weekly, Monthly and daily loading of Data

Environment: InformaticaPowerCenterv9.1/8.6.1, IBM DB2 UDB, UNIX, SQuirreL, Control M and Mainframes

Confidential, TX

Senior ETL Developer

Responsibilities:

  • Preparation of Impact analysis, Analysis & Design (A&D), Unit Test Plan (UTP)
  • Developed detailed ETL Plan for loading the staging area from the DB2 database and then from Staging area to the Data Warehouse.
  • ETL plan included populating the dimension and fact tables, sequencing the loads to maintain referential integrity and data cleansing.
  • Worked on Designer Tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, and Mapping Designer.
  • Developed Informatica Mappings, Sessions and Workflows using transformations (reusable & standalone)
  • Modified the existing ETL mappings to load the data from the core to target database and data warehouse and from Staging to Reporting databases
  • Used Slowly Changing Dimension Type I, Type II for some of the mappings.
  • Used Update Strategy DD INSERT, DD UPDATE, DD DELETE, and DD REJECT to insert, update, delete and reject the items based on the requirements.
  • Perform Unit and Integration Testing and update Unit Test Results (UTR) accordingly
  • Ensure code review and review of test results.
  • Developed Teradata database objects and written the SQL Scripts according to business rules.
  • Used workflow manger and work flow monitor to run and monitor jobs
  • Provide Induction Training to the new joiners
  • Mentor and Monitor the Project team both in technical and domain/business

Environment: Informatica Power Center 8.6.1, IBM DB2 UDB, UNIX, SQuirreL, Teradata utilities, Control M, Mainframes

Confidential, CA

ETL Developer

Responsibilities:

  • Analyze thoroughly both existing Credit card Services& Retails Services Data Warehouse systems, and create the functional specifications for a single system called CRS
  • Creating the System Architecture Design(SAD) which involves analyzing source to target mapping for all fields in new Data Warehouse under each subject area. CS Business rules and RS business rules for common fields are integrated and source field for CS data and RS data are identified. Adding CS only or RS only fields to mapping sheet. ETL Mapping document
  • Creating Technical Specifications, Test Strategies, Test Cases & Test data
  • Designing and developing complex Aggregate, Joiner, Router, Look up (Connected and Unconnected), Filter, Normalizer (for COBOL source files from Mainframe system) and Update transformation rules (business rules).
  • Partitioned Sessions for concurrent loading of data into the partitioned target tables.
  • Designed and developed end-to-end ETL process from various source systems to Staging area (SSOT), from staging to Data Warehouse.
  • Unit and Integration testing of the designed mappings, Workflows& UNIX Shell scripts
  • Written SQL overrides in Source Qualifier and Look up Overrides according to business requirements.
  • Implemented Slowing Changing Dimensions methodology and developed mappings to keep track of historical data.
  • Supported Testing team for System Integration Testing (Phase 1 & Phase 2) and User Acceptance Testing in different test regions.
  • Designed mappings, workflows to create historical data base from 2006 Nov to 2010 September

Environment: Informatica Power Center v8.6/7.1, Oracle 9i, UNIX, TOAD

Confidential, IL

Mainframe Developer

Responsibilities:

  • Estimate and perform feasibility analysis on every service requests and accordingly negotiate and prioritize client requests discussing with the Onsite Coordinator.
  • End to end Project/Task Requirement Analysis in consultation with Clients and Business users. Working closely with clients and Users to finalize all requirements.
  • Major/Key role will be played in Project Design and preparation of project design documents like Functional Specifications.
  • Preparetion of Technical specifications, Test cases
  • Code new or modify existing Programs in COBOL-DB2, JCL and Easytrieve
  • Perform Unit and System Integration Testing.
  • Ensure code review and review of test results.
  • Project Delivery by means of walking through project components to the Clients.
  • Implementation - setting up all configuration items, Schedule documents and scheduling, moving the components to live i.e. to production, running pre-required one-time audits and ensuring smooth production execution of the project.
  • Post-implementation production support for every service request.
  • Provide Induction Training to the new joinees.

Environment: IBM Mainframes z/OS, JCL, COBOL, DB2, Easytrieve, Changeman, File-Aid, File-Aid DB2, VSAM, Expeditor, CA-7, VALITY, CODE-1, SYNCSORT, IDCAMS

Confidential

Mainframe Developer

Responsibilities:

  • Requirement analysis
  • Preparation of Functional & Technical Specifications.
  • Coding of Online COBOL-CICS-DB2 Programs
  • Create Unit Test Cases and Implementation Strategies.
  • Ensure code review and review of test results
  • Unit and Integration Testing in Development CICS Region

Environment: IBM Mainframes z/OS, COBOL, DB2, Changeman, File-Aid, File-Aid DB2, VSAM, Expeditor, CA-7, VALITY, CODE-1, SYNCSORT, IDCAMS

We'd love your feedback!