We provide IT Staff Augmentation Services!

Etl Development Lead Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Over 8 years of experience in Infomatica Power Center and Mainframe development
  • 6 + Strong experience in ETL using Informatica Power Center (Mapping Designer, Workflow Manager, Workflow Monitor, Source Analyzer, Transformation developer, Mapplet Designer, and Repository manager).Good exposure on optimizing the SQL and performance tuning using Explain Plan.Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation,, implementation and post - implementation review.
  • 2+ year’s experience as an Mainframe Developer with the development of COBOL\DB2\JCL\REXX
  • Strong in SQL, T-SQL, PL/SQL, SQL*LOADER, SQL*PLUS, MS-SQL & PRO*C
  • Extensive knowledge on the PowerCenter components as PowerCenter Designer, PowerCenter Repository Manager, Workflow Manager and Workflow Monitor.
  • Thorough knowledge in creating ETL’s to perform the data load from different data sources and good understanding in installation of Informatica.
  • Experienced in developing the mappings and transformations using PowerCenter Designer and executing the mappings through Work Flow Manager from multiple sources to multiple targets.
  • Knowledge in Full Life Cycle development of Data Warehousing with experience in ensuring quality and compliance to coding standards.
  • Good experience with Data Cleansing, Data Analysis, Data Profiling and necessary Test plans to ensure successful execution of the data loading processes
  • IBM data quality tools of DataStage were used to help with the cleansing for migration.
  • Hands on Experience using Push Down Optimization (PDO).
  • Good experience with Data Cleansing, Data Analysis, Data Profiling and necessary Test plans to ensure successful execution of the data loading processes.
  • Strong at developing complex mappings with Custom Transformations, XML Sources.
  • Experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server and Oracle PL/SQL.
  • Extensive development, support and maintenance experience working in all phases of the ETL Life Cycle.
  • Involved in Testing, Test Plan Preparation and Process Improvement for the ETL developments with good exposure to development, testing, debugging, implementation, documentation, user & production support.
  • Worked directly with non-IT business analysts throughout the development cycle, and provide production support for ETL.
  • Involved in understanding client Requirements, Analysis of Functional specification, Technical specification preparation and Review of Technical Specification.
  • Well versed with Datastage 8.1 ETL Tool.
  • IBM data quality tools of DataStage were used to help with the cleansing for migration.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Experience with industry Software development methodologies like Waterfall, Agile within the software development life cycle
  • Excellent communication skills and ability to work effectively and efficiently in teams and individually.
  • Involved in identifying bugs and enhancement of existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs and redesign the existing mappings for improving the performance
  • Used DataStage Manager to implement the Import and Export Interfaces.

TECHNICAL SKILLS:

  • Informatica Power Center 9.x/8.x Oracle 10g, SSRS,SQL Server DB2, MS Access 2000. SQL, PL/SQL, Transact SQL, UNIX Shell Scripting, Java, DOS
  • WINDOWS 98, 2000, XP, Vista, Win7, Win 8, UNIX, TOAD for Oracle\Data analysts, HP Quality Center, HP Service Manager,Datastage,COBOL\JCL\REXX

PROFESSIONAL EXPERIENCE:

Confidential

ETL development Lead

Responsibilities:

  • Lead the offshore team and coordinated its activities with the onshore team.
  • Involved in requirements gathering, data modeling and designed Technical, Functional &ETL Design documents.
  • Implemented Mappings to extract the data from various sources VSAM file\Mainframe DB2\Oracle, XML files and load into oracle and Teradata using Teradata utilities like Mload, Tpump, and Fast Export.
  • Designed and implemented slowly changing dimension mappings to maintain history.
  • Used PowerExchange to capture the changes in the source data.
  • Used Teradata utility Fast Load for bulk loading and Tpump&Mload utilities for loading less and larger volumes of data.
  • Implemented ETL Balancing Process to compare and balance data directly from source and warehouse tables for reconciliation purposes.
  • Worked with XML sources and targets.
  • Created and worked with generic stored procedures for various purposes like truncate data from stage tables, insert a record into the control table, generate parameter files etc.
  • Architecture oversight to various project teams, a key leadership role for the architecture redesign, data governance, security, new developments, and major enhancements.
  • As a release architect, managed various release activities including environment planning, release planning and dependency, project dependencies, architecture review, etc
  • Designed and developed Staging and Error tables to identify and isolate duplicates and unusable data from source systems.
  • Simplified the development and maintenance of ETL by creating Mapplets, Re-usable Transformations to prevent redundancy.
  • Wrote and implemented generic UNIX and FTP Scripts for various purposes like running workflows, archiving files, to execute SQL commands and procedures, move inbound/outbound files.
  • Designed and executed test scripts to validate end-to-end business scenarios.
  • Used session partitions, dynamic cache memory, and index cache to improve the performance of ETL jobs.
  • Implemented Mappings to extract the data from various sources VSAM file\Mainframe DB2\Oracle, XML files and load into oracle and Teradata using Teradata utilities like Mload, Tpump, and Fast Export.
  • Resolved complex technical and functional issues/bugs identified during implementation, testing and post production phases.
  • Identified & documented data integration issues and other data quality issues like duplicate data, non-conformed data, and unclean data.
  • Assisted team members in functional and Integration testing.
  • Automated and scheduled Workflows, UNIX scripts and other jobs for the daily, weekly, monthly data loads using Autosys Scheduler.
  • Created standard reports using Business Objects features like Combined Queries, Drill Down, Drill Up, Cross Tab, Master Detail etc.

Environment: InformaticaPowerCenter 9.1/8.6, Oracle 11/10g, PL/SQL, Teradata 13.1, Autosys, TOAD 9.x, Tableau, Oracle Financials, Shell Scripting, Oracle SQL *Loader, OBIEE, SSIS and Sun Solaris UNIX, Windows-XP.

Confidential

ETL Developer

Responsibilities:

  • Interacting with Business and analysts to create the Functional Specs.
  • Reviewing and creating the design required to support all reporting needs.
  • Getting signoff on report and dashboard template.
  • Optimized the SSRS reports through the use of aggressive scoping of data and judicious use of aggregate tables and materialized views and Caching techniques.
  • Proficient in creating SQL Server reports, handling sub reports and defining query for generating drill down reports and drill through reports using SSRS 2005/2008.
  • SSIS Packages are created to export and import data from CSV files, Text files and Excel Spreadsheet.
  • Used IBM WebSphere DataStage for data quality. Setting up rules, best practices, and managing project resources.
  • Developing reports and intelligent dashboards for Global Sales team.
  • Performance tuning required for slow running reports, designing of performance enhancing structures on Database.
  • CreatedTeradataExternal loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables inTeradataDatabase.
  • POCs for implementing Informatica scheduling on Job Automation tool UC4 and making ETL loads as flexible and restart able.
  • Coordinated with Database administrator team to make sure database gets executed correctly at data stage and production instances before loads can start.
  • Used nested stored procedures with complex control flow logic to feed SSRS reports.
  • Reverse Engineering the Data Model from Erwin.
  • Involved in creating Data stage tables/mappings/ setting the trust level -HDD in Siperian for Customer Master system.
  • Used the metadata of Informatica repository tables
  • Modeled Obiee RPD to use Informatica Repository tables to generate reports for ETL loads
  • Implemented Mappings to extract the data from various sources VSAM file\Mainframe DB2\Oracle, XML files and load into oracle and Teradata using Teradata utilities like Mload, Tpump, and Fast Export.
  • For tracking the current status of loads, current/historic performance, throughput, differentiating long running jobs, view performance trends for individual Informatica sessions and developed metrics to view ETL performance.
  • Created agents using obiee delivers to send emails in case of ETL load failures and for long running jobs.
  • Used statistical functions like regression to view the performance trends.

Environment: Oracle Business Intelligence Suite)OBIEE 11.1.6/11.1.5, InformaticaPowercenter 9.5.1,Informatica Multi-domain MDM 9.1,SSIS,SSRS,InformaticaB2BDX DT v 8.0, Oracle 10g/11g, BI Publisher, TOAD,Teradatav12, SQL, SQL Server 2000/2005, RedGate, Windows 7

Confidential

EL Developer

Responsibilities:

  • Analyzing the source data coming from different sources and working with business users and developers to develop the Model.
  • Involved in Dimensional modeling to Design and develop STAR Schema, Using ER-win to design Fact and Dimension Tables.
  • Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup (Connected & Unconnected), Source Qualifier, Filter, Update Strategy, Stored Procedure, Router, and Expression).
  • Developed number of complex InformaticaMappings, Mapplets and Reusable Transformations for different types of tests in Customer information, Monthly and Yearly Loading of Data.
  • Using Workflow Manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Extracted data from various sources like IMS Data Flat Files and Oracle.
  • Extensively Used Environment SQL commands in workflows prior to extracting the data in the ETL tool.
  • Wrote complex SQL scripts to avoid Informatica joiners and Look-ups to improve the performance, as the volume of the data was heavy.
  • Created Sessions, reusable Worklets and Batches in Workflow Manager and Scheduled the batches and sessions at specified frequency.
  • Used SQL * Loader for Bulk loading.
  • Created Stored Procedures for data transformation purpose.
  • Monitored the sessions using Workflow Monitor.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.
  • Worked with TeradataBTEQ utilities for ETL development and design.
  • Creation of customized MLOAD scripts on UNIX platform for Teradata loads
  • Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Captured data error records corrected and loaded into target system.
  • Created Mappings, Mapplets and Transformations, which remove any duplicate records in source.
  • Implemented efficient and effective performance tuning procedures, Performed benchmarking, and these sessions were used to set a baseline to measure improvements against.
  • Tuned Source System and Target System based on performance details, when source and target were optimized, sessions were run again to determine the impact of changes.
  • Used Calculations, Variables, Break points, Drill down, Slice and Dice and Alerts for creating Business Objects reports.
  • Configured workflows with EmailTask which would send mail with session, log for Failure of a sessions and for Target Failed Rows.
  • Used ServerManager to create schedules and monitor sessions. And to send the error messages to the concerned personal in case of process failures.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.

Environment: Informatica Version 9.1, Oracle 9i, Teradata v2r6, UNIX shell scripting, SQL Server 2008, SQL, PL/SQL

Confidential

Mainframe Developer

Responsibilities:

  • Understanding and Capturing of requirements.
  • Extraction of business rules, Entity Relationship models, process models and cross in the application.
  • Involved in the preparation of high level/low level design documents.
  • Involved in the Analysis, Development and testing phases of the project.
  • Involved in the construction of new COBOL and DB2 programs as per specifications and standards provided by Citi Cards.
  • Involved in the quality check and standards of the program using ASA tool for technical issues while testing.
  • Creating our own region for testing
  • Converting VSAM files to DB2
  • Involved in the Unit testing and User Acceptance Testing(UAT) and ensure the completeness of the testing.
  • Involved in the Resolution of UAT issues.
  • Modelling the UAT regions.
  • To run a series of programs in a chain so as to simulate the testing as it takes place in PRODUCTION environment.
  • Adherence to IQMS (Integrated Quality Management System) by
  • Following quality procedures like IQA (Internal Quality Assurance).
  • EQA (External Quality Assurance) and FI (FinalInspection) for all the deliverables.
  • Adherence to IPMS by entering efforts and logging defects.
  • Adherence to SPEED by achieving goals periodically.

Environment: IBM Mainframe, OS/390 JCL, COBOL, SQL, DB2,VSAM, File Aid, SPUFI, Change man

We'd love your feedback!