Sr. Etl Developer Resume
Raleigh, NC
SUMMARY
- Over 8 years of extensive experience in Information Technology with special emphasis on Data Warehousing and ETL process using Talend & Informatica Power Center9.x/8.x/7.x/6.x.
- Extensive knowledge on all phases of SDLC. (Requirements, Analysis, Design, Development, Testing, Deployment and Maintenance).
- Excellent knowledge of OLTP and OLAP Systems, developing Database schemas - Star and Snow flake schemas (Dimensions and Facts).
- Excellent experience in Extraction, Transformation and Loading of data into Warehouse and downstream applications from heterogeneous systems using ETL tool - Informatica Power Center 9.x/8.x/7.x and Talend Open Studio and Talend Integration Suite.
- Expertise in developing complex mappings using various transformations like Unconnected and Connected lookups, Router, Filter, Expression, Rank, Aggregator, Joiner, Union, Update Strategy and Stored procedure.
- Hands on experience on several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Slowly Changing Dimensions (SCD Type I and Type II).
- Excellent experience working on Informatica Data Masking, Data validation option (DVO).
- Strong experience inperformance tuning, debuggingand Error handling of mappings, sources and targets for better performance, also identified and fixed the bottlenecks.
- Extensive experience in creatingReusable Mapplets, Worklets and used event wait, decision box, email and command tasks in Workflows.
- Extensively worked on integrating various data sources- Oracle, SQL Server, DB2, Teradata, XML files, Mainframe related data and Flat files.
- Strong knowledge in writing efficient & complexSQL Queries in Oracle 11g/10g/9i and in PL/SQL for Stored Procedures, triggers, indexes, cursor, and functions.
- Solid Understanding and work experience on Teradata BTEQ scripts & T-SQL Tool for querying and retrieving data.
- Hands on experience on Teradata Utilities like MLOAD, FLOAD, TPUMP, BTEQ and FAST EXPORT.
- Excellent knowledge of various test stages with expertise in System, Integration and User Acceptance Testing along with Unit Testing.
- Experiencedin Control-M and CRON tab Scheduling tools.
- Used data modeling tools such as,VISIO, Erwinto design database models.
- Hands on experience on UNIX shell scripts for Informaticapre &post session operations.
- Experienced with Production Support and Peregrine services.
- Ability to work well within a team & IT staff and provide necessary support to executive staff and clients.
- Highly skilled in time, task and data management abilities along with good communication skills.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 9.x/ 8.x/7.x, Talend Open Studio, TalendIntegration Suite 5x/4x
Databases: Oracle12c/11g/10g/9i, SQL Server 2008, IBM DB2, Teradata, MS Access, MS-Excel.
RDBMS Load Tools: Toad, SQL Developer, SQL Loader, SQL plus, Teradata SQL Assistant
Programming Skills: SQL, PL/SQL, UNIX shell scripting.
Data Modeling Tools: MS - Visio, Erwin
Operating Systems: UNIX, Windows 98/2000/2007/ XP/NT, UNIX (Solaris, AIX).
PROFESSIONAL EXPERIENCE
Confidential, Minneapolis, MN
ETL Developer
Responsibilities:
- Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
- Developed Technical Design documents (LLD & HLD) for ETL strategy and Oracle dump approaches for both Initial and Incremental Loads.
- Designed and built ETL mappings and workflows to load data from different sources Oracle, Sybase, SQL Server and fixed width as well as Delimited Flat files.
- Implemented Variables and Parameters in the mappings.
- Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
- Created Implicit, local and global Context variables in the job.
- Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
- Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
- Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.
- Developed stored procedure to automate the testing process to ease QA efforts and also reduced the test timelines for data comparison on 500 tables.
- Worked with data analysts in analyzingdata to come up with an extraction strategy in processing catalog updates.
- WroteUNIX shell scripts to SFTP files, archive files and pre-validation scripts.
- Expertise in documenting Mapping documents, Unit test plans, test results and Deployment plans.
- Analyzed change requests and performed impact analysis and estimated level of efforts.
- Worked on advanced concepts like concurrent running of the workflow, session level partitioning techniques and pushdown optimization.
- Automated SFTP process by exchanging SSH keys between UNIX servers.
- Scheduled workflows using PMCMD and UNIX Shell scripts using Control-M for daily and monthly loads.
- Involved in production deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
- Coordinated offshore and onsite teams efficiently for multiple projects.
Environment: Informatica Power Center 9.0 &9.1, Oracle 11g, SQL Developer, SQL Navigator, Toad 10.6, PL/SQL, SQL Server,Talend, Sybase, Clipper DB, Progress DB, Control M, Windows XP, UNIX, Putty &Winscp.
Confidential, Conway, AR
Sr. ETL Developer
Responsibilities:
- Workedclosely with Business Analysts to review the business specifications of the project and also to gather the ETL requirements.
- Developed complex Talend ETL jobs to migrate the data from flat files to database.
- Implemented custom error handling in Talend jobs and also worked on different methods of logging.
- Created ETL/Talend jobs both design and code to process data to Confidential databases.
- Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture globalmap variables and use them in the job.
- Involved in preparing data model for the new tables in an existing subject area.
- Created mappings using the Transformations like Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Sequence Generator, and Update Strategyfor the data integration.
- CreatedReusable mappings, Workflows and Workflows by using various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session.
- Successfully LoadedData into different targets from various source systems like Oracle Database, DB2, Flat files, XML files...etc into the Staging table and then to the Confidential database.
- Implemented Change Data Capture (CDC) logic for high volume tables.
- Created pass through partitions for better data load.
- Re-designed ETL mappings to improve the performance and data qualityand used the Debugger to Debug sessions setting Breakpoints across instances to verify the accuracy of data.
- Worked with Informatica Administrators to setup project folders in development, test and production environments to migrate objects from 8.6.1 version to 9.0, and to get the connection strings created.
- Troubleshot long running sessions and fixing the issues.
- Developed UNIX shell scripts using PMCMD and PMREP utilities and scheduled ETL load to automate Pre-Session and Post-Session Processes.
- Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
- PerformedUnit testing and System testingto validate data loads in the Confidential, .
- UsedInformaticaVersion Control to check-in all versions of the objects.
- ScheduledInformaticajobs through Control M.
Environment: Informatica Power Center 9.0 & 8.6.1, Oracle 11g,Talend, SQL Developer, PL/SQL, DB2,Control M, Windows XP, UNIX, Putty &Winscp.
Confidential, Raleigh, NC
Sr. ETL Developer
Responsibilities:
- Understood business needs and implemented the same into a functional data warehouse design.
- Involved in creating Logical and Physical design of databases using Visio.
- Actively involved in requirements gathering and complete life cycle of the project.
- Developed Logical and physical database designs for the OLAP systems.
- Worked on building datawarehouse using Bill Inmonand Ralph Kimball methodology.
- Extensively worked on Informatica client tools- Source Analyzer, Warehouse Designer, transformation developer, Mapplet Designer, Mapping Designer, Workflow Designer, Worklet Designerand Task Developer.
- Developed new and maintainedexisting Informatica mappings and workflows based on specifications.
- CreatedMapplets, reusable transformations, Workletsand used them in different mappings, workflows.
- Performed trunk & Load process by using stored procedure transformation and load control table.
- Worked Extensively on Slowly Changing Dimensions i.e. Type1 & Type2.
- Performed incremental aggregation to load incremental data into aggregate tables.
- Hands on experience in creating labels and querying to migrateInformatica objects between repositories.
- Worked on Teradata BTEQ scripts & T-SQL Tool for querying and retrieving data.
- Worked with Mload, Fload& Fast export Scripts for Inserting and exporting data from Teradata database.
- Worked on production issues like bug fixing, bottlenecks, data validation and report errors.
- Performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
- Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
- ScheduledInformatica jobs through CRON tab in UNIX as per the business requirement.
- Worked with reporting team to help understand them the user requirements on the reports and the measures on them.
Environment: Informatica Power Center 8.6.1, Oracle 10g, Teradata, SQL Plus, PL/SQL, TOAD,Unix, Visio, HPQuality Center, CRON Tab.
Confidential, Cary, NC
ETL Developer
Responsibilities:
- CreatedDimension Tables and Fact Tables based on the warehouse design.
- Participated in buildingData Warehouse, which includes Design of Data mart using Star Schema.
- Loaded datamart tables using OLTP transactional sales data.
- Utilized FASTLOAD, MLOAD, TPUMP and TPT utilities for Teradata to load data into landing, core and semantic layers
- Involved in the extraction, transformation and loading of data from source flat files and RDBMS tables to Confidential tables.
- Utilized pushdown optimization feature in Informatica with Teradata to improve performance and load efficiency.
- Extracted data from flat files and oracle database,Teradata, applied business logic to load them in the central Teradata database.
- Used Pipeline Partitioning feature in the sessions to reduce the load time.
- Scheduled workflows in Control M to pull data from the source databases at weekly intervals.
- Created SQL joins, sub queries, tracing and performance tuning for better running of queries in SQL Server.
- Used various performance enhancement techniques to improve the performance of the sessions and workflows.
- Performance tuning on sources, targets and mappings.
- Worked extensively with complex mappings using expressions, aggregators, filters, lookup and procedures to develop and feed in Data Warehouse.
- Created reusable transformations and Mappletsand used them in mappings.
- Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
- Worked on DB2 views, triggers stored procedures and SQL transactions.
- UsedShell scripts to automate the Pre-Session and Post-Sessions processes.
- Performed data manipulation using basic functions and Informatica Transformations.
- Involved in unit testing, Integration testing and User acceptance testing of the mappings.
Environment: Informatica Power Center 8.1.1, Oracle 10g, SQL Server 2008,DB2,SQL Developer, Tivoli, UNIX,Windows 2008 .
Confidential, Atlanta, GA
ETL Developer
Responsibilities:
- Responsible for Design and Development of ETL mappings according to the requirement.
- Analyzed the specifications and identifying the source data that needs to be moved to the data warehouse.
- Involved in the requirements definition and analysis in support of data warehousing efforts.
- Configured Informatica with different source systems.
- Applied various Transformations on the source data to populate the data to data marts.
- Analyzed source data structure and perform the Logical mappings.
- Identified Data Ownership/ Responsibility, Data Sizing, ETL Logic Design, Reading the structure of Multiple Databases and Querying the Data to Ensure Uniformity and Accuracy.
- Involved in Logical and physical database design using Erwin.
- Worked on dimensional modeling to Design and develop Star Schemas, identifying Fact and Dimension Tables for providing a unified view to ensure consistent decision making.
- Created Informatica mappings with PL/SQL, procedures/functions to build business rules to load data.
- Created Unix scripts and used in Informatica with command tasks.
- Developed mappings using Informatica Designer to load data from various source systems to Confidential database.
- Set up Batches and Sessions to Schedule Loads at regular intervals in Informatica.
- Responsible for tuning ETL mappings to optimize load and query Performance.
Environment: Informatica Power Center, Windows 2000, Oracle 9i, PL/SQL, SQL Developer, UNIX shell scripting,Erwin