We provide IT Staff Augmentation Services!

Etl Developer Resume

3.00/5 (Submit Your Rating)

Los Angeles, CA

PROFESSIONAL SUMMARY:

  • Highly Motivated IT professional 10(Ten) Years of experience in IT Industry with more than 8 (Eight) years in Business Requirement Analysis, Application Design, Data Modeling, Development, Implementations and Testing of Data Warehouse and Database business systems for Healthcare, Pharmaceutical, Financial, Insurance, Retail and Telecom Industries.Seven + years of Experience in Design, Development & Support of ETL Methodology for Data Transformations using Informatica PowerCenter 9.x/8.x/7.x/6.x
  • Experienced in Informatica PowerConnect for Mainframe and AS/400 to connect to most major data sources
  • Experienced in PowerConnect for SalesForce SAP/PeopleSoft/Siebel and MQ series
  • Highly Experienced in integrating legacy ERP and other application data into the data warehouse
  • 7+ years of Data Integration experience, including EDI, ETL, Data Warehouse, and Data Conversion. source and target analysis, data profiling, data cleanliness, ETL mappings, ETL workflows, and performance tuning.
  • Expertise in Informatica/Oracle tuning with Seven + years of experience in converting PL/SQL programs to PowerCenter Mappings and Sessions
  • Strong experience in designing and developing Business Intelligence solutions in Data Warehouse/Decision Support Systems using ETL tools, Informatica PowerCenter 9.x/8.x/7.x/6.x, OLTP, OLAP.
  • Excellent understanding and experience in implementing Best Practices of Data Warehousing Concepts
  • Iinvolved in Full Development life cycle of Data Warehousing. Expertise in enhancements/bug fixes, performance tuning, troubleshooting, impact analysis and research skills
  • Extensively worked in Source Analyzer, Target Designer, Transformation developer, Mapplet Designer, Mapping Designer, Workflow Manager, and Workflow Monitor to develop various complex Mappings, Maplets, Reusable Transformations, Session tasks and Workflows in Informatica PowerCenter.
  • Extensive Experience in Change Data Capture (CDC), Batch Processing, Real Time Integration.
  • Experienced ETL Team Lead and a part of the ETL Production Support team. Led both onsite and offshore teams for development, testing and support.
  • Databases
  • Over Eight years of experience with various Data sources like DB2(10x/9.x/8x), ORACLE (11/10/9g), Teradata /14/13/12, MS Access, MS Excel
  • 8 Years of Experience with ORACLE databases, in various capacities of development
  • Experienced in Extraction & Migration of data from heterogeneous data sources like Flat Files, MS Excel, MS Access, MS SQL Server to ORACLE, Teradata
  • Experienced in optimizing and tuning of SQL Queries and PL/SQL applications using utilities like Explain plan, TKPROF & hints.
  • Extensively worked on PL/SQL Object Types, Collections, Ref Cursors, Pipeline functions, Compound triggers, Materialized Views and Table Partitioning
  • Experienced in using utilities like Toad, SQL loader and PL/SQL developer
  • Over 5 years of experience on Teradata development and OLAP operation with Teradata database
  • Highly experienced in using Teradata utilities like SQL, B - TEQ, Fast Load, MultiLoad, FastExport, Tpump, FastExport, and Teradata Parallel Transporter (TPT)
  • Experienced in design, enhancement, and development of applications for OLTP, OLAP, using dimension modeling with Teradata and Oracle
  • Experience with integration of Teradata Utilities to carry the ETL Jobs
  • Oracle Certified Professional ( May 2000)
  • Data Modeling
  • Extensive experience in Analyzing the Dimensional Data models, Star Schema/Snowflake models, FACT & Dimensions tables, Physical & logical data.
  • Excellent command over Logical and Physical, Entity Relationship data modeling using Modeling toolslike ERWin etc.
  • Excellent knowledge in designing data marts using Ralph Kimball's and Bill Inmon's dimensional data mart modeling techniques.
  • Other
  • Experience as a DWH Lead for Off-Shore and on-Shore in Multiple Projects
  • Experience in working with UNIX Shell Scripts, among other things, automatically running & aborting sessions and creating parameter files
  • Good Knowledge of various shells and tools like awk, Sed etc.
  • Experience in scheduling and monitoring jobs using AutoSys, Crontab and Tivoli
  • Excellence in individual and team performance

TECHNICAL SKILLS:

Data Warehousing: Informatica Power Center 9.x/8.x/8.x/7.x/6.x, Repository Server Administrator Console, Power Exchange

Database & Related: DB2 9/8, ORACLE9i/10g/11g, SQL Server 2000/03/08, Teradata 14/13/12

Data Modeling Tools: Erwin7.x, MS Visio

Operating Systems: AIX, Linux, UNIX (Solaris, FreeBSD), Windows Server 2003, Windows XP/NT

Languages: SQL, PL/SQL,XML,XSD,HTML, C++

Applications: Teradata SQL Assistant, TOAD, SQL Loader, Microsoft Office, BTEQ, Performance Monitor (PMON), Teradata Viewpoint, Fast Load, Mload, Fast Export, TPUMP, BTEQ, Teradata Parallel Transporter (TPT)

Testing: Load runner, QTP, Manual Testing, Excellent knowledge of UAT, Testing Tools like Load Runner/QTP and Manual Testing

PROFESSIONAL EXPERIENCE:

Confidential, Los Angeles, CA

ETL developer

Responsibilites:

  • Analyzed the requirements and framed the business logic for the ETL process.
  • Involved in the ETL design and its documentation.
  • Source data profiling and data cleanliness has been done using PowerCenter.
  • Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system
  • Developed mappings in PowerCenter to load the data including facts and dimensions from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner etc.
  • Developed mappings using complex flat files to generate Reports for the downstream systems
  • Developed complex ETL to dynamically build queries based on the rules provided by the end user
  • Developed reusable Mapplets and Transformations.
  • Design, develop and PoerCenter mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows
  • Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
  • Used Jira to plan and prioritize work within onsite/offshore team and track the work and task under development
  • Wrote stored procedures to calculate metric values for the fact tables and call it in PowerCenter 9.6
  • Audit table has been used for auditing purpose
  • Used Control-M for the code migration and scheduling workflows etc
  • Wrote shell scripts to automate workflow and loading tables.
  • Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.

Environment: Informatica Power Center 9.6, DB2, Squirrel, Solaris 10.2, ControlM Scheduler

Confidential, Warren, NJ

Lead ETL Developer

Responsibilities:

  • Led a team for 6 developers in designing and implementing Business Intelligence for The Smart Staffing Dashboard
  • Overseeing the ETL Document Design Process
  • Enforcing Agile Software Methodology for code development
  • Scrum Master Role for daily scrums for code development and progress of the project
  • Code Review and Brainstorming Sessions with business vendors to implement correct data Developer Role
  • Developed techniques to calculate each fact and load it into data warehouse and so that UI application shows real-time data
  • Monitor & Maintain real time data warehouse fact values and dimension values for each level (national/region/area/sub-region/store/location etc)
  • SCD Type 2 Mapping Implementation to load the history of each store
  • Wrote SQL queries to test the data in each table
  • Wrote PL/SQL stored procedure to load missing dimension and fact values.
  • Modified existing Data Models to analyze real time data from retail stores across the nation using Snow-Flake Schema
  • Wrote PL/SQL code to capture and load real-time data from stores in various time zones
  • Extensively worked on Performance Tuning of Session and ETL logic.
  • Analysed sessions logs to understand and rectify failures and bottlenecks as part of Performance tuning
  • Developed code to detect and process late arriving Sales Transaction facts & Dimensions

Environment: Informatica Power Center 9.5, Oracle 11g, Toad, Solaris 10.2, Windows 7, Windows 2008 Server R2

Confidential, Baskin Ridge, NJ

Lead ETL Data Integration to Data Warehouse Specialist

Responsibilities:

  • Implement Code Changes, Getting source data from Salesforce application to International and domestic data
  • Generate ABAP Program and save the Mappings.
  • Create Upgrade Plan to migrate all internal applications populating data marts and data warehouse to a new version.
  • Implement Upgrade Plan including testing of all production mappings, sessions, workflows
  • Modified mappings, sessions, workflows as needed to get them to work as designed in upgraded environment
  • Created test plans to test mappings
  • Solve the unique challenges which require database comparison and data issues.
  • Extensively worked on Performance Tuning of Session and ETL logic.
  • Worked on Informatica and Database partitions to optimize the performance

Environment: Oracle11g, Salesforce connector for Informatica, Informatica PowerCenter 9.1, Cognos, Toad, Solaris 10.1, Windows 7

Confidential, Hoboken, NJ

ETL Developer

Responsibilities:

  • Discussed the Business Plans and Canned reports with the business analysts
  • Work very closely with client and vendors for understanding and implementing requirement
  • Coordinating meetings (Offshore and Onshore) to Design and Develop ETL Specs for the requirement
  • Conducted meetings with the clients Confidential various stages of the project and coordinated with the off-shore team on a daily basis
  • Created a complete Logical and Physical Data Model under a Star Schema by Analyzing Dimensions and Measures from the reports.
  • Extensively used Forward and Reverse Engineering Methods utilizing the available Data Dictionaries
  • Created HLD and LLD for ETL Implementation from Source to Stage and Stage to Dimensions and Facts
  • Created Mappings for Staging and Archiving Data using Informatica Power Center 9.0 by the Truncate and Load Method.
  • Used SCD Type-II to load Facts and Dimensions to the Data Mart
  • Generated Dynamic Parameter File using Data Refresh Table to Implement the CDC Logic to load Incremental Data
  • Created Error Table to capture Errors and Exceptions generated while loading Facts and Dimensions
  • Created Load History Table to Track Loading of various mappings
  • Devised Reconciliation processes through SQL script queries on the Source and the Target
  • Used different Transformations (Expression, Router, Lookup, Filter, Normalizer Source Qualifier etc).
  • Worked on various workflow tasks: Command Tasks, Decision & Assignment Tasks etc. to schedule workflows.
  • Db2 database partitioning implementation in Informatica and set the workflow to implement database key range partitioning.
  • Worked closely with Business Objects Team and implemented the Universe Design from the Dimensional Model
  • Extensively used PMCMD for various chronological jobs (scheduling etc.)
  • Migrated the code from dev to Test environment
  • Working closely with QA team and involved in integration testing

Environment: Oracle10g client, ERWin 7.3. UNIX, Informatica 8.6

Confidential, New Jersey

Informatica Developer

Responsibilities:

  • Analyzed business process and gathered core business requirements.
  • Interacted with business analysts and end users and identified the potential bottlenecks.
  • Used Agile software development methodology.
  • Used Rally for user stories and downloaded the mapping specs and requirement.
  • Involved in Design & Development of the Business Warehouse model.
  • Implemented Dimension Model (logical and physical data modeling) in the existing architecture using ERWin
  • Designed ETL processes to populate the fact and dimension tables with staging table data.
  • Used Informatica Power Center Designer to extract data from different data sources.
  • Created Mappings and Mapplets to load data from source systems into data warehouse.
  • Created Different Transformations such as Joiner, Look-up, Rank, Expressions, Aggregator and Sequence Generator to implement the business rules
  • Identified & Implemented Slowly Changing Dimension methodology for accessing the full history of accounts and transaction information
  • Worked on various workflow tasks command Tasks, Decision & Assignment tasks etc. to schedule workflows.
  • Session Partitioning for concurrent loading of data in to the target tables to improve the performance.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations.
  • Performed Incremental Aggregation to load incremental data into Aggregate tables.
  • Created Workflows using Workflow Manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
  • Responsible for enhancing the systems by implementing Informatica Standards and mapping, workflow standards.
  • Migrated code using Repository Manager and used Static and Dynamic Deployment group for the migration in QA and staging and production environment.
  • Scheduling of jobs using Autosys
  • Involved in Creation, Backup, Restoring, and Performance Tuning of Repository

Environment: Informatica 9.0.1, Informatica Schedular, Cognos 8.1, SQL developer tool, Erwin7.3, Oracle10g, Windows server 2008 R2

Confidential, Atlanata, GA

Informatica Developer

Responsibilities:

  • Analyzed the ETL logic to create the OrderTrack Application
  • Using Erwin Modified existing data model and created logical and physical data model per requirement.
  • Created the High Level and the Low Level Design documents according to the requirements
  • Created the ETL logic document and dependency mapping sheet.
  • Attended the weekly meetings with the BA’s and Java Developers to update the status of the application design
  • Interacted with the data modeler team and created the data models according to the requirements.
  • Created Objects in shared folder of development area in Informatica Power Center.
  • Created Mappings to load data from various sources to stage area tables
  • Developed Mapping using fixed width flat file and delimited flat file.
  • Using flat file created change data capture mappings.
  • Developed the Informatica workflows/sessions to extract, transform and load the data into Target Tables.
  • Performance Tuning of Sources, Targets, mappings and SQL queries in transformations.
  • Implemented the Type1 and Type 2 Mappings and Process insert/update strategies to load stage tables
  • Created Mapping Variables in parameter files to perform ETL process
  • Wrote Pre-SQL and post SQL queries in source qualifier to maintain the flag to insert and update data
  • Wrote complex SQL queries and PL/SQL procedures to perform database operations according to business requirements.
  • Implemented the delta loading to load data from staging area to OrderTrack environment with the timestamp variable and maintained the change source capture in Informatica
  • Wrote the Test cases and Test data to test the created ETL mapping for each schema
  • Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems. Prepared Test Data/Unit Testing /Integrated testing and generated reports and documented various test cases.
  • Tuned Performance of Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Wrote UNIX scripts to maintain the workflows and to email customers and session log files to the development team in case of failures.
  • Document all ETL related work per company's methodology.

Environment: Informatica PowerCenter 8.6.1, ORACLE10g, TOAD,Contol-M, AIX 6.1, Windows XP Professional, RHEL 5.6

Confidential, Warren, NJ

Informatica Developer

Responsibilities:

  • Interacted with Business Analyst team and analyzed the requirements to translate into Technical Specifications.
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • As per the requirement, performed Initial loading, monthly loading and weekly cleanup update processing.
  • Wrote PRE and POST SQL commands in session properties to manage constraints that improved performance.
  • Performed Pushdown Optimization to increase the read and write throughput
  • Wrote complex SQL queries and PL/SQL procedures to perform database operations according to business requirements.
  • Performed defect fixes in various environments to ensure the proper delivery of the developed jobs into the production environment.
  • Document all ETL related work per company's methodology.
  • Synchronize Workflows with touch files to ensure proper delivery of data from source to stage to repository to mart.
  • Schedule the workflows and using Autosys
  • Involved in system end to end testing and performance tuning.
  • Prepared estimates and tracked each and every task and strictly adhered to the estimated deadlines.

Environment: Informatica PowerCenter 8.6,Tivoli DB2 UDB 8.0, Autosys, UNIX

Confidential

Software Engineer

Responsibilities:

  • Involved in analysis, solution and low level design.
  • Imported the data to the database through SQL * Loader.
  • Created PL/SQL packages, procedures and triggers to populate the data and calculate the values required for the report.
  • Created shell scripts to run various jobs, and invoke PL/SQL procedures and java programs.
  • Created programs in PL/SQL for making different kinds of reports (Excel, XML, and CSV) based on the local configuration file.
  • Used Inbuilt Oracle packages (UTIL PACKAGE) for sending out the reports to the email addresses configured.

Environment: Oracle 8i, PL/SQL, SQL* Loader, Core Java, HTML, Red Hat Linux.

We'd love your feedback!