Informatica/etl Developer Resume
Brea, CA
SUMMARY
- 8+ years of Technical and Functional experience in Data warehouse implementations using Informatica PowerCenter 9.x/8.x/7.x, Oracle 11g/10g/9i, Netezza, Teradata, MS SQL SERVER 2008/2005.
- Experience in ETL Architecture and Software development life cycle.
- Have exposure to all the steps of SDLC. Also worked in development environment using Agile Methodology.
- Experience in various domains like HealthCare, Retail, and Pharmaceutical industries.
- Good knowledge of Ralph Kimball & Bill Inmon Methodologies.
- Extensively worked on Informatica Repository Manager, Designer, Work Flow Manager and work Flow Monitor and Repository Databases.
- Experience in Master Data Management in healthcare domain with HL7 messages and HIPPA transactions using different applications of MDM as hub console.
- Expertise in Oracle and Netezza databases and tools to support both databases.
- Worked extensively with complex mapping and expertise in implementing all types of slowly changing dimensions logics.
- Experience with complex mappings using Source Qualifier, Joiner, Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence generator, Rank, Stored procedure, XML generator, XML Parser Transformation.
- Experience in creating Sessions, Workflows and Worklet by using the Workflow Manager tools like Task Developer, Worklet and Workflow Designer.
- Experience in optimizing query performance, session performance and fine tuning the mappings for optimum performance.
- Created reusable transformation and mapplets in the designer using transformation developer and mapplet designer.
- Experience with pre - session and post-session SQL commands to drop indexes on the target before session runs, and then recreate them when the session completes.
- Extensively worked for monitoring scheduled, running, completed and failed sessions. Involved in debugging the failed mappings and developing error handling methods.
- Experience working with OHADI.
- Experience with different utilities of Teradata like mload,fload.
- Involved in ETL Architecture and Data Modeling using ERWIN.
- Experience working with BI Reporting tools Cognos and Business Objects.
- Extensively used of SQL Developer, Toad, Aginity Workbench and Management Studio.
- Extensively used SQl plus to create cursors, triggers, Function and Procedures.
- Expertise in Unix Shell scripting, cronjobs and using pmcmd commands (Informatica command line utility) to create, schedule and control workflows and tasks.
- Expertise in writing Functions, Procedures using PL/SQL.
- Extensively used WInscp, Filezilla and Putty.
- Expertise in data analysis for different domains.
- Experience working with Autosys, TIDAL, Tivoli for automation of scheduling jobs.
- Experience in cloud computing, Big data, Hadoop.
- Experience with SOAP,REST based web sevices
- Experience in administrative work by creating users, user groups, folders, configure security, giving permissions in folders and to users in Repository manager.
- Experience in creating Deployment documents, Unit Test Plan and Request for change.
- Experience in Unit Testing and UAT.
- Excellent communication, interpersonal skills and a team-worker with excellent customer interaction.
TECHNICAL SKILLS
ETL & Reporting Tools: Informatica Power Center 9.x/8.x/7.x,(Repository Manager, Designer, Workflow Monitor, Workflow Manager), PowerCenter Real Time Edition, Informatica Data Quality, Master Data Management, B2B, DTstudio, COGNOSData modeling Tools Erwin5.1/4.1, Dimensional modeling, Star Schema, Snow Flake Schema, Physical and logical Data Modeling, Microsoft Visio.
Databases and related tools: Oracle 11g/10g/9i, SQL Server2008/2005, Netezza, SQL Developer, Teradata, Aginity Workbench, Management studio, TOAD, DT studio, DB Visualizer.
Languages and scripting: SQL, PL/ SQL, UNIX Shell Scripting, PERL, SQL plus.
Other: MS Word, MS Excel, MS Office 2003/2007/2010 , Autosys, Filezilla, Putty, winScp, OHADI, TIDAL, Tivoli, SOAP, REST
Operating Systems: UNIX, Linux, Windows 7/XP/2000/98
PROFESSIONAL EXPERIENCE
Confidential, Brea,CA
Informatica/ETL Developer
Responsibilities:
- Designed high level design for Private Passenger Auto and Non Private Passenger Auto.
- Created Mapping Document for the PPA and Non PPA Facts and Dimensions.
- Designed the flow of different source systems in the EDW.
- Different Source system Guidewire, Nextgen and Phoenix East and Phoenix West were redesigned to change the code from SCD type 1 to SCD type 2.
- The EDW was Snow Flake Schema based Dimensional Model.
- Created the mappings for Facts and Dimensions.
- Created the workflow for Daily, Weekly and Monthly load.
- Helped in creation of mapping specs for better code processing.
- Abstracted relational and delimited text file to local environment and developed the code in Dev environment.
- Checking the file formats for the source flat files using UNIX shell scripts thereby ensuring input file formats are same as specified.
- Created the mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy, Sequence generator and Java Transformation.
- Used Debugger in troubleshooting the existing mapping.
- Used SCD type 1 to load dimensions in Netezza and finally Fact tables of Data Warehouse.
- Used truncate load to load Netezza tables.
- Created Netezza Sql scripts to test the table loaded correctly.
- Created Reusable sessions in Task Developer.
- Used Informatica scheduler to run the workflows in Dev Environment.
- Worked on Informatica Power Center tool - Source Analyzer, Mapping Designer and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
- Extensively used SQL Developer to check if data is properly loaded into target systems by writing SQL Queries.
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.
- Used Debugger in troubleshooting the existing mapping.
Environment: Informatica PowerCenter 9.1, Netezza, Oracle 11g, DB Visualizer, Putty, Winscp, Unix shell scripting, Perl, PL/SQL, Aginity workbench, Tivoli.
Confidential, Pittsburgh,Pennsylvania
Informatica/ETL Developer
Responsibilities:
- Interacted with the Business Analysts to understand the business & gather technical requirements.
- Interacted with Architect to understand the High level Design.
- Helped in creation of mapping specs for better code processing.
- Abstracted relational and delimited text file to local environment and developed the code in Dev environment.
- Checking the file formats for the source flat files using UNIX shell scripts thereby ensuring input file formats are same as specified.
- Created the mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy, Sequence generator and Java Transformation.
- Created Mapplet for error handling process to deal with null records.
- Developed code for landing environment and then for staging and finally developed Incremental load populate the target tables for atomic model.
- Developed code to move data to MDM environment and finally loaded target tables using truncate load.
- Used MDM hub console to verify trusted data.
- Used DTstudio for B2B exchange of HL7 messages to our source landing.
- Worked on PowerCenter real time edition to run the Real time workflows.
- Configure session properties for real time workflows.
- Loaded Historical data for specific period of time to the Atomic model.
- Developed code for backload and Incremental load.
- Extracted HL7 messages from queues.
- From atomic model loaded the Normalized model with the help of OHADI.
- This normalized model was used as a staging area for the dimensional model which was Netezza based dataWarehouse.
- Worked on Aginity workbench to process all the relational objects of Netezza.
- Data analysis and Datatype analysis was done on aginity workbench for Netezza objects.
- Created mappings to load backload and clean data to Netezza staging environment.
- Used SCD type 1 to load dimensions in Netezza and finally Fact tables of DataWarehouse.
- Used truncate load to load Netezza tables.
- Worked with SOAP based web service to make a web service call using java transformation.
- Created Netezza Sql scripts to test the table loaded correctly.
- Created Reusable sessions in Task Developer.
- Used Informatica scheduler to run the workflows in Dev Environment.
- Worked on Informatica Power Center tool - Source Analyzer, Mapping Designer and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
- Extensively used SQL Developer to check if data is properly loaded into target systems by writing SQL Queries.
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.
- Used Debugger in troubleshooting the existing mapping.
- Performed Unit test and Integration test for the process created in Informatica and all the test cases were well documented for process improvements.
- Created test scripts and Unit test plan documents.
- Created Deployment Document forMigration of code from Dev to Test and from Test to Production environment.
- Involved with testing team to test the code and Fix the code.
Environment: Informatica PowerCenter 9.1, Informatica PowerCenter Real Time Edition, B2B DT studio, MDM, Oracle 11g, SQL developer, Putty, Winscp, Unix shell scripting, SOAP,PL/SQL, Netezza, Aginity workbench.
Confidential, AZ
Informatica/ETL Developer
Responsibilities:
- Involved in the Requirement gathering from the business users.
- Data was extracted from multiple sources which came in on a daily, weekly and monthly basis.
- Incoming data was loaded to the landing area and from there to staging area.
- I was taking care of errors between landing and staging area by flagging them.
- Involved in the extraction, Transformation and loading of data from flat files, SQL server to target tables.
- Worked on Informatica analyzer of IDQ for data validation from the source systems.
- Created Error handling Mapplets for flagging the data for errors and null records.
- Created reusable Worklets for error handling process which we used to run for every load to stage.
- Slowly Changing Dimensions type 1 and type 2 were implemented to get the historical and latest data.
- For every single source system, made the web service call with java transformation to check the updated records.
- Wrote SQL for every project as Error Handling Process was part of every project.
- Worked with mload utility of Teradata to load insert, updates on block levels.
- Worked on Informatica Power Center tool - Source Analyzer, Mapping Designer and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
- Used Informatica Scheduler to run the workflows.
- Configured Sessions and workflow properties for different sources.
- Involved in documentation and production support.
Environment: Informatica PowerCenter9.1,IDQ, Oracle 11g, TOAD 9.5, Teradata 13, Unix shell scripting, Management Studio, PL/SQL, filezilla, Cognos 8.
Confidential, Grand Rapids, MI
Informatica/ETL Developer
Responsibilities:
- Worked with the customers and business analysts for requirements gathering to understand the business and design of the solution.
- Dealt with Cancer Patient data for creating a Clinical Data Mart.
- Involved in the architecture of the data model with Big data.
- Complex mappings were designed using Informatica PowerCenter which process the source data to populate the data mart tables.
- The Data mart was loaded on a weekly basis before it was Production go live.
- Extensively used transformations like lookup, update strategy, expression, aggregator, normalizer, filter, router, stored procedure, sequence generator, union and rank.
- Worked with mapping parameters and variables, created multiple mapplets and reusable transformations and used them appropriately in different mappings.
- Created events and tasks in the work flows using workflow manager.
- Followed best practices Informatica and Confidential standards for EDW at various levels of SDLC.
- The EDW was Netezza based start schema.
- Worked with REST based web services to get required data.
- Extensively used Slowly changing Dimension techniques like SCD Type1, Type2 to load dimensions.
- Aginity Workbench was excessively used to run queries on Netezza databases in dev environment.
- Loaded Netezza table with the old data, updated data and many clean data flat files.
- Used workflow manager for session management, database connection management and scheduling of jobs.
- Performed Unit test for the process created in Informatica and all the test cases were well documented for process improvements.
- Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
- Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.
- Worked with different application of MDM like hub console.
- Creates Batch groups using Master Data Management.
- Used debugger in Informatica Designer to resolve the issues regarding data.
- Created data model for Patient related data.
- Checking the file formats for the source flat files using UNIX shell scripts thereby ensuring input file formats are same as specified.
- Worked with Hadoop for a short period of period of time with the reconciliation project.
- Identified several bugs in existing mappings by analyzing the data flow, evaluating transformations and fixed bugs.
- Supporting day-to-day issues as well as new pieces of work.
- Identified recurring issues/alerts and conduct root cause analysis
Environment: Informatica PowerCenter 9.1, MDM, Oracle 11g, SQL developer, REST, Netezza, Aginity Workbench, Unix, Autosys. XML source, Putty, Filezilla, Erwin 5.1.
Confidential, St. Louis, MO
Informatica/ETL Developer
Responsibilities:
- Involved in understanding the business requirements and translate them to technical solutions.
- Worked for preparing design documents and interacted with the data modelers to understand the data model and design.
- Created new mappings and updating old mappings according to changes in Business logic.
- Involved in migrating project from UAT to Production.
- Setting up of the local Informatica environment on the client machines which included the connectivity and access to the data sources, taking the necessary steps to set up the Relational Connectivity variables in the Workflow manager etc.
- Generated sequence numbers using Informatica logic without using the sequence generator.
- Used SQL override to perform certain tasks essential for the business.
- Worked with flat file, xml file and SQL server tables as targets.
- Implemented Performance tuning in Mappings and Sources.
- Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.
- Used Teradata Utilities like multi laod and fast laod.
- Created scripts in Teradata usking Bteq platform.
- Created and Modified PL/SQL, stored procedures for data retrieval.
- Defined Target Load Order Plan for loading data into Target Tables.
- Worked in Agile Environment. .
- The Oracle application Enterprise resource planning was used the EDW.
- Used mapping variables to achieve certain goals like creating file names dynamically.
- Responsibilities included designing and developing complex Informatica mappings including Type-II slowly changing dimensions.
- Worked to enhance the dimensional model using Erwin.
- Used Pipeline Partitioning feature in the sessions to reduce the load time.
- Unit testing of mappings and testing of various conditions.
- Developed Unix Shell scripts for customizing the delivery of Flat files to the customer.
- Involved in the documentation of the ETL process with information about the various mappings, the order of execution for them and the dependencies.
Environment: Informatica PowerCenter 8.6, Oracle ERP, SQL, PL/SQL, UNIX Shell Scripting, Cognos, Windows XP, SQL Server 2005 and, SAP R3, Filezila,Teradata, Erwin4.1.
Confidential, Bridgewater, NJ
Informatica/ETL Developer
Responsibilities:
- Involved in gathering the requirements, performing scope analysis.
- Involved in requirements gathering, functional/technical specification, Designing and development of End-to-end ETL process for Sales Data Warehouse.
- Worked on Informatica PowerCenter for (ETL) extraction, transformation and loading data from heterogeneous source systems and flat files, which includes fixed-length as well as delimited files.
- Designed the Universe in Designer module of Business Objects 6.5.
- Detected/resolved data Cardinalities and loops, checked integrity of Universes, created List of Values.
- Resolved ordinary loops, Chasm and Fan traps.
- Edited Joins, resolved the loops using aliases and contexts in the universe and exported to repository.
- Created Classes, Dimension and Measure objects that are needed for reporting.
- Created custom hierarchies for Multi-dimensional analysis.
- Set up Aggregate Awareness in universe to retrieve data quickly.
- Created web intelligence reports.
- Created the reports using Business Objects functionalities like Multiple Data Providers, Prompts, Slice & Dice, and Drill Down.
- Involved in Documentation and Production Support.
Environment: Informatica PowerCenter 8.5, Business Objects 6.x, DB2, Oracle 9i, PL SQL, Toad, Windows 2000.
Confidential
Database Developer
Responsibilities:
- Used Different Types of Transformations according to user Requirements using Informatica
- Used Informatica Designer to Create Mappings, Mapplets, Transformations, and Target tables. Extensively used Server Manager for creating and scheduling various sessions.
- Extensively worked on all phases of Software Test Life Cycle - requirement analysis, Estimation, Estimation approval from client, Design, Walkthroughs and approval of test artifacts from client team.
- Created Test scenarios and Test cases based on Project Requirements Document (PRD)
- Created Mappings to fetch data from XML files to get reference data.
- Responsible for monitoring all the sessions that are scheduled, running completed and failed. Involved in debugging the Mappings that failed using debugger to validate the mappings and gain troubleshooting information about data and error conditions.