We provide IT Staff Augmentation Services!

Sr Informatica Developer Resume

3.00/5 (Submit Your Rating)

Irvine, CaliforniA

SUMMARY:

  • 8 years of ITexperience in SoftwareAnalysis, DesignandDevelopmentfor various software applications in client - server environment in providingBusiness Intelligence SolutionsinData Warehousingfor DecisionSupport Systems, OLAP and OLTPApplication Development.
  • 7+ Years ofData Warehouse, Data mart, Data Integration and Data ConversionProjectsETL experience usingInformatica Power Center 9.0.1/8.6/7.1.
  • Extensively usedEnterprise Data warehousingETLmethodologies for supporting data extraction, transformation and loading processing, in a corporate-wide-ETL Solution usingInformatica Power Center.
  • Expertise in different Schemas (Star and Snow Flake) to fitreporting,queryand Business analysis requirements.
  • Expertise onInformatica Mappings, Mapplets,Sessions,WorkflowsandWorkletsfor data loads.
  • Expertise onException Handling MappingsforData Quality,Data CleansingandData Validation.
  • Extensively worked withInformaticaMapping Variables,Mapping ParametersandParameter Files.
  • Strong experience in developing complex mappings using varied transformations likeUnconnected and Connected lookups, Router, Aggregator, Sorter, Ranker, Joiner, Stored Procedure and Update Strategy.
  • Developedslowly changing dimension (SCD)mappings forType1, Type2and Type3(version, flag and time stamp).
  • Experience in developingIncremental Aggregation mappingsto update the values in flat table.
  • Experience inTrouble shootingandimplementingPerformance tuningat various levels such as Source, Target, Mapping, Session and System in ETL Process.
  • Extensively worked onCode Migration from Development, Sys test, UAT, and Production.
  • Worked with DB development tools like TOAD, SQL Developerand rapid SQL and have knowledge on other ETL products like SSIS and Datastage.
  • Extensive experience in writingPL/SQLProcedures, Triggers, Functions and Views for back end development.
  • Experience inUNIX Shell(Korn Shell, Bourn Shell, C Shell)Scripting. Developed shell scripts for feed file integrity checks, FTP activities and commands like ftp, scp, find, grep, pipe, cut, redirection and utilities like awk and sed.
  • Very good in developing, Debugging, Optimizing and Performance Tuning of Oracle.
  • Extensively worked on Level 3 production support issues and resolved and fixed the bugs using session logs, workflow logs via SPLUNK. Very flexible and can work independently as well as in a team environment.
  • Involved in Preparing HLD, LLD, Test cases, Test Plan, Cut over Plan and ETL Estimation.
  • Extensive knowledge onBusiness Intelligence/ Reporting toolsCrystal Reports, (OBI) EE, Business Objects.
  • Good knowledge on ER Data Modeling tools likeErwin, MS Office SuiteandVisio.
  • Good exposure to theSoftware Development Life Cycle (SDLC)andOOADTechniques.
  • Extensively worked with Scheduling tools likeControl MandAutoSys.
  • Knowledge in full lifecycleimplementation ofData Warehousespertaining toInsurance,Financial,and Networking, Pharmaceuticalverticals.
  • Team Player with ability towork independentlyandmanage/lead ETL Teams.
  • Have Strongtechnical, communication, time managementandinterpersonalskills.

TECHNICAL SKILLS:

ETL & BI Tools: Informatica Powercenter 9.0.1/8.x/7.x, Data Integration 4.0, SSIS & SSRS, Cognos 8, Power Exchange 9.0.1, OBIEE, Crystal reporting.

Database: Oracle 11g, 10g, 9i, SQL server 2005/2008, Teradata V2R5.01, Netezza, DB2, Vertica 5.0, MDM.

Data Modeling: Dimensional Data Modeling (Star, Snowflake Schema, FACT Dimension), Conceptual Physical and Logical data modeling, Erwin.

Programming Languages: C, C++, Java, Oracle PL/SQL, UNIX, LINUX.

Others: HP-Quality center, JIRA V6.4.11, SPLUNK 6.2.1, BMC Remedy, AUTOSYS, Secure CRT 7.1, perforce P4v, TOAD, Rapid SQL 8.6.0, MS Excel, Notepad ++, Microsoft outlook, Word and PowerPoint.

PROFESSIONAL EXPERIENCE:

Confidential, Irvine, California

Sr Informatica Developer

Responsibilities:

  • Analyzing the requirements and making functional specification by discussing with business user groups and translate the Business Requirements and documented source-to-target mappings and ETL specifications
  • Designed and Developing Informatica complex mappings and workflows to load the files from Charles Schwab, EDJ, Morgan Stanley, LPL into MSSBI and SMART staging database.
  • Generated the monthly feed From SMART to wells forgo, UBS extracts and feeds using Informatica power center 9.0.1.
  • Data is extracted through mappings from different sources like Relational database tables, flat files, Oracle tables into target Data warehouse etc.
  • Designed ETL mapping processes to load data from staging (STG) to Integration tables(INT) using lookup transformation, Expression, sequence generator, update, aggregator, router, stored Procedure to implement complex logics.
  • Implemented slowly changing dimension SCD type 1 and SCD type 2, update else Insert to load the data from Integration tables to data warehouse tables using informatica mappings.
  • Created Data bridging among different application using DW environment like SQL server, Hadoop.
  • Worked on Historical data conversions from different source systems SMART to MSSBI and ORION to MSSBI.
  • Experienced in methodologies like SDLC and Agile methodologies such as SCRUM
  • Contributed and actively provided comments for user stories review meeting within an AGILE SCRUM environment.
  • Extensively worked on SQL override in Source Qualifier Transformation for better performance.
  • Designed and developed stored procedures using PL/SQL and tuned SQL queries for better performance.
  • Used Incremental Aggregation Technique to load data into aggregation tables for improved performance.
  • Created and used reusable transformation, Mapplets using Informatica Power center.
  • Created UNIX shell scripts for file validation and watcher/delete programs.
  • Extensively provided on call production support to fix and resolve the production issue/failures using Splunk logs, in a timely fashion and requisite communication is made to the business needs.
  • Created Autosys jobs to trigger UNIX shell scripts and Informatica workflows and batches using Autosys Scheduler.
  • Created/ resolved the incidence in BMC management tool for any existing issue to be reported.
  • Tuned the performance of complex Informatica mappings and database. Used the debugger in Informatica to test the mapping and fix the mappings.
  • Worked with cross functional teams to resolve the issues.
  • Worked on various lookup caches like static, dynamic and persistent.
  • Created business rules using IDQ, which could be dynamically associated across different processing (run) of the application.
  • Involved in the defect analysis for UAT environment along with users to understand the data and to make any modifications to code.
  • Prepared the recovery process in case of workflow failure due to database issues or network issues.
  • Conduct a thorough code review and ensure that the outcome is in line with the objective and all the processes and standards are followed.

Environment: Informatica Power Center 9.0.1, Oracle 11.5, PL/SQL, UNIX shell scripting, LINUX, HP (quality center), Autosys, Rapid SQL 8.6.1, JIRA Project management v6.4.11, Secure CRT 7.1, Splunk 6.2.1, Perforce P4V, Windows XP.

Confidential, Miami, FL

Informatica ETL developer

Responsibilities:

  • Actively involved in gathering requirements and acquiring application knowledge from the Business.
  • Developed mappings using InformaticaPower Center Designer to transform and load the data from Oracle source systems to Staging and Staging to Teradata target database.
  • Create data manipulation and definition scripts using Teradata BTEQ utility.
  • Being an Agile project actively contributed to planning sessions and story creations related to ETL.
  • Utilized Target load Plan and Constraints based loading feature.
  • Designed the ETL processes using Informatica to load data from Oracle to target Teradata Data Warehouse.
  • Created load scripts using Teradata FastLoad, FastExport and Multiload utilities.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
  • Used debugger to test the mapping and fix the bugs.
  • Scheduled various workflows tasks using Informatica Scheduler.
  • Used Power Exchange to source copybook definition and then to row test the data from data files etc.
  • Created mapplets & reusable transformations to use them in different mappings.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Developed and improve standards and procedures to support data quality development, testing, production, and operational oversight of the InformaticaData Quality (IDQ 8.6) processes.
  • Utilized the current InformaticaData Quality (IDQ 8.6 8.6) to create/modify jobs to measure the completeness and accuracy of data flowing into the EDW.

Environment: Informatica Power Center 9.0.1, InformaticaIDQ 8.6, Oracle 11g, PL/SQL, Erwin, Control-M, SQL Server 2008, DB2 8.1, Teradata, UDB, XML, Autosys, UNIX, LINUX.

Confidential, Torrance, California

Informatica ETL Developer

Responsibilities:

  • Involved in the defect analysis call for UAT environment along with users to understand the data and to make any modifications to code. Created SQL Queries to validate the data in both source and target databases.
  • Performed Root cause analysis and resolved complex issues.
  • Prepared the recovery process in case of workflow failure due to database issues or network issues.
  • Worked on Informatica 9.1 Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time.
  • Analyze the data in Teradatatables and check the validity of the scripts if the data is being fetched as expected.
  • Involved in building the ETLarchitecture and Source to Target mapping to load data into Data warehouse.
  • Worked on OBIEE report, agents, dashboard and prompt development.
  • Developed process for Teradataand RDBMS utilities such as MLoad, Fast Load (Teradata)
  • BI solution based on Pentaho platform. Components used, Pentaho Reports, Pentaho Data integrator, Pentaho Analyzer, Mondrian cubes.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Designed and developed advanced reusable UNIX shell scripts for ETLauditing, error handling and automation.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Implemented slowly changing dimension Type 1 and Type 2 for Change data capture (CDC) using Version control.
  • Unit testing, code reviewing, moving in UAT and PROD, Handling Development and production support.

Environment: Informatica Power Center 9.0.1/8.6.1, Oracle 11g/10g, PL/SQL, Erwin, Control-M, SQL Server 2008, DB2 8.1, Teradata, UDB, XML, Autosys, UNIX, LINUX and V-SAM.

Confidential, Richmond, VA

ETL Developer

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End user Meetings.
  • Part of Upgrade Team from Informatica 7.1 to Informatica 8.6.
  • Responsible for Business Requirement Documents BRD’s and converting Functional Requirements into Technical Specifications.
  • Extracted data from various heterogeneous sources like DB2 and Flat Files.
  • Responsible for Production Support and Issue Resolutions using Session Logs, and Workflow Logs.
  • Extensively used Source Qualifier Transformation, and used most of its features like filter, sorter, and SQL Override.
  • Used the Power center Web services Hub to build data integration functionalities and exposing them as web services.
  • Created Mappings using InformaticaDesigner to load the data from sources such as EDI files, Oracle 10g and Oracle application data base.
  • Extensively used various active and passive transformations like Filter, Router, Expression, Source Qualifier, Joiner, Lookup, Update Strategy, Sequence Generator, Rank and Aggregator.
  • Extensively used LTRIM, RTRIM, ISNULL, ISDATE, TO DATE functions in Expression Transformation for data cleansing in the staging area.
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Used Update Strategy to insert, delete, update and reject the items based on the requirement.
  • Worked with Index Cache and Data Cache in cache using transformation like Rank, Lookup, Joiner, and Aggregator Transformations.
  • Worked with Session Logs and Workflow Logs for Error handling and Troubleshooting in all environment.

Environment: InformaticaPower Center 8.6/7x, UNIX Shell Scripting, SQL, PL/SQL, OBIEE, Oracle 10g, TOAD, SQL Loader.

SQL Developer

Confidential

Responsibilities:

  • Involved in creating, manipulation and supporting the SQLServer databases.
  • Developed the Data modeling, Physical and Logical Design of Database.
  • Created tables, constraints, indexes, sequences, procedures and triggers.
  • Implemented Stored Procedures and Triggers on business rules.
  • Participated in integrating the front end with SQLserver back end.
  • Created User defined Functions, Stored Procedures and Triggers using DML and DDL.
  • Used Data Transformation Services (DTS) to import and export data from one server to other server.
  • Involved in performance tuning of T- SQL queries.

Environment: MS SQLServer 2005, T- SQL, DTS.

We'd love your feedback!