We provide IT Staff Augmentation Services!

Informatica Developer Resume Profile

2.00/5 (Submit Your Rating)

Alpharetta, GA

PROFESSIONAL SUMMARY

  • Around 8 years of Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.0.1/8.6/8.1/7.1, Teradata 12/13.10/14, Oracle 10g/9i/8i and MS SQL SERVER 2008/2005/2000 in Finance, Health Insurance and Pharmacy Domains.
  • Expertise in Informatica PowerCenter7.x/8.x/9.1/9.5 Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Maple Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator transformation, Active and Passive transformations ,Joiner and Update Strategy transformations.
  • Expertise in design and implementation of Slowly Changing Dimensions SCD type1, type2, type3.
  • Expertise in RDBMS, Data Warehouse Architecture and Modeling. Thorough understanding and experience in data warehouse and data mart design, Star schema, Snowflake schema, Slowly Changing Dimensions SCD Type 1, Type 2, and Type 3 , Normalization and Demoralization concepts and principles.
  • Experience in working with Mainframe files, COBOL files, XML, and Flat Files.
  • Extensive experience in ETL Extract Transform Load , Data Integration and Data Warehousing using Informatica Power Center Oracle PL/SQL technologies.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle SDLC .
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modelling.
  • Extensive knowledge on Data Profiling using Informatica Developer tool.
  • Implemented Slowly changing dimension types I, II III methodologies for accessing the full history of accounts and transaction information designed and developed change data capture solutions CDC for the project, which captures and analyzes changes from daily feeds to maintain history tables.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co-ordination and development with Teradata/Oracle/SQL Server based Relational Databases.
  • Proficient in Teradata TD12.0/TD13.10/14 database design conceptual and physical , Query optimization, Performance Tuning.
  • Strong hands on experience using Teradata utilities FastExport, MultiLoad, FastLoad, Tpump, BTEQ and QueryMan .
  • Familiar in Creating Secondary indexes, and join indexes in Teradata.
  • Expertise in different types of loading like Normal and Bulk loading challenges. Involved in Initial Loads, Incremental Loads, Daily loads and Monthly loads.
  • Expert in troubleshooting/debugging and improving performance at different stages like database, Workflows, Mapping, Repository and Monitor
  • Involved in Informatica administration such as creating folders, users, change management and also involved in moving code from DEV to TEST and PROD using deployment groups in Informatica Repository Manager.
  • Experience in handling different data sources ranging from flat files, Excel, Oracle, SQL Server, Teradata, DB2 databases, XML files.
  • Expertise in tuning the performance of Mappings and sessions in Informatica and determining the performance bottlenecks.
  • Proficient in applying Performance tuning concepts to Informatica Mappings, Session Properties and Databases.
  • Experienced with mentoring Teradata Development teams, data modeling, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing.
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files
  • Excellent communication skills and experienced in client interaction while providing technical support and knowledge transfer.

TECHNICAL EXPERTISE

Hadoop Development

Data Warehousing ETL

Data Transfer and Data Migration

Data Processing Development

Data Modeling

Query Optimization

Technical and User Documentation

Performance Tuning

TECHNICAL SKILLS

Databases

Oracle 10g/9i/8i, Teradata 14/13.10/13/12/V2R6.2/V2R5, DB2, MS-SQL Server, MS-Access.

DB Tools/Utilities:

Teradata SQL Assistant, BTEQ, Fastload, Multiload, FastExport, TPump, Teradata Manager, Teradata Query Manager, Teradata Administrator, Teradata SQL Assistance, PMON, SQL Loader, TOAD 8.0.

SAS

SAS 8/9, SAS/BASE SAS/SQL, SAS/GRAPH, SAS/STAT, SAS/MACRO, SAS/ODS, SAS/ACCESS, SAS/QC, SAS/CONNECT, SAS/INTRNET, SAS/LAB, SAS/IML

ETL Tools

PL/SQL, Informatica PowerCenter 9.1,9.5/8.x/7.x/6.x/5.1, Informatica Power exchange, Ab Initio GDE 1.15/1.14/1.13,Co-Op 2.15/2.14/2.13, EME .

Data Modelling

Erwin 7.3/9, ER Studio, Sybase Power Designer, Logical/Physical/Dimensional, Star/Snowflake/Extended-star schema, OLAP.

Scheduling Tools

Autosys, Tivoli Maestro.

Version Control Tools

Clear Case.

Operating Systems

Sun Solaris 5.0/2.6/2.7/2.8/8.0, Linux, Windows, UNIX.

PROFESSIONAL EXPERIENCE

Confidential

Role: Informatica Developer

Responsibilities:

  • Interacted with business team to understand business needs and to gather requirements.
  • Prepared requirements document in order to achieve business goals and to meet end user expectations.
  • Created Mapping document from Source to stage and Stage to target mapping.
  • Involved in creating data models using Erwin.
  • Designed Mappings by including the logic of restart.
  • Created Source and Target Definitions, Reusable transformations, Mapplets and Worklets.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
  • Involved in tuning the mappings, sessions and the Source Qualifier query.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Manage all technical aspects of the ETL mapping process with other team members
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Created sessions and workflows to run with the logic embedded in the mappings
  • Extensively used SQL, PL/SQL code to develop custom ETL solutions and load data into data warehouse system.
  • Transformed the logical data model into physical data model on Oracle 11g data warehouse environment to give optimal performance for ETL jobs as well as BI reports.
  • Provided the dimensional data model to give optimal performance by changing the Primary Index of tables and applying various performance tuning techniques on Oracle 11g data warehouse environment.
  • Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
  • Wrote Unix Shell Scripts to process the data received from source system on daily basis.
  • Involved in the continuous enhancements and fixing of production problems.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
  • Created and reviewed scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Involved in data loading using PL/SQL and SQL Loader calling UNIX scripts to download and manipulate files.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL TRACE, TKPROF and AUTOTRACE.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
  • Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
  • Partitioned the fact tables and materialized views to enhance the performance.
  • Created records, tables, collections nested tables and arrays for improving Query performance by reducing context switching.
  • Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
  • Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
  • Developed and modified UNIX shell scripts as part of the ETL process.
  • Participated in discussions with the team members on Prod issues and resolved.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Managed Scheduling of Tasks to run any time without any operator intervention.
  • Leveraged workflow manager for session management, database connection management and scheduling of jobs.
  • Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
  • Implemented pipeline partitioning concepts like Hash-key, Round-Robin, Key-Range, Pass Through techniques in mapping transformations Used Autosys for Scheduling, Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts

Environment: Informatica Power Center 9.5.1,Informatica Developer 9.5.1,Unix, Oracle 10g, Fixed width files, TOAD, Harvest SCM Windows XP and MS Office Suite.

Confidential

Role: Informatica Developer

Responsibilities:

  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
  • Developed mapping parameters and variables to support SQL override.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup File and Database to develop robust mappings in the Informatica Designer.
  • Worked and Implemented Pushdown Optimization PDO to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type 2 and type 1.
  • Mostly worked on Dimensional Data Modelling, Star Schema and Snowflake schema modelling.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used Informatica client tools Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on Change data Capture CDC using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
  • Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on PowerExchange bulk data movement processs by using PowerExchange Change Data Capture CDC method, Power Exchange Navigator, Power Exchange Bulk Data movement. PowerExchange CDC can retrieve updates at user-defined intervals or in near real time.
  • Worked independently on the critical milestone of the project interfaces by designing a completely parameterized code to be used across the interfaces and delivered them on time inspite of several hurdles like requirement changes, business rules changes, source data issues and complex business functionality.
  • Analyzed the source systems to detect the data patterns and designed ETL strategy to process the data.
  • Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
  • Wrote Unix Shell Scripts to process the data received from source system on daily basis.
  • Involved in the continuous enhancements and fixing of production problems.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Involved in data loading using PL/SQL and SQL Loader calling UNIX scripts to download and manipulate files.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL TRACE, TKPROF and AUTOTRACE.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
  • Supported the development and production support group in identifying and resolving production issues.
  • Was instrumental in understanding the functional specifications and helped the rest of the team to understand the same.
  • Involved in All the phases of the development like Analysis, Design, Coding, Unit Testing, System Testing and UAT
  • Created and executed test cases, test scripts and test summary reports
  • Provided support and training to the team regarding any technical and functional aspects of the projectImplemented DBMOVER configuration file, dbmover.cfg, to configure the operation of various Power Exchange tasks as well as their communication with other Power Exchange tasks.
  • Applied different types of monitoring for Power exchange processes like dtlcacon, dtllst, oracle scn, lag scripts, Heartbeat table to monitor the lag
  • Created and managed different Power exchange directories like condense files directory, check point directory etc.

Environment: Informatica 9.1 Designer, Repository Manager, Workflow Manager, Workflow Monitor , Informatica 8x, Oracle 10G, UNIX, Citrix, Toad, Putty, PL/SQL Developer, Power Exchange 9.2.1/8.6.1

Confidential

Role: Informatica Developer

Responsibilities:

  • The prime responsibility of the team is to develop a Patient Care Data Warehouse PCDW .This data warehouse is used to access detailed clinical data on a platform and also to facilitate the enterprise-wide data analysis within the health care environment.
  • Designed and deployed overall ETL strategy including CDC, SCD's, Partition Management, Materialized Views and other complex mapping logics.
  • Migrated from Informatica PowerCenter Repository version 7.1 to 8.1 and applied patches over 8.1.
  • Implemented appropriate Error handling logic, data load methods and capturing invalid data from the source system for further data cleanup.
  • Involved in designing logical and physical database designing using ERWIN tool.
  • Developing shared objects in Informatica for source/target/lookup transformations, developed complex mappings, sessions/workflows/worklets, database connections
  • Entered metadata descriptions at both the transformation and port level in complex mappings
  • Design the ETL process and schedule the stage and mart loads for the data mart.
  • Debugging and Troubleshooting Informatica Mappings.
  • SQL Query tuning including Explain Plan, using Optimizer Hints, Indexes and Histograms.
  • Involved in Postproduction support and enhancements.
  • Analyzed the Specifications and involved in identifying the source data needs to be moved to data warehouse
  • Involved in Project scheduling and Project Estimations.
  • Worked closely with the Requirements Manager and business Analysts on the requirements collection process.
  • Provide technical support to the IT staff in identifying and resolving problems. Act as a liaison with internal teams to research, analyze and propose solutions to technical, operational and test scenarios.
  • Assisted the tester in developing the test cases and reviewed them.
  • Have thorough experience on Defect Management.
  • Supervised the Environment migration process from Development to Test.

Environment: Informatica 8.5.1, Oracle 10g, Informatica Power Center 8.1.1, Erwin 4.5, Oracle Applications, Flat files, PL/SQL, TOAD 9.0, SQL, UNIX, Mainframes JCL Jobs , Quality Center 9.0

Confidential

Role: Informatica Developer

Responsibilities:

  • Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.
  • Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Running and Monitoring daily scheduled jobs by using Work Load manager for supporting EDW Enterprise Data Warehouse loads for History as well as incremental data.
  • Design, Development and Documentation of the ETL Extract, Transformation Load strategy to populate the Data Warehouse from the various source systems.
  • Prepared data marts on policy data, policy coverage, claims data, client data and risk codes.
  • Extensively used Informatica PowerCenter 8.6 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange for mainframe sources.
  • Involved in loading the data from Source Tables to ODS Operational Data Source and XML files using Transformation and Cleansing Logic using Informatica.
  • Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, XML, Stored procedure transformations in the mapping.
  • Involved in performance tuning of mappings, transformations and workflow sessions to optimize session performance.
  • Developed Informatica SCD type-I, Type-II and Type III mappings and tuned them for better performance. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Snowflake Schema was mainly used with Geography, Customer, Product, and Time as basic dimensions.
  • Creating Test cases for Unit Test, System, Integration Test and UAT to check the data quality.
  • Investigating failed jobs and writing SQL to debug data load issues in Production.
  • Writing SQL Scripts to extract the data from Database and for Testing Purposes.
  • Interacting with the Source Team and Business to get the Validation of the data.
  • Supported the code after postproduction deployment.
  • Familiar with Agile software methodologies scrum .

Environment: Informatica Power Center 8.6.1/8.1.1,Oracle Business Intelligence OBIEE , Oracle 10g/9g ,MS SQL Server 2008, TOAD for SQL Server, Flat Files, PL/SQL, Windows 2000, XML.

Confidential

Role: Database Developer

Responsibilities:

  • Analyzed the Data Distribution and Reviewed the Index choices
  • Develop Informatica mappings to identify SKU's with Full name and product description combination by writing conditional statements and using transformations such as Expression, Aggregator, Update Strategy, Lookup, Router, etc.
  • Extensively worked on Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter and Aggregator Transformations in Informatica.
  • Debug the Informatica mappings and validate the data in the target tables once it is loaded with mappings.
  • Transform and Load data into Enterprise Data Warehouse tables using Informatica from the legacy systems and load the data into targets by ETL process through scheduling the workflows.
  • Developed Informatica Objects Mappings, sessions, Workflows based on the design documents.
  • Debug the Informatica Mappings and validate the data in the target tables once it was loaded with mappings.
  • Developed Informatica SCD Type-I, Type-II and Type III mappings and tuned them for better performance.
  • Created Web forms which are used by many departments.
  • Developed wrapper shell scripts for calling Informatica workflows using PMCMD command and Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Experience in managing different Power exchange directories like condense files directory, check point directory etc.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Migrated the code into QA Testing and supported QA team and UAT User .
  • Working with Power Center Versioning Check-in, Check-out , Querying to retrieve specific objects, maintaining the history of objects.
  • Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
  • Created and reviewed scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Developed many stored procedures that act as data source for all the web-based and reporting applications here in this project.
  • Wrote many Adhoc queries daily to get the data needed by the Management.

Environment: Informatica Power Center 8.6.1, Oracle 11g, MS SQL Server 2008, SQL, PL/SQL, T-SQL, SQL Plus, TOAD, Erwin, Unix, Oracle Applications 11i, Sun Solaris

Confidential

Role: ETL Developer

Responsibilities:

  • Gathering of user requirements and source system analysis and establishing mappings between sources to target attributes. Source data analysis and design Documentation.
  • Parsed high-level design spec to simple ETL coding and mapping standards.
  • Designed and developed ETL Mappings using Informatica to extract data from Mainframe DB2 tables, flat files and Oracle, and to load the data into the target database.
  • Extensively used various transformations like Source Qualifier, Joiner, Aggregation, Update Strategy, lookup, Rank and Filter.
  • Created PL/SQL procedures to populate base data mart aggregation structure. Analyzed and fine-tuned PL/SQL scripts.
  • Developed ETL into data mart for phase-II data elements from staging.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.

Transformation.

  • Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables.
  • Administrate Informatica PowerCenter including Migrations, Repository Backup/Restores and Upgrades.
  • Administrating User Privileges, Groups and Folders which includes creation, update and deletion. Migration of New and Changed Informatica objects across the environments using Folder to Folder and Deployment Group methods.
  • Used predefined shell scripts to run jobs and dataloads.
  • Worked on SQL tools like TOAD to run SQL Queries to validate the data.
  • Created, updated and maintained ETL technical documentation.

We'd love your feedback!