We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

0/5 (Submit Your Rating)

CO

SUMMARY

  • Over 7 years of progressive IT Experience as Oracle/ETL Developer in Relational, Transactional and Object database systems and Windows & UNIX operating systems.
  • Expertise in all areas of the Software Development Life Cycle including System Study, Analysis, Coding, Testing and Documentation.
  • Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Teradata, Flat Files, Excel and XML files loaded data in to Data ware house and Data marts using Power center.
  • Worked extensively in creating complex Mappings using various transformations like Unconnected and Connected lookups, Source Qualifier, Joiner, Rank, Sorter, Router, Filter, Expression, Aggregator, Joiner and Update Strategy, Normalizer etc.
  • Worked on Performance tuning, Indexing and Partitioning techniques on Sources, Targets, Mappings, and session levels in Informatica.
  • Identified bugs in existing mappings by analyzing the data flow and evaluating transformations using Informatica Debugger.
  • Experienced in Code Migration, Version control, scheduling tools, Auditing, shared folders, and Data Cleansing.
  • Expertise in using DDL, DQL, DML, DCL statements. in creating Tables, Indexes, Views, Sequences, Synonyms and Materialized views.
  • Expertise in enforced data integrity using Foreign, unique and check constraints.
  • Expertise of writing SQL in a large - scale relational database environment (complex joins, query data across multiple databases). in developing advanced PL/SQL programs through various procedures, functions, packages, cursors and triggers to fulfill the business requirements.
  • Expertise in writing Oracle System defined and User defined exceptions.
  • Expertise in using Bulk Collect along with collections and records to improve the performance in PL/SQL blocks.
  • Strong experience in Oracle Performance tuning of complex SQL queries by using EXPLAIN PLAN and using Oracle Hints to improve performance.
  • Experience in retrieving data from external data Files and loading into oracle tables using tools like UTL FILE, SQL*Loader and External Tables.
  • Experience in Extracting, Transforming and Loading the data into data warehouse system.
  • Experience with Star Schema Modeling and knowledge of Snowflake Dimensional modeling.
  • Experience in developing UNIX Korn shell scripts for Batch Scheduling and having good knowledge on PERL data references and data structures
  • Expertise to follow Agile process in application development.
  • Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools.
  • Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
  • Strong Team working spirit, relationship management and presentation skills.

TECHNICAL SKILLS

Databases: Oracle 11g/10g/9i, MS SQL Server 2013/2012/2008 R2, DB2.

Scripting: SQL, PL/SQL, UNIX Shell Scripting, PERL.

Languages: C, C++, Java, J2EE, XML, HTML, JavaScript.

Query Tools: Informatica 9.x/8.x, Teradata SQL Assistant (V 14.01.0.02 , V 13.10.0.03 ), TOADSQL Navigator, SQL* PLUS, SQL Developer, SQL *Loader.

Operating Systems: Windows 2008/7/XP, LINUX, UNIX.

Other Tools: Erwin, Putty, WinSCP, Robot Scheduler, Wherescape Red, HP ALM Quality Center, MS Office 2013/07/03.

PROFESSIONAL EXPERIENCE

Confidential, CO

Sr. Informatica Developer

Responsibilities:

  • Interacting with stakeholders, gathering requirements, assigning priorities to each of them and analyze the impact on existing business process.
  • Documenting business processes and translating them to functional specifications
  • Elicited requirements using interviews, document analysis, requirements workshops, business process descriptions, use cases, scenarios, business analysis, task and workflow analysis.
  • Understanding business requirements and discuss with functional team to define the scope of work.
  • Used session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
  • Analyzed requirements from the users and created, reviewed the specifications for the ETL.
  • Designed Incremental strategy, Created Reusable Transformations, Mapplets, Mappings, Sessions and Workflows.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Performance tuning and Pushdown optimization for batch and reporting applications using Teradata EXPLAIN facility.
  • Capture Metadata, Capacity Plan and Performance metrics for all the queries being created and Define archival strategy and provide guidance for performance tuning.
  • Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
  • Implemented Slowly Changing Dimensions (SCD, Both Type 1 & 2).
  • Worked on existing mapping for the performance tuning to reduce the total ETL process time.
  • Extensively worked on Robot Scheduler and created automated group jobs.
  • Used SFTP mechanism to transfer the data files between Production server and External servers.
  • Fixing the performance issues and working on query optimization.
  • Interacting with third party teams like DBA and Migration teams for migrating the code to QA and PROD.
  • Supporting production issues and performance issues.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Environment: Informatica Power Center 9.1, Informatica Power exchange, Oracle, Wherescape Red, Teradata V2R5,Mainframe Flat files, Unix Shell Scripting, Robot Scheduler, Putty, WinSCP, HP ALM Quality Center.

Confidential, Charlotte, NC

Sr.ETL Developer

Responsibilities:

  • Involved in complete Data Warehouse Life Cycle from requirements gathering to end user support.
  • Involved in the Analysis of Physical Data Model for ETL Source to Target mapping and the process flow diagrams for all the business functions.
  • Wrote Procedures, Functions and Packages to Extract, Transform and Load (ETL) data from Operational Systems to the Data Warehouse.
  • Converted Data Transformation Services (DTS) application to SQL Server Integrated Services (SSIS).
  • Created PL/SQL packages, procedures and functions that extensively used PL/SQL cursors, user defined object types, exception handling.
  • Used Tools to monitor database performance and took various steps to improve database performance using Microsoft SQL server Enterprise Manager.
  • Wrote PL/SQL - Triggers and Stored Procedures for data validation and to create and drop indexes.
  • Improved Database performance by optimizing back end queries, PL/SQL tuning and implementing performance improvements by analyzing indexes and partitioning tables.
  • Developed SQL Loader scripts for bulk data loads as part of daily data extraction.
  • Writing complex SQL queries and PL/SQL procedures to extract data from various source tables of data warehouse.
  • Pro-actively participated in SQL tuning, Used ANALYZE, DBMS STATS, EXPLAIN PLAN, SQL TRACE and TKPROF.
  • Wrote heavy stored procedures using dynamic SQL to populate data into temp tables from fact and dimensional tables for reporting purpose.
  • Experience with Wherescape RED code impact analysis and scheduling.
  • Worked on Performance tuning by using Explain Plan and various Hints.
  • Used Collections, Bulk Collects to improve performance by minimizing the number of context switches between the PL/SQL and SQL engines.
  • Customized UNIX scripts as required for preprocessing steps and validate input and output data elements
  • Created/modified various Packages, Stored Procedure & Functions as part of the data validation and cleansing process.
  • Tested the ETL process in testing environment and created test cases.
  • Writing UNIX Shell scripts for performing development and DBA activities.

Environment: MS SQL Server 2012, Flat files, Sequential files, Wherescape Red, SQL, PL/SQL, TOAD, SQL LOADER, SQL*Plus, DMEXPRESS (Syncsort), UNIX, STAR Team.

Confidential, Herndon, VA

Informatica/ETL Developer

Responsibilities:

  • Extensively used Informatica Power Center for extracting, transforming, and loading databases from sources including Oracle, DB2, and Flat files. Collaborated with EDW team in, High Level design documents for extract, transform, validate, and load ETL process data dictionaries, Metadata descriptions, file layouts, and flow diagrams.
  • Participated in User meetings, gathering requirements, discussing the issues to be resolved.
  • Worked with business analysts, application developers, production teams and across.
  • Translating user inputs into ETL design docs.
  • Participate in Design Reviews of Data model and Informatica mapping design.
  • Functional units to identify business needs and discuss solution options.
  • Worked on dimensional modeling to Design and develop STAR Schemas, identifying Fact and Dimension Tables for providing a unified view to ensure consistent decision making.
  • Integrated data from diverse source systems including Sales force, Siebel, MS SQL Server, and flat files using Informatica Power Center.
  • Accessed to native database APIs using SQL in power exchange to provide high performance extraction, conversion and filtering of data without intermediate staging and programming.
  • Performed match/merge and ran match rules to check the effectiveness on data.
  • Created mappings using various Transformations like Aggregator, Expression, Filter, Router, Joiner, Lookup, Update strategy, Source Qualifier, Sequence generator, Stored Procedure and Normalizer.
  • Extensively used debugger for troubleshooting issues and checking session stats and error Logs.
  • Performed validation and testing of Informatica mapping against the pre-defined ETL design standards
  • Implemented Performance tuning at database level and at Informatica level. Reduce load time by using partitions and concurrent sessions running at a time
  • Involved in writing SQL, PL/SQL, Stored Procedures, Triggers and Packages in Data Warehouse environments that employ Oracle.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load, update Processes and Batch processes Scheduled Sessions and Batches in Server.
  • Writing shell scripts to automate the export data into flat files for backup and delete data From staging tables for the given time period.
  • Extensive experience in performance tuning, identifying bottlenecks and resolving to improve Performance at database level, Informatica mappings and session level.
  • Involved in the optimization of SQL queries which resulted in substantial performance Improvement for the conversion processes.
  • Worked on Performance Testing, Works within an Oracle environment performing PL/SQL Code development, building high performing Oracle queries, and tuning existing queries
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating Transformations.
  • Involved in migration of objects in all phases (DEV, INT and PRD) of project and trained Developers to maintain system when in production

Environment: Informatica Power Center 8.1.1, Oracle 11g, Flat files, Erwin, UNIX Shell Scripting, Wherescape Red, Toad, Autosys, Windows 2003 Server, PuTTy, WinSCP, HP ALM Quality Center.

Confidential, New York, NY

Informatica Developer

Responsibilities:

  • Involved in design and development of business requirements in liaison to business users and technical teams Analyzed application requirements and provided recommended design and studied the current system to understand the existing data structures.
  • Participated actively in end user meetings and collected requirements.
  • Used Informatica Power Center 8.1 for extraction, transformation and loading (ETL) of data in the Data Warehouse.
  • Designed, developed, and documented several mappings to extract the data from Flat files and Relational sources.
  • Developed Mappings including transformations: Aggregator, Joiner, Lookup (Connected & Unconnected), Filter, Update Strategy, Stored Procedure, Router, Expression, SQ and Sorter.
  • Developed re-usable transformations, mappings and Mapplets confirming to the business rules
  • Created workflows and tested mappings and workflows in development and production environment.
  • Used Debugger to test the mappings and fix the bugs.
  • Actively involved in performance improvements of mapping and sessions and fine-tuned all transformations.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load, update Processes and Batch processes Scheduled Sessions and Batches in Server.
  • Writing shell scripts to automate the export data into flat files for backup and delete data From staging tables for the given time period.
  • Extensive experience in identifying bottlenecks and resolving to improve Performance at database level, Informatica mappings and session level.
  • Involved in the optimization of SQL queries which resulted in substantial performance Improvement for the conversion processes.
  • Involved in enhancements and maintenance activities of the data warehouse including performance tuning, rewriting of stored procedures for code enhancements.
  • Performed Informatica code migration from development/ QA/ production and fixed and solved mapping and workflow problems.
  • Performance tuning of existing stored procedures, functions, views, & SQL Queries.

Environment: Informatica 8.6, UNIX Scripting, Toad, Oracle 10g, Business Objects 6.5, Control-M, Data Analyzer

Confidential, Richmond, VA

Oracle Developer

Responsibilities:

  • Extensively worked on Performance tuning by using Explain plans and various hints.
  • Worked on table partitioning (Range and List) and deploying local indexing on portioned tables.
  • Involved in Extraction, Transformation and Loading (ETL) of the data in the data warehouse for Oracle 9i database.
  • Involved in extracting Data from the Heterogeneous databases residing on Mainframe, UNIX.
  • Used Oracle 9i features such as merge for various transformations.
  • Used UNIX Shell scripts (KORN) for Extracting and Parsing data from the files.
  • Used stored procedures in Oracle for retrieving the data from the database for solving complex queries.
  • Wrote Unix Shell Scripts, undertook Code Optimization and Performance tuning of the application.
  • Created Stored Procedures, functions and Packages in Oracle 9i using SQL and PL/SQL for the Audit Trail using Oracle 9i as backend.
  • Involved in writing complex scripts for Data Transformation, ETL process (Extract, Transform & Load)
  • Analyzed Business Documents, Internal & External Source systems and to develop Health Insurance Claims Data warehouse LOGICAL AND PHYSICAL MODEL (Facts and Dimensions) and Business Process Model.
  • Performed the uploading and downloading flat files from UNIX server using FTP.
  • Involved in debugging and Tuning the PL/SQL code, tuning queries, optimization for the Oracle database.
  • Review data transformation rules and provide technical suggestions in data transformation logic and pseudo code (PL/SQL)
  • Worked on Performance tuning by using Explain Plan, hints and also worked on Partition tables using Range method.
  • Worked on Partitioning and indexing concepts (Local and Global) indexes on partition tables.
  • Creating Test scripts for the generated final Reports.

Environment: Oracle 9i/10g, SQL PLUS, PL/ SQL, SQL*Loader, XML, Korn Shell Scripts, SQL Navigators, Windows XP, MS-Excel, TOAD.

We'd love your feedback!