We provide IT Staff Augmentation Services!

Team Lead Onsite Resume Profile

5.00/5 (Submit Your Rating)

Summary:

  • 8 years of IT experience in Software Analysis, Design, Development and Support for various software applications in client-server environment in providing Business Intelligence Solutions in Data Warehousing for Decision Support Systems, OLAP and OLTP Application Development.
  • Strong experience in Data Warehouse, Data mart, Data Integration and Data Conversion Projects ETL experience using Informatica Power Center 9x/8x/7x/6x.
  • Have clear understanding of Data Warehousing concepts with emphasis on Project Management, ETL, Software Development Life Cycle including requirement analysis, design, development, testing, implementation support.
  • Extensively worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files.
  • Expertise on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
  • Expertise on Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
  • Expertise in different Schemas Star and Snow Flake to fit reporting, query and business analysis requirements.
  • Experience in developing Reusable components for using them across the projects.
  • Developed slowly changing dimension mappings using type-I, type-II, and type-III methods.
  • Extensively used Enterprise Data warehousing ETL methodologies for supporting data extraction, transformation and loading processing, in a corporate-wide-ETL Solution using Informatica Power Center.
  • Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping, Session and System in ETL Process.
  • Worked with SQL/PL-SQL to write Complex SQL queries, Stored Procedures, Triggers, Functions PL/SQL packages.
  • Worked on Address Validator using Discreet and Hybrid techniques to calculate mailability score, match codes for cleansing address attributes.
  • Used Consolidation transformation to extract only survivor records.
  • Used different techniques like Hamming distance, Edit Distance, Biagram and Reverse Hamming Distance algorithm's to perform match and identify duplicate records.
  • Experience in working with Standardizer, Parser, Match, Merge Consolidation Transformations using IDQ.
  • Experience in creating Reference tables using Informatica Analyst tool.
  • Strong Data Analysis and Data Profiling background using Informatica Analyst, Informatica Data Explorer IDE and Informatica Data Quality IDQ .
  • Experience in working with Informatica Power exchange for Mainframe Sources.
  • Experience in writing Unix shell scripts.
  • Extensively used third party schedulers which include Control-M TWS.
  • Having experience of writing test plans, Test cases and working with Quality center for executing the test plans and fixing the bugs and defects raised during testing.
  • Having good knowledge on reporting tools like Cognos 8 and Business objects 6.5
  • Having knowledge on SLA's, SOW's and IT Service Management has experience working on tickets and issue resolving using HPSD, Jira, Managenow Servicenow.
  • Having Knowledge on Informatica Master Data Management MDM .
  • Having knowledge on different MDM tables, delta detection, defining trust levels, validation rules, match and merge setup, Queries and packages, Batch Viewer Batch Groups.
  • Having knowledge on Security Access Manager, Hierarchy Manager and Data Director.
  • Having Good communication skills, hardworking and result oriented as an individual and in a team.

Expertise:

  • ETL Data Integration Tools : Informatica Power Center 9x/8x/7x/6x, Power Exchange, Informatica Developer, IDE, IDQ Informatica Analyst tool.
  • RDBMS : Oracle, SQL Server, MS-Access DB2
  • Data Modelling Tools : Erwin Data Modeller
  • OLAP Tools : Cognos 8 Report Studio, Analysis Studio Framework Manager , Business Objects 6.5 Desktop, Web Intelligence Info view
  • Scheduling Tools : Control-M, Tivoli Work Scheduler TWS
  • IDE : Toad, SQL Developer, SQL Server Management Studio, Erwin 4.5/4.0, WinScp, Putty
  • Professional Summary:
  • Working as an Assistant Consultant at TATA Consultancy Services from May 2010 to till date
  • Worked as a Programmer Analyst at Franklin Templeton Investments from August 2009 to April 2010
  • Worked as a Developer at Mahindra Satyam Formerly Satyam Computer services from March 2006 to August 2009.

Projects:

Confidential

Sr. ETL Developer

Roles Responsibilities:

  • Participated in the high-level and detailed level design documentation of the ETL system and mapping of business rules.
  • Converting functional specification documents into technical design documents.
  • Worked with BA in preparing functional specifications and also involved in user meetings.
  • Analyzed the source data, coordinated with Business and Tech teams in developing Relational Model and participated in the Design team and user requirement gathering.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Connected unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, and Update Strategy for developing mappings involving complex business logics.
  • Created customized user defined functions and re-used across the project.
  • Creating mapping parameters and session parameters.
  • Worked with Stored Procedure for pre post load session tasks.
  • Creating session tasks, event wait, event raise, command task in the workflow manager and deployed them across the DEV, QA, UAT and PROD repositories.
  • Generating Email Notifications through scripts that run in the Post session tasks.
  • Used Informatica debugger to debug any issues faced in the mapping.
  • Used Workflow manager for Workflow, Session Management, and database connection management and scheduled jobs using Tivoli Work Scheduler.
  • Analyzed the bottle necks in the ETL process and tuned the mappings for better performance.
  • Involved in unit testing and system integration testing and preparing Unit Test Plan UTP and System Test Plan STP documents.
  • Reviewed test plans, test cases and participated in user acceptance test.

Environment: Informatica Power Center 9.1, Power Exchange, Oracle 11g, Linux, TWS

Confidential

Team Lead Onsite/Offshore

Roles Responsibilities:

  • Worked cooperatively with the team members to identify and resolve various issues relating to Informatica and other database related issues.
  • Participated in the high-level and detailed level design documentation of the ETL system and mapping of business rules.
  • Interacted with business representatives to analysis and to define Business and Functional Specifications.
  • Involved in Designing ER models Logical/Physical for Oracle database to store data retrieved from other sources including legacy systems.
  • Analyzed the source data, coordinated with Business and Tech teams in developing Relational Model and participated in the Design team and user requirement gathering.
  • Used IDQ transformations like Parser, Standardizer, Match and Consolidation transformations for cleansing of data and loaded into stage tables.
  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer and Transformations Developer to develop ETL mappings to load data into target database.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Connected unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, and Update Strategy for developing mappings involving complex business logics.
  • Creating mapping parameters and session parameters.
  • Used Workflow manager for Workflow and Session Management, database connection management and Tivoli for scheduling of jobs to be run in the batch process.
  • Worked on Address Validator using Discreet and Hybrid techniques to calculate mailability score, match codes for cleansing address attributes.
  • Used Consolidation transformation to extract only survivor records.
  • Used different techniques like Hamming distance, Edit Distance, Biagram and Reverse Hamming Distance algorithm's to perform match and identify duplicate records.
  • Cleansed and standardized data using Informatica IDQ transformations like Standardizer, Parser, Match, Merge Consolidation Transformations using IDQ.
  • Involved in unit testing and system integration testing and preparing Unit Test Plan UTP and System Test Plan STP documents.
  • Conduct Scrum meetings on daily basis to get the status and worked in resolving the issues team faced and distributed the tasks.
  • Did peer-to-peer code reviews and provided feedback.
  • Reviewed test plans and test cases and participated in user acceptance test.
  • Worked on Change Requests CR while integration testing and UAT are in progress.

Environment: Informatica power center 8.6, Informatica IDQ, Oracle10g, TWS, Linux

Confidential

Sr. ETL Developer

Roles Responsibilities:

  • Worked with Connected and Unconnected Stored Procedure for pre post load sessions
  • Worked with various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Stored procedure, Router and Normalizer etc.
  • Perform impact analysis, identifying gaps and code changes to meet new and changing business requirements.
  • Preparation of technical specifications for the development of Informatica Extraction, Transformation and Loading ETL mappings to load data into various tables in Data Marts and defining the ETL standards.
  • Worked with BA in preparing functional specifications and also involved in user meetings.
  • Creating session tasks, event waits raise, command task, worklets etc in the workflow manager and deployed them across the DEV, QA, UAT and PROD repositories.
  • Generating Email Notifications through scripts that run in the Post session implementations.
  • Worked with Testing Team for test case and testing plan development, worked with them in fixing Defects as identified during Testing Phase and coordinated the total testing process of project.
  • Track defects in the Mercury Quality Center tool and update status in a timely manner during the entire phase of the testing life cycle.
  • Created test plans, test strategies, setup the test environment, co-ordinate with the business analysts and business users to create test data to perform extensive testing.
  • Worked on Address Validator using Discreet and Hybrid techniques to calculate mailability score, match codes for cleansing address attributes.
  • Used Consolidation transformation to extract only survivor records.
  • Used different techniques like Hamming distance, Edit Distance, Biagram and Reverse Hamming Distance algorithm's to perform match and identify duplicate records.
  • Cleansed and standardized data using Informatica IDQ transformations like Standardizer, Parser, Match, Merge Consolidation Transformations using IDQ.
  • Environment: Informatica Power Center 8.1,Power Exchange, Informatica IDQ, Oracle 10g,Unix,Control-M

Confidential

Programmer Analyst

Roles Responsibilities:

  • Analysing the load issues and implementing the fix to resolve the issues.
  • Analysing ETL mappings and making changes for optimization of load and inclusion of required new logic.
  • Creating technical specification documents.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Connected unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, and Update Strategy for developing mappings involving complex business logics.
  • Used Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process using Control-M.
  • Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes of performance for pre and post session management.
  • Attending the technical review meetings.
  • Involved in unit testing and system testing and preparing Unit Test Plan UTP and System Test Plan STP documents.

Environment: Informatica 8.1, UNIX, Oracle 10g, Control-M

Confidential

Developer

Roles Responsibilities:

  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Connected unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, and Update Strategy for developing mappings involving complex business logics.
  • Used Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes for performance in pre and post session management.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Creating mapping parameters and session parameters.
  • Well versed with components like Framework Manager, Report Studio, Analysis studio, Query Studio, Cognos Connection.
  • Involved in creation of list reports with many number of queries, used techniques like drill through.
  • Worked with the creation of cross tab reports and chart reports.
  • Generated reports using Cognos Report Studio.
  • Created reports using Value Prompts and select search prompts to filter the data in Cognos Report Studio.
  • Created list Reports by applying cascading prompts to restrict data fetch
  • Develop the various reports with Conditional Formatting to apply the styles.
  • Created reports with various types of filters like Details filters, Summary filters Group filters.
  • Design the reports with Master-Detail relationship.
  • Designing and making changes to the Cognos Package.
  • Involved developing complex reports like Master Detail Reports, Hyperlinked reports.
  • Involved in unit testing and system integration testing and preparing Unit Test Plan UTP and System Test Plan STP documents.
  • Designing technical specification document for all the developed mappings.
  • Worked on Change Requests CR while integration testing and UAT are in progress.

Environment: Informatica 7.1, UNIX and Oracle 9i, Cognos 8.

We'd love your feedback!