Data Integration Etl Lead Developer Resume Profile
SUMMARY:
- Ten Plus years of Data Warehousing experience using Informatica Power Center 8.x,7.x/6.2/6.1/5.1.2, Power Mart 5.0, Repository Admin consoled, Repository Manager, Designer, Workflow Manager, workflow Monitor, ETL, Data mart, OLAP, OLTP including 2 plus years of Power Exchange 5.2.2 experience.
- Excellent Team Player, self-motivated with solid communications skills
- Experience in conceptual, logical and physical data models using ERWIN.
- Extensive experience in writing SQL and PL/SQL programs Stored procedures, Triggers, Functions for back end development.
- Excellent inter-personal and communication skills and ability to work individually as well as in a team.
- Highly motivated self-starter with ability to handle multiple projects.
TECHNICAL SKILLS:
PROFESSIONAL EXPERIENCE:
Confidential
Informatica Lead
Credit Suisse Group is a leading global financial services company offering client financial advice aspects of Investment Banking, Private Banking, Asset Management, and a Shared Services.
Confidential
This project serves to address liquidity risk reporting capabilities which will enable Credit Suisse Treasury department in monitoring and regulating global securities collateral and un-collateral .
Responsibilities
- Involved in requirements gathering, design and development of treasury risk reporting DataMart. Technical lead for onsite and offshore teams in India and Poland.
- Extensively used Informatica PowerCenter to design and develop mappings with embedded business logic.
- Developed stored procedures to purge tables depending on the object maintenance criteria.
- Purge tables and its partitions by current business day.
- Created OBIEE Dashboard reports as per business requirements.
- Identify load issues and resolve them from the root cause and direct/guide offshore development team to perform required steps.
- Worked extensively with Unix Scripts. Developed file watcher and batch control Unix scripts.
- Unix file system maintenance. Oracle performance tuning and table space maintenance. Hardware / software servers upgrades including procurement.
- Implemented controls for data load jobs to keep track of load duration and success / failure of data jobs.
- Used PADA for Informatica deployments.
- Used Control M to schedule data load jobs.
- Primary contact for production support and issues.
- Production batch load process of every region on day to day basis.
- Implemented the functionality to rerun a workflow from the point of failure to ease production support and maintenance.
- Responsible for upgrading transmission protocols from FTP to FTPS.
- Responsible for power center 8.6.1 upgrade to power center 9.5.1.
Environment: Informatica Power Center 8.6.1, 9.5.1, Oracle 11g, Unix, OBIEE 11g, Share point and PADA.
Confidential
Data Integration ETL Lead Developer
The Federal Reserve System, the central bank of the United States, conducts the nation's monetary policy, supervises and regulates banks, and provides a variety of financial services to the U.S. Government and to the nation's financial firms. The major components of the Federal Reserve System are the Board of Governors in Washington D.C. and the 12 Federal Reserve Banks spread across the country.
Security Master Consolidation FRN Implementation
There are 2 phases to the project. Consolidating Security Master Data from OPICS live trading system and Broadridge external vendor to IDS Integrated Data Store and accepting new FRN Floating Rate Index securities from Treasury. Data has been staged and validated in landing and staging environments before it is finally stored in IDS.
OPICS Interface
Is an interface that facilitates the data transfer to OPICS system from IDS to store new / updated treasury FRN Floating Rate Index rates. The system is intelligent to notify business upon data exceptions and rate fluctuations.
Responsibilities
- Involved in the requirement gathering, analysis, design, development and developer unit testing.
- Created queries to analyze the performance while extracting data from Broadridge external treasury database system .
- Adopted AGILE methodologies for development and process improvement.
- Extensively used Power Center to design multiple mappings with embedded business logic.
- Extensively used reusable objects like User defined functions, Expression, mapplets, worklets, parameters and variables across data marts.
- Used MD5 to hash out updates and interest on tables.
- Developed dynamic / runtime queries and expressions using parameters.
- Worked extensively with mapping parameters and variables.
- Implemented Audit Balance Control ABC to keep track of all the files loaded and the duration of each load
- Developed stored procedures to perform business specific calculations.
- Batch table keeps record of incremental extraction date/times for each table and for the next run.
- Developed complex view for downstream applications with embedded business logic.
- Used Oracle Analytical functions for business calculations.
- Performed Code Review / Functional reviews for the code developed by the team members.
- Implemented the functionality to rerun a workflow from the point of failure to ease production support process.
- Notification to business upon exceptions and rate fluctuation.
- Involved in creating Data dictionary and Data flow diagrams.
- Created technical design document from business requirement document.
- Created testing and deployment documents saved in share point.
- Supported QA team for Integration testing.
Environment: Informatica Power Center 9.1, Oracle 11g, SQL Developer, Unix, Share point and QC.
Confidential
Sr.ETL Lead Data Warehouse Developer / Informatica Administrator
The School Construction Authority SCA established by the New York State Legislature in December 1988 to build new public schools and manage the design, construction and renovation of capital projects in New York City.
Project Tracking System PTS
PTS system tracks and maintains day to day transaction of a project based on its activities on LLW, Bundle and Package. Earlier version of PTS system was on Sybase and it being moved to Oracle. Informatica is used to do the migration, cleansing and transformation of data from Sybase to Oracle.
SCA Data Warehouse
- SCA for the first time is integrating all the systems residing on different platforms into one platform, to avoid redundancy of the data. To create ad-hoc and business user reports according to the business user requirements. Information regarding LLW Low Level Workitem , Bundle and Package is brought into ODS Operational Data Source as part of Phase I. The lowest level of the grain is based on LLW.
- Data from different sources CIP Capital Improvement Project , CMS Contractor Management System , Project, PPM Primavera Project Management Scheduler is scrubbed, cleansed and transformed before storing in ODS Environment.
Labor Compliance Management System - LCMS
- The Labor Law Compliance Management System LCMS is a web based application used to electronically collect, track, store and analyze Certified Payrolls for all contractors working on SCA projects. Employers are required by law to pay their employees prevailing wages and the LCMS system enables Labor Law Compliance Unit to easily track non-compliance.
- The scope of this project is to provide related information about the Contractor , Subcontractors , company details, school location where the work is being undertaken, project Manager details , chief project officer information , senior project officer responsible for the project, details regarding trades undertaken by vendors, new employment of contractors.
MIS
MIS System captures information related to schools from DSF District School Facility . DSF updates school address information. Any changes in the addresses of the school are captured by DSF. SCA maintains address information of the school as updated by DSF.
Confidential
Responsibilities
- Prepared a list of technical tasks and effort estimation for Project Manager.
- Created technical document from BRD.
- Involved in creating Data dictionary, Data flow diagrams, conceptual/logical/physical data models using Erwin.
- Assigned task to sub-ordinates.
- Performed Performance Tuning on the SQL Queries, Caching, Mappings and Sessions with techniques like reusable objects, mapplets, reusable Caches etc.
- Designed and developed SCD I II type maps to load Full /Incremental strategy.
- Developed Packages, stored procedures for refresh, cleanup, truncate, purge tables using Stored procedure transformation
- Resolved defects when a Heat Ticket is assigned.
- Used PVCS and TFS to save various versions.
- Cleansed and transformed data according to the business requirements before loading into target Oracle database.
- Responsible for source files processing, extracting, transforming, loading, bad data processing, monitoring, partitioning and purging to maintain the history in ODS system and email notification about the jobs and source files status using Informatica PowerCenter.
- Created database objects such as Table, Synonym, B-tree, Bit-map Indexes and Partitioning.
- Used different transformation like Lookup, Update Strategy, Joiner, Union, Filter, Router, Aggregator, Sorter, Sequence Generator, XML Generator.
- Created complex SQL Overrides to read data from different tables.
- Used DBLinks to access to different source systems.
- Modeled star schema in OBIEE business layer. Create pre-defined reports and dashboards to users. Implement Data warehouse to work with OBIEE.
- Develop complex joins in OBIEE to function well. Create Dimensional, hierarchial tables in OBIEE.
- Integrate the data from different systems like CIP Capacity Improvement Project residing on Oracle, CMS Contractor Management System residing on Sybase and EXP Expedition residing on Oracle and target them into Oracle Database.
- Used transformations like xml source qualifier, xml generator, XML Target, Web service consumer transformation with cookie port, web service consumer source qualifier, Union transformation, Filter, Router, Update Strategy transformation.
- Consumed a Web Service to feed the data in xml format into vendor system.
- Developed batch file to convert pipe delimited flat file to an Excel sheet.
- Developed a Procedure transformation to compare existing data that is already existing with the new data and updating records.
- Used XML Source qualifier transformation imported from .XML file, Expression transformation to transform data and Fixed Width Flat file.
Environment: Informatica Power Center 8.6, XML, Fixed width flat file, Oracle 10g, Windows server, Sybase 12, MS sql server 2000.
Confidential
ETL Data Conversion Specialist
Finance Modules to SAP IS-Media.
Transformed Delimited Data files are loaded into SAP-IS Media by Legacy System Migration Workbench LSMW and Custom Development.
Responsibilities
- Different Modules I was involved are Material Management, Sales and Distribution and Finance.
- Liaison with the SAP functional teams and legacy data owners to ensure data integrity, relationships, completeness and cleanliness of the data.
- Loaded Delimited Data files into SAP-IS Media by combination Legacy System Migration Workbench LSMW and Custom Development.
- Worked extensively with Sales and Distribution Data of Advertising and Circulation.
- Worked extensively with Material Management Data.
- Work extensively with financial data 'Accounts Receivable and Credits'.
- Formulating ETL data load for Advertising and Circulation conversions Includes Master Data, Transactional Data, Opening Balances and Historical Data .
- Analyzed the functional specifications provided by the data architect and created technical specification documents for all the mappings.
- Analyzed source system, source to target data and transformation analysis.
- Designed Data Mart defining Entities, Attributes and relationships between them.
- Created automated scripts, defined storage parameters for the objects in the database.
- Used Group1 software to cleanse postal address of US or Canada to USPS compatible address.
- Designed and developed Informatica Mappings to load data from Source systems to Operational Data Source and then to Data Mart to Scrub, Cleanse and transform the data before it is loaded to SAP IS-Media system.
- Created Mapplets for re-usability across data marts in different Mappings.
- Created complex mappings using lookup, Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
- Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
- Created and maintained several custom Ad hoc reports for the client querying database.
- Populated error tables as part of the ETL process to capture the records that failed the migration.
- Created detailed test cases and objective of the mappings. User Acceptance Testing.
- Used Unix scripts to schedule a workflow, to call pmcmd commands.
- Wrote Unix scripts to dynamically create mapping parameters file, to check if source file exist before starting a workflow, clearing session log files older than 30 days.
Environment: Informatica Power Center 8, Oracle 11i, PL/SQL, DB2, Raid SQL Server, Cobol, Mainframe, Flat file, Power Exchange 5.2.2, Group1, Windows NT, TOAD and Unix.
Confidential
ETL Developer
This warehouse is to generate data to analyze the sales of various products. The product data is categorized depending on the product group and product family. This warehouse is also used to analyze the usage of products at different time of the year. The data is stored in a legacy system on Oracle 9i database.
Responsibilities:
- Involved in designing the Data mart and entire data warehousing life cycle designing.
- Interacted with Business users and Business analyst to gather requirements and business logic.
- Used Informatica Power Center for extraction, loading and transformation ETL of data in the data warehouse.
- Worked on Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor and Repository Server Administration Console.
- Developed and documented data Mappings/Transformations and Informatica sessions.
- Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational databases, transform, cleanse data and load it into data marts.
- Good knowledge with Attributes, Derivations, Dimensions and Measures in Informatica.
- Used the Informatica Designer, Source Analyzer, Warehouse Designer and Mapping Designer.
- Used update strategy to effectively migrate data from source to target.
- Populated error tables as part of the ETL process for the records that failed the migration.
- Developed stored procedures using PL/SQL to implement the business logic.
Environment: Informatica Power Center 7.1.2, Business Objects 6.5, Oracle 9i, PL/SQL, Windows NT, ERWIN, TOAD and SQL navigator.
Confidential
ETL Developer
Responsibilities
- Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational database, Transform, cleanse data and Load it into data warehouse using Informatica Power center.
- Created mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
- Developed PL/SQL stored procedures and functions to handle data outside of Informatica ETL process.
- Responsible to tune ETL procedures, STAR schemas optimizing, load and query performance.
Environment: ETL, Informatica PowerCenter5.1 , Oracle 7.X, Erwin, Business Objects and Windows NT.
Confidential
Oracle Developer
Responsibilities:
- Worked extensively with Oracle Stored Procedures, triggers, View and DDL commands.
- Created packages that have logically related functions, procedures using PL/SQL.
- Responsible for requirement gathering from end users.
Environment: Oracle 7.X, SQL, PL/SQL, Windows NT.