We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume Profile

4.00/5 (Submit Your Rating)

Atlanta, GA

Summary:

  • Eleven plus 11 years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Financial, Telecom and Timeshare Sectors.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • Strong experience with Ralph Kimball and Inmon data modelling methodologies.
  • Strong experience working with ETL tools Informatica/SSIS.
  • Strong Data Warehousing ETL experience of using Informatica 9.5/9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.
  • Experience working on Dataquality tools Informatica IDQ 9.1 , Informatica MDM 9.1 .
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Expertise in working with various sources such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, UDB, Netezza,Teradata,flat files,XML,COBOL,Mainframe.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, T-SQL and Oracle PL/SQL.
  • Utilized AUTOTRACE and EXPLAIN PLAN for monitoring the SQL query performance.
  • Experience in resolving on-going maintenance issues and bug fixes monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Created pre-session, post session ,pre-sql ,post sql commands in Informatica.
  • Worked with different Informatica transformations like Aggregator,Lookup,Joiner,Filter,Router,Update strategy,Transaction Control,Union,Normaliser,SQL in ETL development.
  • Worked with Event wait and event raise tasks,mapplets,reusable transformations.
  • Worked extensively on NZLOAD,NZSQL scripts to read and write data with Netezza Database.
  • Worked with Parameter file for ease of use in connections across Dev/QA/Prod environments.
  • Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Experience in using Automation Scheduling tools like Autosys ,Tidal,Control-M,Tivoli Maestro scripts.
  • Experience working with Powerexchange.
  • Worked extensively with slowly changing dimensions SCD Type1 and Type2.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.

Technical Skills:

Data Warehousing

Informatica Power Center 9.5/9.1.0/8.6.1/7.1/6.2,

Designer, Workflow Manager, WorkflowMonitor,

Repository Manager , Informatica Power Connect, DTS, SQL Server

Integration Services SSIS ,Webservices,Informatica MDM, Informatica IDQ

Data Modeling

Erwin , Toad

Databases

Oracle11i/10g/9i/8i/7x, SQL Server 2003/2008, MS Access,Excel,salesforce.com

Business Intelligence

Hyperion,OBIEE,Cognos Impromptu, Transformer, Power Play Reports,

Scheduler, IWR, PPWR, Upfront, Access Manager ,

Business Objects XI/6.5 Supervisor, Designer, Reporter , SQL Server Analysis

Services SSAS , SQL Server Reporting Services SSRS ,

Crystal Reports 9/10/11.

Languages

SQL,T-SQL,ANSI-SQL, PL/SQL, Unix Shell Script, Visual Basic,

ANSI SQL, SQL Plus 3.3/8.0

Tools

Toad, SQL Loader, Crystal Reports 9.0/8.x/7/6/5/4

Operating Systems

Windows 2007/2005/2003/XP/NT Server and Windows NT, UNIX, HP-UX,

UNIX AIX 4.2/4.3, Sun Solaris 8.

Analytical Tools

SQL Server Analysis Service SSAS , Performance Point Server 2007,

ProClarity Analytics Server

SQL Server Tools

Query Analyzer, SQL Server Profiler, SQL Server Mail Service, Enterprise

Manager, SQL Server Agent, DTS, BCP,Microsoft Visual Studio

Versioning

Team Foundation Server TFS , Visual Source Safe VSS .

Professional Experience

Confidential

Role: Lead/Senior Informatica developer

  • As the third-largest cable provider in the nation, Cox Communications Inc. is noted for its high-capacity, reliable broadband delivery network as well as the company's ability to provide superior customer service.
  • Enterprise Datwareouse is built to understand,analyse customer satisfaction, daily work orders processed,
  • Customers across different subject areas Business/Residential/Medicaid .Informatica is the ETL being used to pull data from various sources like ICOMS,UNICA,MEDALLIA etc. OBIEE reports with dashboards for different subject areas are being used for business analysis.

Responsibilities:

  • Worked as Lead for the projects, involving in all the phases of SDLC.
  • Worked with DataArchitect in designing the data mart , defining,designing and building FACTS and DIMENSIONS using Star Schema model.
  • Developed Informatica workflows/worklets/sessions associated with the mappings across various sources like XML,COBOL,flat files, Webservices, Salesforce.
  • Worked with SCD Type1, Type 2, Type3 to maintain history in Dimension tables.
  • Worked with cleanse, parse, standardization, validation, scorecard transformations.
  • Worked with transformations Source Qualifier,Update Strategy, XML transformation, SQL Transformation, Webservices, Java transformation,Lookup Connected and Unconnected .
  • Worked with Push down optimization to improve performance.
  • Worked on UNIX shell scripting for file processing to third party vendor through SFTP, encryption and decryption process.
  • Worked with different scheduling tools like Tidal, Tivoli, Control M, Autosys
  • Documentation of ETL process for each project and KT to Offshore and team.
  • Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings across Dev, QA, and PROD environments.
  • Production support for the Informatica process, troubleshoot and debug any errors.
  • Worked with Informatica tools IDQ Data Analyst,Developer with various dataprofiling techniques to cleanse,match/remove duplicate data.
  • Worked with cleanse, parse, standardization, validation, scorecard transformations.
  • Worked extensively with Netezza scripts to load the data from flatfiles to Netezza database.
  • Used NZSQL scripts,NZLOAD commands to load the data.
  • Used FIFO to load the flat files from source for faster data load into Netezza database.
  • Worked with Informatica power exchange and Informatica cloud to integrate Salesforce and load the data from Saleforce to Oracle db.

Environment: Informatica Power Center 9.5/9.1.0, Flat Files,MainFrame Files, T-SQL,Oracle 11i, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008, Teradata,Netezza,Aginity workbench.

Confidential

Role: Lead/ Senior Informatica developer

The main objective of the project is to integrate customers of the ATT wireless into the Cingular systems by developing and maintaining Customer Information Data Mart. It included extraction of data from different source systems such as Relational, Flat files, Excel files and application databases like Access. ATT wireless has information such as Orders, Billing, Remedy and Communications information, which is integrated with Cingular wireless information and stored in the Data mart.Worked on projects for Enabler where the data is loaded in phases from Source to Stage and then stage to EDW which resides in Teradata.

Responsibilities:

  • Lead design, development and implementation of the ETL projects end to end.
  • Responsible for ETL technical design discussions and prepared ETL high level technical design document.
  • Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents.
  • Involved in designing of star schema based data model with dimensions and facts.
  • Interacting with onsite and offshore team to assign Development tasks and scheduling weekly status calls with Offshore team on status.
  • Extracted data from flat files, Golden Gate , Oracle ,Sql server using Informatica ETL mappings and loaded to Data Mart.
  • Utilized Informatica IDQ Data Analyst,Developer for dataprofiling and matching/removing duplicate data,fixing the bad data,fixing NULL values.
  • Created quality rules, development and implementation patterns with cleanse, parse, standardization, validation, scorecard transformations.
  • Created complex Informatica mappings using transformations Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router transformations to extract, transform and loaded data mart area.
  • Used Teradata utilities fastload, multiload, tpump to load data from various source systems.
  • Developed scripts in BTEQ to import and export the data.
  • Developed CTL scripts to load the data to and from Teradata database.
  • Worked extensively on shell scripting for file management.
  • Created re-usable transformations/mapplets and used across various mappings
  • Wrote complex PLSQL scripts /functions/procedures/packages.
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager
  • Worked extensively with Netezza scripts to load the data from flatfiles to Netezza database.
  • Used NZSQL scripts,NZLOAD commands to load the data.
  • Created Tivoli Maestro jobs to schedule Informatica Workflows
  • Expert in performance tuning of Informatica code using standard informatica tuning steps.

Environment: Informatica Power Center 9.5/9.1.0/8.6.1, Flat Files,MainFrame Files, Oracle 11i, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,T-SQL,SQL Server 2008, Teradata,Netezza,Aginity workbench.

Confidential

Role: Lead/Senior Informatica Developer

Confidential has built a solid reputation as a preferred custom solutions and consulting partner for clients in healthcare, entertainment, finance, insurance, state and local government, retail, gaming and manufacturing.

Domain : HealthCare Dialysis Treatment

Responsibilities:

  • Created Informatica mappings using various transformations like XML, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Normaliser,Union, Filter and Router in Informatica designer.
  • Worked with FLOAD,MLOAD,TPUMP utilities to load the data to Teradata.
  • Extensively worked with Teradata database using BTEQ scripts.
  • Involve in all phase of SDLC ,i.e design, code, test and deploy ETL components of datawarehouse and integrated Data Mart .
  • Used NZSQL scripts,NZLOAD commands to load the data.
  • Worked extensively with Netezza scripts to load the data from flatfiles to Netezza database.
  • Created Web services mappings for consumer and Provider,used Webservices consumer transformation,XML parser to parse the incoming data.
  • Created and edited custom objects and custom fields in Salesforce and checked the field level Securities.
  • Worked on sfdc session log error files to look into the errors and debug the issue.
  • Worked on Informatica Cloud to create Source /Target sfdc connections,monitor,synchronize the data in sfdc.
  • Worked with Informatica Power Exchange as well as Informatica cloud to load the data into salesforce.com
  • Involved in analyzing, defining, and documenting data requirements by interacting with the client and Salesforce team for the Salesforce objects.
  • Worked with cleanse, parse, standardization, validation, scorecard transformations.
  • Worked with Informatica IDQ Data Analyst,Developer with various dataprofiling techniques to cleanse,match/remove duplicate data.
  • Worked as Informatica Lead for ETL projects to design,develop Informatica mappings.
  • Created pre-session, post session ,pre-sql ,post sql commands in Informatica.
  • Used UNIX scripts for file management as well as in FTP process.
  • Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD.
  • Production support for the Informatica process, troubleshoot and debug any errors.

Environment: Informatica Power Center 9.5/9.1.0/8.6.1, Informatica Data Quality 9.1.0/9.5.1, Flat Files,MainFrame Files, Oracle 11i, Netezza,Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008, Salesforce.com,Webservices.

Confidential

Role: Lead/Senior Informatica Developer

Client Description:

Confidential Vacation Ownership develops, markets and sells vacation ownership interests and provides consumer financing to owners through its three primary consumer brands, Wyndham Vacation Resorts, WorldMark by Wyndham, and Wyndham Vacation Resorts Asia Pacific.

Responsibilities:

  • Worked with Informatica Cloud to create Source /Target connections,monitor,synchronize the data in sfdc.
  • Worked with Informatica cloud for creating source and target objects, developed source to target mappings.
  • Involved in analyzing, defining, and documenting data requirements by interacting with the client and Salesforce team for the Salesforce objects.
  • Involved in extracting, transforming and loading data Accounts , Contracts, Reservations, Owner interactions Interactions tables from various source systems to Salesforce.com and also reverse data feed from Salesforce for CRM telesales.
  • Interacting and assigning development work to Developers that were offshore and guiding them to implement logic and troubleshoot the issue that they were experiencing.
  • Responsible for projects estimates, design documents, resource utilization and allocations.
  • Work with offshore/onsite team and lead the project and assign tasks appropriately to the team members.
  • Created Web services mappings for consumer and Provider,used Webservices consumer transformation,XML parser to parse the incoming data.
  • Utilized Informatica IDQ Data Analyst,Developer for dataprofiling and matching/removing duplicate data,fixing the bad data,fixing NULL values.
  • Created Informatica mappings using various transformations like XML, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Normaliser,Union, Filter and Router in Informatica designer.
  • Worked with data loading utiltities Bteq, Tpump,Fload,Mload to load the data into Teradata Database.
  • Involve in all phase of SDLC ,i.e design, code, test and deploy ETL components of datawarehouse and integrated Data Mart .
  • Created quality rules, development and implementation patterns with cleanse, parse, standardization, validation, scorecard transformations.
  • Worked on shell scripting for file management and in FTP process.
  • Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD.
  • Production support for the Informatica process, troubleshoot and debug any errors.

Environment: Informatica Power Center 9.1.0/8.6.1, Informatica Data Quality 9.1.0, Flat Files,MainFrame Files, Oracle 11i, Netezza,Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008, Salesforce.com,Webservices,Teradata.

Confidential

Role: Lead/Senior Informatica Developer

Confidential Financial/Bayview asset Management is a mortgage investment company focused on providing capital and servicing solutions to banks and financial companies.

Responsibilities:

  • Performed performance tuning by identifying the bottlenecks in informatica mappings and sessions and also using explain plan in oracle using TOAD.
  • Created complex mappings in Power Center Designer 8.6 using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Extensively used Toad for analyzing the queries in the existing the mappings to better understand the business logic implemented.
  • Responsible for mentoring Developers and Code Review of Mappings developed by other developers
  • Fixing the existing issues by introducing the data cleansing techniques into the mappings rather than cleaning the source data files manually.
  • Analyzed the existing mappings and understand the dataflow process.
  • Lead and interacting with Business Analyst to understand the business requirements. Involved in analyzing requirements to refine transformations.
  • Wrote SQL, PL/SQL, stored procedures triggers, cursors for implementing business rules and transformations.
  • Created pre-session, post session ,pre-sql ,post sql commands for email notifications with the Email Task, also to update target tables after the data is loaded.
  • Created and used tasks like Email Task, Command Task, Control task in Informatica workflow manager and monitored jobs in Workflow Monitor.
  • Developed UNIX scripts for file management like to zip unzip the files.
  • Developed the automated and scheduled load processes using TIDAL Schedular.

Environment: Informatica Power Center 8.6.1, Flat Files,MainFrame Files, Oracle 11i, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008

Confidential

Role: Informatica developer

The purpose of the project is to desingn a datawarehouse for the company with all the employee information and contact details.For maintaining records of the employee salaries,benefits and packages.Informatica Power Center is the ETL tool being used to pull the data from the front end application which is stored in different databases like oracle,sql server,flatfiles.

Responsibilities:

  • Worked closely with the Business analyst to understand the various source data.
  • Involved in designing Logical and Physical models for staging, transition of the Data.
  • Involved in designing of star schema based data model with dimensions and facts
  • Designed ETL mapping document to map source data elements to the target based in Star-Schema dimensional model.
  • Designed and developed Informatica Mapping for data load and data cleansing
  • Created Stored Procedure, Functions and Triggers as per the business requirements
  • Used Update Strategy and Lookup transformation to implement the Change Data Capture CDC process
  • Partitioned sources to improve session performance.
  • Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate One time and Monthly loading of Data
  • Utilized the Aggregate, Join, Router, Lookup and Update transformations to model various standardized business processes
  • Worked with Scheduler to schedule Informatica sessions on daily basis and to send an email after the completion of loading
  • Created design documents and performed Unit Testing on the Mappings.
  • Created complex SCD type 1 type 2 mappings using dynamic lookup, Joiner, Router, Union, Expression and Update Transformation.
  • Worked on identifying Mapping Bottlenecks in Source, Target and Mappings and Improve performance.
  • Extensively used Workflow Manager to create connections, sessions, tasks and workflows
  • Performance tuned stored procedures, transformations, mappings, sessions and SQL queries
  • Worked on the Database Triggers, Stored Procedures, Functions and Database Constraints.

Environment: Informatica 7.1/6.x, Oracle 10g, SQL Server 2005, Autosys, Business Objects 6.5

Confidential

Role: Software Programmer/oracle developer

The purpose of the project is to store the information of all the clients the company is provided the services to.In order to maintain the data,oracle is primarily used as the storage database.In this database,the information is stored pulling the data from the front end applications ,flat files etc..and creating the oracle tables and views correspondingly.Reports are generated from this data to analyse the business.

Responsibilities:

  • Involved in creating Functional and Program Specification documents.
  • PL/SQL Development and Implementation.
  • Extensive Performance Tuning SQL Tuning, PL/SQL Tuning
  • Involved in ETL Development using native Oracle tools SQL LOADER, Oracle PL/SQL
  • Involved in the creation of Partitioned Tables and Indexes
  • Involved in the creation and modification of Packages, Stored Procedures and Triggers.
  • Involved in writing complex SQL Queries to implement the business requirements.
  • Involved in loading data into Database using SQL Loader.
  • Data Migration using PL/SQL stored Procedures.
  • Involved in DATA MODELING using ERWIN.
  • Created stored Procedures using EXECUTE IMMEDIATE and REF CURSORS Native Dynamic SQL .
  • Involved in cleaning and maintaining migrated data.
  • PL/SQL Collections were extensively used for high performance of stored procedures.
  • Involved in Index Monitoring for identifying the Unused Indexes.
  • Involved in analyzing Schema's, Tables and Indexes as part of OPTIMIZATION.
  • Used data pump to refresh the development and test database environment.
  • Worked with AUTONOMOUS TRANSACTIONS in Triggers and Functions in order to include logging.
  • Involved in creating UNIX shell Scripts for automating various routine database tasks.
  • Made use of AUTOTRACE and EXPLAIN PLAN for monitoring the individual query performance.
  • TOAD and SQL PLUS were used for PL/SQL Development. Created Namespaces, Query Subjects, Calculated Fields, Filters, Joins and Packages in Cognos Frame Work Manager.
  • Created User Classes and Users in Access manager.
  • Installed ReportNet in IIS, Windows and Oracle Environment,
  • Created standard and Adhoc reports using Query Studio and Report Studio.
  • Created User Groups, Roles to implement the security in ReportNet.

Environment: Oracle 9i, HP-UX 11.0, SQL DEVELOPER, TOAD, Cognos Reportnet Report Studio, Query Studio, Cognos Connection Framework Manager, Cognos Series 7 Impromptu Administrator, Power play Enterprise Server, Power play Transformer, Access Manager , Microsoft IIS

Confidential

Role: Software Programmer/oracle developer

The project is based on automation machines which are being used for residential and commericial purposes. The day to day activities of the production of these machines has to be stored in a database.Oracle is being used as the main source of database for storing the information of the business activities and there of reports are generated out of this data.

Responsibilities

  • Involved in creating Functional and Program Specification documents.
  • PL/SQL Development and Implementation.
  • Extensive Performance Tuning SQL Tuning, PL/SQL Tuning
  • Involved in ETL Development using native Oracle tools SQL LOADER, Oracle PL/SQL
  • Involved in the creation of Partitioned Tables and Indexes
  • Involved in the creation and modification of Packages, Stored Procedures and Triggers.
  • Involved in writing complex SQL Queries to implement the business requirements.
  • Involved in loading data into Database using SQL Loader.
  • Data Migration using PL/SQL stored Procedures.
  • Involved in DATA MODELING using ERWIN.
  • Created stored Procedures using EXECUTE IMMEDIATE and REF CURSORS Native Dynamic SQL .
  • Involved in cleaning and maintaining migrated data.
  • PL/SQL Collections were extensively used for high performance of stored procedures.
  • Involved in Index Monitoring for identifying the Unused Indexes.
  • Involved in analyzing Schema's, Tables and Indexes as part of OPTIMIZATION.
  • Used data pump to refresh the development and test database environment.
  • Worked with AUTONOMOUS TRANSACTIONS in Triggers and Functions in order to include logging.
  • Involved in creating UNIX shell Scripts for automating various routine database tasks.
  • Made use of AUTOTRACE and EXPLAIN PLAN for monitoring the individual query performance.
  • TOAD and SQL PLUS were used for PL/SQL Development. Created Namespaces, Query Subjects, Calculated Fields, Filters, Joins and Packages in Cognos Frame Work Manager.
  • Created User Classes and Users in Access manager.
  • Installed ReportNet in IIS, Windows and Oracle Environment,
  • Created standard and Adhoc reports using Query Studio and Report Studio.
  • Created User Groups, Roles to implement the security in ReportNet.

Environment: Oracle 9i, HP-UX 11.0, SQL DEVELOPER, TOAD, Cognos Reportnet Report Studio, Query Studio, Cognos Connection Framework Manager, Cognos Series 7 Impromptu Administrator, Power play Enterprise Server, Power play Transformer, Access Manager , Microsoft IIS

We'd love your feedback!