We provide IT Staff Augmentation Services!

Informatica Developer Resume Profile

3.00/5 (Submit Your Rating)

SUMMARY

  • Software experience for over ten years in the client server applications design and development implementation.
  • Performed maintenance and upgrades.
  • Strong knowledge in OLAP systems, Kimball and Inmon methodology and models, dimensional modeling using star schema and snowflake schema.
  • Worked on databases Oracle 8i/9i/10g/11g, Teradata V2R6, SQL Server 2008/2012, DB2.
  • Extensive experience in Extraction, Transformation, Loading ETL data from various sources into data warehouses and data marts using Informatica PowerCenter Tools.
  • Extensively worked on Informatica designer, workflow manager and on Informatica Webservices.
  • Experienced integrating various data sources like Oracle, Teradata, SQL server, flat files into the data warehouse.
  • Experienced in data cleansing, data profiling and data analysis.
  • Excellent functional and technical knowledge in addition to customization experience.
  • Excellent knowledge and experience in technical design and documentation.
  • Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages and Cursors.
  • Excellent expertise with different types of data load strategies and scenarios like historical dimensions, surrogate keys and summary facts etc.
  • Expertise in developing and maintaining overall test methodologies and strategy, documenting test plans, test cases and editing/executing test cases and test scripts.
  • Experienced in PERL and UNIX shell scripting Korn Bourne on UNIX and Linux platforms.
  • Proficient experience in performance tuning SQL and PL/SQL scripts.
  • Experienced in UNIX shell programming.
  • Exposure to development, testing, debugging, implementation, documentation, user training and production support.
  • Excellent analytical, programming, written and verbal communication skills with the ability to interact with individuals at all levels.

SKILLS

  • ETL Tools
  • Informatica Powercenter 9.5.1/8.6.1/7.1.3/6.1, Informatica Data Quality 9.5.1
  • Databases
  • Oracle 11g/10g/9i/8i, SQL Server 2008/2012, GreenPlum, DB2,MS Access, Teradata V2R4/V2R6
  • Database Utilities
  • Toad for Oracle10g, SQL Loader, Embracedo Rapid SQL, AQT, SQL Developer
  • Languages
  • C, C , JAVA, K-Shell, SQL, PL/SQL, Perl
  • Web Technologies
  • HTML, DHTML, XML
  • Operating Systems
  • Windows 9X/2000/XP/2K8/Vista/7/ NT, MS-DOS, HP-UX, Unix, IBM AIX 4.3/4.2

EXPERIENCE

Confidential

Wachovia Securities

  • Wells Fargo Corporation NYSE:WB is the fourth largest bank in the US and is one of the largest providers of financial services to retail, brokerage and corporate customers nationwide.
  • Wachovia have assets of USD 521 billion and stockholders equity of USD 48 billion as of December 31, 2005. Its four core businesses, the general bank, capital management, wealth management, and the corporate and investment bank, serve thirteen million household and business relationships primarily through 3,131 financial centers and 719 retail brokerage offices in fifteen states and Washington, D.C. Wachovia has close to 100,000 employees. Currently, my role is Informatica Powercenter tech lead in creating various designs to load data into various applications such as brokerage data ware house, operational data store and advisory compensation etc.
  • Extracted source definitions from various databases like oracle, DB2 and SQL server into the Informatica Power Center repository.
  • Worked on dimensional modeling to design and developed star schemas and identified fact and dimension tables.
  • Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
  • Worked with different operation data sources such as Oracle, SQL server, legacy systems, excel and flat files. Used Informatica to extract data into the data warehouse.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Used sever manager for session management, database connection management and scheduling of jobs to be run in the batch process
  • Created dimensions model star schemas using Kimball methodology.
  • Developed a number of complex Informatica mappings, mapplets and reusable transformations for different health plan systems to facilitate daily, monthly and yearly loading the data.
  • Involved in fixing invalid mappings, testing of Stored Procedures and Functions, unit and integration testing of Informatica Sessions, Batches and the Target Data.
  • Wrote documentation to describe program development, logic, coding, testing, changes and corrections.
  • Recovered the failed Sessions. Installed and configured Informatica.
  • Optimized the mappings by changing the logic and reduced running time.
  • Finished the tasks within the allocated time for every release, and on target.

Environment: Informatica PowerCenter 9.5.1/9.1.0, Informatica Data Quality 9.5.1, UNIX, Oracle 10g,DB2/AIX64 10.1.4, Sql Server 2012, Flat Files, XML Files, SQL and PL/SQL, Toad for Oracle10g.

Confidential

Travel Industry

  • Orbitz Worldwide is currently creating a customer data infrastructure program. A key initiative for the firm is to develop a unified view of its customers across the organizations. The solution will utilize Greenplum database technology, Oracle, Informatica and Toad to build a technology platform that will enable Orbitz Worldwide to target customers through highly customized marketing campaigns. The multi year effort will deploy functionality in several phases, including new data sourcing and building a new clickstream data mart.
  • With the deployment of the new data sources phase, business users will be able to identify customers who have previously used the Website and are open to purchasing products based on their usage history. Pursuing such targeted cross-sell and deals campaigns as expected to bring incremental revenue to the Orbitz worldwide. The clickstream data mart phase of the program will enable tracking of individual campaign performance by capturing click data through URL parameterization and pixel tracking technologies.
  • Involved in complete software development life cycles, including analysis, design, development and test developing new computer systems to improve the functionality of existing systems.
  • Assisted in converting user requirements into high level and detailed technical specifications.
  • Developed ETL mappings using Informatica, Wrote UNIX scripts to perform data cleansing and manipulation and build Opswise scheduling tasks to batch execute processes based on time and workflow dependencies.
  • Document test plans for testing individual components, performed integration testing of the system and helped with creating user acceptance test criteria, prior to production system deployments. Utilizes test director tools to document and execute test cases as appropriate.
  • Handled user requests for data extractions using the SQL plus analysis tools.
  • Facilitated user training and system appreciation sessions, created MS excel spreadsheets and MS PowerPoint presentations to highlight system features and showcase sample data.
  • Scheduled production system deployments and supported the system through initial stages to ensure stability of business processes that depended on the computer system.

Environment: Informatica PowerCenter 9.1.0, Greenplum, Flat Files, Oracle, Sql Server 2008 and PL/SQL, SQL dbx, PGAdmin3, Opswise Scheduler.

Confidential

Investments

  • CME stats application deals with the compliance and the statistical data related to volume, open interest and prices of all the contracts on various products. The traders trade the contracts futures, options, forwards etc on a daily basis and all of this data is maintained in various key tables such as daily bulletin, contract daily data etc. The data will be sourced from the clearing sources and is transformed through Informatica Powercenter and maintained in the STATS database.
  • Interaction with the business partners to understand the business of the application, gather
  • requirements analyzed and put into the technical design documents.
  • Worked closely with data population developers, multiple business units and a data solutions Engineer to identify key information.
  • Designed complex Informatica mappings with transformations such as Aggregator, Expression, Joiner, Filter, Source Qualifier, Union Transformation, connected/unconnected Lookups, Update Strategy, Stored Procedures, Web services, HTTP, Java transformation and Router Transformation to transform and load the data from and to flat files and relational tables.
  • Used Informatica Web services to support the feeds. Created and Used WSDL.
  • Involved in developing the technical documents from functional specifications.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Created and configured workflows, worklets, and Sessions to transport the data to target using Informatica workflow manager. Also created various tasks such as command tasks, email tasks, timer tasks, event wait and event raise tasks in the workflow manager.
  • Involved in writing UNIX shell scripts in KSH to transfer the files between multiple systems and to load data from files to database.
  • Imported data from excel workbook thru Informatica. Also sent daily feeds through FTP.
  • Involved in analysis, meeting with business partners in the agile development team.
  • As per Change Requests CR migration of objects in all phases DEV, QA and PRD of project and trained developers to maintain system when in production.
  • Involved in tuning the Informatica objects to increase the performance of the Informatica loads.
  • Performed tuning by identifying and eliminating the bottlenecks occurred for increasing the performance.
  • Performed data quality analysis to validate the input data based on the cleansing rules.
  • Worked on the database connections, SQL Joins Loops, Aliases, Views, Aggregate conditions and also wrote various PL/SQL Procedures, Functions and Triggers for processing business logic in the database. Tuned SQL queries for better performance.
  • Performed unit testing on the Informatica codes using the Informatica debugger and by manually checking through SQL queries.

Environment: Informatica PowerCenter 8.6.1,Informatica Web Services, UNIX Shell, Oracle 10g/11g, Flat Files, XML Files, SQL and PL/SQL, SQL Assistant, Toad, Oracle Adminserver, SQL Developer, HP Quality Center, UC4 Scheduler, Linux

Confidential

  • Nationwide Investments database which is known as HUB has a lot of subscribers such as ALFA, LMS source systems etc. and publishers such as PAM, SMF source systems etc. This project aims to load the investments subscriber data into HUB using common interface Module using Informatica Powercenter 8.6.1 and Perl Scripts. The batch execution Engine is driven by Perl and the Informatica workflows are scheduled through Maestro. The data related to securities, positions, transactions and holdings will be loaded to the respective flat files using CIM models and these flatfiles are loaded to the HUB the data warehouse using the application Eagle Pace. Eagle Pace is an application which has uploading and downloading utilities. The pace uploader utility uploads the flatfiles into HUB which is used as a source to Netezza datamart on which reports will be derived using Business Objects.
  • Interacted with the business partners to understand the business of the application, gather
  • Requirements analyzed and put into technical design documents.
  • Worked closely with data population developers, multiple business units and a data solutions engineer to
  • Identify the key information.
  • Designed complex Informatica mappings with transformations such as Aggregator, Expression, Joiner, Filter, Source Qualifier, Union Transformation, connected/unconnected Lookups, Update Strategy, Stored Procedure, Web services, HTTP, Java Transformation and Router Transformation to transform and load the data from and to flat files and relational tables.
  • Used informatica Web Services to support the feeds. Created and Used WSDL.
  • Involved in developing the technical documents from functional specifications.
  • Identified and tracked the slowly changing dimensions, heterogeneous sources and determined the hierarchies in dimensions.
  • Created and configured Workflows, Worklets, and Sessions to transport the data to target using Informatica Workflow Manager. Also created various tasks such as command task, email task, timer tasks, event wait and event raise tasks in the workflow manager.
  • Involved in writing UNIX shell scripts in KSH to transfer the files between multiple systems and to load data from files to database.
  • Imported data from excel workbook thru Informatica. Also, I sent daily feeds through FTP.
  • Involved in analysis, meeting with business partners in agile development teams.
  • As per Change Requests CR migration of objects in all phases DEV, QA and PRD of projects and trained developers to maintain systems when in production.
  • Involved in tuning the Informatica objects to increase the performance of the Informatica loads.
  • Performed tuning by identifying and eliminating the bottlenecks occurred for increasing the performance.
  • Performed data quality analysis to validate the input data based on the cleansing rules.
  • Worked on database connections, SQL Joins Loops, Aliases, Views and Aggregate conditions and also wrote various PL/SQL Procedures, Functions and Triggers for processing business logic in the database. Tuned the SQL queries for better performance.
  • Performed unit testing on the Informatica codes using the Informatica debugger and by manually checking through SQL queries.

Environment: Informatica PowerCenter 8.6.1,Informatica Web Services, UNIX Shell, Oracle 10g/11g, Flat Files, XML Files, SQL and PL/SQL, SQL Assistant, Teradata V2R4/V2R6, Toad, Oracle Adminserver, SQL Developer, HP Quality Center, Maestro Scheduler, Linux.

Confidential

Wachovia Securities

  • Wells Fargo Corporation NYSE:WB is the fourth largest bank in the US and is one of the largest providers of financial services to retail, brokerage and corporate customers nationwide. Wachovia had assets of USD 521 billion and stockholders' equity of USD 48 billion as of December 31, 2005. Its four core businesses, the general bank, capital management, wealth management, and the corporate and investment bank, serve 13 million household and business relationships primarily through 3,131 financial centers and 719 retail brokerage offices in 15 states and Washington, D.C. Wachovia has close to 100,000 employees. During this project, I played the role of a Informatica Powercenter tech lead in creating various designs to load data into various applications such as brokerage data ware house , operational data store and advisory compensation etc.
  • Extracted source definitions from various databases like oracle, DB2 and SQL server into the Informatica Power Center repository.
  • Worked on dimensional modeling to design and develop star schemas, identified fact and dimension tables.
  • Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into the target table.
  • Worked with different operation data sources such as Oracle, SQL server and Legacy systems excel and flat files. Used Informatica to extract data into the data warehouse.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Used Sever manager for session management, database connection management and scheduling of jobs to be run in the batch process
  • Created dimensions model star schemas using Kimball methodology.
  • Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations for different Health Plan Systems to facilitate Daily, Monthly and Yearly Loading of Data.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data
  • Written documentation to describe program development, logic, coding, testing, changes and corrections
  • Recover the failed Sessions. Installed and Configured Informatica.
  • Optimized the mappings by changing the logic and reduced running time.
  • Finished the tasks within the allocated time for every release, always we are on Time and on Target.

Environment: Informatica PowerCenter 8.6.1/7.1.3, IBM AIX 4.3/4.2, Oracle 10g, Flat Files, XML Files, SQL and PL/SQL, Toad for Oracle10g.

Informatica Developer

  • This project aimed to migrate from Nucleus to Entegrate systems. This migration to a new trading system had multiple ripple effects across the business and hence each of the existing solutions needed to source the data from new system.During this project, played the role of technical lead as well as developer on Informatica Powercenter.
  • Developed and maintained ETL maps to Extract, Transform and Load data from COBOL, xml files into the Enterprise Data warehouse.
  • Used to convert the mainframes data i.e. VSAM files to ASCII through Informatica.
  • Created complex mappings in Power Center Designer using Source Qualifier, XML Source Qualifier, Aggregate, Expression, Filter, Sequence Generator, Normalizer Transformation, Lookup, Update Strategy, Rank, Joiner, and Stored procedure transformations.
  • Created data conversion mappings to make the data available all internal applications like PLDR, Libra, A2 and Prism.
  • Interaction with the users/analysts to understand the business of the application, gather
  • Requirements and put into technical design.
  • Worked closely with data population developers, multiple business units and a data solutions engineer to identify key information.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translating business rules and functionality requirements into ETL procedures.
  • Used Mapping Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code
  • Involved in performance tuning of SQL Queries, Sources, Targets and Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit intervals.
  • Used Debugger to test the mappings and fixed the bugs.
  • Created sessions, configured Workflow to extract data from various sources, transformed data and loading into data warehouse. Involved in testing DB2 Sql Queries.
  • Developed UNIX shell scripts to move Staging files to DB2 database using DB2 bulk loads.
  • Developed UNIX shell script newpmcmd.sh to run the PMCMD commands in UNIX environment to run the workflows using the Autosys scripts.
  • Created, launched scheduled Workflows/sessions by developing the Autosys Jobs by generating the JIL files in UNIX Environment.
  • Involved in migrating the mappings and workflows from Development to Testing and then to Production environments. Improved ETL Performance by using the Informatica Pipeline Partitioning.
  • Created Partitions through the session wizard in the Workflow Manager to increase the performance.
  • Performed Unit Testing, System Testing and Integration testing on large Data Warehouse.
  • Maximizing data through concurrent processing of specified Partitions using Simple Pass Through, Key Range, Round Robin Pipeline along the data transformation pipeline.
  • Prepared documentation describing development of mappings, coding, testing and design of Autosys Jobs.

Environment: Informatica PowerCenter 6.x, UNIX, Oracle 9i, Flat Files, XML Files, SQL and PL/SQL, Business Objects XIR2, , Toad for oracle 9i, DB2, Unix K-shell Scripts.

ETL/Informatica Developer

  • This project aimed to migrate the spine mappings for the source systems 'ATLAS','PIMS','RCPA' from Abinitio to Informatica. During this project, I played the role of a developer in development and unit testing of complex and medium mappings.
  • Extraction, transformation and loading of the data using Informatica.
  • Developed mappings using mapping designer.
  • Developed transformations using the transformation developer.
  • Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table. Involved in extracting the data from Oracle, DB2 and flat files.
  • Created and ran Sessions using workflow manager and monitored using the workflow monitor.
  • Defined target load order plan for loading data correctly into different target tables.
  • Defined mapping variables and parameters to capture load history, row counts and to log errors of a Session run.
  • Developed wrapper workflow with command tasks to control execution of workflows in loop.
  • Implemented performance tuning techniques by identifying and resolving the bottlenecks in Source, Target, Transformations, Mappings and Sessions to improve performances.
  • Developed complex Informatica load maps and informatica error maps for extraction, loading and transformation ETL of data as per the business requirements.
  • Developed reusable Mapplets and reusable transformations for standard business units to be used in multiple mappings.
  • Analyzed and created indexes and hints in the database to improve query performance.
  • Performance tuning of the Mappings. Recovered the failed Sessions. Understood the functional requirements. Created dimensions model star schemas using the Kimball methodology.

Environment: Informatica Power Center 6.1, Business Objects 6.1,Oracle SQL, PL/SQL UNIX shell scripts,Oracle9i

ETL/Informatica Developer

  • The objective of the project was to convert the traditional software application into more flexible and scalable Business Analysis tool. The project involved reporting for the procurement business for vehicle parts of DC. Also it involved maintaining history for the procurement objects for facilitating business analysis and data mining. This involved integration of all the systems that they were using to create a single repository for procurement reporting.
  • Involved in installation of Informatica Power Center 6.
  • Designed new database tables to meet business information needs. Designed mapping documents, which is a guideline to ETL coding. Prepared functional specifications and scope documents.
  • Prepared mapping documents for loading the data from data warehouse to i2 demand manager.
  • Involved in creating high level and low level design of the ETL process
  • Defined target load order plans for loading data into different target tables.
  • Designed and developed various kinds of mappings using transformations like Expression, Aggregator, Stored Procedure, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Used Debuggers to test the mappings and fixed the bugs.
  • Prepared migration documents to move the mappings from development to testing and then to production repositories.
  • Prepared ETL standards, naming conventions and wrote ETL flow documentation for stage and ODS.
  • Automation of job processing, established automatic email notifications to the concerned persons.
  • Used FTP connections to transfer flat files between i2 demand manager, BPCS, Siebel and SCP.
  • Worked on the database connections, SQL Joins, Views in the database level.
  • Involved in unit, integration, system, and performance testing levels.

Environment: Informatica Power Center 6, Oracle 9i, PL/SQL, Excel, Windows XP.

ETL/Informatica Developer

  • This project involved the migration of 18000 desktops from the current operation system Windows NT to Windows XP in various locations in United States of America. During this project, I developed simple to medium mappings to load the respective data into the data warehouse.
  • Created medium mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner, and Stored Procedure transformations.
  • Worked with database connections, SQL Joins, Cardinalities, Loops, Aliases, Views, Aggregate conditions, parsing of objects and hierarchies.
  • Involved in developing several canned reports. Generated reports involved continuous interaction with the users/analysts to get the business knowledge of the company. Interacted with the DBA s to understand the tables and necessary joins used to getting the data needed for the reports.
  • Tested reports before posting them to the repository query optimization and performance . Most of the reports involved using multiple data providers, free hand SQL, pie charts, alerts and formulae.
  • Created and developed Stored Procedures, Packages and Triggers for Oracle 8i databases using PL/SQL

Environment: Informatica Power Center 6.5 , Designer, Unix, shell scripting, Toad, Oracle 8i, MS SQL Server 2000, PL/SQL, Windows NT, Windows XP

We'd love your feedback!