Informatica Developer Resume
Summary:
- 5+ years of experience in IT industry with in depth understanding of the life cycle of a software development project - Analysis, Software Development, System Testing, Bug fixing,Documentation and implementation.
- 4+ years experience as an ETL Developer with the development of ETL’s for DatawareHouse/Data Migration using Informatica PowerCenter version 8.x/9.x, Power Exchange and SSIS (SQL Server Integration Services) ETL tools.
- Experience in the areas of Life Insurance and Mortgae Insurance domain.
- Design and development of Business Intelligence Solutions using Informatica PowerCenter and Business Intelligence Development Studio using SSIS.
- Extensive knowledge on the PowerCenter components as PowerCenter Designer, PowerCenter Repository Manager, Workflow Manager and Workflow Monitor.
- Thorough knowledge in creating ETL’s to perform the data load from different data sources and good understanding in installation of Informatica.
- Experienced in developing the mappings and transformations using PowerCenter Designer and executing the mappings through Work Flow Manager from multiple sources to multiple targets.
- Experienced in creating of Slowly Changing Dimensions (SCD) Type 1, Type 2 methodologies for datawarehousing projects with FACT and Dimensions Tables.
- Experience in integration of various data sources with multiple Relational Databases like Oracle, SQL Server, IBM DB2 and Sybase and worked on integrating data from flat files.
- Excellent knowledge and experience in creating source to target mapping, edit rules and validation, transformations, and business rules.
- Good understanding of the Data Modeling (Dimension and Relational) concepts like Dimension and Fact tables, Star Schema Model and Snow Flake Schema Model.
- Involved in complete lifecycle of a project including writing test scripts and performing unit-testing by writing SQL against the database and validating with end users reports.
- Good knowledge in Data migration from SQL Server to DB2 using SSIS with numerous challenges like data integrity issues, data inconsistencies and high data volumes.
- Extensive knowledge in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes and the scheduler tool Autosys.
- Involved in implementation activities and Support activities post implementation.
- Experienced at creating effective Test data and development thorough Unit test cases to ensure successful execution.
TECHNICAL SKILLS:
Design Knowledge : Datawarehouse
Languages : C, XML, SQL and Korn Shell Scripting, T-SQL (sybase), IBM
Mainframes-JCL, PLI
BI Tools : Informatica Power Center 8.x/9.x,PowerExchange, Microsoft SSIS
Domain Knowledge : Life Insurance, Mortgage Insurance
Operating Systems : Windows 98/2000 /XP and UNIX
Databases : Oracle 10g, Sybase, IBM DB2, MS SQL Server 2005
Job Scheduler : Autosys
Version Control : Serena -PVCS Dimensions
Tools : Toad 9.6, Embarcadero Rapid SQL
Defect Tracking Tool : HP Quality Center
Management Tools : HP Service Center, MS Office 2007, MS Visio 2007
Data Modelling : Erwin
Professional Experience:
Informatica Developer Sept 2012 – Present
Confidential,Auburn, IN
The project involves migrating data from Legacy Systems to SAP by interacting with SAP team –understanding the requirements, build, testing and implementation. Project is employing iterative model to implement the solution through a multiple release schedule.
Responsibilities:
- Interaction with the SAP team to understand the migration specifications and underlying Legacy system.
- Evaluate the data migration solution to determine the scope of the overall effort.
- Create the Informatica mappings using Power Exchange to perform the data migration from Legacy systems to SAP.
- Run the mappings to test the migration and validate the test results on SAP.
- Testing of the mappings and validation of results to ensure migration perform according to defined specifications.
- Prepare validation process of the input data to avoid costly overruns by understanding and addressing data issues early.
- Document the steps written with mapping documents at a detailed level.
- Prepare code migration document and work with release team in migrating the code from Development to UAT, Production Servers.
- Production Support which includes the correction of the data that has been migrated according to the application needs.
Environment: Informatica Power Exchange, SQL Server 2005, SAP-IDOC’s
Informatica Developer May 2008 – Oct2011
Confidential,Milwaukee
Retirement Market Initiative project was to identify the specific segments that could be considered as RMI clients and develop targeted solutions and building the Data Mart for each of those client segments.
The project was aimed to build a new initiative– Retirement Market which helps the financial representatives to market the retirement products to the clients close to retirement. The functionality of this project was to get data from various OLTP application sources and selectively extract, transform and load it into the Sybase/DB2 data warehouse using Informatica PowerCenter 8.1.1 tool and then transform data from data warehouse into an OLAP server to provide Business Intelligence Analysis services.
Responsibilities:
- Correlate the business to technical aspect and come up with design to create the mappings.
- Parsing design spec to simple ETL coding and mapping standards.Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
- Creation of Informatica Workflows, sessions and run that Sessions to load into Target and Debugging, Unit Testing of Mappings.
- Integrating the Work Flows to the Autosys jobs to schedule the run of the mappings.
- Perform Informatica code migrations to different Informatica repositories.
- Designed and developed Informatica Mappings to load data from Source systems to ODS and then to the DataWarehouse to DataMart by creating the Dimension nd Fact Tables.
- Extensively used Power Center to design multiple mappings with embedded business logic.
- Creation of mappings with the transformations like Lookups, Joiner, Rank,
Source Qualifier, Aggregate, Expression, Filter, Update Strategy, Normalizer, Router,Union,Sorter,Mapplets etc in the Informatica Designer for populating target table in efficient manner per the business need.
- Knowledge sharing with the end users, clients and documented the design, development process, the process flow and the schedule of each mappings/jobs.
- Created UNIX shell scripts for triggering/automating the execution of the informatica mappings.
- Design/develop the UNIX scripts and deployment in production using APLUS tool.
- Create the schema for the new mappings; define the data types, constraints, indexes in the database.
- Mapping where created using mapping parameters and variables and Session level parameters such as connection strings
- Worked on command tasks, event wait tasks, event raise tasks, timer tasks to implement business logic
- Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling the jobs.
- Maintained metadata, naming standards and warehouse standards for future application development.
- Used the scheduler tool Autosys for triggering the Unix scripts which inturn executes the Informatica sessions.
- Created database objects like stored procedures and views and stored functions.
- Involved in writing test plan, Unit, System integration, user testing and Implementation of the modules.
- Created the Change Data Capture (CDC) logic for the mapping for the daily load.
- Crated mappings with parameters and variables.
- Worked on various lookup caches like static, dynamic and persistent.
- Optimized the sql override to filter unwanted data and improve session performance.
- Developed User Manual and involved in training the User.
- Perform thorough end to end system testing of the functionality.
- Assist in User Acceptance testing and ensure that all issues are resolved and an official sign off is obtained from the users regarding the same.
- Monitor the scheduled batch jobs for the execution of the workflows for any issues/failure or unusual behaviour.
- Root Cause Analysis for any failures and come up with solutions to prevent such failures in future. Design and implement solutions to prevent such failures in future.
- Conduct a thorough code review and ensure that the outcome is in line with the objective and all the processes and standards are followed.
- Review the detail level design and ensure that the IT standards are followed.
- Implementation of Code Fixes and Changes in Production.
- Prepare a detailed implementation plan and checklist.
- Ensure that all steps in implementation checklist have been verified.
- Perform checkouts of the components after the production implementation.
- Facilitate and control the implementation of approved changes efficiently and with acceptable risks to the existing and new IT services and supporting vital business functions
- Create change management records for components implemented in production. Follow all the processes and standards for change control as specified by the clients Change management policy.
Environment: Informatica Power Center 8.1.1/9.0.1, Sybase, IBM DB2, Oracle 10g, Toad, Visio, UNIX, Windows 2007.
Client – Confidential,Milwaukee August 2007 – April 2008
Role: Data Retention -Developer
PeopleSoft General Ledger holds for the financial accounting of Northwestern Mutuals. The core tables of PeopleSoft GL has got millions data which was growing dramatically day. The historic data which was kept in the databases were causing performance issues and was no longer accessed. Data Retention was a Re-engineering project in which I have developed a new automated process for purging the historic data from the General Ledger system. I was also later on involved in General Ledger production support of the components developed.
Responsibilities:
- Identify the people soft general ledger system tables for the data retention activities.
- Analysis and understanding of the core business logic behind the tables identified for retention process.
- Create a detailed design document on the Retention/purge process for the GL tables.
- Creating the System appreciatiob document with the available options/proposals for the Data Retention activites.
- Delivering the desinged proposals to the end users and get the sign off on the effective proposal.
- Analysis and understanding the business requirements and the Designing of the data base for the migration of the data from SQL database.
- Conduting the review meetings for the design review and code reviews.
- Develop the Sybase stored procedures with the Business logic, UNIX scripts for executing the procedures.
- Create test cases to test the Retention/Purge process for the tables identified.
- Execute the test cases and validate the results to make sure that the intended records were purged and all the business logic was applied when the data was purged.
- Documents the implementation activties and checklist.
- Perform the testing in multiple test databases to ensure the smooth flow of the process.
- Coordinated with different teams like QA and testing teams, DBA’s for the project execution.
- Worked closely with the QA team during the testing phase and fixed bugs that were reported.
- Deployed the code to different environment in coordination with the engineering team and DBA’s.
- Migration of the Unix scripts and stored procuders with APLUS tool to the production environment.
- Implementation activities of the components created and validate the results after the first run.
- Production support of the components developed for General Leger Purge.
Environment: Sybase Database, UNIX scripts, Sybase stored procedure, Autosys.
Client – Confidential,California August 2006 – July 2007
Role: SSIS Developer
Investigation Data Migration is a complete re structuring of the current SQL databases. The investigations MS Access application is being revamped to new Web based application and along with this, a more reliable and stable database was designed in DB2.Investigation data migration is an end-to-end project which mainly focus on the migration of existing data to DB2 database and thus provides a smooth running of the new web based investigations application. Investigation –Data migration project involves solution modelling, designing, build, testing of build code and implementation of solution. Project is employing iterative model to implement the solution, so the project is implemented through multiple release.
Responsibilities:
- Identify the tables for migration required for providing the data for the new web based application.
- Analysis and understanding the business requirements and the Designing of the data base for the migration of the data from SQL database.
- Analyzed user equirements for the system and Identified business rules for data migration
- Evaluate the data migration solution to determine the scope of the overall effort.
- Evaluation of the facotrs like Number and complexity of legacy Systems Type of migration, Cost estimation for the full project lifecycle, Legacy data quality ,Amount of data and history to be converted, Target application architecture (i.e. schema, table, data types), Resource bandwidth and availability.
- Understand the existing SQL database structure and get the bad data corrected.
- Cleansing of the incorrect, redundant and outdated records.
- Create the SSIS data mappings to perform the data migration from SQL data base to the new data base designed in DB2 which is a onetime built migration process.
- Tested and validated to ensure migration perform according to defined specifications
- Loaded the data into the new application environment.
- Create test cases to ensure the Data integrity, data correctness and the record count of the data migrated.
- Avoid costly project overruns by understanding and addressing data migration issues early.
- Cleansing the data by Categorizing, standardizing.
- Transforming the data by applying business logic to data in order to adhere to target requirements.
- Document the steps written with mapping documents at a detailed level in order to be
able to have an ETL developer applying the logic without having to worry about the reason
for the transformations.
- Assist the QA team to perform the testing of the migrated data.
- Worked with DBA to set up development, test, stage and production environments.
- Prepared code migration document and worked with release team in migrating the code from Development to UAT, Production Servers.
- Reviewed code, design and test plans as appropriate throughout project lifecycle to assure IT compliance.
Production Support which includes the correction of the data that has been migrated according to the application needs.
Environment: Sql Server Integration Services, IBM DB2, SQL Database, MS Access
EDUCATION:
- BE Electronics & Communication Engineering