We provide IT Staff Augmentation Services!

Informatica Developer Resume

0/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY

  • Around 7 years in the information technology and wide range of experience in Software development, Project management, Data Integration and Informatica Data Quality(IDQ).
  • Strong Data warehousing experience using Informatica Power Center 10.1/ 9.6/9.0/8.6/8.0/7 (Source
  • Analyzer, Repository Manager, Mapping Designer, Mapplets, Transformations, Workflow Manager, Task Developer),IDQ 10.1/9.5, Power Connect for ETL, OLTP, OLAP, Data Mining, scheduling tool.Worked with heterogeneous source systems like flat files, RDBMS, DWH, parameterizing variables.
  • Involved in driving complex discussions with business analyst and systems architect to determine the solution.
  • Worked on deployment document preparation and assistance to deployment team while migration.
  • Converted the business rules into technical specifications for ETL process for populating Fact and Dimension table of data warehouse.
  • Data modelling knowledge in Dimensional Data modeling, Star Schema, Snow - Flake Schema, FACT and Dimensions tables, Physical and logical data modeling and De-normalization techniques.
  • Created mappings for various sources like DB2, Oracle, and flat files to load and integrate details to warehouse.
  • Scheduled jobs and automated the workload using the Autosys Job Scheduler, $universe Job Scheduler and Informatica Scheduler.
  • Created IDQ mappings to make data standardized and validated.
  • Developed UNIX shell scripts to scheduling Scripts by using scheduling tools.
  • Expertise in Data Warehousing with variety of Data manipulating skills like Data Migration, Data Modeling, Data Profiling, Data Cleansing and Data Validation.
  • Worked in a production support role of processing daily, weekly and monthly loads in datastage based upon the business requirements.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling
  • Scheduled the ETL jobs daily. Weekly and monthly based on the business requirement in Autosys.
  • Experience in integration of Heterogeneous data sources like Oracle, SQL Server 2005/2008, DB2,XML Files, Flat Files into staging area as well as from homogenous sources.
  • Performed Unit testing for the mappings developed.
  • Experience in Dimensional Modeling (Erwin), Star / Snowflake Schema, Fact / Dimension Tables, Business Process Analysis, Production support, Data cleansing, Data analysis and Performance tuning of sources, targets, mappings sessions and SQL’s.
  • Cleansed, labeled and fix data gaps with IDQ.
  • Experienced in UNIX work environment, File Transfers, Job Scheduling and Error Handling.
  • Scheduled sessions and batches on the Informatica server using Informatica Workflow manager.
  • Performed application systems development tasks which include working with users to define system needs, analyzing, designing and developing applications to meet user needs.
  • Provided technical support for very complex, highly critical mapping designs. Plan and coordinate tests, resolve production problems, identify opportunities to improve the performance.
  • Tuned the Performance for ETL jobs by tuning the SQL used in Transformations and fine-tuning the database.
  • Determined alternate solutions with risk analysis and identified opportunities to improve availability and advance business initiatives.
  • Hands on experience in resolving the production issues in 24/7 environment.
  • Worked on deploying the product in development environment and exposing the application to rigorous testing via functional, performance and user acceptance testing in test environments.
  • Reviewed the code developed by co-team members.
  • Experience with Sales, Retail, Financial and Banking Domains

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 10.2/9.6/9.5/9.0/8.6 , Informatica Analyst, OLAP, OLTP, ETL, Informatica Data Quality 10.1/9.5,Data Cleansing, Data Profiling.

Tools: & Utilities Informatica PowerCenter,, Toad, $Universe,Autosys,Putty, SQL Developer, Tableau

Databases: Oracle 11g/10g/9i, MS SQL Server 2008/2005

Operating Systems: Linux, UNIX, Windows, Sun Solaris 2.x

Languages/Web: SQL, PL/SQL, UNIX,C

PROFESSIONAL EXPERIENCE

Confidential, Minneapolis, MN

Informatica Developer

Tools: Informatica 10.1/9.5.1, Toad, Informatica Data Quality, $Universe Scheduler

Operating system: Windows XP/Win 7/Win 8, LINUX and UNIX

Database: SQL Server, Oracle

Languages:SQL/PLSQL

Responsibilities:

  • Extracting the data from the Relational Databases, Flat files using ETL process.
  • Used Informatica Version Control for checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production environment.
  • Developed UNIX shell scripts to scheduling Scripts by using scheduling tools.
  • Thorough Validation on source data and the load in the dimension and fact tables.
  • Developed new Informatica mappings and workflows based on specifications.
  • Extensively used XML transformation to generate target XML files.
  • Involved in Using Parameters, Variables, for doing incremental loads and CDC.
  • Created mappings to load different sources sent by IMS and CDC (Flat files, Oracle tables) to Target Oracle database.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Implemented Type 2 Slowly Changing Dimensions to maintain the historical data in sales Data mart as per the business requirements.
  • Worked on development of metric dashboards via Tableau data at varying levels of complexity.
  • Created IDQ mappings to make data standardized and validated.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Cleansed, labeled and fix data gaps with IDQ.
  • Developed and tested all the backend programs, Error Handling Strategies and update processes.
  • Building mappings, workflows in a cost effective manner.
  • Worked on all phases of data warehouse development life-cycle, ETL design and implementation, and support of new and existing applications.
  • Involved in analyzing business requirements and translating requirements into functional and technical design specifications
  • Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Captured data error records corrected and loaded into target system.
  • Implemented efficient and effective performance tuning procedures.

Confidential, New Jersey

Informatica Developer

Tools: Informatica10.1/9.5.1,InformaticaDataQuality10.1/9.5,Toad,Autosys, Tableau

Operating system: Windows XP/Win 7/Win 8, LINUX and UNIX.

Database: Oracle 9i, Oracle 10g,Teradata

Responsibilities:

  • Gathering requirements from the Data Modelers and understanding the design from the technical analysis provided.
  • Analyzing the technical documentation and concluding the design of the ETL loads.
  • Extracting the data from the Relational Databases, Flat files using ETL process.
  • Developing the ETL mappings according to the loads for Stage and Reporting.
  • Worked on data profiling & various data quality rules development using Informatica Data Quality.
  • Building mappings, workflows in a cost effective manner.
  • Worked on all phases of data warehouse development life-cycle, ETL design and implementation, and support of new and existing applications.
  • Designed, Developed and implemented Informatica Developer Mappings for data cleansing using Address validator,Labeler, Association, Parser, Expression, Filter, Router, Lookup transformations etc.
  • Imported mapplets and mappings from Informatica developer (IDQ) to Power Center.
  • Created generic rules in Informatica developer as per the business.
  • Worked with IDQ toolkit, Analysis, data cleansing, data matching, data conversion, reporting and monitoring capabilities.
  • Involved in analyzing business requirements and translating requirements into functional and technical design specifications.
  • Designed complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly.
  • Used Debugger in Informatica Power Center Designer to check the errors in mapping.
  • Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Support true ad hoc query by having predictable and consistent query performance for all queries and no requirement for known query workloads or precomputation of aggregates or summaries using tableau.
  • Captured data error records corrected and loaded into target system.
  • Implemented efficient and effective performance tuning procedures.
  • Used Address validator transformation in IDQ and passed the partial address and populated the full address in the target table.
  • Resolving performance issues by understanding the bottles necks at various levels and tuning them for better performance.
  • Resolving productions issues for any scheduled jobs failures and reporting issues to the concerned teams.
  • Documenting the Cut over plans, Design documents and Run books for the Production support teams.

Confidential

Informatica Developer

Tools: Informatica 8.6.1, CA-7 Scheduling tool

Operaring system: Windows XP, UNIX

Database: Oracle 10x, IBM DB2,SQL Server

Responsibilities:

  • Gathered Business requirements from concerned users and analyzed the Functional Requirement Documents (FRD) and drafted the Technical Specifications based on the Informatica ETL standards and also conducted Impact and feasibility analysis.
  • Used Informatica Power Center for extraction, transformation and loading (ETL) of data in the data warehouse.
  • Created various Informatica jobs using transformations such as Source Qualifier, Normalizer, Union, Expression, Filter, Connected Lookup, Unconnected Lookup, Update Strategy, Sorter, Aggregator, Router, and Sequence Generator etc.
  • Used Informatica Debugger for debugging mappings.
  • Well versed in loading the Data into different targets from source systems like Flat files, Oracle, DB2 and loaded into Oracle and Flat files.
  • Designed Reusable Transformations using the Transformation Developer & created Mapping Parameters and Variables.
  • Extracting data from set of flat files through Direct or Indirect file methods.
  • Created various Informatica tasks such as Sessions, Email, Decision, Control, Command, Timer, Assignment, Event Wait and Event Raise.
  • Developed Error Logging and Error Handling methodologies.
  • Implemented Global Parameter file to centralized all the parameters and variables for the Database connections, file paths etc.
  • Tuned Informatica mappings for better performance using various strategies based upon the analysis done.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Created UNIX Shell scripts to create wrapper scripts to run the workflows and ftp the target files to Mainframes server.
  • Installed Informatica PowerCenter 9.1 and configured in Development, QA and Production.
  • Created deployment groups for deploying the mapping from one environment to another.
  • Responsible for Release migration in between different environments.
  • Provided 24/7 On-Call support to resolve the Informatica related issues on Production system and used to support ETL Applications.

Confidential

Informatica Developer

Tools: Informatica 8.1.1, Toad, Informatica Data Quality

Operating system: Windows XP/Win 7/Win 8, LINUX and UNIX

Database: SQL Server, Oracle

Languages: SQL/PLSQL

Responsibilities:

  • Developed different mappings by using different Transformations like Aggregator, Lookup, Expression, update Strategy, Joiner, Router etc to load the data into staging tables and then to target.
  • Created ETL mappings using Informatica Power Center to move Data from multiple sources like Flat files, Oracle into a common target area such as Staging, Data Warehouse and Data Marts.
  • Involved into source code review, the mappings created by my team members.
  • Involved in deployment document and assisted to deployment team while migration.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions.
  • Creating Unit test cases for the Informatica mappings.
  • Developed the automated and scheduled load processes using Control-m scheduler.
  • Created Folders using Informatica PowerCenter Repository Manager to store Source, Target, Transformations, Mapplets and Mappings Meta Data Information..
  • Implemented Type 2 Slowly Changing Dimensions to maintain the historical data in Data mart as per the business requirements.
  • Created sessions, sequential and concurrent batches for the mappings using Workflow Manager.
  • Involved in Unit testing for the mappings developed by myself.
  • Involved in Designing and monitoring the jobs using Control-M tool based on user needs.
  • Created, scheduled and monitored the sessions and batches on the basis of the run on command, run on time using Informatica Power Center Workflow Manager.
  • Discussing the technical way of development with team lead.
  • Doing the review of code developed by co-team members

We'd love your feedback!