Sr. Informatica Developer Resume
New, YorK
SUMMARY
- 9+ Years of professional experience in Information Technology with extensive experience in Analysis, Design, Development, Deployment, Testing, Enhancement and Maintenance of Business Intelligence solutions using Informatica PowerCenter and Informatica Data quality in various domains viz. Banking, Rebate, and Insurance business verticals.
- Strong work experience in Informatica Data Quality 9.6 (IDQ) - InformaticaDeveloper and Informatica Analyst tools for Analysis, data cleansing, creating rules and Data Standardization.
- Extensive experience with Data Extraction, Transformation, and Loading (ETL) from heterogeneous data sources of multiple relational databases like Oracle, SQL Server, DB2, Teradata etc. and worked on integrating data from VSAM files and flat files like fixed width and delimited CSV, XML files etc. into a common reporting and analytical Data Model.
- Worked with Informatica Power Exchange for extracting mainframe data by creating Data map from Copybook and for handling the change data capture (CDC) from the oracle database.
- Knowledge on Data warehousing concepts and principles (Kimball/Inman) - Star Schema and Snowflake schema modelling.
- Designed and implemented Slowly Changing Dimensions (SCD) type1 and type2.
- Experience in various stages of SDLC development life cycle and its approaches like Waterfall, Agile etc.
- Worked on Profiling and Analyzing of source data using IDQ to determine accuracy and completeness of the source data which clarifies the structure, relationship, content and derivation rules of data and implemented Data Quality Solutions like Standardization, Matching of Source data etc.
- Gained experience in IBM DB2/UDB database.
- Developed various mappings using different transformations like Joiner, Lookup (Connected and Unconnected), Source Qualifier, Router, Filter, Expression, Rank, Normalizer, Sequence Generator, Transaction Control, Union, Aggregator, Update Strategy, Java, XML, Data Masking, Mapplet etc. and worked with Informatica mapping variables / parameters and session variables.
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Command, Control, Decision, Session etc. in the workflow manager.
- Worked with Oracle, PL/SQL Stored Procedures, Functions, Indexes and Triggers and involved in Query Optimization and worked on Teradata SQL, Stored Procedures and Utilities.
- Involved in the creation of SQL queries to generate reports, answers and dashboards using different reporting tools like OBIEE, SSRS etc.
- Scheduled ETL load using utilities like CA7 scheduler, Autosys, Control - M etc. and used the Harvest tool to maintain code changes.
- Strong in UNIX / Linux Shell Scripts and extensively involved in writing FTP and NDM / Connect Direct programs to transfer files from Windows to UNIX and UNIX to Windows.
- Experience in Identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Session partitioning, Load strategies, commit intervals, transformation tuning and implementing Pushdown optimization.
- Experience in Analyzing, Creating, Updating and Maintaining of project documents, including business requirements, functional and non-functional requirements, functional design (HLD and LLD) and data mapping.
- Good interpersonal and communication skills, presentational skills and desire to work in dynamic and challenging environments.
TECHNICAL SKILLS
ETL tools: Informatica PowerCenter 9.6/8.5.0 (Designer, Workflow Manager, Workflow Monitor, Repository manager) Informatica Data Quality and Informatica Power Exchange
Databases: MS SQL server 2005/2008, Oracle 10g/11g, Teradata, IBM DB2
Operating Systems: Unix (Sun Solaris, AIX), Windows XP/ Vista/7/8/10, Linux, and Mainframe
Reporting Tools: SSRS, OBIEE
Data Modelling Tools: MS Visio, Erwin
Languages: SQL, PL/SQL, UNIX/Linux Shell scripts, FTP and NDM scripts, COBOL, C, C++, Java
Scheduling Tools: CA7 scheduler, Autosys, CA ESP Scheduler, Informatica Scheduler
Methodologies: Data Warehousing Design, Data Modelling, Logical and Physical Database Design
Testing Tools: Quality Center and Application Lifecycle Management (ALM)
Special Tools: Toad, Putty, WinSCP, B2B Data Transformation Studio, Harvest, Service now, PTM, MS Office 2007, 2010, Excel, PowerPoint.
PROFESSIONAL EXPERIENCE
Confidential, New York
Sr. Informatica Developer
Responsibilities:
- Worked extensively in the study of the existing vendor system and Involved in all phases of engineering System Development Life Cycle (SDLC) using the Agile Scrum Model.
- Involved as the Informatica Lead role with the client at on-site, which includes, involving in Business requirement, understanding the requirements of Business team, pass on the requirements to the offshore development team and lead the project from Design, Development, Unit testing and Deployment.
- Designed new dimensional data model after understanding of existing processes with the help of DBA team.
- Prepared the DDL’s for the staging / work tables and coordinated with DBA for creating the development environment and data models.
- Evaluated data, such as data analysis, data cleansing, data matching, exception handling, and reporting and monitoring using the Informatica Data Quality (IDQ).
- Provided relevant outputs and results from the data quality procedures, including any ongoing procedures that runs after the project end.
- Involved in the design and implementation of Informatica (IDQ v9.1) such as Data Quality applications, standards, Data Profiling and best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, Version Control.
- Acquired experience to gain complex quality rule and index design, development and implementation patterns with cleansing, parse, standardization, validation, scorecard, exception, notification and reporting with ETL and Real-Time consideration.
- Documented at a functional level how the procedures work within the data quality applications.
- Created standards for the Mapplets and code migration from development through testing.
- Developed and implemented additional data quality monitoring based on requirements gathered from business units and maintained the business playbook which lists all re-useable business rule assets.
- Provided Data Quality Scorecards, Dashboards and Ad-Hoc reporting.
- Defined and deployed data quality programs and tools on enterprise data quality projects.
- Acquired strong experience in data profiling and data quality rules development using Informatica Data Quality tools.
- Performed data profiling, data mapping, data validation, data manipulation, data analysis, use case, test cases, etc.
- Acquired strong experience in translating business problems into actionable data quality initiatives.
- The Usage of Informatica Data Quality 9.6.0 (IDQ) involved for Analysis, data cleansing, creating Rules and Data Standardization.
- Profiling and analyzing of source data using Informatica IDQ to determine accuracy and completeness of the source data which clarifies the structure, relationship, content and derivation rules of data.
- Worked on Informatica Power Exchange for extracting Mainframe data by creating a Data map from Copybook.
- Extensively worked with all the client components of Informatica like Repository Manager, Designer, Workflow Manager, Workflow Monitor.
- Developed ETL design using various transformations like Source Qualifier, Aggregator, Sorter, Joiner, Lookup, Stored Procedure, Router, Filter, Transaction Control, Sequence Generator, Expression, JAVA, and XML as per necessity for source-to-target data mappings and to load the target table.
- Implemented Slowly Changing Dimensions (SCD) phenomenon using Informatica ETL mapping to load SCD Type1 and Type2 tables.
- Worked on utilities in Teradata including BTEQ, Fast load, Multi Load (MLoad) and Tpump and implemented SQL, PL/SQL Stored Procedures, Functions, Indexes and Triggers and involved in Query Optimization.
- Implemented the solution to transfer all the transactions and reports present, in the existing system to the new system on the first day of migration as part of the initial load.
- Created NDM scripts to transfer/extract flat files from other servers.
- Worked to deal error handling strategy for trapping errors in a mapping and sending errors to an error table.
- Involved in both Unit and system testing of Implemented ETL components and UNIX scripts and uploaded the Test results into ALM.
- Responsible for deliverables with reviews conducted and created code documentation
- Scheduled Production Jobs in CA7 Scheduler through CA7 DOCs creation.
- Deployed the developed components through Harvest packages to maintain code changes.
- Verified the deployed components after migration and Monitored the batch for the first few days as part of the validation.
- Provided regular status on all work activities that ensures management to get updates on all issues and risks.
Environment: Informatica Power Center 9.5, Informatica Power Exchange 9.5, Informatica Data Quality (IDQ) 9.5, SQL, PL/SQL, UNIX, ALM, Oracle 11g, Teradata 13.x, Mainframe, NDM, WinSCP, Windows, Toad, CA7 scheduler, MS Visio, ALM, ServiceNow, Harvest.
Confidential, Michigan
Sr. Informatica Developer
Responsibilities:
- Interacted with the Business users to identify the process metrics and various key dimensions and measures and worked on the implementation of the project using Agile approach.
- Involved in building the data integration between Enrollment.
- Populated the data from various sources into the target tables comprising rebate information.
- Worked with the Informatica Power Exchange to handle the change data capture (CDC) data from the source and loaded into the schema containing target tables.
- Developed complex mappings to populate and incrementally load the source data to the staging area using joiner, sorter, connected lookup, router, filter, update strategy, expression using Informatica designer taking performance into consideration.
- Eligibility and Claims Information from multiple vendors Sources to Warehouse and generate Extracts from there for analysis.
- Created Workflows and used various tasks like Email, Assignment, Decision and Session in the workflow manager.
- Worked with heterogeneous source to extract data from Oracle database and flat files and loaded into a relational Teradata warehouse.
- Performed FTP to extract the flat files from other server.
- Involved in the development of PL/SQL Stored Procedures, Functions, Indexes and Triggers to process business data.
- Acquired Knowledge on utilities in Teradata including Fast load, Multi Load (MLoad) and Tpump.
- Contributed in performance tuning of the existing project from source to target level and debugged invalid mappings using break points, testing the stored procedure, functions, sessions, and batches to check the target data.
- Developed and implemented additional data quality monitoring based on requirements gathered from business units.
- Managed and prioritized a request queue for new data quality monitoring requests over new data elements, new data quality rules, and maintenance to existing rules.
- Provided a high level of data quality awareness within the Data Service Delivery Unit assigned.
- Provided reports for data matching, consult on pattern analysis and provided insight to data quality errors.
- Supported data cleansing activities and spreadsheets that are utilized for Winshuttle load scripts.
- Troubleshooting of long running sessions and fixing the issues.
- Worked extensively with variables and parameters in both mapping and session level to pass the value and to maintain the source and target environment information.
- Involved in Unit Testing of the code with the testing team and resolved the issues encountered before migrating into production.
- Produced technical documentation with the changes/enhancement in accordance with SDLC guidelines and managed the whole deployment process and the post deployment validation in a production environment and scheduled the jobs in a production environment using Autosys.
- Analyzed design specifications, development, and technical and user documentation and requirements surrounding the EDW technology components.
- Established a technology vision that integrated the various source systems into the target EDW infrastructure
- Provided regular status on all work activities that ensured management to update on all issues and risks.
Environment: Informatica Power Center 9.5, Informatica Power Exchange, Oracle, Teradata, SQL, Flat Files, Toad, WinSCP, UNIX / Windows, Erwin, Korn Shell, Quality Center, Autosys, FTP
Confidential
Informatica Developer
Responsibilities:
- Created sessions, database connections and batches using the Informatica Workflow Manager.
- Performed Unit testing on the Informatica code by running in the debugger and writing simple test scripts in the database, thereby tuning it by identifying and eliminating the bottlenecks for the optimum performance.
- Developed common routine mappings and made the usage of mapping variables, mapping parameters and variable functions.
- Extensively worked with a Netezza database to implement data cleanup, performance tuning techniques.
- Gather and prepare analysis based on information from internal and external sources to evaluate and demonstrate program effectiveness and efficiency.
- Created and Configured Workflows, Worklets, and Sessions to load the data to target warehouse Netezza tables using Informatica Workflow Manager.
- Created various UNIX shell scripts for Job automation of data loads.
- Developed various command tasks to automate the Pre session jobs and worked on performance tuning to improve the load.
- Developed/modified the PL/SQL Procedures and Functions to enhance the reusability of the code to be used later in various applications.
- Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation
Environment: Informatica PowerCenter 9.1, Oracle, shell scripting, SQL Developer, COBOL, UNIX, Netezza, Erwin, TOAD, SQL*Plus, SQL*Loader, Autosys,Teradata
Confidential
Informatica Developer
Responsibilities:
- Extensively interacted with the Client, Business User’s and reporting Team for requirement gathering and assisted to design the Functional Requirement Document.
- Created several mappings in the Informatica designer 8.5 by using several transformations like Source Qualifier, Joiner, Aggregator, Rank, Expression, Sequence Generator, Lookup (Connected and Unconnected), Update Strategy, Transaction Control, Java, Normalizer, Filter, Router etc.
- Performed various levels of data analysis, data cleansing, data profiling, and data validation using different transformation in Informatica Designer.
- Extensively worked with the Informatica Workflow Manager and Workflow Monitor to schedule, run, debug and test the application on development and to obtain the performance statistics
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Command, Control, Decision, Session in the workflow manager
- Prepared ETL mapping documents for every mapping and Data Migration document for smooth transfer of a project from development to testing environment and then to production environment.
- Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
- Migrated repository objects, services and scripts from development environment to production environment. Extensive experience in troubleshooting and solving migration issues and production issues.
Environment: Windows XP/NT, Informatica Power Center 8.5, Informatica Power Exchange 8.5, SSRS, MS SQL server, PL/SQL, Flat Files, MS Visio, Linux, Mainframe, ALM, FTP, PTM, Informatica Scheduler.