We provide IT Staff Augmentation Services!

Oracle And Informatica Developer Resume

0/5 (Submit Your Rating)

AL

SUMMARY

  • Over 8 years of IT experience with good knowledge in Analysis, Design, Development, Testing and Implementation of Data Warehouse application using Informatica.
  • Good proficiency in design and development of Data Warehousing using Informatica Power Center 9.6/9.5/9.1/9.0.1/8.6/8.5/8.1.1/7. x/6.x/5.x
  • And good experience on Data Warehouse Concepts like Dimension Tables, Fact tables, slowly changing dimensions, Datamarts and dimensional modeling schemas.
  • Expertise in data modeling, development and enhancement.
  • Experience in Relational Databases like Oracle, MS SQL Server, Sybase, MS Access, SQL, PL/SQL, TOAD, SQL*Plus, Linux, UNIX
  • Strong experience in Development of Data Warehouses with ETL tools Informatica Power Center 9.6/9.5/9.1/9.0.1/8.6/8.1.1/7. x/6.x/5.x
  • Strong experience in Data modeling 1. Dimensional modeling and E - R Modeling and OLTP AND OLAP in Data Analysis. Very familiar with SCD1 and SCD2 in snowflake schema and star schema.
  • Familiar about SDLC (Software Development Life Cycle) Requirements, Analysis, Design, Testing, Deployment of Informatica Power Center
  • Strong expertise in relational database management system like Oracle, DB2, MS Access, Teradata, SQL server.
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Aggregators, and Re-usable transformations.
  • Having very good knowledge in UNIX commands, shellscripting and Awk programming.
  • Good Knowledge on Hive for providing data summarization, query and Analysis.
  • Expertise in implementing complex business rules by creating robust Mapplets, Sessions, Mappings, and Workflows using Informatica Power Center.
  • Good knowledge in Informatica Master Data Management (MDM) andInformatica Data Quality (IDQ).
  • Extensive experience in extraction, transformation and loading of data from heterogeneous source systems like flat files, Excel, Oracle.
  • Designed Data Quality using IDQ and developed several informatica mappings.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Creating mappings, profiles and scorecards in IDQdeveloper and Analyst as per requirements.
  • Performing duplicate analysis using match and consolidation transformations.
  • Applying the business rules to identify the bad records and reporting them to the business.
  • Good knowledge on deploying metadata manager of Informatica power center like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.
  • Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica power center components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Informatica Administration Console).
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Aggregators, and Re-usable transformations.
  • Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools -Task Developer, Workflow &Worklet Designer.
  • Experience in performance tuning of Informatica mappings and sessions to improve performance for the large volume projects.
  • Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Good experience in writing UNIX shell scripts, SQL scripts for development, automation of ETL process, error handling and auditing purposes.
  • Design and develop applications using informatica, UNIX scripting, SQL and Autosys.
  • Highly proficient in UNIX shell scripting and administering job scheduler using Autosys UNIX machines.
  • Written various Autosys JIL scripts to activate the Linux scripts in production i.e., Jil Scripting for the Box and Commands.
  • In-depth knowledge in dealing with Flat files, Cobol and XML files
  • Excellent analytical, problem solving skills with strong technical background and interpersonal skills.
  • Efficient team player with excellent communication skills and good interaction.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 9.x/8.x/7.x/6.x/5.x(Designer, Repository Manager, Workflow manager, Workflow monitor), Informatica Data Quality.

Databases: Oracle, MS SQL server, DB2, Tera Data.

GUI Tools: TOAD, SQL plus, PL/SQL developer, SQL developer, Power Builder.

Languages/web: HTML, UNIX shell scripting and MS-Excel, XML.

Operating systems: Windows, Linux

PROFESSIONAL EXPERIENCE

Confidential, AL

Informatica Developer

Responsibilities:

  • Understanding of client business requirement and documentation of same.
  • Preparation of mapping documents based on client requirement specifications.
  • Preparation of ETL Design documents and Functional & Technical specifications
  • Created Informatica Mappings, usage of Lookups, Aggregator, Ranking, Mapplets, connected and unconnected and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done.
  • Created Data Marts and loaded the data using Informatica Tool.
  • Applying rules and policies using ILM (Information Life Cycle Management) workbench for Data Masking Transformationand loading into targets.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Experienced in troubleshooting the errors in ILM jobs with Informatica offshore team, Followed and Maintained Policies and Guidelines for data movement adhering to Client standards using with ILM tool.
  • Developing validation, cleanse functions and message bundles and testing user exits developed by user exit developers for Address, Address Unit, Company site and Other Party Site MDM Hub 9.5 and MDM IDD 9.5 Development tool
  • Implementing methods to validate that data supplied by external sources were loaded correctly into the awards database.
  • Validating the data were loaded correctly, external data will be examined for signals that indicate the source might be incorrect.
  • Created Physical schema (Data Mart) and load the data into schema using NSI Scheduling tool
  • Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
  • Working on data warehousing concepts/design with good understanding of the ETL and reporting processes
  • Participate in cross-application integration testing and system testing testing and worked with team members in the defect resolution process
  • Ensure that all timelines of loading/validating data are met with comparing host (mainframes) files
  • Ensure smooth functioning of our development, QA and Worked with team members in the defect resolution process
  • Worked with heterogeneous sources including relational sources and flat files.
  • Work with Data modeler to understand the architecture of Data warehouse and mapping documents
  • Design mappings related to complex business logic provided by data modeler, which includes Dimensions and Fact tables.
  • Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.
  • Extensively worked on Informatica IDE/IDQ.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user’s machines and resolved the issues.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Involved in the designing of Dimensional Model and created Star Schema using E/R studio.
  • Extensively worked with Data Analyst and Data Modeler to design and to understand the structures of dimensions and fact tables and Technical Specification Document.
  • OLTP and OLAP system will provide source data to data ware house which helps OLAP in data analysis.
  • Interacting with the front end users to present the proof of concept and to gather the deliverables of the team.
  • Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.
  • Doing research work to resolve production issues and data discarded during workflow runs.
  • Extract the data from relational source VCMG (Oracle), flat files and perform mappings based on company requirements and load into oracle tables.
  • Used lookup transformation, Aggregator transformation, Filter transformation, Update strategy and Router transformations.
  • Extensively used Informatica functions LTRIM, RTRIM, IIF, DECODE, ISNULL, TO DATE, DATE COMPARE in Transformations.
  • Also used Used-defined Function, which declared once globally and used in various mappings.
  • Ensure integrity, availability and performance of DB2 database systems by providing technical support and maintenance.And maintain database security and disaster recovery procedures. And performed troubleshooting and maintenance of multiple databases. And resolved many database issues in accurate and timely fashion
  • Working extensively on Informatica Designer, workflow manager to create, sessions, workflows and monitor the results and validate them according to the requirement.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Extensively involved in monitoring the jobs in order to detect and fix unknown bugs and track performance.
  • Used Informatica Workflow to create, schedule, monitor and send the messages in case of process failures.
  • Involved in Performance Tuning of sources, targets, mappings, sessions and data loads by increasing data cache size, sequence buffer length and target based commit interval.

Environment: Oracle 10g, DB2, PL/SQL, Informatica power center and IDQ.

Confidential, IA

Oracle And Informatica Developer

Responsibilities:

  • Analyzed, Designed and implemented in Informatica, PL/SQL stored procedures to Extract, Transform and Load data from external and Visilogplus internal application systems to GL and ftped to different downstream legacy systems.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Work with Data modeler to understand the architecture of Data warehouse
  • Transformed business requirements into technical specification and then source-to-target mapping documents.
  • Designed and Developed Complex Informatica Mappings, Mapplets, reusable Transformations, tasks, sessions and Workflows for Daily, Weekly and Monthly process to load heterogeneous data into the Oracle Data warehouse or external vendors and to Data marts and then to downstream systems..
  • Extensively Designed,Developedand Tested complex Informatica mappings and mapplets to load data from external flat files and other databases to Oracle Data Warehouse and various Data Marts.
  • Extensively worked on workflow manager and workflow monitor to create, schedule, monitor workflows, worklets, sessions, tasks etc.
  • Execute and Process (Migration) Informatica change control code from Development to QA and then to Production environments.
  • Created, Updated and Scheduled Autosys jobs based on user requirements in UAT and Production environments.
  • Created multiple power center mappings (20+) and power center workflows to accomplish data transformation and load process.
  • Used various complex Power Center Transformations like lookup, Joiner, Expression, Router, Update strategy, Source Qualifier, Aggregator, SQL filter, Sequence Generator, Normalizer to accomplish the mapping design.
  • Re-designed multiple existing Power Center mappings to implement change request (CR) representing the updated business logic.
  • Develop MDM solutions for workflows, de-duplication, validations etc., and Facilitate Data load and syndication
  • Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.
  • Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.
  • OLTP and OLAP system will provide source data to data ware house which helps OLAP in data analysis
  • Created User Defined Functions (UDFs) and reusable Mapplets and Transformations to simplify maintenance process and improve the productivity.
  • Installed and configured Informatica Power Exchange for CDC and Informatica Data Quality (IDQ).
  • Created custom plans for product name discrepancy check using IDQ and incorporated the plan as a Mapplet into Power Centre.
  • Experienced in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Performed Unit Testing and Integration Testing of Mappings and Workflows.
  • UNIX shell scripting and administering job scheduler using Autosys UNIX machines.
  • Designed Data Quality using IDQ and developed several informatica mappings.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects.
  • Validated and fine-tuned the ETL logic coded into existing Power Center Mappings, leading to improved performance.
  • Maintained technical documentation.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files and Excel files to staging database and from staging to target Oracle Data Warehouse database.
  • Designed and developed the logic for handling slowly changing dimension table load by flagging the record using update strategy for populating the desired.
  • Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata.
  • Experience in job scheduling tools like Autosys and control-M setting up, monitoring in detail and using job control-M.
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practicesenvironment
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partition and data/index cache to manage very large volume of data.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions and validations based on design specifications for unit testing, expected results, preparing test data and loading for testing, error handling and analysis.
  • Used Debugger in troubleshooting the existing mappings.
  • Even used Teradata sources which are used in order to maintain huge databases.

Environment: Informatica, IDQ,Oracle 10g, Teradata 12.0, Autosys, Windows, SQL server 2005, TOAD.

Confidential

ETL informatica developer

Responsibilities:

  • Designed ETL's using Informatica Power Center 9.1.0 to extract data from the functional Data Marts and aggregate the metrics into the SQL Server Database which would be later used to generate Executive dashboards.
  • Work with Data modeler to understand the architecture of Data warehouse and mapping documents
  • Extraction from various Source databases such as Flat Files and Oracle stage tables using Informatica mappings.
  • Worked extensively on different types of transformations like Source Qualifier, Expression, Filter, Aggregator, Joiner, Rank, Update Strategy and Lookup.
  • Used Informatica features to implement Type II changes in slowly changing dimension tables.
  • ETL processes using Informatica to load data from heterogeneous sources to target Oracle Data Ware house database.
  • Analyzing the source data coming from various databases and files
  • Identified data source systems (Oracle Apps, IMS data, Legacy Systems, files) integration issues and proposing feasible integration solutions.
  • Created Oracle PL/SQL Stored Procedures, Packages, Triggers, Cursors and backup-recovery for the various tables.
  • Leveraged Explain Plan to improve the performance of SQL queries and PL/SQL Stored procedures.
  • Identifying and tracking the slowly changing dimensions (SCD), used CDC logic (Change Data Capture) for the SCD tables loading in Oracle.
  • Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.
  • OLTP and OLAP system will provide source data to data ware house which helps OLAP in data analysis.
  • Ensure integrity, availability and performance of DB2 database systems by providing technical support and maintenance.And maintain database security and disaster recovery procedures. And performed troubleshooting and maintenance of multiple databases. And resolved many database issues in accurate and timely fashion
  • Extracting data from Oracle and Flat file, Excel files, XML and COBOL sources and performed complex joiner transformations, Expression, aggregate, lookup, stored procedure, filter, Router transformations and Update strategy transformations to extract and load data into the target systems.
  • Installed and configured Informatica Power Exchange for CDC and Informatica Data Quality (IDQ).
  • Created custom plans for product name discrepancy check using IDQ and incorporated the plan as a Mapplet into Power Centre.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Designed Data Quality using IDQ and developed several informatica mappings.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Fixing invalid Mappings, Debugging the mappings in designer, Unit and Integration Testing of Informatica Sessions, Worklets, Workflows, and Target Data.
  • Created reusable Tasks, Sessions, reusable Worklets and workflows in Workflow manager.
  • Provided user training and production support.
  • Fine Tuning of the SQL Statements to improve the database Performance.

Environment: Informatica Power Center 8.6.1, MDM and IDQ, Oracle 10g, DB2, SQL*Loader, TOAD.

Confidential, CT

ETL Informatica developer

Responsibilities:

  • Worked with Systems Analyst to review the FRD and got the design approval on the ADS.
  • Attend daily stand up call to discuss on outstanding issues and daily progress.
  • Completed code review with the lead/architect and share the unit test result with SA, QA and DEV team.
  • Worked with fixed width and semicolon delimited flat files as the group and eligibility extracts are written on the flat files.
  • Strong experience in health care claims processing, billings and payments.
  • Used Confidential 's application to setup the group and subscriber data in the FACETS application,
  • Updated SQL overrides in source qualifier according to the business requirements
  • Performed data manipulations using various Informatica transformations like Expression, Router, Lookup, Aggregator, Filter, Update Strategy, Stored Procedure, etc.
  • Created and used different tasks like Command Task, Event Wait, and E-mail etc.
  • Worked on Performance Bottlenecks to improve the performance of the sessions and workflows by using the HINTS, Pre SQL and Post SQL.
  • Created Deployment Groups to move the ETL code from RS Dev to RS TST, RS UAT and RS PROD.
  • Created Harvest package ( Confidential 's homegrown tool for Version control) and promoted the DG along with the other objects (PARM file, DDL, DML changes).
  • Worked with the offshore DEV and QA team.
  • Updated the project documentation in Project SharePoint folder.

Environment: Informatica Power Center 9.1, Oracle 11g, Flat files, Facets 4.51, TOAD for Oracle 11.6, ESP Scheduler, Visio, SQL, UNIX, Unix Sell Scripting, Windows XP

Confidential

Informatica Developer

Responsibilities:

  • Troubleshoot and diagnose complex technical work, resolve end User issues.
  • Taking responsibility in fixing the defects.
  • Loading of history data (around 450 million records) and addressing the issues if found.
  • Checking the daily run jobs in Control-M and analyzing the failures.
  • Helping in deploying the code from Development environment to Test and Production environment
  • Helping team members in resolving their problems and make sure the defects are closed on time.
  • Participated in giving estimates for the project.

Environment: Informatica Power Center 9.1, Visio, SQL, UNIX, Unix Sell Scripting, Windows XP

We'd love your feedback!