We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

TOTAL IT: Nine years (9+) Years of Total IT experience in the Business Requirements Analysis, Application Design, Data Modeling, Development, Implementations, Production support and Testing of Data warehousing and Database business systems
DATA WAREHOUSING: Eight years (8) Years of Data warehousing ETL experience implementing Data Warehousing Applications including Analysis, Architecture, Design, Development, Administration, Support, Upgrade and Troubleshootusing Informatica Power Center / PowerMart 9.0/8.6/7.1/6.2/5.1/4.7, ETL, Datamart, OLAP, OLTP, CA Autosys, BMC Control M, IBM Maestro.

  • Performed DW Strategy, Road Map and Assessment.
  • Reviewed Business, Technical and Data Architecture to develop the Data Warehouses.
  • Responsible for Design, Architecture and Deployment of ODS, DW and Datamarts
  • Facilitate JAD Sessions, responsible for Analysis and Technical Architecture.
  • Experience in Data warehousing concepts with emphasis on ETL and life cycle development. Created, executed and managed ETL processes
  • Understood and implemented Data Extraction, Data Transformation, Data Loading, Data Conversions and Data Analysis.
  • Responsible for ETL Analysis and Design
  • Designed and developed efficient Error Handling method and implemented throughout the ETL
  • Assisted in Production support
  • Hands on experience in tuning ETL, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings/jobs, and sessions.

DATA MODELING / ANALYSIS: Five Plus (5+) years of experience in Data warehouse concepts and principles of Ralph-Kimball and Bill-Inmon methodologies – Star Schema, Snowflake, SCD, Surrogate keys, Normalization/De normalization, Facts and Dimensions tables, Physical and Logical data Using Erwin 7.2/4.5/3.x, for Data Marts and Staging Database. Performed extensive Data Cleansing and Analysis using IDQ, IDE, IDQ, Trillium, First Logic.

  • Requirements Gathering, Source Data Analysis
  • Source to Target Mapping
  • Data warehouse design and architecture
  • Dimensional Data Modeling experience using ERwin 7.2/7.1/4.0/3.5.5/3.5.2/3.x
  • Star Schema/Snowflake modeling, FACT & Dimensions tables
  • Physical & logical data modeling
  • Extensively followed Ralph Kimball and Bill Inman Methodologies.
  • Develop Logical and Physical data models
  • Designed and Customized data models for Data warehouse
  • Architecture design by effective data modeling, implementing database standards and processes.
  • Performed Data Profiling.
  • Responsible for Business Analysis and Requirements Gathering
  • OLAP & OLTP reporting and analysis
  • Created Oracle PL/SQL queries and Stored Procedures. Worked on data loads using Oracle PL/SQL and SQL *Loader from legacy systems and uploaded into Oracle
  • Created PL/SQL triggers and Cursors.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing. Writing PL/SQL code using the technical and functional specifications.
  • Implemented Explain Plan and SQL Trace for improving the performance of SQL queries.
  • Extensively worked with Oracle products, tools and utilities like SQL*Loader, PL/SQL, SQL*Plus, OEM. Experience in Oracle backend operations for creating stored procedures, indexes, packages, triggers, external tables, materialized views, partitioned objects, and writing queries for efficient retrieval of data. Assisted the work lead in Timely delivery of projects, planning and administration.
  • Extensive Data Warehouse and OLTP experience using MS Informatica PowerCenter for designing and developing transformations, packages, sessions and for scheduling and configuring jobs.
  • Eight years (8) Years of BI Experience in OBIEE, Business Objects 12/XI/6.5/6.0/5.1/5.0 Cognos Series 7.0/6.0.
  • Eight years (8) Years of database experience using Oracle 11gR2/11gR1/10gR2/10gR1/9i/8i, SQL, PL/SQL, SQL*Loader, Stored Procedures, TOAD, Explain Plan, TKPROF Functions, Ref Cursors, Constraints, Triggers, Indexes-B-tree Index, Bitmap Index, Views, Materialized Views, Database Links, Export/Import Utilities, Developer 2000, Oracle Report Writer, Sybase Server 12.0/11.x, DB2 8.0/7.0 (DB2 LOOK, DB2 MOVE, DB2 REORG), MS SQL Server 2005/2000/7.0/6.0, MS Access 7.0/2000, Teradata V2R6/V2R5/V2R4.

EDUCATION & CERTIFICATIONS

  • Bachelors in Engineering
  • Brain Bench Certified in Oracle, SQL, Data warehousing concepts

PROFESSIONAL SUMMARY

Confidential, NY APR’11 – CURRENT SR. INFORMATICA DEVELOPER

The main purpose of the project was to convert the history and the present data from Teradata (eMedNY) to Oracle (MDW). The Medicaid Data Warehouse (MDW) is the replacement for the eMedNY Medicaid data warehouse which accepts data from the eMedNY Medicaid online transactional processing (OLTP) system, external agency transactional and reporting systems, and external vendor databases. This data is currently transformed and loaded into a Teradata database designed to support analysis, research and reporting. The process that converts this operational data into analytical data is called Extract, Transformation, and Load (ETL).

Responsibilities:

  • Collected requirements from Business Users, analyzed and prepared the technical specifications
  • Used ETL tools Informatica 8.6.3/9.0.1 to extract data from different source systems, cleanse Transform, and Load data into databases.
  • Developed mappings using Power Center -Designer for data transformation as per the technical requirements
  • Designed mappings for different subject area CLAIMS, DENIED CLAIMS, PROCEUDRE, PROVIDER, REFERENCE, MEMBER,WMS
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy, Sequence generator, Union, Lookup and Procedure. Knowledge in use of SQL and Java Transformations
  • Developed (SCD 1) and (SCD 2) to capture new changes while maintaining the historic information.
  • Used Update Strategy DD_INSERT, DD_UPDATE, DD_DELETE and DD_REJECT to insert, update, delete and reject items based on the requirement.
  • Worked extensively with Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
  • Built PLSQL procedures, functions as a part of custom transformations
  • Designed work flows to execute the mappings in orderly fashion, called PLSQL procedures in the work flows and other functions
  • Used Reusable Transformations and Reusable Mapplets for different validations.
  • Setting up batches and sessions to schedule the loads at required frequency using Power Center server manager
  • Involved in Version control of the jobs to keep track of the changes in the Development Environment.
  • Designed views and materialized views to create accumulated data
  • Extensively worked on UNIT TESTING and created different unit test case documents for different subject areas.
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.
  • Used debugger to debug the mapping and fixed the bugs
  • Developed UNIX shell scripts to schedule the jobs for running
  • Troubleshooting issues and assisted in production support.
  • Tuned and optimized mappings to reduce ETL run times thereby ensuring the mappings ran within the designated load window.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.

Environment: Environment: Informatica 8.6.3/9.0.1, SQL Server 2005/2000, PL/SQL, TERADATA, UNIX, VIRTUAL WINDOWS XP, WINDOWS 7, IBM Rational Clear Case, OBIEE,SQL Developer 2005

Confidential,PLAINSBORO, NJ AUG’08 –MAR’11 SR. Informatica Developer

The main purpose of the project was to build a Datamart on Sales and Marketing to enhance the business strategy and operations for the Sales and Marketing Department.

ETL / OTHER RESPONSIBILITIES:

  • Designed, developed Informatica mappings, enabling the Extract, Transform and Loading of the data into target tables.
  • Design and deploy ETL job workflows with exemption handling and reporting strategy.
  • Implemented Type2 Dimension logic in mappings using Informatica Designer
  • Unit tested the mapping logic for correctness.
  • Designed and implemented change data capture and web services.
  • Extensively utilized the Debugger utility to test the mappings.
  • Provided support for the implemented code for defects and enhancements.
  • Extensively used Power Center o design multiple mappings with embedded business logic.
  • Created shell scripts to run the workflows through UNIX using pmcmd command. Used Autosys for scheduling.
  • Tuned performance of Informatica mappings using components like Parameter files, Variables and Dynamic Cache.
  • Analyzed IMS Rx Data using IMS tools such as Xponent and Xponent PlanTrak, regarding segmentation & profiling.
  • Analyzed the IMS DDD (Drug Distribution Data) and Rx data - Xponent and Xponent Plantrak
  • Applied appropriate field level validations like date validations for cleansing the data.
  • Using Workflow Manager, Server Manager for Creating, Validating, Testing and running the Batches and Sessions and scheduling them to run at specified time.
  • Maintain Development, Test and Production Repositories using Repository Manager. Also used Repository Manager to maintain the metadata, Security and Backup and Locks
  • Used Debugger to troubleshoot the mappings.
  • Performance tuning of databases and mappings.
  • Responsible for Error Handling and bug fixing.
  • Tuning the Informatica Mappings for optimum performance. Replaced filters with Routers and SQL overrides for heterogeneous sources.
  • Worked extensively with Oracle SQL and PL/SQL Coding.
  • Responsible for Error Handling and bug fixing.
  • Used Business Objects, OBIEE for Reporting and Data Mining to generate Monthly/Quarterly/Yearly reports.
  • Performed unit / integrated testing. Assisted in UAT.
  • Responsible for production support.

DATA MODELING / DATA ANALYSIS RESPONSIBILITIES:

  • Requirements Gathering and Business Analysis. Project coordination, End User meetings.
  • Translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Involved in System Study analysis for logical/physical data model there by defining strategy for implementing Star Schema with Fact and Dimension tables.
  • Responsible for Data Modeling. Created Conceptual, Logical and Physical models for Staging, Transition and Production Warehouses using Erwin 7.2.
  • Data Analysis and Source to Target Mapping
  • Build Enterprise Data Warehouse to store historical data coming from multiple operational systems.
  • Build Data Mart using dimensional modeling, create FACT, Dimension tables in Star Schema and maintain existing (Type1) data.
  • Used Erwin report template to publish data dictionary, maintain data modeler checklist, create views, material views, tables, constraints, assign table spaces, bitmap indexes, primary keys, foreign keys.
  • Generate DDL statements from data model using Forward Engineering, and generate data model using DDL statements using Reverse Engineering.
  • Use Complete/Compare to discover discrepancy data types, columns between Oracle database and physical model.
  • Requirements Gathering and Business Analysis.
  • Developed Logical and Physical Data Model using Erwin, followed Star Schema to build the Datamart.

Environment: Informatica 9.0/8.6, Erwin 7.2/4.5, Oracle 10g/11g, Autosys, SQL Server 2005/2000, IMS Data (XPonent, Plantrak, DDD),Toad, SQL, PL/SQL, Unix Shell Scripting, Business Objects XI/6.5, OBIEE.

Confidential,NY OCT ‘06 – AUG ‘08

SR. Informatica Developer

Starwood is one of the world’s largest hotel and leisure companies. It conducts hotel and leisure business both directly and through its subsidiaries . Its brand names include the following : St.Regis, The luxury Collection, W®, Westin®, Le Meridien®, Sheraton®, Four Points®, Aloft, Element .Through its brands, we are well represent in most major markets in the world. Our operations are grouped into two business segments, hotels and vacation ownership and residential operations. This project was implemented to maintain customer database for analyzed for further direct marketing campaigns.

ETL / OTHER RESPONSIBILITIES:

  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica PowerCenter 8.6.1/8.1.1
  • Extensively used Informatica Client tools – Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.
  • Worked on multiple disparate Data sources ranging from flat files, XML Files, Oracle, SQL server databases to load the data into Oracle Database.
  • Created and modified source database tables as per the design requirements and applying constraints to maintain complete Referential Integrity and creating indexes for performance and loading the data into Oracle database tables.
  • Designed and developed Mappings using Mapping Designer to load the data from various sources using different transformations like Aggregator, Lookup (connected and unconnected), Filter, Router, Rank, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter, Sequence Generator and Update Strategy transformations.
  • Developed mappings with transformations (reusable) and mapplets confirming to the business rules. Developed and wrote procedures for getting the data from the Source systems to the Staging and to the Data Warehouse and Data Mart.
  • Integrated various sources in to the Staging area in Data warehouse for Integrating and Cleansing data.
  • Performed Source to Target Mapping.
  • Installed, Maintained and Documented the Informatica setup on multiple environments.
  • Designed, Developed, Deployed and implemented ETL mappings using Informatica
  • Responsible for Design, Development, Administer and automation of ETL processes Informatica.
  • Created Sessions, Workflows and Worklets for data loads using Workflow Manager
  • Created events and tasks in the work flows using workflow manager
  • Monitored Workflows and Sessions using Workflow Monitor.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings.
  • Executed Pre and Post session commands on Source and Target database using Shell Scripting.
  • Used mapping parameters and variables at mapping and session levels to tune the performance of Mappings.
  • Migrated Workflows, Mappings, and other repository objects from Development to QA and then to production.
  • Created Informatica sessions in workflow manager to load the data from staging to Target database.
  • Created mappings to incorporate Incremental loads.
  • Created reusable Mailing alerts, events, Tasks, Sessions, reusable worklets and workflows in Workflow manager.
  • Followed proper naming conventions for the transformations and other repository objects.
  • Re-designed existing Informatica mappings to have them compliant with the best practices which included eliminating un-necessary source lookups.
  • Responsible for performance tuning at all levels of the Data warehouse. Implemented performance tuning by using lookup caches, using tables with fewer rows as the master table in joiner transformations, dropped indexes and re-created them after loading data to targets and increased the commit interval.
  • Generated various reports using Data Analyzer showing key performance metrics such as member enrollments, franchisee satisfaction, consumer hotel stay counts, revenue etc.
  • Worked on Performance Tuning of various sources, targets, mappings and sessions to identify and remove Bottle Necks.
  • Improved workflow performance by shifting filters as close as possible to the source and selecting tables with fewer rows as the master during joins.
  • Prepared ETL mapping documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment and to transfer knowledge to production support and other team members.
Involved in Unit testing, User Acceptance Testing and System Testing to verify accuracy and completeness of ETL process.
  • Used control – M for Job scheduling.
  • Worked cooperatively with the team members to identify and resolve various issues relating to Informatica and databases
  • Developed new \\ shell scripts to move data from source systems to staging and from staging to Data warehouse.
  • Created UNIX scripts to archive log files into archive directory in order to free up space on the server.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for optimized performance.
  • Worked with the team to ensure the quality of coordinated business functions.
  • Prepared, documented unit test plans and resolved issues escalated from system testing.
  • Validated the BID data in production after moving the code to Production.
  • Performed SQL Querying, PL/SQL Coding and Unix Coding.
  • Coordinating with source system owners, performed data migration and monitored day-to-day ETL progress, Data warehouse target schema Design (Star Schema) and maintenance.
  • For Performance tuning used Explain plan Tkprof tools. Developed Views and Materialized Views.
  • Optimized/tuned mappings for better performance and efficiency. Wrote UNIX shell scripts for run the Sessions
  • Created different types of reports like Slice and Dice, Drill Down, Master Detail using Business Objects.
  • Used Qlikview for reporting
  • Performed Unit & Regression Testing of the application

We'd love your feedback!