We provide IT Staff Augmentation Services!

Sr. Dw Informatica Resume

5.00/5 (Submit Your Rating)

Philadelphia, PA

S U M M A R Y

  • Nine Plus (9+) years ofTotal IT experience in the Business Requirements Analysis, Application Design, Data Modeling, Development, Implementations and testing of Data Warehousing and Database business systems.
  • Seven plus (7+) years of Data warehousing ETL experience implementing Data Warehousing Applications including Analysis, Architecture, design, Development, Support using Informatica Power Center / Power Mart 9.0/8.6/8.1/7.1x/7.0/6.2/5.1/4.7, (Source Analyzer, Repository Manager, Server manager, Mapping/Mapplet/Transformations/Warehouse Designers, Workflow Manager, Work flow Monitor, Data Analyzer, Metadata Manager) Informatica Power Analyzer, Power Exchange, Super Glue, Informatica Power Connect, Informatica Power Plug, ETL, Datamart, OLAP, OLTP, CA Autosys, BMC Control M, IBM Maestro. Data cleansing experience with First Logic (Ace, DataRight, Merge/Purge), Trillium 7.0 (Converter, Poster, Geo-Coder, Matcher), JAVA, UNIX, Tidal, WEBLOGIC, WEB 2.0, HTML, Siebel.
  • Extensive Data Warehouse experience using Informatica Power Center for designing and developing transformations, mapping, sessions and for scheduling and configuring workflows.
  • Seven Plus(7+) Years of Dimensional Data Modeling and Relational Data Modeling experience using, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling, ERWIN 7.2/4.5/4.0/3.5/3.2, Oracle Designer, Sybase Power Designer. Proficient with Ralph Kimball and Bill Inman Methodologies. Thorough understanding of Database Concepts and experience in E-R modelingnormalization of tables and in using development tools like ERWIN 7.2/4.5/4.0/3.5 for forward/reverse engineering.
  • Seven Plus(7+) Years of BI Experience in OBIEEBusiness Objects XI/6.5/6.0/5.1/5.0 (Designer 5.0, and Developer Suite, Broadcast Agent, Supervisor, Designer, Info View & Set Analyzer 2.0, Universe Developer, Supervisor, Web Intelligence 2.6/2.7), Cognos Series 7.0/6.0, QlikView 9.0/8.0/7.0.
  • Nine Plus (9+) Years of database experience using Oracle 11g/10g/9i/8i, SQL, PL/SQL, SQL* Loader, Stored Procedures, TOAD, Explain Plan, TKPROF Functions, Ref Cursors, Constraints, Triggers, Indexes-B-tree Index, Bitmap Index, Views, Materialized Views, Database Links, Export/Import Utilities, Developer 2000, Oracle Report Writer, Sybase Server 12.0/11.x, DB2 8.0/7.0 (DB2 LOOK, DB2 MOVE, DB2 REORG), MS SQL Server 2008/2005/2000/7.0/6.0, MS Access 7.0/2000, Teradata V2R6/V2R5.
  • Extensively worked with ETL tools to extract data from various source including Oracle, Flat files, Teradata. Experience in Performance Tuning of sources, targets, mapping and sessions. Worked with Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL* LoaderExtensive Expertise with error handling.
  • Strong knowledge of Software Development Life Cycle (SDLC) including Requirement analysis, Design, Development, Testing, Support and Implementation. Provided End User Training and Support.
  • Experienced in TOAD to test, modify and analyze data, create indexes and compare data from different schemas. Experienced in Shell Scripting and PL/SQL procedures.

E D U C A T I O N & C E R T I F I C A T I O N S

Bachelors of EngineeringBrain Bench Certified in Informatica.
Brain Bench Certified in Data Warehousing Concepts.

T E C H N I C A L S U M M A R Y

Data Warehousing

Informatica Power Center / Informatica Power Mart 9.0/8.6/8.1/8.0/7.1/6.2/5.1/4.7, Informatica Power Analyzer, Power Exchange, Super Glue, Informatica Power Connect, Informatica Power Plug, ETL,ETI, Datamart,OLAP,OLTP,CA Autosys, BMC Control M, Tidal, IBM Maestro. Data Cleansing experience with First Logic (Ace, DataRight, Merge/Purge), Trillium 7.0 (Converter, poster, Geo-Coder, Matcher), JAVA, UNIX, Tidal, WEBLOGIC, WEB 2.0, HTML, Siebel.

Dimensional Data Modeling

Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimension Tables, Physical and Logical Data Modeling, Erwin 7.2/4.5/4.0/3.5, Oracle Designer and Sybase Power Designer.

Business / Data Analysis

Functional Requirements Gathering, User Interviews, Business Requirements Gathering, Process Flow Diagrams, Data Flow Diagrams, MS Project, MS Access, MS Office.

Databases

Oracle 11g/10g/9i/8i, SQL Server 2005/2000/7.0/6.0, Teradata v2R6/5, IBM DB2 8.0/7.0, MS SQL Server 2008/2005/7.x, Sybase 12.x/11.x, Teradata V2R5/V2R4, MS Access.

BI Tools

Business Objects XI/6i/6.0/5.1/5.0 (Designer 5.0,Developer Suite, Broadcast Agent, Supervisor, Designer, Info View & Set Analyzer 2.0, Universe Developer, Supervisor, Web Intelligence 2.6/2.7, Business Miner, Business Objects SDK, WebI SDK), Cognos Series 7.0,OBIEE,QlikView, Developer 2000, Data Reports, SQL * Loader, Toad, MS Access Reports, Visual Basic 6.0/5.0, SQL Navigator.

Programming/GUI

SQL, PL/SQL, SQL Plus, Transact SQL, ANSI SQL, C, C++, Unix Shell Scripting, HMTL, DHTML, XML.

Tools

Developer 2000, Data Reports, SQL * Loader, Toad, MS Acess Reports, Visual Basic 6.0/5.0, SQL Navigator.

Environment

Sun Solaris 2.6/2.7/2.8, HP-UX 10.2/9.0, IBM AIX 4.2/4.3 ,MS DOS 6.22, Win 95/98/NT/2000.

T E C H N I C A L S U M M A R Y

Confidential PHOLADELPHIA, PA Oct’ 09-PRESENT

Sr. DW INFORMATICA DEVELOPER

Worked on Hospital Customer Information Datamart. The Hospital Customer Information Datamart to integrate Customer data for various analysis and reporting purposes.

Responsibilities:

  • Requirements Gathering and Business Analysis.
  • Developed Logical and Physical Data Model using Erwin, followed Star Schema to build the Datamart.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Analyzing the source data coming from Oracle, DB2, SQLSERVER AND FLAT FILES and working with business users and developers to develop the Model.
  • Responsible for Data Analysis, Data Profiling, Data cleansing using IDE/IDQ/Trillium.
  • Performed Data Analysis, Data Cleansing and Data Profiling.
  • Used IDQ, IDE, SQL Queries and EXCEL to perform Data Cleansing and Analysis.
  • Used various data cleansing tools to perform Data Profiling and Data Standardization as per the business rules.
  • Perform Data Cleansing and Profiling, following are some of the categories we used for cleansing profiling for
    • Frequency of occurrences in a field, including blanks, zeros.
    • Shapes of data in a field, such as xxx-xxx-xxxx for SSN.
    • Distribution of business addresses vs residential addresses.
    • Data values, statistics, frequencies and ranges.
    • Mismatches and inconsistencies between metadata and actual data content.
  • Performed Data Profiling to enhance consistency of Patient Data.
  • Performed Data Standardization which helped in adding business value to data.
  • Data Integration and cleansing, Address matching.
  • Removal of duplicates, match/merge, parsing data was done in Data Cleansing.
  • Worked closely with project manager and team lead to develop the transformation logic to be used in Informatica.
  • Analyzing the source data and deciding on appropriate extraction, transformation and loading strategy.
  • Used transformation logic to cleanse the data.
  • Extensively worked on all the transformations like Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence generator, Rank, Union, Joiner, Source Qualifier, etc.,
  • Developed slowly changing dimensions mapping to accommodate the passive mode phase.
  • Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations for weekly loading of Data.
  • Developed several Mappings and Mapplets using corresponding Souce, Targets and Transformations.
  • Worked with Informatica PowerCenter – Source Analyzer, Warehouse designer, Mapping Designer & Mapplet and Transformation Developer for creating Mappings.
  • Using Informatica Workflow Manger for creating, running the Workflows and Sessions and scheduling them to run at specified time.
  • Optimized Informatica mappings for performance at transformation level, mapping level and Session level.
  • Worked on Data Extraction, Data Transformation, Data Loading, Data Conversions and Data Analysis.
  • Used Tidal and Informatica scheduler for scheduled the jobs.
  • Extensively designed Data mapping using filters, Expressions, Update Strategy Transformations in Power Center Designer.
  • Created target load order group mappings and scheduled them for Daily Loads.
  • Preparation of Unit Test Plans and verification of functions specifications and review of deliverables.
  • Migrated Mappings, Sessions, Workflows and Common Objects from Development to Test and to Production.
  • Designed Informatica transformations for loading from various sources like flat files ODBC sources.
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance. Performed error handling.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Scheduling and Loading data process and monitoring the ETL process.
  • Used various data Sources like Flat Files, Relational Tables.
  • Extensively used UNIX Commands within Informatica for Pre Session and Post Session Data Loading Process.
  • Worked on loading of data from several flat files sources.
  • Worked on exporting data to flat files.
  • Performed Unit testing.
  • Assisted in handling production support issues.
  • Did design and code reviews for the EDQ Production Support project.
  • Assisted in fixing and modifying mappings in production support.
  • Used Cognos for reporting. Used Qlikview for internal reporting.

HL7 Responsibilities:

  • Created HL7 interfaces to communicate with Hospital Information system through HL7 formatted messages. These messages imported and exported for relevant clinical information.
  • HL7 interfaces were designed to send and receive real-time messages, TCP/IP protocol was also used.
  • Incoming HL7 messages were saved into the local database in the form of flat files.
  • Created HL7 interfaces through TCP/IP socket connection (one for each direction) for both imports and exports. For the import, the HL7 interface is a server on a configurable port.
  • All incoming HL7 messages are acknowledged and any messages that are not processes, are rejected with a non-acknowledge ACK HL7 message.
  • All outgoing HL7 messages were expected to be acknowledged.
  • Used HL7 Delimiter definitions (Segment Terminator, Field Separator, Component Separator, Subcomponent separator, Repetition Separator, Escape Separator) for identifying and Separating HL7 data.
  • Followed standard HL7 encoding rules to identify data types.
  • Dealt with HL7 messages type ADT (events A01, A02, A03, A11, A29, etc.,) for integrating Admission, Dispatch and transfer information of patients.
  • Dealt with HL7 message type ACK for acknowledgement of Inbound and Outbound HL7 messages.
  • Dealt with HL7 message type OMG for dealing with Clinical Orders.
  • Dealt with several other HL7 message types such as but not limited to ORU, OBR, OBX, ORM, DFT, etc.,
  • Dealt with several segments and fields within HL7 message types for both HL7 inbound workflow and HL7 Outbound workflow.
  • Dealt with HL7 components and subcomponents.
  • Extracted HL7 messages from EPIC Systems.

Environments: Informatica Power Center 8.1/8.6, Oracle 11g/10g, AS400, DB2 8.0, SQL Server 2008/2005, Tidal, Reflection FTP, Cognos 7.0, T-SQL, UNIX – AIX VERSION 5, Vision 2003, UNIX, Erwin 7.2/4.5, Teradata V2R6/V2R5, Power Connect /Power Exchange, Power Analyzer, SQL, PL/SQL, Unix Shell Scripting, HL7, HIPAA, OWBB, ODI, QlikView 9.0/8.0.

Confidential, PISCATWAY, NJ Jun’08-Oct’09

Sr. DW INFORMATICA DEVELOPER

The main purpose of this project was to build a Datamart for claims processing.
Responsibilities:

  • Requirement Gathering and Business Analysis.
  • Developed Logical and Physical Data Model using Erwin, followed Star Schema to build the Datamart.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Coordinating with source system owners, day-to-day ETL progress monitoring.
  • Project coordination, End User meetings.
  • Research Sources and identify Reusable Components for Mapplets and Reusable Transformations.
  • Developed various Mappings and Transformations using Informatica Designer.
  • Created partitions for parallel processing of data.
  • Extensively worked with Informatica tools.
  • Used Trillium 7.0 and its components (Geo-Coder, Parser, Converter, and Matcher) to cleanse the data.
  • Developed transformation logic and designed various complex Mappings and Mapplets using the Designer.
  • Worked with Union, Lookup, Aggregator, Expression, Router, Filter, Update Strategy Joiner, Source Qualifier, Sequence Generator, Stored Procedure, XML Souce Qualifier transformations.
  • Designed complex mappings involving target load order and constraint based loading.
  • Designed Informatica transformations for loading data from various like flat files/OCI/ODBC sources.
  • Co-ordinate with Souce System Owners for proper data feeds and Defaults for Incoming Data.
  • Populated data into staging tables from flat files, XML sources and Relational Sources.
  • Install Informatica PowerCenter on Client and Server Machines and Configure the Informatica Server and Register Server.
  • Maintain and Document the Informatica Power Center setup on UNIX, Windows environment.
  • Migrate Repository from Informatica PowerCenter 7.1 to 8.
  • Worked on Informatica PowerCenter tool – Source Analyzer, warehouse designer, Mapping Designer, Mapplets, Workflow Manager, Workflow Monitor, Data Analyzer, Metadata Manager and

Reusable Transformations.

  • Applied appropriate field level validations like date validations for cleansing the data.
  • Using Informatica Designer designed Mappings, which populated the Data into the Target Star Schema, which is on Oracle Instance.
  • Using Workflow Manager, Server Manager for creating, Validating, Testing and running the Batches and Sessions and scheduling them to run at specified time.
  • Maintain Development, Test and production Repositories using Repository using Repository Manager. Also used Repository Manager to maintain the metadataSecurity and Backup andLocks.
  • Used PMCMD for calling Batches and Sessions from the Command mode also embed the same in UNIX shell scripts. Also used Autosys to schedule the jobs. Wrote UNIX shell Scripts.
  • Used Autosys to schedule the jobs. Wrote UNIX Shell Scripts.
  • Extensively Used SQL and PL/SQL Scripts and worked in the both UNIX and Windows Environment.
  • Assisted in the process of preparation of the document Informatica Transformation Development Standards. This document describes the general guidelines for Informatica developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Used Debugger to troubleshoot the mappings.
  • Performance tuning of databases and mappings.
  • Responsible for Error Handling and bug fixing.
  • Implemented Slowly Changing Dimensions Type 1, Type 2 and Type 3.
  • Worked with different Informatica tuning issues and fine-tuned the transformations.
  • Set up batches of worklets and sessions to schedule the loads at the required frequency.
  • Bench Mark testing of Informatica PowerCenter mappings to calculate the lead times for batch processing and tuning the mappings.
  • Tuning the Informatica Mappings for optimum performance. Replaced filters with Routers and SQL Overrides for heterogeneous sources.
  • Used Business Objects, BO Query Designer and Report Designer to Reporting and data mining to generate Quarterly /Monthly/ Yearly reports. Used QlikView for internal sales reporting.
  • Created Universes for Sales & marketing. Created reports/ Pie Charts based on the end user reporting requirements.
  • Performed Unit/ Integrated testing. Assisted in UAT.
  • Responsible for end user training and support.

HEALTH CARE CLAIMS RELATED RESPONSIBILITIES:

  • Extensively worked on HIPAA compliance. Worked with HIPAA rules and regulations and HIPAA transaction sets.
  • Extensively worked on Claims, Enrollment, Eligibility Verification for Members and Providers, benefits setup and backend payment cycle.
  • Extensively worked with the suite of Healthcare Applications such as Membership, Provider, Eligibility, Group, Product, Enrollment, Provider, Claims, Correspondence and Letters, Facilities, Customer Service and Health Care Regulatory standards.
  • Worked with business/ functional unit to assist in the development, documentation and analysis of functional and technical requirements of FACETS.
  • Extensively worked on different claims like ICD-9 codes relates to diagnosis, procedure and service related programs, other claims like Procedure Modifier Codes like CPT (Current Procedural Terminology) and HCPCS codes.

Environments: Informatica Power Center 8.1/8.6, Workflow Manager, Workflow Monitor, Repository Manager, Server Manager, Data Analyzer, Metadata Manager, Mapping Designer, AIX 4.3.3, Oracle 10g/9i on HP-UX, Trillium 7.0 (Converter, Geo-Coder, Parser, Matcher) , Autosys, SQL Server 2005/2000 on Unix, DB2 8.0 on Mainframes, Toad, Erwin 7.2/4.5, Power Connect/ Power Exchange, Power Analyzer, SQL, PL/SQL, Unix Shell Scripting, Business Objects XI/6.5, Designer, BO Developer Suite ( Broadcast Agent, Supervisor, Designer, Info View), Universe Developer, Supervisor, Web Intelligence 2.6, Windows XP 2003.

Confidential, PHILADELPHIA, PA Jun’07-May’08

Sr. ETL INFORMATICA DEVELOPER/ANALYST

The main purpose of this project was to design and build a Datamart for Sales and Marketing department to enhance the business strategy and operations for the Sales and Marketing Department. The Datamart compiled all the data that would be fed in from various source departments, external data and various platforms of the company and construct the Datamart to generate reports for Sales Growth, Sales Forecasts, Market Share, Compensation Packages, Sales Force Alignment & Deployment, Sales Force Size Analysis, Exploratory Analysis, Compensation Analysis, product/competitor Analysis, Cluster & Segmentation Analysis, Physician & Account Targeting, Market Share.
Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Responsible for Data Analysis, Data Profiling and Data Cleansing using FirstLogic.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Responsible for Dimensional Data Modeling and populating the business rules using mappings into the Repository for Meta Data Management.
  • Created the (ER) Entity Relationship diagrams & maintained corresponding documentation for corporate data dictionary with all attributes, table names and constraints.
  • Extensively used ERwin for data modeling and Dimensional Data Modeling.
  • Designed Sources to Targets mapping from primarily Flat files to Oracle using Informatica PowerCenter.
  • Data Quality Analysis to determine cleansing requirements.
  • Coordinating with souce system owners, day-to-day ETL progress monitoring, Data warehouse target schema (Star Schema) and maintenance.
  • Worked on Informatica Power Center tool – Union , Source Qualifier , Look Up (Connected and Unconnected), Expression , Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations upgraded Informatica 7.1 to Informatica 8.1.
  • Various kinds of transformations were used to implement simple and complex business logic Transformations used are: stored procedure, connected & unconnected lookups, Router, Expression, source qualifier, aggregators, filters, sequence Generator, etc.
  • Analyzed IMS Rx Data using IMS tools such as Xponent and Xponent Plantrak, regarding segmentation & profiling.
  • Analyzed the IMS DDD (Drug Distribution Data) and Rx data – Xponent and Xponent Plantrak.
  • Created and Configured Workflows, Worklets and Session to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Troubleshoot problems by checking sessions and error logs. Also used debugger for complex problem shooting.
  • Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance.
  • Created Stored Procedures for data transformation purpose.
  • Generated PL/SQL and Shell scripts for scheduling periodic load processes.
  • Extensively worked on the Database Triggers, Functions and Database Constraints.
  • Tuning Informatica Mappings and Sessions for optimum performance.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into Target.
  • Created database triggers for Data Security.
  • Wrote SQL, PL/SQL codes, stored procedures and packages.
  • Developed Informatica Mappings/Sessions to populate the Data Warehouse and Data Mart.
  • Error checking and testing of the ETL procedures and programs using Informatica session log.
  • Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling the jobs.
  • Wrote pre-session shell scripts to check session mode ( enable/ disable) before running/scheduling batches.
  • Used Business Objects to create various reports.
  • Used Autosys for scheduling.
  • Wrote pre-session shell scripts to check status (failure/success) after completion of all batches.
  • Maintain Development, Test and Production mapping migration using Repository Manager. Also use Repository Manager to maintain the metadata, security and reporting.

Environments: Informatica Power Center 8.1/7.1, Informatica Power Connect/Power Exchange, Power Analyzer/ Data Analyzer (Repository Manager, Designer, Server Manager, Workflow Monitor, Workflow Manager), Superglue, ETL, Erwin 4.5/4.0, Flat files, Oracle 10g/9i, IMS Data (Xponent, plantrak, DDD), First Logic, MS SQL Server 2005/2000, PL/SQL, Cognos, Autosys, Shell Programming, SQL* Loader, IBM DB2 8.0, Toad, Excel and Unix scripting , Sun Solaris, Windows NT.

Confidential, DALLAS, TX Jan’06 – Dec’06

Sr.INFORMATICA DEVELOPER

The primary objective in building this credit card Datamart was to process customer’s credit card application using the data coming from customer Datamart, Credit Bureau reports, credit scores for approval, rejection of the application and also to decide the credit amount without posing risk to the organization. This Datamart was also built for ranking existing potential customers and identifying risky Customers based on their financial history and account information. This classification was used to target customers with new credit offers.

Responsibilities:

  • Business Requirements Gathering and Analysis.
  • Analyzing the source data coming from Oracle and working with business users and developers to develop the Model.
  • Worked on Dimensional modeling to Design and develop STAR schemas.
  • Using ER-win to identify Fact and Dimensional Tables.
  • Worked closely with user decision makers to develop the transformation logic to be used in Informatica.
  • Analyzing the source data and deciding on appropriate extraction, transformation and loading strategy.
  • Used First Logic and its components (ACE, DataRight, merge/Purge) to cleanse the data.
  • Extensively worked on all the transformations like Filter, Aggregator, Expression, Router, Lookup , Update Strategy, Sequence Generator, Rank, Union, Joiner, Source Qualifier etc.,
  • Identifying and tracking the slowly changing dimensions, heterogeneous Sources and determining the hierarchies in dimensions.
  • Developing number of Complex Informatica Mappings, Mapplets and Reusable Transformations for weekly Loading of Data.
  • Developing several Mappings and Mapplets using corresponding Source, Targets and Transformations.
  • Worked with Informatica PowerCenter 6.2/7.1 – Source Analyzer, Warehousing designer, Mapping Designer & Mapplet and Transformation Developer for creating Mappings.
  • Using Informatica Workflow Manager for Creating, running the Workflows and Sessions and scheduling them to run at specified time.
  • Dimensional Database Modeling using Erwin 4.0 and Involved in Star Schema Modeling.
  • Using Informatica Repository Manager to maintain the metadata, Security, Folder Management.
  • Optimized Informatica mappings for Performance at transformation level, mapping level and session level.
  • Extensively involved in creating database procedures, functions and triggers using PL/SQL.
  • Worked on Data Extraction, Data Transformations, Data Loading, Data Conversions and Data Analysis.
  • Used Maestro for scheduling the jobs.
  • Extensively involved in writing Stored Procedures and calling the same through PowerCenter Stored Procedure Transformation.
  • Extensively designed Data mapping using filters, Expression, Update Strategy Transformations in PowerCenter Stored Procedure Transformation.
  • Extensively designed Data mappings filters, Expression, Update Strategy Transformations in PowerCenter Designer.
  • Created Nested Batches and scheduled them for Daily Loads.
  • Installing Informatica PowerCenter Client and Server and Database connectivity using ODBC to Various Sources and Repositories.
  • Preparation of Unit Test Plans and verification of functional specifications and review of deliverables.
  • Migrated Mappings, Sessions, Workflows and common Objects from Development to Test and to Production.
  • Designed Informatica transformations for loading data from various sources like flat files ODBC Sources.
  • Worked closely with Software Developers to isolate, track and troubleshoot defects.
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance. Performed error handling.
  • Setting the Error Logic for streamlining and automating the data loads for cleansing and trapping incorrect data on staging servers and then loading it to the data warehouse.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Scheduling and Loading data process and monitoring the ETL Process.
  • Used various data sources like Flat Files, Relational Tables.
  • Extensively used Stored Procedures for Pre-Session and Post-Session Data Loading Process.
  • Assisted in creating reports using Cognos.
  • Wrote Unix Shell Scripts, SQL Commands, PL/SQL stored procedures.
  • Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD.
  • Transfer of large volumes of data using Teradata Fast Load, Multi Load and T-Pump.
  • Worked on exporting data to flat files using Teradata FEXPORT.
  • Assisted in UAT Testing, Performed Unit & Integration testing.
  • Assisted in handling production support issues.

Environment: Informatica Power center 6.2/7.1 (Designer, Repository Manager, Workflow Manager, Workflow Monitor, PMCMD) on windows 2000 Box, Power Connect, Power Analyzer, Superglue, Erwin 4.0, Oracle 10g/9i, Teradata V2R5/V2R4, DB2 8.0, SQL Server 2000 (Enterprise Manager, Query Analyzer), Maestro, First Logic (ACE, DataRight, Merge/Purge), Cognos 7.0, T-SQL, Solaris 9.0, Unix.

Confidential, ATLANTA, GA Jul’04 – Dec’05

ETL ORACLE DEVELOPER

Responsibilities:

  • Requirement Gathering and Business Analysis.
  • Participated in User meetings, gathering requirements, Analysis, Design and Development, test and implementation of Informatica transformations and workflows for extracting the data from multiple legacy systems.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data and historical data.
  • Responsible for Data Modeling and populating the business rules using mappings into the Repository for Meta Data Management.
  • Created Logical and Physical models for Staging, Transition and Production Warehouses using Erwin 3.5.
  • Used Repository Manager to create Repository, User groups, Users and managed users by setting up their privileges and profile.
  • Developed Complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected ), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank and Router Transformations. Used debugger to test the mapping and fixed the bugs.
  • Analyzing the source data and deciding on appropriate extraction, transformation and loading strategy.
  • Identifying and tracking the slowly changing dimensions, heterogeneous Sources and determining the hierarchies in dimensions.
  • Using Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Developing number of Complex Informatica Mappings, Mapplets and Reusable Transformations for weekly loading of Data.
  • Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Designed Informatica Transformations for loading data from various sources like flat files/OCI/ODBC Sources.
  • Worked closely with Software Developers to isolate, track and troubleshoot defects.
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance. Responsible to Error handling.
  • Wrote Shell Scripts, SQL statements and PL/SQL Stored Procedures. Used Control M for scheduling jobs.
  • Implemented Slowly Changing Dimensions Type 1 and Type 2.
  • Assisted in the process of preparation of document Informatica transformation Development Standards. This document describes the general guidelines for Informatica developers, the naming conventions to be used in Transformations and also development and production environment structures.
  • Created Sessions, reusable worklets and batches in Workflow Manager.
  • Monitored the sessions using Workflow Manager.
  • Created reusable transformations and reusable Mapplets for use in Mappings.
  • Extensively used Environment SQL commands in workflows prior to extracting the data in the ETL tool.
  • Scheduled the batches and sessions at specified frequency.
  • Used Business Objects to generate reports.
  • Performed Unit and Integrated testing, Provided support.

Environment: Informatica Power center 6.2/5.1, Erwin 4.0, Oracle 9i, SQL Server 2000, DB2 7.0, First Logic, Power Plug, UNIX Shell Scripting, Sybase 12.x, Business Objects 5.0, Solaris 9.0, Unix, PL/SQL, SQL, Control M, Windows.

Confidential Feb’02-Jul’04

DATABASE DEVELOPER/ BUSINESS ANALYST (INTERN)

Responsibilities:

  • Analysis of Source, Requirement, existing OLTP system and Identification of required dimensions and facts from Database.
  • Extracting the data from various sources of input and loading into the Oracle Data Warehouse.
  • Loading the Data from tables into OLAP application and further aggregate to higher levels for analysis.
  • Creating temporary repository for already migrated database for system analysis.
  • Writing Batch programs and database triggers at staging area for population of warehouse. Creating sessions and batches.
  • Develop data conversion, integration, loading and verification specifications and design of a Mapping specification and creating multidimensional models.
  • Setting of Error Logic for streamlining and automating the data loads for cleansing and trapping incorrect data on staging servers and then loading it to the data warehouse and populating the warehouse.
  • Creating sessions and batch management and Performance Tuning and initial testing followed by volume testing.
  • Resolve technical issues with software consultants and vendors.

Environment: Oracle 7.3, Erwin 3.5.2, HP UNIX

We'd love your feedback!