We provide IT Staff Augmentation Services!

Informatica Resume

2.00/5 (Submit Your Rating)

New Jersey, NJ

SUMMARY:

  • Over 7 years of IT experience in Design, Development and Implementation of various projects using Data Warehousing tools a working knowledge of traditional transaction-based (OLTP) systems as well as latest data warehousing /data mart techniques.
  • Very good experience in Integration and design of complex OLTP database models using ER diagrams and handling performance issues of the OLTP databases.
  • Expertise in designing and handling the models for very large databases.
  • Experience in all the phases of Data warehouse life cycle involving requirements gathering/analysis, design, development, validation & testing of Data warehouses using ETL, Data Modeling, & Reporting tools.
  • Studied the existing OLTP systems (3NF models) and created facts and dimensions in the data mart.
  • Discussed extensively with developers, data architecture team, Business SME\'s and Business Analysts before designing a model that can fulfill all business needs.
  • Clarified client requirements, business needs and project objectives, at each level of project implementation.
  • Extensively followed Ralph Kimball and Bill Inmon Methodologies.
  • Extensive experience in conceptual design, logical design and physical design of data models, Star Join Schema/Snowflake modeling, FACTs & Dimensions tables exclusively using ERwin (7.3.3/7.2/6.0/5.5/4.5/4.0/3.5.5)ER Studio Well-versed with various grains like Transactional grain, periodic snap shot grain and accumulating snap shot grain.
  • Applied advance database concepts according to business needs like degenerate dimension, factless fact table, bus matrix architecture for building EDW and build strategy for late arriving fact rows.
  • Experienced in Writing PL/SQL ETL routines in Data warehousing applications Involved in database design and Data modeling
  • Worked with huge databases containing terabytes of data in high level database designs like Oracle and DB2.
  • Knowledge in writing, testing, and implementation of the triggers, procedures, functions at database level and form level using PL/SQL
  • Managed various Sub Models according to different subject areas via Nested Hierarchies which allowed high level of organization and Security.
  • Analyzed the forecasting of data for future application growth and worked with DBA’s to come up with the necessary space.
  • Performed Data cleansing and Data profiling for detecting and correcting inaccurate data from the databases and to track data quality and to assess the risk involved in integrating data for new applications.
  • Created and executed Test Plan, Test Scripts and Test Cases based on Design document and User Requirement document for testing purposes.
  • Assisted in developing and reviewing Data Access plans for handling query performance issues
  • Extensively used ERwin Repository to provide rich support for model version management and designed to allow organizations to easily manage all successive states of models and Implemented Security Management by defining Users and inheriting permissions by providing roles.
  • Extensive experience of Software Development Life Cycle (SDLC) methodologies like Waterfall, Agile and RUP.
  • Possess wide range of IT industry experience with, Communication, Banking/Finance, Insurance, with strong analytical problem-solving, organizational, communication, learning and team skills.
  • Excellent communication and interpersonal skills, good analytical reasoning and ability to learn quickly, high adaptability to new technologies and tools.

TECHNICAL SKILLS:


Databases:

Oracle 11g/10g/9i/8i/7i, DB2 UDB 9.1v/8.2, MS SQL Server 2000/2005,SQL, PL/SQL,SQL*Plus.

ETL Tools:

Informatica Power Center 9.0.1,8.6, 8.5.1, 8.1.1/7.x/6.x

Designing/Data Modeling Tools

CA Erwin7.3.9/7.1,ERStudio 8.0/7.1/5.1, MS Visio

Languages:

C, C++, XML, Shell Scripting

Utilities:

Embarcadero DB Artisan, SQL * Loader, SQL Navigator, TOAD 9.6.1.

Operating Systems:

Windows NT/2000/XP/2003, UNIX Sun Solaris

OLAP Reporting Tools:

MS SQL Server Reporting Services, Micro Strategy

Scheduling Tools:

Autosys, Control-M, Tidal

PROFESSIONAL EXPERIENCE:
Confidential,NJ Jan 2012- Present


Role: Data Modeler/ Data Analyst/Informatica Developer

Confidential,is the corporate and investment banking division of Bank of America. It provides services in mergers and acquisitions, equity and debt capital markets, lending, trading, risk management, research, and liquidity and payments management. It was formed through the combination of the corporate and investment banking activities of Bank of America and Merrill Lynch following the acquisition. Compliance Data Warehouse (CDW) or Compliance Surveillance Data Repository (CSDR) is designed which is a single source of data repository for all Merrill Lynch and Bank of America Compliance Requirements.

  • “CORE” project focuses on retrieving the data from various domain feeds and load in to CSDR for multiple downstream users for various business requirements like Auditing, Trade Surveillance, reporting etc.
  • Designed and developed Data Dictionary to support the enterprise architecture for the business and technical users.

CORE Project Responsibilities:

  • Involved in collecting and understanding the requirements of the business users and corporate managers throughout the project
  • Worked as a part of the technical team in converting the business requirements to technical specifications
  • Coordinated with different data providers to source the data and build the Extraction, Transformation, and Loading (ETL) modules based on the requirements to load the data from source to stage and performed Source Data Analysis.
  • Analyzed existing Data Model and accommodated changes according to the business requirements.
  • Created Conceptual model based on the business process and identified the entities, attributes and relationships and converted the conceptual model to the logical data model
  • Converted the logical model to Physical Model by giving accurate data types, created indexes on keys and forward engineered the DDL.
  • Utilized Brownstone data dictionary for Metadata Management
  • Ensured that referential integrity is maintained across the data model and compared the new model with the database structure and implemented the new structure on to the database.
  • Created Source to Target mapping document with ETL Routine, Transformation logic and types of transformation to be used.
  • Created ETL Design Specifications Document with in depth details about Source, Target, Naming Standards, Parameters/Variables, Lookup, Program Logic/general Processing Rules, ETL Logic, Folder, Session, and Workflow details.
  • Assisted QA team during the UAT phase in testing the data

Data Dictionary Project Responsibilities

  • Interacted frequently both the business and technical team and captured the requirements of the Data Dictionary users.
  • Created sample reports from Erwin Data Browser/Report Template builder to review with the users.
  • Analyzed the data in M7 tables of Erwin Model Manager for data quality.
  • Extensively worked on M7tables like M7Property, M7Object, M7ObjectProperty, M7Class and M7Library etc.
  • Edited and created new report queries in Erwin Data Browser for Erwin Model Manager Users.
  • Created Metadata related tables on the database.
  • Created Source to Target mapping document with ETL Logic that retrieves the data from M7tables and load in to new Metadata related tables.
  • As a Data Dictionary Specialist ,maintained and owned Compliance metadata database

Environment: Erwin7.3.9, Oracle 11g, Toad, Informatica Power Center 9.0.1, OBIEE 10g, UNIX

Confidential, May 2011- Dec2011


Role: Data Modeler/Data Analyst

Confidential,is part of PNC, a member of The PNC Financial Services Group, which is committed to the contribution/401(k) plan services. A centralized data repository (relational database) is designed to address the data management which is currently performed outside the record keeping application in MS Access databases, Excel Spreadsheets and word documents etc. An individual front end application is designed for each business line to improve the efficiency of their business process. A Vested Interest Data warehouse is also developed which will replace all reporting needs, this would also include Ad-Hoc reporting using crystals.
Responsibilities:
Phase 1: Design of Relational Database

  • Facilitated meetings with the different business lines to understand the business process.
  • Worked with business Analyst to gather the business requirements and created functional and non functional requirements documents.
  • Analyzed the current business processes and data which is maintained in the different sources.
  • Designed Conceptual, Logical and Physical Model based on normalization standards (Second Normal Form/Third Normal Form).
  • Conducted JAD session with application developers to determine and analyze the data flow.
  • Created mapping document to load the existing data from MS Access database to the newly designed relational database.
  • Assisted testing team in developing test scenarios, test logic and test data to support unit and system integration testing and executed test plans.
  • Performed unit testing on the applications and data to ensure the correct data flow.
  • Facilitated training sessions and trained business users on the reporting feature of the applications.

Phase 2: Design of Data warehouse

  • Conducted meetings with each business line and gathered reporting requirements and created reporting requirements and got the approval from the business users
  • Analyzed current reports used by the business and presented enhanced reporting structure with additional data elements.
  • Performed Source Data Analysis on different sources like SQL Server and Oracle and also on additional informational systems
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using ERWIN / Star Schema.
  • Studied the existing OLTP system(s), analyzed and created Facts, Dimension tables Modeled the Data Warehousing data marts in Star join schema and Snow Flakes Schema
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Created application-specific Data Marts so that users can access personalized dashboards of information that is specific to their department and business unit.
  • Drilled out each of the data flow along with source to target mapping document for the ETL team
  • Coordinate with testing team to write test scripts for testing functionality of the design.
  • Created and maintained Data Dictionary and data model supporting technical documentation for both rational schema and for both the phases.

Environment: ERwin 7.3.3, MS SQL Server 2008, Oracle 11g, Informatica Power Centre 8.6.1, Crystal
Reports 12

Confidential,WI Jan 2010-April 2011

Role: Data Modeler/Data Analyst

The project is building an Enterprise Data Warehouse, which would integrate insurance data for accounting purposes. Data is sourced from different systems, which include premium, claim, expenses or loss, contracts. Flat Files, Oracle tables, DB2 tables and Excel spreadsheets were part of the source data for the present ODS which is on DB2, which came in on a daily, weekly and monthly basis. The warehouse was sitting on Oracle database. Efforts are on from the Oracle side, to build its own Data warehouse from the DB2 mainframe database being maintained by Total Access. The plan is to build two different types of data warehouses, EDW and ODS, while the latter maintains data for 6 months, the former maintains all the history spanning over several years.
Responsibilities:

  • Organizing and managing meetings with Business Analysts (BA’s), Technical Assistants (TA’s), Data Stewards, and Subject Matter Experts (SME’s) for the requirements gathering, business analysis, testing, metrics and project coordination.
  • Deliver an EDW solution that provides an integrated data store to allow for analysis across subject areas such as customer, product, service, and transactional revenue. It provides a single, common repository of information used across the enterprise for data categories captured in the EDW.
  • Involved in Data modeling, E/R diagrams, normalization and de-normalization as per business requirements.
  • Designed and developed a detailed Subject Area model to establish relationships between Business entities.
  • Worked on the 3NF tables in the OLTP system and denormalized to load the data into facts and dimensions.
  • Used data modeling toolERwin 7.3.3 for creating models, Use ER Diagrams, DDL’s (Data Definition Languages), Logical and Physical data models.
  • Extensively worked in accordance to strategy overseer, communicating standards and conventions for consistency and conventions, Maintaining Releases and Versions.
  • Developed Enterprise Data Dictionary for reusable objects like domains, attachments, defaults, reference values, User data types and reusable procedural logic.
  • Managed various Sub Models according to different subject areas via Nested Hierarchies which allowed high level of organization and Security.
  • Implemented Security Management by defining Users and inheriting permissions by providing Roles.
  • Handled Model Access, Viewing and overall control by restricting access to sensitive data, maintained Model histories, Tracked changes to every modeling object through versions.
  • Extensively used ERwin for reverse engineering and hosting according to the business requirements on existing models.
  • Reverse engineered PDM based on current situation of the business process and then developed detailed in synchronization with PDM and LDM.
  • Developed use cases, data flow models, and performed functional decomposition analysis
  • Created Bus Matrix Architecture for both detailed transaction view and monthly snapshot perspective.
  • Based on the business findings created Star Schema and Snow-flake Schema using advanced dimensional modeling concepts of Conformed Dimensions and Facts, Large Dimensions, Degenerated dimension, Fact less fact table, Aggregate fact tables in Multidimensional model.
  • To maintain the consistency and quality of the data worked with Data governance, Data Profiling and Data quality team that required managing the master data from all the business units as well as from IT and ensuring data quality standards across the enterprise.
  • Analyze the integrated metadata to get information on data usage, end-to-end change impact analysis, and report-to-source data lineage.
  • Designed and developed the model in hierarchical manner where most common information has the least amount of distribution and least common information has highest level distribution.
  • Documented target matrix document, business access rule, data volumes and retention and handed it over to the ETL and DBA teams.
  • Worked with data mapping from source to target and data profiling to maintain the consistency of the data.
  • Participated in the entire requirements engineering process right from requirements elicitation phase to the phase involving documenting of the requirements.
  • Worked efficiently even in tight deadlines. Followed Agile methodology software development.

Environment: ERwin 7.3.3, Oracle 10g, Flat Files, MS SQL Server 2005, PL/SQL, UNIX Shell Scripting, Toad 9.6, Informatica Power Centre 8.6.1.

Confidential,Des Moines, Iowa Dec 2008- Dec-2009


Role: Data Modeler/ Data Analyst

Confidential,is one of the nation’s biggest Financial Corporation that is engaged in commercial and retail banking and offers a comprehensive selection of financial products and services including Mortgage, lending, depository, investment management, insurance services and other related financial services. Enterprise Mortgage Data Warehouse (EMD) for Loan Amortization is maintained, that enables management, marketing and remittance collection team to analyze the business over a period of time. Budgeting, marketing and forecasting decisions are based on the reports produced using the data warehouse.

Responsibilities:

  • Gathered Business requirements by organizing and managing meetings with business stake holders, development teams and analysts on a scheduled basis.
  • Analyze Different Data Sources like SQL Server, Oracle and Flat files from which has student information, college information, parent/sponsor information, loan contract, etc.
  • Normalized the incoming files of different sources to 3NF at the staging area before loading into facts and dimension tables.
  • Performed Data Profiling to assess the risk involved in integrating data for new applications, including the challenges of joins and to track data quality.
  • Identified Multiple Dimensions and Fact tables. Used advance data modeling concepts of Degenerated dimension, sub-dimension, Fact less fact table, Aggregate fact tables in Multidimensional model.
  • Used advanced dimensional modeling concept Multi Valued Dimensions to resolve many to many relationships using Bridge or associative identity between tables.
  • Handled the problem of late arriving fact rows and multiple date keys by using accumulating snapshot grain.
  • Exclusively used ERStudio 8.5 to build logical or physical data model, allowing data architects to incorporate the same standards in SOA development.
  • Conducted Data Model design review meetings to get approval from cross-functional teams.
  • Involved in creating Enterprise data dictionary and data standards to ensure data quality and consistency.
  • Implemented Data Archiving strategies to handle the problems with large volumes of data by moving inactive data to another storage location that can be accessed easily.
  • Created DDL for documenting and replicating the database’s build procedures and handed it over to the DBA team.
  • Created Source to Target mapping document which contains the transformation logic and handed it over to ETL Team.
  • Supported the testing department to create Test data and Unit test cases to ensure successful data loading process.


Environment: ER Studio 8.5, Oracle 10g, Flat Files, SQL Server, Informatica Power Centre 8.5.1, Toad for Oracle, Micro Strategy
Confidential,Bensenville IL July 2006- Nov 2008


Role: Data Modeler/ Analyst

Confidential,is a telecom provider of local and long distance telephone services, wireless services and internet access services in the United States. An EDW solution will provide the data that will enable US Cellular to respond to competition and support our Customer Relationship Management (CRM) strategy. Our business metrics, customer, market, product and CRM supporting external data will be readily available and accessible. The scope of this project is to design and implement a real time data warehouse to support and maintain customers current and history data. Sales Managers and Sales Representatives are the major end users of this data warehousing system.

Responsibilities:

  • Collaborated with business analysts and the DBA for requirements gathering, business analysis and designing of the data marts
  • Worked on dimensional modeling to design Star/Snow-flake schemas in ERwin
  • Identified potential fact and dimension tables for providing a unified view to ensure consistent decision making
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables
  • Draw the granularity for the fact and dimension tables and identified the relationships between them across each Star Schema
  • Created the database design document for building the data mart that includes all the fact and dimension tables and the business rules to be transformed during the ETL process for these tables
  • Worked with heterogeneous sources from various channels like Oracle, SQL Server, flat files, and web logs
  • Analyzed the data in the source databases and designed the overall data architecture and all the individual data marts in the data warehouse for each of the areas Merchandising, and store operations
  • Generated DDL-Sql scripts and collaborated with the DBA team to ensure proper deployment of data structures
  • Prepared source to target matrix (STM) and collaborated with ETL team all through the transformations and loading activities as defined in STM.
  • Created update strategy Transformation for all mappings, which inserts, deletes and updates the rows based on conditions routed through router transformation
  • Designed, developed and deployed new data marts along with modifying existing marts to support additional business requirements

Environment: Informatica Power Center 8.5.1, DB2, Oracle 10g, SQL/PLSQL, SQL*Loader, Erwin 7.1, TOAD for Oracle, UNIX

Confidential,Hyderabad India Oct 2005- May 2006


Role: Programmer/ Data Analyst

This was for an on-site project for ICICI Prudential Life Insurance Company Ltd. The project covered the entire functionality of individual and group insurance solutions for nineteen departments which included central office and regional offices located in major cities. It aimed to get all the data in different parts of the Life Insurance organization to work together and synthesize data into a modern database

Responsibilities:

  • Database planning and logical and physical design using MS-Access to assist the team lead.
  • Typical forms development included designing forms with different layouts and placing the layout at runtime depending on some form parameters
  • Tuning and code optimization using different techniques like dynamic SQL, dynamic cursors, tuning SQL queries, writing generic procedures, functions and packages
  • Developing reusable objects like PL/SQL program units and libraries, database procedures and functions, database triggers to be used by the team and satisfying the business rules
  • Documented testing methodology and devised test scripts and plans to help the QA team in the functional aspect of the package and apply the same before submission to the client
  • Assisted in designing implementation plan and monitored the same by actively participating in user training, live data entry and obtained sign-off from the concerned departmental head

Environment: Oracle 8i, Developer 2000 with Forms 5.0 and Reports 3.0, Oracle Designer 6i, Windows XP, PL/SQL, MS-Access

Education:
Bachelor\'s in Engineering

We'd love your feedback!