Informatica Developer/data Analyst Resume
PROFESSIONAL SUMMARY
- 8+ years of experience in data warehouse projects involving data analysis/ETL/Reports design,development,testing and implementation strategies.
- Strong experience in design and implementation of ETL using Informatica Power center.
- Very good understanding & design knowledge of Data warehouse modeling, Star/Snowflake schemas.
- Experienced in all phases of Software Development Life Cycle (SDLC).
- Extensive experience in Oracle PL/SQL programming, SQL*Plus, Database Triggers, Packages, Cursors, Stored Procedures, Functions and management of Schema objects. Proficiency in Data Definition and Data Manipulation languages.
- Experience in using UNIX platform wif shell scripting.
- Excellent skills in InformaticaPower Center, Power Exchange, Data Explorer (IDE) and Data Quality (IDQ).
- Highly skilled experience in creating Business Requirement documents, Technical Architecture/Design documents, Unit Test plans, Deployment plans and supporting documents.
- Strong working experience in Informatica Data Integration Tools - Repository Manager, Designer, Workflow Manager and Workflow Monitor.
- Expertise in Developing Mappings, Mapplets and Transformations between Source and Target using Informatica Designer.
- Excellent experience working on Netezza to write complex SQL queries, Netezza views, Bulk writer, Bulk reader etc. using Informatica
- Worked on integrating data from flat files - fixed width and delimited files, COBOL files and XML files both as source as well as target.
- Worked wif business SMEs on developing teh business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
- Presented Data Cleansing Results and IDQ plans results to teh business.
- Experience in Performance tuning of Informaitca (sources, targets, mappings and sessions)
- Extensive knowledge in Performance tuning, pipeline partitioning, Push down optimization.
- Experience in writing UNIX shell / Perl scripting and Parameter Files.
- Good Experience in working wif Oracle, Teradata, Netezza, DB2.
- Extensively used Teradata utilities such as BTEQ, Fast Load, Multiload, TPT and TPUMP.
- Experience in writing various Stored Procedures, SQL Cards, PL/SQL Blocks, and Triggers.
- Extensive professional experience in preparing and maintaining project design documents like Preliminary Design Document, Detailed Design Document, and Mapping Document.
- Involved in Unit Testing, System Integration Testing (SIT) and User Acceptance Testing (UAT).
- Experience working on End-To-End implementation of Data warehouse and Strong understanding of Database and Data warehouse concepts, ETL, Star Schema, Normalization, Business Process Analysis, Reengineering, Data modeling.
- Good experience in interacting wif teh Business Users and gathering teh requirements.
TECHNICAL SKILLS:
ETL Tools: Informatica PowerCenter 9.5/9.1/8.6.1/8.1.1/7.x/6.x, Informatica DataQuality Informatica DataExplorer, PowerExchangeDatabases Oracle 11g/10g/9i/8i, MS SQL Server 2000/2005/2008, DB2, Teradata, Netezza
Database Tools: SQL Plus, SQL Loader, Toad, SQL Navigator, PLSQL Developer, SQL Advantage, Teradata SQL Assistant, Aginity, Query Analyzer, Embarcadero
Reporting Tools: COGNOS (Impromptu, Transformer, Power Play), Microstrategy, SAP Business Objects
Others: OLAP, Rational ClearQuest, IBM Clearcase, PVCS, Tivoli, Autosys, ERWin, MS - Visio, Rational rose.
Languages: C, C++, Java, Visual Basic, HTML, ASP, Perl, SQL, Shell Scripting (Csh, Ksh)
Operating Systems: Windows 9x/NT/2000/XP/Win7,8, UNIX (AIX, Solaris, Linux)
PROFESSIONAL EXPERIENCE
Confidential
Informatica Developer/Data Analyst
Responsibilities:
- My responsibilities during teh course of this project include determination of teh fact and dimension tables from teh existing source tables, which hold teh raw data.
- me also did data mapping to determine teh availability of data in teh aforementioned tables and match dat against teh data requirements set forth in teh business requirements document (BRD) for teh new data mart.
- Worked on gap analysis to address teh missing data and subsequently identify and map teh sources from which teh missing data will be sourced. Once teh data needs have been addressed, me used my expertise in dimensional modeling techniques to design teh conceptual, logical and physical data models using Oracle data designer modeling tool.
- Completed teh design process by generating teh DDL (Data definition language) scripts and subsequently running teh scripts on teh Oracle database side to produce physical table structures.
- My responsibility includes designing teh ETL (Extraction, Transformation & Loading) routines and jobs to extract teh raw data from sources using teh Informatica ETL tool and tan transform teh data as well as aggregate and summarize it by applying teh business rules specified in teh BRD.
- me also lay down teh standards and best practices for teh development team to adhere to while developing teh ETL objects.
- Worked on Informatica tool Source Analyzer, Data Warehousing designer, Mapping Designer, Workflow Monitor, Workflow Manager and Repository Manager.
- Worked extensively on different types of transformations like Source qualifier, Expression, Filter, Aggregator, Rank, Lookup, Stored procedure, Sequence generator and Joiner.
- Created sessions and Workflows to schedule teh loads at required frequency using Power Center Workflow Manager.
- Analyzed teh load dependencies and scheduled teh workflows accordingly to avoid teh loading conflict issues.
- Analyzed Session Log files in case teh session fails in order to resolve errors in mapping or session configurations.
- Optimized teh performance of teh mappings by various tests on sources, targets and transformations. Identified teh Bottlenecks, Removed teh Bottlenecks and implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
- Worked on designing teh metadata, defining logical joins; complex joins on various subject areas wif conformed dimensions for various reports on physical layer, logical layer and presentation layer.
- Involved in Developing Various Dashboards and reports wif different delivery mechanisms by creating IBots
- Performed unit testing, knowledge transfer and mentored other team members
Environment: Informatica Power Center (Designer 9.5/9.6 Repository Manager 9.5/9.6, Workflow Manager 9.5/9.6), BO XI R3, Flat files, Oracle, mainframe,wintel,BO XI,OBIEE 10.1.3.3, TOAD, Tivoli,UNIX,PERL,Oracle designer and Shell Scripting
Confidential, Boston, MA
Data Analyst/ETL developer
Responsibilities:
- Interacted wif Cash Management Division and Compliance Office to review teh AML (Anti Money Laundering) requirements and provide Positive Pay service to reduce check fraud
- Updated teh internal anti-money laundering software by monitoring anti-money laundering (AML) activity
- Implemented Scrum as a part of agile methodology for work management and part of Teh Team in teh Framework.
- Process mapping, data cleansing, data migration and validation of data table structure in areas of sales, inventory, procurement, production and distribution.
- Involved in Logical & Physical Data Modeling. Database Schema design and modification of Triggers, Scripts, Stored Procedures in Sybase Database Servers.
- Developed project artifacts like Requirement List, Process Models, Wireframes, Use Cases, Data Journal, and teh Business Rules Catalog using MS-Visio and Rational Rose.
- Worked on data modelling and produced data mapping and data definition documentation
- Responsible for physical/logical data modeling; metadata documentation; user documentation; and production specs.
- Developed detailed auditing and reporting of teh current status and history of all components related to a product.
- Involved in formatting data stores and generate UML diagrams of logical and physical data.Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
- Developed Business Process Flows using MS Visio
- Assumed ownership of Use Case Diagrams, Use Case narratives and other various artifacts.
- Implemented SDLC which included requirements, specifications, design, analysis and testing utilizing RUP methodology.
- Used Microsoft Office suit to develop teh documents such as Visio for creating wireframes, Word,Excel and Power Point for creation of Business Requirement documents.
- Assured dat all Artifacts are in compliance wif corporate SDLC Policies and guidelines.
- Used SnagIt for editing of teh Screen shots and in Jad sessions for screen sharing and presentations.
- Intricately involved in Business Process Flow development, Business Process Modeling, and ad hoc Analysis Created source table definitions in teh DataStage Repository by studying teh data sources.
- Complete study of teh in-house requirements for teh data warehouse. Analyzed teh DW project database requirements from teh users in terms of teh dimensions they want to measure and teh facts for which teh dimensions need to be analysed.
- Prepared teh UML Use Cases using Rational Rose.
- Used MS Visio for flow-charting, process model and architectural design of teh application.
- Documented business workflows textually as well as in UML diagrams according to Scrum for teh stakeholder review.Reviewed Test Plans dat contains test scripts, test cases, test data and expected results for teh User Acceptance testing.
Environment: SQL Server 2012/2008 R2, SSIS, SSRS, PL/SQL, Informatica Power Centre 9.1,ER Studio 9.1,ORACLE 11G, Content Management System
Confidential, Baltimore, MD
Data Analyst/ ETL developer
Responsibilities:
- Studied in-house requirements for teh Data warehouse to be developed.
- Conducted one-on-one sessions wif business users to gather data warehouse requirements.
- Analyzed database requirements in detail wif teh project stakeholders by conducting Joint Requirements Development sessions.
- Developed a Conceptual model using Erwin based on requirements analysis.
- Developed normalized Logical and Physical database models to design OLTP system for insurance applications.
- Created dimensional model for teh reporting system by identifying required dimensions and facts using Erwin 8.0.
- Used forward engineering to create a Physical Data Model wif DDL dat best suits teh requirements from teh Logical Data Model.
- Worked wif Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate teh developed models.
- Identified, formulated and documented detailed business rules and Use Cases based on requirements analysis.
- Facilitated development, testing and maintenance of quality guidelines and procedures along wif necessary documentation.
- Responsible for defining teh naming standards for data warehouse.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2 and SQL Server database systems.
- Exhaustively collected business and technical metadata and maintained naming standards.
- Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in teh form of Entity Relationships and elicit more information.
- Used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges.
- Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Extracted data from teh databases (Oracle and SQL Server, DB2, FLAT FILES) using Informatica to load it into a single data warehouse repository.
- Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).Integrated teh work tasks wif relevant teams for smooth transition from testing to implementation.
Environment: InformaticaPower Center 8x/ 9x(Designer, Repository Manager, Workflow Monitor, Workflow Manager), Oracle, SQL Server, Toad 9.6.1, Red hat Linux, Windows XP/7, Microsoft Office 2010 and Unix Shell Scripting.
Confidential, NJ
Data Analyst
Responsibilities:
- Involved in developing teh conceptual, logical and physical data model and used star schema in designing teh data mart
- Translated teh Business rules into ETL Processes dat load teh repository for Metadata management.
- Performed thorough analysis of data and worked on enhancements and troubleshooting processes
- Provided production support of teh nightly batch.
- Designed processes to load data for MYWSJ.COM; personalized consumer application for WSJ.COM.
- Installed and configured Informatica Power Center and Informatica Client tools.
- Extracted teh data from Sql server, Flat files into data warehouse.
- Worked wif different transformations like source analyzer, lookup filters.
- Created session, batches and scheduled it.
- Created User, folders and User groups.
- Created various mappings using designer dat include source qualifier, Expression, Aggregator, and Lookup Transformations.
- Requirement analysis and Functional Design Specification documents for different projects.
- Involved in translating teh business requirements into technical Requirements, prioritizing teh development efforts by working wif teh End users.
- Met wif teh business users to collect detailed requirements for addition and implementation of data alerts
- Involved in developing Report Design documents (RDD) wif most recent requirements for all teh reports.
- Provided critical performance information reports to support Sales Relationship Management (SRM) for Commercial Banking (CB).
- Involved in creating and updating reports and internal reports for Commercial Data Mart (CDM).
- Generated performance metric reports to support Incentive Calculations for Commercial Banking (CB) Bankers.
- Interacted wif Commercial banking (CB) Bankers, Treasury Management (TM) clients and end users: to gather and prepare business/system requirement documentation.
- Worked wif Data Analysts to populate base tables in Production (PBDW) and Bank (BDW) data warehouse.
- Maintained Different Applications of SAS; SAS/QC, SAS/STAT.
- Designed, developed, and maintained universes used Business Objects designer.
- Trained end users wif implementation of BOBJ reports
- Worked wif Sales Data Control (SDC) team to integrate and maintain one master tree table dat validates complete list of end users including Relationship Managers (RM’s), Market Managers (MM) Private bankers (PB), Government Bankers (GB) etc
- Created Monthly Escrow Reporting for Treasury Management team using Teradata sql assistant.
- Worked in Implementation and transition of Commercial Banking -SRM reports and Treasury Management - Incentive reports from Teradata environment to BOBJ environment.
- Technical expertise in Analyzing, Design, Implementation and administration of security models wif row level data security, role based security etc in Business Objects.
- Managing and tracking teh progress of teh development effort and resolve teh issues wif development needs.
- Designed Scripts from Scratch which populated teh Teradata Fact, dimension and lookup Tables.
- Provide key data elements to populate Monthly Fulfillment dashboard and executive dashboard.
- Successfully applied schema, change scripts to existing objects for synchronizing wif changing business rules.
- Loading data by using teh Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working wif loader logs...
- Wrote various SQL queries using joins and sub queries in Teradata to develop Views for reports.
- Developed Stored Procedures to test ETL Load per batch and populated ETL Load statistics table to estimate ‘Data growth’.
- Provided performance optimized solution to eliminate duplicate records.
- Identified and resolved performance bottlenecks in Source, Target, Mappings, Database related, Unix File server.
- Design data warehouse ‘System Testing strategy’ and developing Test plans to validate teh functional requirements.
- Review BI Project estimation and recruit team members and work allocation schedule to team members.
- Provide teh team wif technical leadership on ETL Design, Development best practices, Version Controlling and customization of Data loads.