Business Intelligence Resume
Westborough, MA
Summary
- Extensiveexperience in software development life cycle, including requirements study, design, construct, test and support with 9 years of strong experienceinData Warehousing/Business Intelligence, Data IntegrationasSr. ETL/Informatica Engineer,Data warehouse Developer
- 7+ years of expertise in Extract Transform, Load (ETL) Design and implementation including Requirements gathering, data analysis/data cleansing, Slowly changing dimensions, Error handling, ETL mapping specifications and Test plans, Performance tuning, Production support -InformaticaPowerCenter 8.6.1, 8.5.1,8.1.x, 7.1.x SQL, Shell Script, Unix(AIX, Solaris), Linux
- Four yearsexperience inRelational/Dimensional modelwith Operational Data Store, Enterprise Data Warehouse, and Data Martsinthelarge terabytedatabases. Well versed in Star and Snowflake schema design - Normalized/Demoralized data structures, Conceptual/Logical/Physical data model design using modeling tools likeErwin, and Visio
- Over 2 years in BI Reporting using Cognos Enterprise 7i/8 with tasks like Data mart creation, metadata modeling, multidimensional analysis (OLAP/ROLAP/MOLAP)
- Solid experience in database programming using Oracle10g,9i,8i, SQL, Ksh, PL/SQL
- Hands on databases like SQL Server 2008 R2,DB2UDB 8x,Informix 7.x
- Knowledge in OBIEE 10.1.3, BO6.5, XML, Web Services, Brio
- Experience in pull/push data - Relational databases,PeopleSoft HRMS/Financials, Oracle Apps/Financials, SAP Payroll, Siebel CRM, Legacy, Mainframe, flat file, VSAM/Cobol
- Businessdomain knowledgeinFinance, Retail, HR, Insurance,Sales & Marketing Applications
- Experience in working with Job scheduling tools like Control M,Auosysand CRON
- Working with QA team and ETL testing, Unit, SIT, and User Acceptance Tests, CMM and ISO guidelines,Agile-SCRUM, Waterfall,onsite/offshore model. Ability to learn very quickly, excellent problem solving, communication and multitasking skills.
Technical Skills
ETL: Informatica PowerCenter8.6.1,8.5.1AE, 8.1.1/7.1.x/6.2, PowerConnect for PeopleSoft
BI tools: Cognos 8/7i series, OBIEEv10.1.3, Business Objects6.5, Brio
Data Modeling: Erwin 4.5/7.3, Visio, PowerDesigner
Languages:PL/SQL, Shell Script, MF Cobol, Informix 4GL, ESQL, Java
Data Cleansing: Trillium7, Axiom
RDBMS: Oracle10g RAC/9i/8i, SQL Server2008 R2, DB2UDB8.1, Informix 7.31
Database tools: TOAD, SQL*Navigator, WinSQL, Quest Central for DB2 UDB
O.S: UNIX (HP-UX, Solaris, AIX), Linux10, Windows XP/2000/ NT/9x, Mainframe OS/390
Tools/Utilities: Control M, Kintana, Harvest, ClearCase, PVCS, SCCS, VSS, MS Project
Confidential,Southboro MA Aug 2009 - May 12
SrETL/Informatica Developer
Technical Architecture
Informatica PowerCenter8.6.1,Oracle 10g, SQL Server 2008 R2,TOAD v9.5.1,Erwin7.3, OBIEE 10.1.3,Solaris5.1, Control M Sources: PeopleSoft HRMS, Flat files, Equity source tables(oracle)
Worked on EMC Enterprise Data Warehouse(EDW) - Human Resource(HR) projects such asApplicant Tracking System (ATS), Equity, Effective dating functionalityimplementation and HRBI reporting phases.The HRReporting and analytic application are developed primarily around Oracle database repositories, Informatica for extract, transformation and load of data and the Oracle Business Intelligence Enterprise Edition(OBIEE 10.1.3) for reports and dashboards, OBIEE
Key Accomplishments:
- Interaction with HR business SME to analyze story and iteration of the business tasks
- Prepare a list of technical tasks and effort estimation for Project Manager
- Involve logical/physical data models activities for projects like ATS, Equity, Effective dating functionality,HRBI Phases using CA ERWin
- Design and develop high performance ETL design plans to load large volumes of data utilizing InformaticaPowerCenter, Oracle10g, SQL, Korn Shell, Control-M
- Responsible for Technical System Design, Source to Target mapping,Data Quality, Error handling, Job scheduling, handling production issues and onsite/offshore team interaction
- Create database objects such as Table, Synonym, Materialized view, Indexes, Partitioning
- Performance Tuning on the SQL Queries, Mappings and Sessions with techniques like reusable objects, mapplets, reusable Caches etc.
- Design and develop SCD I & II type maps, Full refresh/Incremental load strategy
- Develop Ksh scripts for file processing, wrapper script and Workflow execution
- Code migration - DDL/Ksh scripts using Harvest Package, and Informatica objects
- Responsible for Code review, test plans, test cases for Unit/System testing
- Work closely with HR project teams and adhere to Agile - SCRUM methodology
Confidential, Boston MA Aug2008 - July 2009
Sr ETL/Informatica Developer
Technical Architecture
Informatica PowerCenter8.1.1, DB2UDB8.1, SQL ,Solaris, WinSQL,Webservices, XML,Control M Source system: PolicyStar (PolicyCommon, Policy Product/Unit, Unit/Coverages)
Plymouth Rock Companies, headquartered in Boston MA and providing auto insurance to New England and Newjersy Drivers. The purpose AIG Data Mart project is to enable AIG to accurately submit Massachusetts Bureau Reporting and Massachusetts DMV reporting for all their Massachusetts Commercial Auto line of business. AIG provides the detail necessary to quote and issue MA Commercial Automobile policies. My role is to analysis and design, construct, validation/testing and deploys AIG Data Mart and Bureau Reporting
Key Accomplishments:
- BA/SME interaction for requirementsgathering and analysis
- Prepared work plan such as list of tasks and effort estimation for Project Manager
- WroteTechnical System Design, Source to Target mapping based on BRD
- Designed and develop Logical/Physical database design such as Staging, AIG Bureau reporting environment, custom build metadata
- Involved database object creation & alteration inDB2 UDB
- Developed ETL scripts for ODS to Staging, Staging to Data Mart, Data Mart to Submission
- Developed ETL process for migrating 56 Unix/custom scripts to Informatica mappings
- DesignedAudit table, Metadata model, and Error handling process for AIG Data Mart
- Developed mappings using SQ, Lookup, Joiner, Expression, Filter, Aggregator, Update strategy, Router, Rank, Sequence generator, Union, Transaction Control transformations
- Createdreusable objects, mapplets, parameter and variables to facilitate reusability code
- Developedparameter files, Unix scripts to schedule load thru Control M
- Responsible for Performance Monitoring and Tuning with InformaticaPowerCenter
- Responsible for code migration, Code review, test plans, test cases as part of Unit testing
ConfidentialBraintree MA May2008 - July 2008
Informatica Integration Analyst
Technical Architecture
Oracle 10g, SQL, Suse Linux10, TOAD 8, InformaticaPowerCenter Advanced Edition 8.5.1, InformaticaPowerExchange, Autosys, TIBCO, Tivoli, Webservices, Remedy, PVCS, and Erwin Sources: Oracle Retail applications, Non-Oracle application layer (Mainframe, Item/customer Data warehouses)
Stop and Shop is grocery supermarket, a division of Ahold USA. This Proof of ConceptProject(NextGen Data Migration) implementation effort focuses on creating a core capability for migrating and integrating data leveraging the Informatica product set between Oracle Retail and non-Oracle retail applications.
Key Accomplishments:
- Interacted with Global Architect team to study the existing Environments.
- Involved in configuring InformaticaPowerCenter Advanced Edition 8.5.1
- Involved in creating new repositories, users Test, QA and Prod environments
- Designed and Implemented Integration Competency Center and ETL best practices guide
- Provided Error handling, audit design, Metadata design
- Developed reusable objects and mapplets to use in multiple mappings
- Creation of Users, groups and ETL folders
- Develop ETL strategy for construct Flat file to Relational table, table to table load mappings
- Developed sampledynamicparameter files, Unix wrapper scripts
- Performance tuning recommendationson the Mappings and Sessions like reusable Transformations, reusable Caches etc.
Confidential, Boston MA Jan 2007 - Apr 2008
Sr ETL Engineer
Technical Architecture
Informatica PowerCenter8.1, Oracle 10g RAC, SQL, PL/SQL, Unix-AIX5.2, TOAD 8.1, Control-M, ClearCase, Kintana, Axiom, Rational ClearQuest, MS Visio Source systems: CWE-MDB(Informix), Flat Files, PlanAdmin (Mainframe)
Fidelity Investments is one of the world\'s largest providers of financial services. The Customer Warehouse Environment (CWE) is the central enterprise data warehouse for Finance, customer and prospect marketing campaigns used across the Fidelity Brokerage Company (FBC). Have done two projects 1. Health Savings Account (HSA). 2. FRAIG (Fidelity Registered Investment Advisor Group) Data Mart.
Key Accomplishments:
- Participated in all phases of Fidelity's Software Delivery Methodologies (FSDM)
- Responsible for Requirements gathering,Technical System Design, Development, Testing, Code Review/migration, signoff, UAT, Job scheduling, led offshore team
- Interacted with Data Architecture group, Project team to finalize the technical design strategies
- Responsible for effort estimation, technical project plan for Project Manager
- Designed and implemented Meta data repository and reusable components
- Designed and developed high performance ETL design plans, Staging and Target tables design to load large volumes of data (Flat file to table, Table to Table, table to Flat file) utilizing InformaticaPowerCenter, Parameter File, PLSQL, Korn Shell, Control-M
- Implemented slowly changing dimensional mappings, full load/incremental loadstrategies
- Created database objects such as Table, Synonym, Materialized view, Indexes, Partitioning
- Designed error checking mechanisms to reduce failures and monitor error logs thru Parameter file. Develop several Korn scripts to integrate workflows on Control-M
- Performance Tuning on the Targets, Sources, Mappings and Sessions with techniques like reusable objects, mapplets, reusable Caches and partitioning etc.
- Developed test plans, test scenarios, test cases as part of Unit/Integrations testing
- Responsible for setting up Production Jobsusing Control M, Korn Script, Workflow
Confidential, Shrewsbury, MA July 2006 - Dec2006
Sr ETL developer
Technical Architecture
Oracle 10g, InformaticaPowerCenter7.1.3, Solaris 5.2, Shell script, CognosReportNet Source Systems: Peoplesoft Financials 8.0, PeopleSoft HRMS
Worked on UMASS Finance Data Mart project with a focus on General Ledger, Grant Management, and PeopleSoft HRMS with data extraction, data integration, key generalization, data scrubbing, and preparation for loading into the FIN Data Mart for integrating its five University campuses in Massachusetts. The ETL tool Informatica extracts data from PeopleSoft Financials & HRMS source systems and load into Oracle Database. CognosReportNet is used for reporting.
Key Accomplishments:
- Interacted with SME/Data modeler
- Defined Logical/Physical star schema and staging database environments
- Developed high performance ETL plans to load large volumes of data
- Design and develop ETL processes to data from PeopleSoft Financial tables
- Designed and implemented slowly changing dimension types for Organizational hierarchy and GL/GM account mappings for across five campuses
- Developed various Complex Informatica Mappings implementing SCD type 2 for Fund, Employee, Project, Vendor, Contract Dimensions etc and also implemented incremental load strategy for Fact table load (Actuals, Encumbrance & Budget)
- Designed bulk loading utility for large volume of source data
- Detected bottlenecks, trouble shooting and tuning of mappings, queries, sessions for better performance and efficiency. Handled error handling routine using Process Control table and automated error checking mechanisms to reduce failures & monitor error logs
- Created ETL batch jobs, worklets and workflows for data warehouse data loading automation process. Designed a strategy to integrate workflows with Control-M scheduler.
- Delivered project documentations such as Source to target mapping document, ETL design specs, ETL Test strategy, Production Operational documents
Confidential, Blue Bell, PA May 2004 - June 2006
SrInformatica Consultant/Data warehouse developer
Technical Architecture
Informatica PowerCenter7.1.1, Oracle9i, PL/SQL, PowerConnectKintana 5.0, Solaris5.2, PVCS, Erwin Source system:Oracle Apps & Financials, Legacy, PeopleSoft HRMS, Siebel CRM & Flat files
The Unisys Corporate BI Application is a Web-based Analytical Reporting system for corporate reporting Finance,Sales and Marketing applications.. The BI applications are developed primarily around Oracle database repositories, Informatica for extract, transformation and load of data; and the Business Objects suite for reporting. It consists of various data marts like AP/Supplier,AR, Bulletin Board, Client Reference Intelligence System (CRIS), ExchangeRate, Marketing Dash Board, MDM, Project accounting, PO, , Sales Metrics, SCORES, PackTrack, SDW REPORTING ,SUPPLIER, Webtrex.
Key Accomplishments:
- Responsible for Planning, Knowledge Transfer, Knowledge Acquisition,offshore coordination, designing, implementing and supporting all aspects of BI applications
- Responsible for creating conceptual, logical and physical data model for Staging and Warehouse schema using ERwin and worked closely with Business Analyst
- Designed and developed high performance ETL/Informatica design plans, Staging and Target tables design to load large volumes of data(Flat file to table, Table to Table, table to Flat file) utilizing InformaticaPowerCenter, Parameter File, Korn Shell, Control-M
- Created Mapplets and Reusable Transformations, Used Mapping and Workflow Parameters and Variable for Running the Sessions and Workflows. Created Sessions Workflows and Worklets.
- Review of existing complex mappings to resolve all coding defects and missing functionality.
- Optimized existing process for Customer Dim load process used to take 1+ hours, the new plan runs for 8 minutes with other processes
- Developed back-end routines usingOracle SQL PLSQL Stored procedures Packages Triggers and Sql*loader. Created Unixshell scripts to automate routine tasks.
- Detected bottlenecks and improved the performance of the Informatica maps.
- Did Performance Tuning on the Mappings, Sessions, Targets, Sources used Techniques like reusable Transformations reusable Caches and partitioning etc.
- Provided on-call support for the nightly load process using ControlM, Unix Scripting Informatica, Oracle
- Prepared Technical design document, ETL checklist, Test Plan & Test cases
Confidential, Charlotte, NC July 2003 - May 2004
ETL Developer/Data Modeler
Technical Architecture
Informatica PowerCenter7.1, DB2 UDB8.2,Erwin, HP-UX, Shell scriptSources:Sumtotal, PeopleSoft
The implementation of the Learning Data Warehouse at Hewitt for Marriot involves the movement of data from various sources to staging from which the data warehouse is built. The primary source system application Sumtotal Learning management System (Docent) resides on Oracle database.
Key Accomplishments:
- Responsible for analyzing source systems (Docent & PS HRMS) - functional & data requirements, Identify business logic, updating data dictionary
- Responsible for creating conceptual, logical and physical data model for Staging and Warehouse schema using ERwin and worked closely with Business Analyst
- Designed key dimensions/Facts like User, Activity, Instance, Location, Domain, Learning Fact etc & designed Star SchemaData model based on the Marriot reporting requirements
- Gap analysis for Source, Staging and warehouse tables and created DDL scripts
- Proposed a solution for security row/role level table implementation (created security table based on sumtotal roles - Manager, Training/Learning Coordinator, Instructor)
- Designed and developed high performance ETL/Informatica design plans, Staging and Target tables design to load large volumes of data(Flat file to table, Table to Table, table to Flat file) utilizing InformaticaPowerCenter, Parameter File, Korn Shell, Control-M
- Created Mapplets and Reusable Transformations, used mapping and workflow parameters and variable for running the sessions and workflows. Created Sessions Workflows and Worklets.
- Detected bottlenecks, trouble shooting and tuning of mappings, queries, sessions for better performance and efficiency. Handled error handling routine using Process Control table and automated error checking mechanisms to reduce failures & monitor error logs
- Created ETL batch jobs, worklets and workflows for data warehouse data loading automation process. Designed a strategy to integrate workflows with Control-M scheduler.
- Deliver project documentations such as Source to target mapping document, ETL design specs, ETL Test strategy, Production Operational documents
Confidential, Memphis, USA May 2001 - June 2003
Data warehouse developer
Technical Architecture
Oracle8i, PL/SQL, SQL*Loader, Cognos Enterprise 6.2,Shell Script, TOAD, Sun Solaris, Mainframe JCL & XCOM, Erwin 3.0, Source Systems: SAP Payroll, CRM, Legacy, JD.Edwards & SQL*Server (Lawson)
The implementation of HR Data Warehouse at IP involves the movement of data from six source systems to an Operational Data Store (ODS) from which the data warehouse is built and maintained. SAPis the primary source system. The Analysis / Presentation Delivery Subsystem is represented in the end user Standard / Ad-hoc reporting, Power Cubes. It has 12 Inbound/Outbound Interfaces.
Key Accomplishments:
- Responsible for design, develop, test, implement and support all aspects of HR data warehouse
- Developed Cognos Impromptu and Powerplay reports over Oracle database.
- Created and maintained Catalog in Impromptu Administrator to create reports
- Created List, Sub and Cross-Tab reports using Impromptu and distributed Catalog to users for Ad-Hoc reporting and dimension data model, Cognos Catalog, Setup user access & security. Created dimension maps and multidimensional cubes using IQD in PowerPlay Transformer. Developed Power cubes such as Head count, Termination, New Hire, Org
- Designed and built CognosPowerPlay Cubes and set up automated update scripts.
- Worked with implementing security for Power Cube and Impromptu Web reports over the intranet using Access Manager and Upfront and catalog administration
- Published Impromptu report to UpFront by using IWR server
Confidential, USA July 2000 - Apr 2001
Programmer/Analyst
Technical Architecture
Informix Online DE7.3, I-SQL 7.20, I-SPL, Microfocus COBOL5.x, Shell Script, SCCS, HP-Ux10.2
Worked on Sam's Club relocation project related to Shipment and Order.The system was built in using Microfocus Cobol, ESQL, Informix Online Dynamic 7.3 on HP-UX.
Key Accomplishments:
- Conducted requirement study and design sessions with System Analyst
- Developed, modified and implemented applications such as Inventory management, shipping / receiving, warehouse locations system, bar-code printing and scanning using MF Cobol
- Designed wrote and unit tested Microfocus Cobol programs in a test environment
Confidential Malaysia July 1999 - June 2000
Source System specialist/ETL developer
Client: Agilent Technologies, Malaysia
Technical Architecture
Oracle 8i, PLSQL, Informix7.13, HP-UX 10.x, Ardent Data Stage, Windows NT Source Systems: Legacy systems (MANMAN), WorkStream (Informix v7.31), IFEP, ETL: PL/SQL, Shell Scripts, BRIO 6.0
The Semiconductor Product Group (SPG) Mercury Data warehouse project is focused on delivering WW Manufacturing information needs for DSS. The purpose is to generate problem identification/tracking losses for WorkStream users that helped reduce cycle times.
Key Accomplishments:
- Responsible for identifying users requirements and entity relationship diagrams (ERD)
- Designed and developed ETL process using existing Workstream tables
- Responsible for data staging process using Shell Scripts to execute SQL*Loader Scripts
HCL Technologies(Infosystems) Ltd (SEI CMM level 5 certified ) Chennai India
March 1997 - May 1999
Confidential, Mountain View, CA
Project WS071 (Work Stream Open V7.1) Sep 98 - June 99
Technical Architecture
ORACLE 8, Pro*C, Unix, Microfocus Cobol4.1, ESQL, RCS, INFORMIX 7.23, TIBCO RV, HP-UX 10.X
Worked on a WorkStreamOpen7.1(WSO) project at HCL Infosystems Ltd, India for AMAT, whose main business is Manufacturing Execution Systems application.Developed new modules like Remote Transactions GUI, Product Support requirement, Reporting Server, bug fixing, Y2k conversion
Key Accomplishments:
- Designed wrote debugged and unit tested MF Cobol programs
- Developed and modified all required UNIX Shell scripts for the database and the interfaces
- Converted tested and migrated several batch COBOL programs to meet Year2000 standards.
Confidential,Dallas, TX(onsite) Mar 98 - Dec 98
Programmer/Analyst
Technical Architecture
Oracle8i, PLSQL, Ksh, Informix 7.23, Informix 4GL HP-9000/HP-UX 10.X, Endura
The project Warehouse Management System is a re-hosting project for EXE Technologies.
The WMS was developed with Informix database and Informix 4GL as front end.
Key Accomplishments:
- Replaced Informix 4GL Programs, Informix Database with PLSQL, Oracle8i,
- Developed Shell scripts, stored procedures as required by the application.
- Importing/exporting/migrating enterprise data from Informix databases to Oracle
Confidential. CA Mar 97 to Feb'98
Oracle 8i, Informix7.1, HP-UX, TIBCO RV 4.2, RCS, Microfocus Cobol4.1, Pro*C, & Shell Scripts
The Primary objective of the project is to develop and enhance the capabilities of WSO 6.2.3 to WSO 7.0 a) Y2K Conversion b) Database Conversion c) Bug fixing
Key Accomplishments:
- Created and modified reports, and query scripts in Informix, MF-Cobol, I-SPL, Ksh
- Simulated and fixed, tested several complex bugs in the Cobol programs
Education:
- Masters in Computer Applications-
- Bachelors in Mathematics-
Certification/Achievements:
- Certified in CognosReportNet 1.1 and Cognos Impromptu series v7x
- Training in Data Warehousing concepts and MicroStrategy
- Received Milestone achievement award - 100% IT KPI Success award for the Human Resource Warehouse Production support from the client International Paper, Memphis
- Received customer appreciation for saving International Paper money $2.8 million by creating Tennessee jobs tax credit adhoc report in HR Data warehouse