Sr. Oracle Data Integrator Consultant Resume
NyC
Summary of Qualifications:
- Worked on all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation and support
- Exceptional background in analysis, design, development, customization and implementation and testing of software applications and products
- Demonstrated expertise utilizing ETL tools, including Oracle Data Integrator (previously known as Sunopsis), Ab initio, Informatica, Datastage and RDBM systems like Oracle, DB2, Teradata, Sybase and SQL Server.
- Expertise with all ODI tools - Topology, Security Manager, Designer and Operator.
- Strong leader with experience training developers and advising technical groups on ETL best practices.
- Coded SQL/PL-SQL, SQL* Loader and Unix Shell scripts for high volume data warehouse instances.
- Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.
- Good Knowledge of BI tool - OBIEE.
- Good Knowledge of Oracle single sign-on and oracle web logic server.
- Team player with excellent communication and problem solving skills.
- Experience with well known clients having line of business like insurance, retail, banking, shipping, etc
TECHNICAL SKILLS:
ETL: Oracle Data Integrator, Sunopsis, Ab Initio, Informatica Power Mart, Informatica Power Center and SQL Loader.
BI: Micro strategy, Business Objects, Cognos.
RDBMS: Oracle 7.x/8.x/9i/10g, MS Access 2000, SQL Server 2000/7.0/6.5,
Teradata V2R3/ V2R4/V2R5, DB2 8.1/UDB.
Languages: SQL, PL/SQL, UNIX shell scripting, COBOL, Transact-SQL, Java, C, C++, PERL.
Web: HTML, ASP and JSP.
OS: MS-DOS, Windows 95/98/NT/2000 server, UNIX (Sun Solaris), IBM AIX 5.x/4.x, HP-UX, Linux.
UNIX Editors: Ultra edit, vi.
Database Tools: Teradata SQL Assistant 6.1, Tpump, Fastload, Multiload, Developer 2000 Forms 4.5/5.0/6i, Reports 6i, SQL Loader, TOAD.
Data Modeling Tools: Erwin 3.0/3.5/4.0/4.1, Visio.
PROFESSIONAL EXPERIENCE:
Confidential, NYC July 09 Present
Sr. Oracle Data Integrator Consultant
Confidential, global rating agency committed to providing the world’s credit markets with independent and prospective credit opinions, research, and data. With 50 offices worldwide, Fitch Ratings’ global expertise, built on a foundation of local market experience, spans across capital markets in over 150 countries. Fitch Ratings is widely recognized by investors, issuers, and bankers for its credible, transparent, and timely coverage.
Worked on 2 projects Loading the SAP BW general ledger data into oracle HUB and Loading the SUN general ledger data into oracle HUB.
Responsibilities
- Installed ODI. Set up the ODI connection with Oracle, MS SQL Server and flat files.
- Troubleshooted the ODI connection with oracle by fixing the tnsnames.ora file, TNS listener and DNS entries.
- Set up the Topology including physical architecture, logical architecture and context.
- Created new models for the data sources flat files, MS SQL server, Oracle.
- Did reverse engineering for the data sources and targets.
- Worked closely with the Project Manager and Data Architect. Assisted Data Architect in design by doing source data analysis, rectifying the requirement documents, creating source to target mappings.
- Designed the ETL flow for SUN accounting general ledger system to reuse the existing logic of TM1 accounting system to meet the tight deadlines.
- Coordinated with the offshore UK team. Helped them to define the source view for sun accounting system and in resolving the access issue.
- Developed interfaces to load the data from flat files, SQL Server to stage and from stage to Oracle HUB.
- Did SQL and PL/SQL programming for ODI and oracle.
- Created PL/SQL stored procedures, functions and triggers.
- Developed interfaces for loading the lookup and transactional data.
- Installed and configured TOAD for Oracle and MS SQL Server.
- Created ODI packages, scenarios using interfaces, variables, procedure
- Used ODI commands like ODIFile_Move, odiFileAppend, odiFilecopy etc
- Implemented logic to archive the source files with date stamp affixed after the successful load.
- Performance tuned the ODI interfaces.
- Optimized the knowledge modules to improve the functionality of the process.
- Performed unit and integration testing. Created various test scenarios to test the application.
- Delivered the assignments before the deadlines.
- Conducted code reviews and planned the knowledge transfer to client.
Environment: - Oracle Data Integrator 10.1.3.5, Oracle Database 10g Enterprise Edition Release 10.2.0.4.0, TOAD 9.7.2.5, TOAD for SQL Server 4.5.0.854, Java, Python, XML, MS Excel 2007.
Confidential,Secaucus, NJ May 07 June 09
Sr. Oracle Data Integrator Consultant
Confidential, the world’s leading transportation (shipping) companies. Designed, developed and implemented central data warehouse (CDW) and various Marts for Accounts Receivables (AR) and Detention & Demurrage modules. Implementation of AR Unbilled Snapshot Mart, AR Unpaid Mart, AR Invoice History Mart and Detention & Demurrage Mart enhanced the capability of the company to increase the efficiency of billing department, expedited the collection of invoices and evaluate the whole turn times from invoice creation to payment collection.
Responsibilities
- Did extensive data analysis using source data and UI to understand the data, to assist data modeler in designing the databases, to identify bad data, to help Business in improving and redefining the requirements
- Design of dimensional model based on the requirement.
- Worked closely with Business (GBPM). Provided options and alternative solutions to Business during data anomalies etc
- Involved in CDW and Mart design meetings. Created Technical design documents.
- Created ODI design documents from the existing Informatica mappings. Used these design documents in development of ODI interfaces/packages.
- Performance tuned the interfaces; queries through the use of explain plan, indexes, ranks, Oracle hints, exchange partitions etc.
- Created ODI (Sunopsis) interfaces for loading the data from Sybase to CDW and from CDW to different Marts.
- Participated in code reviews. Did code promotions from dev to test, staging and production environment using ODI.
- Wrote unit test cases, test scenarios, unit test documents and also verified the test scenarios of the QA team. Unit tested for all the possible test scenarios.
- Helped QA team in writing the appropriate test scenarios.
- Did error analysis for continued improvement of the system by catching data anomalies and code issues.
- Planning and prioritizing the tasks to meet the deliverable deadlines.
- Participated in all the phases of project development, testing, deployment and support.
- Used Oracle analytic functions, global temporary tables, etc
- Did SQL and PL/SQL programming for ODI and oracle.
- Worked extensively with ODI Designer, operator and metadata Navigator.
- Good understanding of ODI architecture and installation of Topology, Security Manager, agents, Designer, Operator, Master repository and Work repository.
- Used Security Manager for managing privileges to the users.
- Lead a team of AR and DnD.
- Involved in release meetings and created release notes.
- Involved in rotational production support.
- Complied with the JSOX standards.
- Interacted with BI team during design and development of facts and dimensions to accommodate all the reporting requirements.
- Used Cognos cubes, stand alone reports and drill through reports.
Environment: Oracle 10g, Sybase 12.5, Toad, Oracle Data Integrator 10.1.3.4.7 (Sunopsis V3), DB2 9.1, Informatica 7.x, Cognos 8, OSCAR, Oracle 10g, UNIX, MS Excel 2003.
Confidential,Chevy Chase, MD May 05 Apr 07
Sr. ETL Consultant
Worked on “Load Policy data” and “Claims matching” projects for EDW. “Load Policy data” -The purpose of this particular project is to ensure that all Omnibus Plus (Oasis policy) data elements are loaded into the Enterprise Data Warehouse (EDW) on a daily basis. It added policy and party data elements to the EDW Central Repository (CR).
Confidential, The objective of the claims matching project was to allow business users the ability to relate the policy information that was in effect at the time of the claim. The flexibility provided by this relationship will enable the users to analyze many business measures, like loss ratios and claim frequencies, by any combination of claim/policy/rating variables available in the EDW.
Responsibilities
- Assisted in Data Value analysis of policy project. Split the omnibus plus data into the 37 data files with the dmls corresponding to the mainframe copybooks. Then loaded those files into the DB2 tables using DB2 utility.
- Designed the Software Design Document (SDD) for Claims Matching project.
- Was involved in the Design review and Code Reviews of different projects.
- Extracted data from mainframe did cleansing, transformations using Sunopsis and loaded into Oracle database.
- Used the existing transformations written in COBOL and implemented in Sunopsis interfaces.
- Created interfaces for loading policy elements, TFS, Forms, CFR, DRVR CRS, etc into the CR
- Created policy and party sgk interfaces and generic conditioning interfaces.
- Created input/output balancing, discount balancing, premium balancing interfaces and reports.
- Used all the Sunopsis components Topology Manager, Security Manager, Designer, Operator and Repository Explorer.
- Used Sunopsis Designer to create the interfaces.
- Used Sunopsis Operator for code migration and scheduling jobs.
- Used Sunopsis Repository explorer for job monitoring.
- Optimized the Knowledge modules for increasing the efficiency of jobs.
- Migrated the ab initio graphs into Sunopsis interfaces.
- Performance tuned the Oracle and DB2 queries.
- Used DB2 utilities Load Client, Import, Auto load etc
- Updated Requirements Tracability matrix.
- Involved in production support of daily EDW cycle. Responsible for fixing any problems and completion of PIF cycle in time.
Environment: IBM AIX 5.2, Oracle 9i, Sunopsis V3, DB2 EEE 8.2, GDE 1.13.22 Co-op 2.13.1, , EME, Quest Central 3.1.1, MS Access 2003, COBOL, Merant Tracker 8.0.0.4, TSO at Z/OS 1.6, ISPF 5.6
Confidential,Atlanta, GA Oct 04 Apr 05
ETL Consultant
Worked as an ETL Developer in the full life cycle development of the Data warehouse for the "Services EDW for Services Business" and “Store Response Reporting” projects.
.
Responsibilities
- Participated in all the phases of the project from Planning, Discovery, Requirements, Design, Construction, Testing, Pilot and Deployment.
- Helped data modelers in identifying data sources and calculating the formulas for calculating order retail and order cost.
- Wrote detailed design specs for ETL. Converted the functional specs written by Data Modelers into the detailed design spec for ETL. These specs were used in the development of code using Ab Initio.
- Resolved the lock issue in the production environment. Graphs in the production were in inconsistent state with EME.
- Well versed with air commands.
- Developed graphs using Ab Initio.
- Did performance tuning of the graphs by implementing the parallelism features and by following the good design standards.
- Used EME for version control. Defined and checked in project parameters.
- Migration of code from development to QA and from QA to production using DEWS, Home Depot in-house product used for migration.
- Involved in design and developed jobs and performed data loads and transformations using different stages of DataStage and pre-built routines, functions and macros.
- Used DataStage Designer for developing various jobs to extract, cleansing, transforming, integrating and loading data into Data Warehouse.
- Used DataStage Director to schedule running the server jobs, monitoring scheduling and validating its components.
- Created HDDTM plans for dependency analysis and Maestro for scheduling jobs.
Environment: IBM AIX 5.1, IBM DB2 8.1, GDE 1.12.6, Co-op 2.11.8, Datastage v 6, EME, Maestro 6.0, HDDTM, OS/390.
Confidential,Glen Allen, VA May 04 Oct 04
ETL Consultant
Worked as an ETL Developer in the full life cycle development for the project "Data Environment Setup" (DES) of the Mortgage line of Business.
Responsibilities:
- Created the logical and physical data model using Erwin following the standards prescribed by the Data Model review Board of Capital One.
- Created the data dictionary and pre-estimated the growth of the data size.
- Created the detailed Process flow design document following the standards prescribed by the Script Review Board of Capital One.
- Wrote UNIX scripts for batch loads involving SED/AWK.
- Wrote Teradata DDL scripts.
- Used Ab Initio as an ETL tool for developing graphs.
- Created dynamic Teradata BTEQ scripts involving collect statistics, database space issues, skew info, etc and back out scripts.
- Generated reports like daily Triggers, Daily summary reporting and Regulatory reporting.
- Created Tpump scripts for loading the data from the flat files to Teradata.
- Involved in the Data Migration from the user space to the production environment.
- Involved in the batch loads.
- Assisted DBA and EPC UNIX team in setting up the dev/test/prod environment.
- Performed unit, integration and assisted in UAT testing.
Environment: Teradata V2R5, GDE 2.12, Co-op 1.12.5.2, HP UX B.11.00, Teradata SQL Assistant 6.1, Oracle 8i, Erwin 4.1, Microsoft Visio Professional 2002 (10.0.225), Windows XP.
Confidential, Columbus, OH Dec 02 Apr 04
ETL Consultant
Worked as an ETL Developer in the building of the Data warehouse for the Finance Target Environment (FTE) and Anti Money Laundering (AML).
Responsibilities:
- Created a workspace, called sandbox for the project to work on Ab Initio.
- Developed Ab Initio graphs for few interfaces and extracts of the project.
- Created custom functions for most of the common requirements like getting the dimension number and natural keys, etc.
- Created custom component for getting the package id of the graph being run for the validation purposes and control.
- Plot the strategy and did the reverse migration of Ab Initio graphs from production to development and QA for AML project.
- Deployed and test run the graph as executable Korn shell scripts in the application system.
- Followed the best design principles, efficiency guidelines and naming standards in designing the graphs.
- Responsible for check in and checkout of the graphs and parameters.
- Used transform components like reformat, rollup, normalize and join.
- Used lookup files and incorporated lookup functions into the common functions.
- Used partition and departition components
- Did error handling by using appropriate error handling functions.
- Wrote UNIX scripts for backups, archiving, wrapper programs etc.
Environment: ab initio GDE 2.12, ab initio Co-op 1.12.5.2, IBM AIX, DB2 8.1, Informatica 5.x, Windows NT/2000, Oracle 8i.
Confidential,Detroit, MI Jan 01 Dec 02
Database Developer
Confidential, India May 99 - Aug 00
Database Programmer
Education:
- Master of Science in Computer Science
- Bachelor of Technology
Relevant Courses:
- Database Design
- Advanced Database: Data Warehousing
- Software Quality Assurance and Testing
- UNIX
- Networks
Certifications and Training:
- Oracle 9i Developer Certified Associate - Introduction to Oracle 9i: SQL
- Informatica 7.1 training from Ducat.