Sr. Data Modeler Resume
Paramus, NJ
SUMMARY
- Overall 9 years of IT experience involving various aspects of Business Intelligence, data warehousing and mainframe technologies. Well versed with complete software Life Cycle Development process including requirement analysis, documentation, coding, testing, implementation, and maintenance.
- 6 years of experience with special emphasis on System Analysis, Design, Development and Implementation of ETL methodologies in all phases of Data Warehousing and Relational Databases applications using IBM InfoSphere Information Server 8.7/8.5/8.1(DataStage, Quality Stage) and Ascential DataStage 7.5 (Designer, Manager, Administrator, Director).
- 3 years experience in working with legacy application and mainframe technologies.
- 4+ years in Logical, Physical, Conceptual Data Modeling and performance tuning of Oracle, DB2, SQL queries and ETL processes.
- Proficient in handling multiple operational data sources like Oracle, SQL Server, DB2, Teredata, Netezza, Flat Files, IMS DB and Delimited Files for extraction, staging and data warehouse environments.
- Expert in Data Warehousing techniques for Data Analysis, Cleansing, Transforming, Testing, Slowly Changing Dimensions, Change Data Capture and Data Loading across source systems.
- Experience in developing UNIX Shell Scripts for automating Extraction, Transformation and Loading process for various feeds and enabling the execution of jobs in Production environment using scheduler like Autosys.
- Experience in gathering requirements, designing and implementing complex reporting solutions using COGNOS Suite 8.4 / series 7 ver. 4
- Strong hands on experience using Teradata utilities (SQL, B - TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man, WinDDI).
- Excellent understanding of Information Analyzer for importing Metadata to Repository and Data Profiling.
- Experience in designing and implementation of Star Schema, Snowflake Schema and Multi-Dimensional Modeling.
- Optimized DataStage jobs utilizing parallelism: Partitioning and Pipelining features of Parallel Extender.
- Proven ability to implement technology based solutions for business problems.
- Excellent communication and interpersonal skills, teamwork, problem solving skills, flexible, self-direct and energetic person.
- Strong proficiency in different SDLC methodologies including Agile and linear models.
- Certified IBM application developer in DB2 programming.
- Attended training in IBM InfoSphere Advanced DataStage Parallel Framework v9.1.
TECHNICAL SKILLS
- IBM InfoSphere Information Server (DataStage and QualityStage
- Information Analyzer
- Metadata Workbench) 8.X
- Ascential DataStage Enterprise and DataStage 7.X(Designer
- Manager
- Director and Administrator)
- IBM Information Server Suite
- Ascential QualityStage 7.5.2/7
- Cognos
- Erwin
- Oracle 10g/9i/8i
- DB2 UDB 7.1/8.1
- SQL Server 2000
- 2005
- MS Access
- TeraDataV2R5
- SQL
- Neteeza PL/SQL
- HTML
- DHTML
- C
- C++
- Java
- UNIX Shell Scripting
- Linux 7.1/7.2/8
- Windows 2000 Server/Advanced Server
- Windows 95/98/2000/XP
- OS/390
- WinNT4.0
- MSDOS6.22
- TOAD 7.x/8.x for Oracle.
PROFESSIONAL EXPERIENCE
Confidential - Paramus, NJ
Sr. Data Modeler
RESPONSIBILITIES:
- Participate instrategic initiatives relating to system enhancements and development.
- As lead, work withclient on identifying systems requirements andcontribute toa formal Systems Requirements Analysis document.
- Contribute to the corresponding delivery specifications and technical design specifications.
- Manage individual tasks and deliverables in order to complete projects on schedule.
- Define and analyze problems in terms of systems requirements and modify system design.
- Coding and unit testing as per standards
- Participate in quality control measures including peer code reviews.
- Work closely with the other technology departments within Confidential
- Maintain user/external group relationships (Developers, Business Analysts, Data Architects, DBAs)
- As Tech lead, guide, supervise the Team member’s tasks and communicate the deliverable to the clients.
- Managed Level 1 support for day to day operations and issue of Enterprise data warehouse.
- Analyzed the various sources and designed and developed parallel jobs for extracting data from different databases such as SQL Server, Teradata, DB2, and Oracle, Netezza, Sequential files and flat files.
- Loaded the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).
- Assisted in Batch processes using Fast Load, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and summarize data.
- Involved in understanding the scope of application, present schema, data model and defining relationship within and between the groups of data.
- Good understanding of Ralph Kimball's data warehouse methodology.
- Involved in various projects related to Data Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments.
- Facilitated data requirement meetings with business and technical stakeholders and resolved conflicts to drive decisions. Worked extensively on Erwin and ER Studio in several projects in both OLAP and OLTP applications.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing, data manipulation.
- Implemented Slowly Changing Dimensions - Type I & II in Dimension tables as per the requirements.
- Familiarity and experience in the work environment consisting of Business analysts, Production Support teams, Subject Matter Experts, Database Administrators and Database developers
- Good understanding of views, Synonyms, Indexes, Joins and Partitioning.
- Experience in extracting, transforming and loading (ETL) data from spreadsheets, database tables and other sources using DataStage. Created, documented and maintained logical and physical database models in compliance with enterprise standards and maintained corporate metadata definitions for enterprise data stores within a metadata repository. Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships
- Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards,datatypes, volumetric, domain definitions, and corporate meta-data definitions. Exceptional communication and presentation skills and established track record of client interactions
Environment: InfoSphere Data Stage 8.x/7.5.2/Parallel Extender, Quality Stage, Oracle 10g/9i, FTP, SQL, PL/SQL, Toad 9.0, Erwin, UNIX, UDB DB2, Autosys, Netezza, cognos 8, Teradata
Confidential, Chicago IL
Sr. Data modeler
Responsibilities:
- Studied requirements for the Data warehouse to be developed
- Conducted one-on-one sessions with business users to gather data warehouse requirements
- Analyzed database requirements in detail with the project stakeholders by conducting Joint Requirements Development sessions
- Developed a Conceptual model using Erwin based on requirements analysis. Developed normalized Logical and Physical database models to design OLTP system for insurance applications. Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r7.1Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models
- Identified, formulated and documented detailed business rules and Use Cases based on requirements analysis
- Facilitated development, testing and maintenance of quality guidelines and procedures along with necessary documentation
- Responsible for defining the naming standards for data warehouse
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2, Teradata and SQL Server database systems Exhaustively collected business and technical metadata and maintained naming standards
- Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information. Used Data stage Designer, Metadata manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges
- Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement Extracted data from the databases (Oracle and SQL Server, DB2, FLAT FILES) using Datastage to load it into a single data warehouse repository. Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT)
- Integrated the work tasks with relevant teams for smooth transition from testing to implementation
- Developed star schema data model using suitable dimensions and facts. Involved in analyzing the scope of application, identifying the relationship within and between the groups of data.
- Developed several Test Plans, UNIX Scripts for Unit/Team Testing.
- Involved in Unit testing, Functional testing and Integration testing and provide process run time.
- Create and use DataStage Shared Containers, Local Containers for DS jobs and retrieving Error log information.
- Developed system test plans and test cases for unit testing. Performed data validation, unit testing, and performance analysis for the designed components.
Environment: InfoSphere Data Stage 8.x/7.5.2/Parallel Extender, Quality Stage, Oracle 10g/9i, FTP, SQL, PL/SQL, Toad 9.0, Erwin, UNIX, Shell scripting, UDB DB2, Autosys, cognos 8, Teradata
Confidential
ETL Developer
Responsibilities:
- Involved in designing and development of data warehouse environment.
- Developed Parallel extender jobs for extracting data from the various sources.
- Used Metastage to import and export table definition which was shared by multiple users, applied information analyzer to perform primary key, foreign key.
- Extracting data from relational tables, flat file sources and loading to the staging area.
- Designed jobs using different parallel job stages such as join, Merge, Lookup, Remove Duplicates, Filter, Dataset, Change Data Capture, and Aggregator.
- Used diverse partitioning methods like Auto, Hash, and Entire etc. and also involved in preparation of Test Cases concerned with Unit testing, System testing to check data reliability.
- Extensively used DataStage to load data from Oracle, DB2 and Flat Files to Oracle, also developed PL/SQL programs and UNIX Scripts.
- Used Information Analyzer for column analysis, primary key analysis and foreign key analysis.
- Involved in unit, system integration testing and moved jobs from development environment to production environment using DataStage administrator.
- Used Toad for writing SQL routines and functions. Used UNIX shell scripting for file manipulation.
Environment: Oracle 10g/9i, IBM/Ascential DataStage 7.5.(Enterprise Edition)/7.0/6.X (DataStage Manager, DataStage Director, DataStage Designer), UNIX, Windows 2000 Server, SunOS 5.9
Confidential, Greenville SC
Mainframe Programmer
Responsibilities:
- Requirement Analysis, System Study and Estimation
- Preparation of Functional Specification/Design and Review
- Coding for the Enhancement tasks
- Peer Review and Review of Unit/System Test Strategy/Plan/Results
- User Acceptance Testing Support and Bug Fixing
- Project Communication to get the business requirements from the client.
- Continuous Support activities of Project
- Ensure adherence to Confidential Standards in Release/Change Management
- Ensure Quality of Deliverables- Components and Documents
- Ensure overall Delivery and Operational Excellence in the Customer Support modules of the project
- Provide on-call production support for the Unified Disability System and resolving abends.
Environment: IBM OS/390, MVS/ESA, E-Cobol, JCL, VSAM, REXX, DB2, CICS, FILE-AID, IBM Utilities, TSO/ISPF, Change man, XPEDITOR, BMC, QMF, Spufi, DB2 UDB V8 2, Cognos Impromptu version 7.2
Confidential, St. Louis
Mainframe Programmer
Responsibilities:
- Playing a role as team member
- Involved in the analysis, design, implementation and testing the enhancements and bug fixes
- Production support of Host jobs and Data Warehouse ETL Processes.
- Peer Review and Review of Unit/System Test Strategy/Plan/Results
- User Acceptance Testing Support and Bug Fixing
- Project Communication with Onsite to get the business requirements from the client.
- Continuous Support activities of Project
- Ensure adherence to Cognizant Standards in Release/Change Management
- Ensure Quality of Deliverables- Components and Documents
- Ensure overall Delivery and Operational Excellence in the Customer Support modules of the project
Environment: IBM OS/390, Windows NT, MVS/ESA, E-Cobol, JCL, VSAM, REXX, DB2 V8.1, CICS, FILE-AID, IBM Utilities, TSO/ISPF, Change man, XPEDITOR, BMC, QMF, Spufi, Data warehouse center DB2 UDB V8 2, Cognos Impromptu version 7.2, SQL Server
Confidential, San Francisco, CA
Programmer
Responsibilities:
- As a key team member, responsible for understanding the business & the technology.
- Analysis, Design, Development and Testing for Maintenance, development.
- Performance tuned production and test jobs and FTP transmissions.
- Created complicated JCL’s using UTILITIES, IDCAMS, and DFSORT.
Environment: IBM OS/390, Windows NT, MVS/ESA, E-Cobol, JCL, VSAM, REXX, DB2 V8.1, CICS, FILE-AID, IBM Utilities, TSO/ISPF, Change man, XPEDITOR, BMC, QMF, Spufi