Sr.data Modeler Resume
SUMMARY:
- Having 7+ years of technical experience in Oracle PL/SQL, and Data Modeling and developed effective and efficient solutions and ensuring client deliverables within committed timelines.
- Experienced Data Modeler with strong conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries.
- Experienced in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
- Expertise in AWS Resources like EC2, S3, EBS, VPC, ELB, SNS, RDS, IAM, Route 53, Auto scaling, Cloud Formation, Cloud Watch, Security Groups.
- Understanding in development of Conceptual, Logical and Physical Models for Online Transaction Processing and Online Analytical Processing (OLTP&OLAP).
- Skillful in Data Analysis using SQL on Oracle, MS SQLServer, DB2, Teradata, Hive and AWS.
- Experienced in trouble shooting SQL queries, ETL jobs, Datawarehouse, Data mart data store models.
- Assisted in creating communication materials based on data for key internal / external audiences.
- Expert in documenting the Business Requirements Document (BRD), generating the UAT Test Plans, maintaining the Traceability Matrix and assisting in Post Implementation activities.
- Enterprise Data Modeler with a deep understanding of developing Enterprise Data Models that strictly meet Normalization Rules, as well as Enterprise Data Warehouses using Kimball and Billmon Data Warehouse Methodologies.
- Knowledgeable in Best Practices and Design Patterns, Cube design, BI Strategy and Design and 3NFModeling.
- Extensive work experience on Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) system environment.
- I have handled performance tuning, conducted backups and ensure integrity and security of databases managed Postgres DB in the AWS environment and Aurora - Postgres.
- Expert in SQL queries, PL/SQLPa
PROFESSIONAL EXPERIENCE:
Confidential
Sr.Data Modeler
Responsibilities:
- ResponsibilitiesWorking as a Sr. Data Modeler to generate Data Models using Erwin r9.64 and developed relational database system. Researched, evaluated and deployed new tools, frameworks, and patterns to build sustainable Big Data platforms for our clients. Translated the business requirements into workable functional and non - functional requirements Confidential detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modelling. Have been working with AWS cloud services (VPC, EC2, S3, RDS, Redshift, Data Pipeline, EMR, DynamoDB, Work Spaces,
- Lambda, Kinesis, RDS, SNS, SQS). Involved in creating Physical and Logical models using Erwin. Worked on building the data model using Erwin as per the requirements. Designed the grain of facts depending on reporting requirements. Involved with Data Analysis Primarily Identifying Data Sets, Source
- Data, Source Meta Data, Data Definitions and Data Formats. Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design. Expert in the Data Analysis, Design, Development, Implementation and Testing using Data Conversions, Extraction,
- Transformation and Loading (ETL) and ORACLE, SQL Server and other relational and non-relational databases. Involved in Normalization / Denormalization techniques for optimum performance in relational and dimensional database environments. Document all data mapping and transformation processes in the Functional Design documents based on the business requirements. Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2 and SQL Server database systems. Highly proficient in Data Modeling retaining concepts of RDBMS, Logical and
- Physical Data Modeling until 3NormalForm (3NF) and Multidimensional Data ModelingSchema (Star schema, Snow-Flake Modeling, Facts and dimensions). Used data analysis techniques to validate business rules and identify low quality missing data in the existing data. Migrated an existing on-premises application to AWS. Used AWS services like EC2 and S3 for small data sets processing and storage, Experienced in Maintaining the Hadoop on AWS EMR. Generated ad-hoc SQL queries using joins, database connections and transformation rules to profile data from DB2 and SQL Server database systems.
- Worked with data compliance teams, Data governance team to maintain data models, Metadata, Data Dictionaries; define source fields and its definitions.
Confidential
Sr. Data Modeler
Responsibilities:
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity. Documented logical, physical, relational and dimensional data models. Designed the
- Data Marts in dimensional data modeling using star and snowflake schemas. Prepared documentation for all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, and glossary evolve and change during the project. Coordinated with DBA on database build and table normalizations and de - normalizations. Identified the entities and relationship between the entities to develop Conceptual Model busing ERWIN.
- Developed Logical Model from the conceptual model. Responsible for different Data mapping activities from Source systems. Involved in data model reviews with internal data, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements. Involved with Data Profiling activities for new sources before creating new subject areas in warehouse. Extensively worked Data Governance, i.e. Metadata management, Master data Management, Data Quality, Data Security. Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns on Teradata database as part of data analysis responsibilities. Performed complex data analysis in support of ad-hoc and standing customer requests. Delivered data solutions in report/presentation format according to customer specifications and timelines. Used Reverse Engineering approach to redefine entities, relationships and attributes in the data model as per new specifications in Erwin after analysing the database systems currently in use. Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design. Created the test environment for Staging area, loading the Staging area with data from multiple sources. Involved in SQL Development, Unit Testing and Performance
- Tuning and to ensure testing issues are resolved on the basis of using defect reports. Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases. Experience in creating UNIX scripts for file transfer and file manipulation. Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
Confidential
Data Analyst
Responsibilities:
- Defined scope of the project, gathered business requirements, preformed GAP analysis. Implemented Data lake using Hadoop architecture. Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data. Created rules using IBM InfoSphere Master Data
- Management for the MDM solution. Worked on IBM Infosphere Data Architect for creating logical data models. Involved in converting the user requirements into business requirements, functional requirements and technical requirements and also created business process models from the requirements specs.
- Managed requirements using DOORS. Defined the business logic for the web - services being used for the SOA based application. Worked to create Physical Data Designs/ First Cut Data Models for various projects/contracts. Extensively worked on Performance Tuning for Project using IBM Infosphere
- DataStage 8.5. Working on different data formats such as Flat files, SQL files, Databases, XML schema, CSV files. Involved in project cycle plan for the data warehouse source data analysis, data extraction process, transformation and ETL loading strategy designing. Involved in running Hadoop streaming jobs to process terabytes of text data. Worked with different file formats such as Text, Sequence files, Avro, ORC and Parquette. Used Flume to collect the log data from different resources and transfer the data type to Hive tables using different SerDes to store in JSON, XML and Sequence file formats. Layed the
- Arcitecture designing for the Data Lake. Successfully implemented projects using the Data lake strategies. Analyzed system specifications, business requirements for full understanding of the project to comply with corporate rules and regulations.
Confidential
PL/SQL Developer
Responsibilities:
- Involved in identifying the process flow, the work flow and data flow of the core systems. Worked extensively on user requirements gathering and gap analysis. Involved in full development cycle of Planning, Analysis, Design, Development, Testing and Implementation. Developed PL/SQL triggers and master tables for automatic creation of primary keys. Involved in Data analysis for data conversion included data mapping from source to target database schemas, specification and writing data extract scripts/programming of data conversion, in test and production environments. Developed Advance
PL/SQLpackages, procedures, triggers, functions, Indexes and Collections to implement business logic using SQLNavigator. Generated server side PL/SQLscripts for data manipulation and validation and materialized views for remote instance. Used SQL Server SSIStool to build high performance data integration solutions including extraction, transformation and load packages for dataware housing. Extracted data from the XML file and loaded it into the database. Designed and developed Oracle forms & reports generating up to 60 reports. Involved in building, debugging and running forms. Involved in Data loading and Extracting functions using SQL*Loader. Performed Database Administration of all database objects including tables, clusters, indexes, views, sequences packages and procedures. Designed and developed all the tables, views for the system in Oracle. Designing and developing forms validation procedures for query and update of data. Extensive learning and development activities. Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application. Testing all forms, PL/SQL code for logic correction.