Sr. Data Modeler/cloud Datawarehouse Analyst/data Analyst Resume
Canton, MI
SUMMARY
- Overall 10+ years of prevalent IT experience exceptionally inDataModeling working as a Data Modelerin Analysis, Design,DataModeling (Conceptual, Logical and Physical) for Online Transaction Processing and Online Analytical Processing (OLTP & OLAP) in financial domain.
- Experience in working withData modeling, Entity - Relationship diagrams, Model Transactional Databases andDataWarehousing using tools like Erwin, Power Designer, Oracle Designer, Embarcadero ER Studio and Microsoft Visio.
- Experience Analyzing and providing feedback on monthly financial results and key performance measures. Prepares management reports against forecast, budget and/or prior year.
- Hands - on experience in architecting and data modeling for AWS Redshift, AWS Oracle RDS, AWS PostgreSQL and AWS Aurora.
- Good knowledge of Cloud Data Warehouse, Big Data and Data Science tools including R, Python, Hadoop, Hive, SQL Azure Data Warehouse/Data Lake/Blob Storage and AWS Redshift.
- Proficient in all phases of Software Development Life Cycle (SDLC) including in-depth knowledge of agile methodology, Enterprise Reporting Life Cycle, SQL Server Migrations, change control rules, problem management and escalation procedures.
- Extensive knowledge in ER Modeling, Dimensional Modeling (Star Schema, Snowflake Schema), Enterprise DataWarehousing (EDW) and OLAP tools.
- Experience in providing financial expertise; budgeting, projections, forecasts Strong experience in Data Analysis, Data Profiling, Data Migration, Data Integration and Metadata Management Services.
- Experience with writing scripts in Oracle databases to extract data for reporting and analysis.
- Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system.
- Experience inDataModeling with expertise in creating Star & Snow-Flake Schemas, Fact and Dimension Tables, Logical and Physical models and converting Logical model into Physical model using forward and reverse engineering principles by using differentDatamodeling tools.
- Worked and extractedDatafrom various database sources like Oracle, SQL Server, DB2, My SQL, Sybase, MS Access and Teradata.
- Strong background in various Data Modeling tools using Confidential, ER/Studio and Power Designer
- Experience with Business Intelligence and full development cycle of aDataWarehouse, byData profiling including requirements gathering, design, implementation, and maintenance.
- Experience in working with Agile, Waterfall and Rapid Application Development (RAD)DataModeling methodologies.
- Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
- Efficient in DimensionalDataModeling forDataMart design, strong understanding of principles ofDataWarehouse concepts, Fact tables, Dimension tables, and Slowly Changing Dimensions.
- Proficient in Normalization (1NF, 2NF and 3NF) and De-normalization techniques for improved database performance in OLTP, OLAP,DataWarehouse andDataMart environments.
- Strong programming skills in a variety of languages such as Pythonand SQL
- Experience in Hadoop/Big Datarelated technologies like Map Reduce, HDFS, Apache Spark, Scala, Sqoop, and Hive in storage, processing, querying, and analysis ofData.
- Experienced in consulting with users in meeting specific technical and business needs and resolving problems.
TECHNICAL SKILLS
Data Modeling Languages: Erwin r9.1/9.5/9.6/x, ER/Studio, Oracle Designer, Sybase, Power Designer.
OLAP Tools: Microsoft Analysis Services, Business Objects and Crystal Reports9.
Programming Languages: SQL, T-SQL, PL/SQL, Base SAS, HTML, XML, VB.NET, C, C++, UNIX and Shell Scripting
Database Tools: Microsoft SQL Server /2012/2014, MySQL, Oracle 12c/11g/10g/9i, DB2, MS Access and Teradata
Methodologies: Ralph Kimball, Star Schema and Snowflake Schema.
Packages: Microsoft Office Suite, Microsoft Project 2010, SAP and Microsoft Visio.
ETL Tools: Informatica, Data junction, Ab-Initio, DataStage, SSIS, BODS
Other Tools: SAS Enterprise Guide, Web Service, Code Flow/TFS
PROFESSIONAL EXPERIENCE:
Confidential, Canton, MI
Sr. Data Modeler/Cloud Datawarehouse Analyst/Data Analyst
Responsibilities:
- Implemented dimension model (logical and physicaldatamodeling) in the existing architecture using ER Studio.
- Created ER diagrams using Open Sphere Model and implemented concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Managed, designed, and created theStarSchemaandSnowflakeSchemafor a financial data mart usingErwin.
- Designed and Developed Oracle PL/SQL Procedures LINUX and UNIX Shell Scripts fordata Import/Export anddataConversions.
- Extensively made use of Triggers, Table Spaces, Pre/Post SQL, Sequences, Materialized Views, Procedures and Packages in Data Models.
- Collected, reviewed, and analyzed financial data, worked on the development of new financial models, and completed detailed financial forecasts and budgets.
- Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys).
- Written Procedures and Functions using Dynamic SQL and written complex SQL queries using joins, sub queries and correlated sub queries.
- Implemented slowly changing dimension (SCD) type I and type II in dimensional modeling as per requirements
- Also, Very strong in Data Warehousing Concepts like Dimensions Facts, Surrogate keys, ODS, Staging area, cube also well versed in Ralph Kimball and Bill Inmon Methodologies.
- Enhancement of existing Oracle forms and reports adding new features based on user requirements.
- Generated DDL statements for the creation of new ER/studio objects like table, views, indexes, packages and stored procedures.
- Created tables, views, secure views, user defined functions in Snowflake Cloud Data Warehouse.
- Designed the data marts using theRalph Kimball'sDimensionalDataMartmodeling methodology usingErwin.
- Used Erwin and Visio to create 3NF and dimensionaldatamodels and published to the business users and ETL / BI teams.
- Developed scripts to migrate data from proprietary database to PostgreSQL.
- Developed logical and physicaldatamodels and mapping logical entities to the enterprise Data Model.
- Architected Extract, Transformation and Load processes (ETL) to populate an operational data store from various sources including other databases like Teradata, SQL, Oracle, spreadsheets, and flat files.
- Designed and developed star schema model for target database using ER/Studio.
- Extensively worked in SQL, PL/SQL, SQL Plus, SQL Loader, Query performance tuning, DDL scripts, database objects like Tables, Views Indexes, Synonyms and Sequences.
- Applied appropriate level of abstraction in designs and confirmed that datadesigns support the integration of data and information flow across systems and platforms.
- Implemented the Cloud Data warehouse solution for descriptive, Diagnostics and predictive analytics.
- Responsible for the data model development, review, approval and usedAgile MethodologyforData Warehousedevelopment.
- Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
- Analyzed and made recommendations on the impact ofdesigns as they related to common conceptual models.
- Developed many python applications for extracting data from different sources like PostgreSQL and export to formats desired by client.
- Worked with data governance teams to create and maintain documentation related to Metadata and data dictionaries of different kinds of data.
- Updated and enforceddataarchitecture policies, standards and procedures.
- Designed the logical data model from the technical design documents and translating to a physical data model usingErwin 9.6/x.
- Loaded diverse types (Structured, JSON, flat files etc.) into the Snowflake cloud data warehouse.
- Created mapping to load data intoAWS S3 bucketusing Informatica S3 connector also populated data into Snowflake from S3 bucket using complex SQL query.
- Coordinated efforts in Data Migration and data conversion from legacy to Oracle platform, documented Data Mapping and data profiling requirements, data transformation requirements for ETL and data integration.
- Worked in generating and documenting Metadata while designing OLTP and OLAP systems environment.
- Ensureddataobjects adhere to defined standards and best practices.
- Data Promoteddatamodels into production metadata environments.
- Worked on Database physical modeling and normalization using Erwin and Oracle Data Modeler.
- Created user defined stored procedure PL/SQL Developer for data conversion from flat files to Oracle database.
- Worked with both relational and dimensionaldatamodeling, data standards and datamodel practices.
- Performed maintenance of current datadiagrams and metadata to enable further enhancements.
- Provided Level of Efforts, activity status, identified dependencies impacting deliverables, identified risks and mitigation plans, identified issues and impacts.
- Mapped the data elements from the new system to the downstreamBIW(BusinessIntelligenceWarehouse) and produced source to target mapping document.
- Participate in development of cloud data warehouses and business intelligence solutions.
- Provided support to the Bank Financial department in modeling their risk with retail and commercial real estate sectors.
- Designed database solution for applications, including all required database design components and artifacts.
Environment: Erwin, SSAS, T-SQL, SQL Server, PostgreSQL, LINUX, MDM, Oracle PL/SQL, ER/Studio, ETL, Normalization and De-normalization, Metadata, Star-Schema Modeling, UNIX, Snowflake Schema Modeling, Agile, etc.
Confidential, Jacksonville, FL
Data Modeler/Cloud Data warehouse Analyst/Database Analyst
Responsibilities:
- Interacting with business users to analyze the business process, requirements and transforming requirements into Conceptual, logical and Physical DataModels
- Developed financial models using excel macros and data manipulation techniques. Adjusted the excel based models on a monthly basis to in corporate changes in the political and economic environment and hence the assumptions of the models.
- Developed logical/ physical datamodels using ER Studio tool across the subject areas based on the specifications and established referential integrity of the system.
- Developed DataMapping, DataGovernance, Transformation and Cleansing rules for various processes involving OLTP, ODS and OLAP.
- Developed the financing reporting requirements by analyzing the existing business objects reports.
- Worked with ETL to create source to target mappings and performed validation for source to target mappings.
- Understood basic business analyst's concepts for logical datamodeling, dataflow processing and database design.
- Efficient in enterprise data warehouses using Kimball data warehouse and Inman's methodologies.
- Developed data marts in Snowflake cloud data warehouse.
- Performed Source System Analysis, database design, data modeling for the warehouse layer usingMDMconcepts and package layer using Dimensional modeling.
- Created Physical DataModel (PDM) for the OLAP application using ER Studio.
- Developed and maintained datadictionary to create metadata reports for technical and business purpose.
- Involved in loading Datafrom LINUX file system to HDFSImporting and exporting datainto HDFS and Hive using Sqoop Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
- Identified and Defined Key Performance Indicators in SSAS.
- Played an active role in high-performance cloud data warehouse architecture and design. Developed complex data models in Snowflake to support analytics and self- service dashboard.
- Designed and implemented effective Analytics solutions and models with Snowflake.
- Designed and changed existing data model as per business requirements usingErwin 9.5.
- Designed and Developed Oracle PL/SQL and Shell Scripts, Data /Export, Data Conversions and Data Cleansing.
- Created/Tuned PL/SQL Procedures, SQL queries for DataValidation and for various dataprofiling activities for current system.
- Used Erwin and Visio to create 3NF and dimensionaldatamodels and published to the business users and ETL / BI teams.
- Extracted and loaded CSV files, json files data from AWS S3 to Snowflake Cloud Data Warehouse.
- Implementation ofMetadataRepository, Maintaining DataQuality, DataCleanup procedures, Transformations, Data Standards, Datagovernanceprogram, Scripts, Stored Procedures, triggers and execution of test plans.
- SME to Data Governance on current Common Data Model Project.
- Coordinated efforts in Data Migration and data conversion from legacy to Oracle platform, documented Data Mapping and data profiling requirements, data transformation requirements for ETL and data integration.
- Created HBasetables to load large sets of structured, semi-structured and unstructureddatacoming from UNIX, NoSQL and a variety of portfolios.
- Worked on logical data models and physical database design and generated database schemas using Erwin.
- Backup Restore Test for all PostgreSQL DB and worked on PostgreSQL Triggers creation.
- Worked on the Snow-flaking the Dimensions to remove redundancy.
- Collaborated with ETL, BI and DBA teams to analyze and provide solutions to dataissues and other challenges while implementing the OLAP model.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using ERWIN.
- Migrated Oracle database tables data into Snowflake Cloud Data Warehouse.
- Worked on identifying facts, dimensions, grain of facts and various other concepts of dimensional modeling which are used for datawarehousing projects.
Environment: Erwin, Toad, PL/SQL, Oracle, SQL Server, PostgreSQL, Quest Central for DB2, COBOL, Teradata, Microsoft SQL Reporting Services, Star-Schema Modeling, Data Marts, Agile, etc.
Confidential, Minneapolis, MN
Data Modeler/Database Analyst
Responsibilities:
- Involved indataanalysis and creatingdatamapping documents to capture source to target transformation rules.
- Developed the requireddatawarehouse model using Star schema for the generalized model.
- Used forward engineering approach for designing and creating databases for OLAP model.
- DevelopeddataMartfor the base data inStarSchema,Snow-FlakeSchemainvolved in developing thedatawarehousefor the database.
- Worked with business to identify the distinctdataelements in each report to determine the number of reports needed to satisfy all reporting requirements.
- Developed decision analysis tools, Excel models and other ad hoc analysis to analyze current investment performance.
- Established Data Management Office and Chair Data Governance and Business Intelligence councils implemented global Data Governance, Data Stewardship and Data Quality monitoring processes, standards, policies and executive reporting dashboards.
- PostgreSQL RDBMS to support agile development teams and mission critical production systems.
- Performed scoring and financial forecasting for collection priorities using Python, R and SAS machine learning algorithms.
- WorkedNormalizationandDe-normalizationconcepts and design methodologies likeRalph KimballandBill Inmonapproaches and implemented Slowly Changing Dimensions.
- Performed Reverse Engineering of the current application using Erwin and developed Logical and Physicaldatamodels for Central Model consolidation.
- Developed many python applications for extracting data from different sources like PostgreSQL and export to formats desired by client.
- Responsible for Testing Schemas, Joins,Datatypes and column values among source systems, Staging andDatamart.
- Used SQL tools to run SQL queries and validate thedataloaded in to the target tables.
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
- Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation.
Environment: Informatica Power Center, TOAD, Oracle, PL/SQL, PostgreSQL, MS Access, MS Excel, PL/SQL, Business Objects, Erwin, Teradata V2R12, Teradata SQL Assistant 12.0, etc.
Confidential, New York, NY
Data Modeler/Data Analyst
Responsibilities:
- Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
- Extensively used Star Schema methodology in building and designing the logicaldatamodel.
- Part of team conducting logicaldataanalysis anddatamodeling JAD sessions, communicateddata-related standards.
- Prepared requirement for design ofdatamodel fordatamarts using the identifieddataelements.
- Involved withdataprofiling for multiple sources and answered complex business questions by providingdatato business users.
- Migration of Data from Reports generated by various vendors into PostgreSQL Databases using PostgreSQL Export/Import Procedures.
- Prepared business case for thedatamart and then developed and deployed it.
- Used Erwin to transformdatarequirements intodatamodels.
- Worked in importing and cleansing ofdatafrom various sources like DB2, Oracle, flat files onto SQL Server with high volumedata.
- Troubleshoot test scripts, SQL queries, ETL jobs,datawarehouse/datamart/datastore models.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part ofDataAnalysis responsibilities.
- Worked extensively indataanalysis by querying in SQL and generating various PL/SQL objects.
Environment: ETL, Data Modelling, Data Marts, PostgreSQL, Oracle, SQL Server, Linux, Windows XP, Erwin, MS Visio.