We provide IT Staff Augmentation Services!

Sr. Data Architect/modeler Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY:

  • Over 10+ year of Senior Data Architect/Modeler/Analyst with IT professional experienced in Data Analysis, Data Modeling, Data Architecture, designing, developing, and implementing data models for enterprise - level applications and systems.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers Excellent experience in writing SQL queries to validatedatamovement between different layers in data warehouse environment.
  • Experience in BI/DW solution (ETL,OLAP, Data mart), Informatica, BI Reporting tool like Tableau and Qlikview and also experienced leading the team of application, ETL, BI developers, Testing team
  • Responsible for detail architectural design and data wrangling, data profiling to ensure data quality of vendor data, Source to target mapping
  • Worked on Informatica Power Center tools-Designer, Repository Manager, Workflow Manager.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Data Warehousing: Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house star/snowflake design.
  • Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders
  • Extensive ETL testing experience using Informatica 9x/8x, Talend, Pentaho.
  • Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
  • Expert in BI reporting and Data Reporting tools like Pentaho and SAP BI.
  • Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Assist in creating communication materials based on data for key internal /external audiences.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working knowledge of CRM Automation Salesforce.com, SAP.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin R6/R9, Rational System Architect, IBM Infosphere Data Architect, ER Studio and Oracle Designer

Database Tools: Microsoft SQL Server12.0, Teradata 15.0, Oracle 11g/9i/12c and MS Access

Tools: OBIE 10g/11g, SAP ECC6 EHP5, Go to meeting, Docusign, Insidesales.com, Share point, Mat-lab.

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe

BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1,Pentaho 6, SAP Business Objects, Crystal Reports Packages Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server

Version Tool: VSS, SVN, CVS.

Project Execution Methodologies: Agile, Ralph Kimball and BillInmondatawarehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

Quality Assurance Tools: Win Runner, Load Runner, Test Director, Quick Test Pro, Quality Center, Rational Functional Tester.

RDBMS: Microsoft SQL Server14.0, Teradata 15.0, Oracle 12c/9i, and MS Access

ETL/Datawarehouse Tools: Informatica 9.6/9.1/8.6.1/8.1, SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau 8.2, Pentaho.

Operating System: Windows, Unix, Sun Solaris

PROFESSIONAL EXPERIENCE

Confidential, Atlanta, GA

Sr. Data Architect/Modeler

Responsibilities:

  • Provided a consultative approach with business users, asking questions to understand the business need and deriving the data flow, logical, and physical data models based on those needs.
  • Designed the Logical Data Model using ERWIN 9.64 with the entities and attributes for each subject areas.
  • Used Tableau for BI Reporting and Data Analysis.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Worked on Tableau 7 for insight reporting and data visualization .
  • Designed and developed architecture fordataservices ecosystem spanning Relational, NoSQL, and BigDatatechnologies.
  • Created Logical and Physical Data Model using IBM Data Architect tool.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Actively used Business Object and Business Inteligence for Data Analytics and Reporting needs.
  • Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Redshift, SQL Server, and Oracle).
  • Extensively used Crystal Reports SAP SE 14.2 for Data Reporting
  • Experience with AWS ecosystem (EC2, S3, RDS, Redshift).
  • Advises/leads projects involving the ETL related activities and the migration or conversion of data between enterprise data systems. Coordinates interactions between central IT, business units, and data stewards to achieve desired organizational outcomes.
  • Gathered and analyzed existing physical data models for in scope applications and proposed the changes to the data models according to the requirements.
  • Advises on and enforces data governance to improve the quality/integrity of data and oversight on the collection and management of operational data.
  • Able to guide / partner with VP for architecting solutions for the Big data Organization
  • Integrated crystal reports using Erwin Data Modeler.
  • Use Erwin to support for SSL.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Involved in designing Logical and Physical data models for different database applications using the Erwin.
  • Worked with Netezza and Oracle databases and implemented various logical and physical data models for them.
  • Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Applied data analysis, data mining and data engineering to present data clearly.
  • Ensured high-quality data and understand how data is generated out experimental design and how these experiments can produce actionable, trustworthy conclusions.
  • Reverse engineered some of the databases using Erwin.

Environment: DB2, CA Erwin 7.0, Oracle 12c, MS-Office, SQL Architect, Tableau 7, TOAD Benchmark Factory, SQL Loader, PL/SQL, SharePoint, ERwin r 9.64, MS-Office, SQL Server 2008/2012, Pentaho 6, Crystal Reports.

Confidential, Northbrook, IL

Sr. Data Analyst/Modeler

Responsibilities:

  • Designed and build relational database models and defines data requirements to meet the business requirements.
  • Developed strategies for data acquisitions, archive recovery, and implementation of databases.
  • Responsible for developing and supporting a data model and architecture that supports and enables the overall strategy of expanded data deliverables, services, process optimization and advanced business intelligence.
  • Worked with Data Vault Methodology Developed normalized Logical and Physical database models
  • Extensively used Erwin r9.6 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Creation of BTEQ, Fast export, Multi Load, TPump, Fast load scripts for extracting data from various production systems.
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements and developed the logical and physical models using Erwin.
  • Owned and managed all changes to the data models. Created data models, solution designs and data architecture documentation for complex information systems.
  • Worked with reverse engineering Data Model from Database instance and Scripts.
  • Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Implemented Data Vault Modeling Concept solved the problem of dealing with change in the environment by separating the business keys and the associations between those business keys, from the descriptive attributes of those keys using HUB, LINKS tables and Satellites.
  • Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
  • Worked with Business Objects XI for helping client on their reporting needs.
  • Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
  • Gather and analyze business data requirements and model these needs. In doing so, work closely with the users of the information, the application developers and architects, to ensure the information models are capable of meeting their needs.

Environment: AWS Redshift, RDS, Big Data, JDBC, Cassandra, NOSQL, Spark, Scala, Python, Hadoop, MySQL, PostgreSQL, SQL Server, Erwin, Informatica, Business Objects .

Confidential, Plano, TX

Sr. Data Analyst/Modeler

Responsibilities:

  • Involved in maintaining and updating Metadata Repository with details on the nature and use of applications/datatransformations to facilitate impact analysis.
  • Created DDL scripts using ER Studio and source to target mappings to bring the data from source to the warehouse.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata .
  • Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle, flatfiles, MS SQL Server with high volumedata.
  • Participated in performance management and tuning for stored procedures, tables and database servers.
  • Create Logical Data Model for Staging, ODS and Data Mart and Time dimension as well.
  • Developed the design & Process flow to ensure that the process is repeatable.
  • Performed analysis of the existing source systems (Transaction database).
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Generated comprehensive analytical reports by running SQL queries against current databases to conductdataanalysis.
  • Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writingdata.
  • Designed Logical & Physical Data Model /Metadata/ data dictionary usingErwinfor both OLTP and OLAP based systems.
  • Co-ordinate all teams to centralize Meta-data management updates and follow the standard Naming Standards and Attributes Standards for DATA &ETL Jobs.
  • Finalize the naming Standards for Data Elements and ETL Jobs and create a Data Dictionary for Meta Data Management.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data

Environment: ER Studio, SQL Server 2008, SQL Server Analysis Services, SSIS, Oracle 10g, Business Objects XI, Rational Rose,Datastage, MS Office, MS Visio, SQL, SQL Server 2000/2005, Rational Rose, Crystal Reports 9

Confidential, Denver, CO

Sr. Data Analyst/Modeler

Responsibilities:

  • Responsible for producing a data roadmap to introduce BI/DW systems to the organization.
  • Analyzed the business requirements by dividing them into subject areas and understood the data flow within the organization
  • Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.
  • Generated a separate MRM document with each assignment and shared it on SharePoint along with the PDF of updated data models
  • Created a list of domains in Erwin and worked on building up the data dictionary for the company
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
  • Extensively used Erwin as the main tool for modeling along with Visio
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Worked on data mapping process from source system to target system. Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin
  • Designed Logical Data Models and Physical Data Models using Erwin.
  • Developed the Conceptual Data Models, Logical Data models and transformed them to creating schema using ERWIN.
  • Used HEDIS for reporting on Health Plan statistics.
  • Worked very close withDataArchitectures and DBAteam to implementdatamodel changes in database in all environments.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Developed enhancements toMongo DBarchitecture to improve performance and scalability.
  • Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
  • Performeddatacleaning anddatamanipulation activities using NZSQL utility.
  • Analyzed the physicaldatamodel to understand the relationship between existing tables. Cleansed the unwanted tables and columns as per the requirements as part of the duty being aDataAnalyst.

Environment: Erwin r8.2, Oracle SQL Developer, OracleDataModeler, Teradata 12, SSIS, Business Objects, SQL Server 2008, ER/Studio Windows XP, MS Excel.

Confidential

Sr. Data Analyst/Modeler

Responsibilities:

  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and PhysicalDataModels.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Attended and participated in information and requirements gathering sessions
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
  • Extensively used reverse engineering feature of Erwin to save thedatamodel with production.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms etc.
  • Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Created business requirement documents and integrated the requirements and underlying platform functionality.
  • Excellent knowledge and experience in Technical Design and Documentation.
  • Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
  • Involved in preparing the design flow for theDatastageobjects to pull thedatafrom various upstream applications and do the required transformations and load thedatainto various downstream applications
  • Performed logicaldatamodeling, physicaldatamodeling (including reverse engineering) using the ErwinDataModeling tool.
  • Experience in developing dashboards and client specific tools in Microsoft Excel and Power Point.
  • Responsible for the development and maintenance of Logical and Physical data models, along with corresponding metadata, to support Applications.

Environment: Oracle9i, PL/SQL, Solaris 9/10, Windows Server 2003 & 2008. NZSQL,Erwin8.0, ER- Studio6.0/6.5, Toad 8.6, Informatica 8.0, IBM OS 390(V6.0), DB2 V7.1.

We'd love your feedback!