We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

2.00/5 (Submit Your Rating)

Glendale, CA

SUMMARY:

  • Data Modeling professional with around 8 plus years of total IT experience and expertise in data modeling for data warehouse/data mart development, SQL and analysis of Online Transactional Processing(OLTP), data warehouse (OLAP) and business Intelligence (BI) applications.
  • Proficient in all phases of the Software Development Lifecycle (SDLC).
  • Experience in designing star schema (identification of facts, measures and dimensions), Snowflake schema for Data Warehouse, by using tools like Erwin 9.6/8.2/7.0, Power Designer 15, Embarcadero E - R Studio and Microsoft Visio.
  • Well versed in Normalization and DE Normalization techniques for optimum performance in relational and dimensional database environments and have performed normalization up to 3NF.
  • Experience working with Agile and Waterfall data modeling methodologies.
  • Generated DDL Scripts from Physical Data Model by using Forward Engineering technique
  • Immense Experience in Customer Interaction, Collecting and Handling Customer Requirements.
  • In a data warehouse environment, designed the staging area based on OLTP concepts, cleansed and profiled before loading into data marts.
  • Documented the source to target mappings for both data integration as well as web services.
  • Experienced as both an onsite and offshore project member.
  • Experience working with Zachman framework.
  • Experience in designing a canonical model.
  • Efficient in Dimensional Data Modeling for Data Mart design, identifying Facts and Dimensions and Strong understanding of principles of data warehousing, fact tables, dimension tables, Slowly Changing Dimensions (SCD) Type I and Type II.
  • Well versed in Forward and Reverse Engineering principles.
  • Worked with various RDBMS like Oracle 9i/10g/11g, SQL Server 2008/2008R2/2012/2014, DB2 UDB, Teradata.
  • Extensive experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation, Good at using Toad.
  • Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS)
  • Created data mappings to load data from target using different transformations and executed workflows for data loads to target systems.
  • Worked on Business As usual (BAU) Operating Model defined and agreed in the Blueprinting phase
  • Experienced in conducting Joint Application Development (JAD) sessions with the stakeholders, business users, and SME’s to obtain domain level information of projects.
  • Good understanding of Ralph Kimball and Bill Inmon modeling techniques.
  • A good expertise in Extract Transform and Load (ETL) data from spreadsheets, database tables and other sources using Informatica.
  • Expertise in working with multiple modules of business with QlikView.
  • Excellent knowledge in designing and developing dashboards using QlikView by extracting the data from multiple sources (Flat files, Excel, Access).
  • Involved in Requirement gathering & designing QlikView applications.
  • Experience with QlikView objects including Pivot, Straight Table, Multi box, Bar chart and more.
  • Involved in Extracting data from multiple data sources into QlikView application.
  • Developed QlikView Dashboards using different charts and list boxes.
  • More than four years of experience in working with Qlikview components such as Qlikview Desktop, Qlikview Enterprise Management Console (QEMC), Qlikview Publisher, Qlikview Web Server and Ajax Clients.
  • Developed custom dashboards/reports using Qlikview.
  • Well-versed in writing SQL queries to perform end-to-end ETL validations and support Ad-hoc business requests.
  • An excellent experience in generating ad-hoc repots using Crystal Reports.
  • Very Good exposure to ETL tools like Informatica, and SSIS.
  • Building reports using SQL SERVER Reporting Services (SSRS), Crystal Reports, Business Objects and Cognos Excellent analytical, communication skills with clear understanding of business process flow, SDLC life cycle. Quick starter with ability to master and apply new concepts. Ability to meet deadlines and handle pressure in coordinating multiple tasks in a work/project environment.
  • Working knowledge on Release management and Change management. Worked on PPM tool to manage the status of applications and project maintenance.
  • Experience in using and maintaining Metadata Management Tools for maintaining Business and Technical Metadata.
  • Work closely with MDM Informatica Architects to fully understand and “read” ETL processes and process flows for business requirements, understand the business rules that are coded within the ETL load processes
  • Extensive experience in MDM solution and data warehousing solutions involving Data Mapping, ETL Development, Data Modeling, Metadata Management, Data migration and Reporting Solutions
  • Worked on Microsoft Word, Access and Excel
  • Data Analysis and creation of Excel Templates using Pivot Tables, VLOOKUP's and multiple nested functions
  • Updating Model Numbers using Formula like Index, Match, Indirect
  • Excellent communication skills, self-starter with ability to work with minimal guidance.
  • Designed Use Case for Member data search/add/update of member data

TECHNICAL SKILLS:

Data Modeling Tools: ERwin 9.5/8.2/7.3/7.0, Power Designer 15/12/11, Embarcardero ER Studio 6.6, MS Visio 2000/2007/2010, ER Studio.

Reporting Tools: Business Objects, Crystal reports 9, Business Intelligence, SSRS, Qlikview 9/10/11.X series, Tableau, Cognos 8, Toad, MATLAB, Business Objects XI, Crystal Reports 2008.

ETL Tools: Informatica 8x, 9x, SSIS.

Languages: SQL, PL/SQL, ANSI SQL, VISUAL ANALYTICS, OLAP CUBE, Python, R, based C# and VB, C, C++, HTMLDatabases: Oracle11g/10g MS SQL Server 2008/2008R2/2012/2014, MS Access, Teradata, DB2, Sybase 12, NoSQL, Aqua Data Studio, SQL Server Management Studio.

Operating Systems: Microsoft Windows7/8/8.1/10, UNIX, LINUX

PROFESSIONAL EXPERIENCE:

Confidential, Glendale, CA

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Involved in requirement gathering and data analysis and Interacted with Business users to understand the reporting requirements, analyzing BI needs for user community.
  • Involved in logical and physical designs and transforming logical models into physical implementations.
  • Normalized the data up to 3 rd Normal form.
  • Created Entity/Relationship Diagrams, grouped and created the tables, validated the data, identified PKs for lookup tables.
  • Worked on Business As usual (BAU) Operating Model defined and agreed in the Blueprinting phase
  • Involved in modeling (Star Schema methodologies) in building and designing the logical data model into Dimensional Models.
  • Documented the source to target mappings for both data integration as well as web services.
  • Experience working with MDM team with various business operations involved in the organization
  • Utilized Erwin’s forward/reverse engineering tools and target database schema conversion process.
  • Designed the data marts in dimensional data modeling using star and snowflake schemas.
  • Redefined attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities.
  • Metadata & Data Dictionary Management; Data Profiling; Data Mapping.
  • Support the SSRS environments through report deployment, archival, and subscription maintenance.
  • SSIS ETL building and debugging
  • Identify the PK, FK relationships across the entities and across subject areas.
  • Developed ETL routines using SSIS packages, to plan an effective package development process, and design the control flow within the packages.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
  • Used Python to extract information from XML files.
  • Developed Business Logic using Python on Django Web Framework.
  • Took active role in the design and development of user interface objects in QlikView applications. Connected to various data sources like SQL Server, Oracle and flat files.
  • Presented the Dashboard to Business users and cross functional teams, define KPIs (Key Performance Indicators), and identify data sources.
  • Extensively worked with Set Analysis. Expert in writing complex set analysis expressions in Qlikview applications. Created and Scheduled Weekly QlikView reports to distribute on email using NPrinting and publisher.
  • Worked with applications like R, and Python to develop neural network algorithms, cluster analysis.
  • Created and maintained logical, dimensional data models for different Claim types.
  • Designed data flows that extract, transform, and load data by optimizing SSIS performance
  • Worked with slowly changing dimensions (SCDs) in implementing custom SCD transformations.
  • Involved in loading the data from Source Tables to Operational Data Source tables using Transformation and Cleansing Logic.
  • Involved in development of Informatica mappings and tuning for better performance.
  • Created Informatica mappings with stored procedures to build business rules to load data
  • Data gathering, web scraping using Python BeautifulSoup/Scrapy
  • Data manipulation, munging with pandas, math, scikit-learn.
  • Worked on all data management activities on the project data sources, data migration.
  • Worked on creating DDL, DML scripts for the data models.
  • Worked on stored procedures for processing business logic in the database.
  • Performance query tuning to improve the performance along with index maintenance.
  • Worked on the reporting requirements for the data warehouse.
  • Worked on Microsoft Word, Access and Excel
  • Data Analysis and creation of Excel Templates using Pivot Tables, VLOOKUP's and multiple nested functions
  • Updating Model Numbers using Formula like Index, Match, Indirect
  • Created support documentation and worked closely with production support and testing team.

Environment: ERwin 9.5/8.2, Oracle 11g, Crystal Reports, Data Mapping, SSRS, Toad, Windows OS, DB2, Teradata, SSRS, SQL, Python, Qlikview 11.6, Informatica 9.5, R.

Confidential, Denver, Colorado

Data Modeler/Data Analyst

Responsibilities:

  • Gather the various reporting requirement from the business analysts.
  • Reverse Engineering the reports and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Conduct Design discussions and meetings to come out with the appropriate Data Mart at the lowest level of grain for each of the Dimensions involved.
  • Created Data flow diagrams for current system.
  • Developed a data topology based on the data storage, various replications, and movements of data.
  • Participated in several JAD (Joint Application Design/Development) sessions to track end to end flow of attributes starting from source screens to all the downstream systems.
  • Design the View models per the requirements of downstream users like web portals and other consumers.
  • Developed various Qlikview Data Models by extracting and using the data from various sources files, Sybase, DB2, Excel, and Big data, Flat Files.
  • Implemented Incremental Load, Optimized Load & Binary Loads for performance tuning and for optimized QlikMarts.
  • Maintained data mapping information and documented properly
  • Checked for all the modeling standards including naming standards, entity relationships on model and for comments and history in the model. Conducted design walkthrough with project team and got it signed off.
  • Created Technical Mapping documents for the development team to develop mapping workflows.
  • Designed a STAR schema for the detailed data marts and Plan data marts involving shared dimensions (Conformed).
  • Used Teradata utilities such as Fast Export, MLOAD for handling various tasks.
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Performed reverse engineering for a wide variety of relational DBMS, including Microsoft Access, Oracle and Teradata, to connect to existing database and create graphical representation (E-R diagram) using Erwin.
  • Used BTEQ for SQL queries for loading data from source into Teradata tables.
  • Developed Logical data model using Erwin and created physical data models using forward engineering.
  • Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
  • Worked on testing, debugging, supporting reports using SSRS
  • Ensured the feasibility of the logical and physical design models.
  • Collaborated with the Reporting Team to design Monthly Summary Level Cubes to support the further aggregated level of detailed reports.
  • Worked with various Salesforce standard objects like Accounts, Contacts, Leads, Campaigns, Reports, Dashboards
  • SQA Data Analyst working in the Technology and Operations Group within the Home Mortgage Loss Mitigation space
  • Collect and analyze data regarding customer performance reporting results and concerns to management
  • Worked on the Snow-flaking the Dimensions to remove redundancy.
  • Designed Sales Hierarchy dimensions to handle sales hierarchy reporting historically and dynamically.
  • Worked with the Implementation team to ensure a smooth transition from the design to the implementation phase.
  • Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic.
  • Co-ordinated with QA team to test and validate the reporting system and its data.
  • Suggested effective implementation of the applications, which are being developed.

Environment: ERwin 8.2, PL/SQL Developer, Teradata, TOAD Data Analyst - 2.1, Oracle 9i, Qlikview 11.6, Quality Center- 9.2, Informatica Powercenter 9.1, Data Mapping, SSRS, R, TD SQL Assistant, Microsoft Visio, SQL Server 2000, Microsoft Excel.

Confidential, Indianapolis, IN

Data Analyst / Data Modeler

Responsibilities:

  • Conducted JAD sessions, wrote meeting minutes and documented the requirements.
  • Collected requirements from business users and analyzed based on the requirements.
  • Designed and built Data marts by using Star Schema.
  • Involved in designing Context Flow Diagrams, Structure Chart and ER- diagrams.
  • Gathered and documented requirements of a Qlikview application from users.
  • Extracted data from various sources (SQL Server, Oracle, text files and excel sheets), used ETL load scripts to manipulate, concatenate and clean source data.
  • Presented Qlikview applications to various levels of user.
  • Demonstrated Qlikview data analyst to create custom reports, charts and bookmarks.
  • Involved in creating scripts for data manipulation and management.
  • Supported Qlikview Administration tasks.
  • Extensive system study, design, development and testing were carried out in the Oracle environment to meet the customer requirements.
  • Serve as a member of a development team to provide business data requirements analysis services, producing logical and Physical data models using ERwin 4.0 and Power Designer.
  • Created tables, views, procedures and SQL scripts.
  • Utilized organizational standards for data naming, structuring and documentation.
  • Responsible for defining database schemas to support business data entities and transaction processing requirements.
  • Development of Informatica mappings, workflows and work lets.
  • Created test harness to enable comprehensive testing utilizing Python.
  • Developed Business Logic using Python on Django Web Framework.
  • Ensure the business metadata definitions of all data attributes and entities in each data model are documented to meet standards.
  • Experience in writing expressions and formulas for reports in SSRS
  • Knowledge on SSRS report performance using execution logs and query improvement for datasets.
  • Ensure the first cut physical data model includes business definitions of the fields (columns) and records (tables) were generated.
  • Integrated high - level business rules (constraints, triggers and indexes) with the code.
  • Assisted DBAs in the implementation of the data models.
  • Closely worked with ETL process development team.
  • Writing and executing customized SQL code for ad hoc reporting duties and used other tools for routine report generation.
  • Worked on R packages to interface with Caffe Deep Learning Framework.
  • Maintained current documentation for all primary and backup responsibilities.
  • Worked as part of a team of Data Management professionals supporting a Portfolio of development projects both regional and global in scope.
  • Applied organizational best practices to enable application project teams to produce data structures that fully meet application needs for accurate, timely, and consistent data that fully meets its intended purposes.
  • Conducted peer reviews of completed data models and plans to ensure quality and integrity from data capture through usage and archiving.

Environment: Power Designer, Qlikview 11.6, Sybase12, Windows NT, MS Excel, MS Visio, DB2, Teradata, Oracle 10g/9i, XML files, Tableau, Cognos Impromptu and Power Play, Python, SSRS, R, Business Objects.

Confidential, Kansas City, KS

Data Modeler/ Data analyst

Responsibilities:

  • Communicated with users and business analysts to gather requirements.
  • Involved in business process modeling using UML through Rational Rose.
  • Used Sybase Power Designer tool for relational database and dimensional data warehouse designs.
  • Designed a Star Schema based Data warehouse model to accomplish building the central Data warehouse.
  • Involved in the data model changes in XML format.
  • Normalized the database up to 3NF to put them into the snow flake schema of the data warehouse.
  • Involved in developing the central repository, populated the metadata and maintained the proper working of the system.
  • Created entity process association matrices, entity-relationship diagrams, functional decomposition diagrams and data flow diagrams.
  • Redefined many attributes and relationships in the reverse engineered model and removed the unwanted data as part of Data Analyst.
  • Defined and processed the Facts and Dimensions.
  • Performed data cleansing using PERL scripts and using the Informatica tool.
  • Comprehended the existing data model and recognized design for improved performance of the system and documented it.
  • Used Sybase Power Designer for creating tables using Forward Engineering.
  • Arranged brain storming sessions with project focus groups.
  • Identified KPI (Key Performance Indicators) for decision support making use of aggregator stage.
  • Created documentation and test cases, worked with users for new module enhancements and testing.
  • Worked with business analyst to design weekly reports using combination of Cognos Impromptu and Power play multidimensional.
  • Validated the model with response to questionnaire from analysts.

Environment: ER Studio, Rational Requisite Pro, Crystal Reports, Cognos Impromptu and Power play Informatica Power Center/Power Mart, MS Office.

Confidential

Data Analyst

Responsibilities:

  • Involved in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using Erwin 7.0.
  • Initiated Use Case Analysis using UML, which provided the framework for potential use case deliverables and their inter-relationships.
  • Involved in the study of the business logic and understanding the physical system and the terms and condition for sales Data mart.
  • Generated meta-data reports from data models.
  • Used Erwin for creating logical and physical data models for Oracle and Teradata database design, star schema design, data analysis, documentation, implementation and support.
  • Used SQL Server Integrations Services (SSIS) for extraction, transformation, and loading data into target system from multiple sources
  • Created SSIS packages to clean and load data to data warehouse.
  • Created package to transfer data between OLTP and OLAP databases.
  • Created SSIS Packages using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
  • Created a table to save dirty records.
  • Extracted data from XML to SQL server.
  • Created SSIS packages for data Importing, Cleansing, and Parsing etc. Extracted, cleaned and validated
  • Created complex queries to automate data profiling process needed to define the structure of the pre-staging and staging area.
  • Struck a balance between scope of the project and user requirements by ensuring that it covered minimum user requirements without making it too complex to handle.
  • Requirement gathering from the users. A series of meetings were conducted with the business system users to gather the requirements for reporting.
  • Involved in mapping the load data to the target based on primary, and foreign key relationship considering the constrained based load order
  • Designed the mapping with a slowly changing dimension Type2 to keep track of historical data
  • Designed a mapping to process the incremental changes that exits in the source table. Whenever source data elements were missing in source tables, these were modified/added in consistency with third normal form based OLTP source database.
  • Designed, developed the changes and enhancements for the existing submission, and refining programs were enhanced to in corporate the new calculations.
  • Interacted with end users to identify key dimensions and measures that were relevant quantitative.
  • Involved in logical and physical designs and transform logical models into physical implementations for Oracle and Teradata.
  • Used Erwin7.0 tool for reverse engineering and target database schema conversion process.
  • Involved in overall testing and different team review meetings.
  • Worked on Master Data Management MDM.

Environment: HP-Unix 11.11, Oracle 11g, Teradata 12, MS Analysis manager with SQL Server 2008, MS SQL Server 2008, SQL Server 2008 Integration Services, ERwin 7.0, ER/ Studio 8, TOAD 9, XML, Microsoft Excel 2007, Access 2007, Business Objects SI, Acute reports 10, Power Designer.

Confidential

Business Analyst/Data Analyst

Responsibilities:

  • Involved in gathering user requirements along with the Business Analyst.
  • Participated in creating the logical model of an online processing system for a large financial institution using Erwin.
  • Worked with DBAs to generate physical model.
  • Worked on Bill inmon methodologies for modeling.
  • Created tables, views, procedures and SQL scripts and mapping documents.
  • Worked on slowly changing dimensions (SCD) and hierarchical dimensions
  • Worked on conversion process of data, which is stored in flat files into oracle tables.
  • Designed and Developed SQL procedures, functions and packages to create Summary tables.
  • Generating ad-hoc reports using crystal reports.
  • Developed database backup and restore policy.
  • Expertise in Creating Report Models for building Ad-hoc Reports Using SSRS.
  • Expertise in Generating Reports using SSRS and Excel Spreadsheet.
  • Assisted in the daily data entry at Home Mortgage data into system.
  • Analyze Home loans for accuracy and errors.
  • Disclose seller concessions and lender exception to credit borrower.
  • Disclose fees on Home loan to charge borrower
  • Expertise in Creating Various Parameterized, Cascaded, Linked, Drill-through and Drill-down Reports.
  • Hands on Experience in creating ETL Packages using SQL Server 2005 Integration Services (SSIS).
  • Good Understanding in Database and Data Warehousing Concepts.

Environment: Oracle, SQLServer, SSIS, SSRS, ERwin, HTML, Crystalreports, MSOFFICE, Windows 7.

We'd love your feedback!