Sr. Business Data Analyst Resume
Deerfield, IL
PROFESSIONAL SUMMARY:
- Over 10+ years of Information Technologies (IT) experience as Sr. Data Modeler/Data Architect and Data Analyst in design, development, testing and maintenance of data warehouse, business intelligence and operational data systems.
- Excellent understanding of an Approach to MDM to creating a data dictionary, Using Informatica or other tools to do mapping from sources to the Target MDM Data Model.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
- Knowledge and working experience on big data tools like Hadoop, AWS Redshift.
- Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS)
- Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
- Experience in Logical/Physical Data Modeling, Relational and Dimensional Modeling for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) systems (ODS, ADS and Data Marts).
- Experience in developing Entity - Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like Erwin, ER/Studio and Power Designer.
- Efficient in enterprise data warehouses using Kimball data warehouse and Inman's methodologies.
- Experience in designing Star Schema, Snowflake schema for Data Warehouse, by using tools like Erwin data modeler, Power Designer and Embarcadero E-R Studio.
- Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
- Having good knowledge of Hive, Sqoop, MR, Storm, Pig, HBase, Flume, and Spark.
- Involved in various data modeling tasks that included forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas etc.
- Good knowledge in Big Data -Hadoop, AWS Cloud, Amazon Redshift, AWS EC2, AWS EC3, AWS Lambda, MongoDB and Python.
- Good Knowledge of Big Data like Hadoop, Hive, Spark, Pig, Apache Big Data frameworks and standards.
- Proficient in UML Modeling like Use Case Diagrams, Activity Diagrams, and Sequence Diagrams with Rational Rose and MS Visio.
- Excellent understanding on Tableau Workbooks and Background Tasks for Targeted Interactions Validation.
- Experience in NOSQL DB and tools like Apache HBase to handle massive data tables containing billions of rows, millions of columns.
- Strong hands on experience using Teradata utilities (SQL, BTEQ, Fast Load, Multi Load, Fast Export, Tpump, Visual Explain, and Query man), Teradata parallel support and Unix Shell scripting.
- Well versed in conducting Gap analysis, Joint Application Design (JAD) session, User Acceptance Testing (UAT), Cost benefit analysis and ROI analysis.
- Good understanding in Normalization (1NF, 2NF, 3NF and BCNF) techniques for OLTP environments and Denormalization techniques for improved database performance in OLAP environments.
- Experience in designing Enterprise Data warehouse, Reporting data stores (RDS) and Operational data stores (ODS).
TECHNICAL SKILLS:
Data Modeling Tools: Erwin 9.7, IBM Info sphere Data Architect, E/R Studio 17, Power Designer and Oracle SQL Developer.
Big Data Technology: MapReduce, HBase, HDFS, Sqoop, Hadoop, Hive, PIG
Cloud Platforms: Amazon EC2, EC3, Elastic Search, Elastic Load Balance.
Operating System: Windows 8/10, UNIX, Sun Solaris.
Database Tools: Microsoft SQL Server 16.0, Teradata 16.0, Oracle 12c and MS Access.
BI Tools: Tableau 10, Tableau server, Tableau Reader, Crystal Reports
Packages: Microsoft Office 2016, Microsoft Project 2016, SAP and Microsoft Visio, Share point Portal Server.
Version Tool: VSS, SVN, CVS.
Programming Languages: Oracle PL/SQL, UNIX Shell Scripting
Methodologies: Agile, Ralph Kimball, BillInmon s data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), and Joint Application Development (JAD).
WORK EXPERIENCE:
Confidential - Deerfield, IL
Sr. Business Data Analyst
Responsibilities:
- Interviewed Business Users to gather Requirements and analyzed the feasibility of their needs by coordinating with the project manager and technical lead.
- Supported project metrics analysis, team communication, resource planning, risk analysis, report generation and documentation control using workbench application.
- Ensure data integrity, data security and process optimization by identifying trends and provide reports as necessary.
- Established a business analysis methodology around the RUP (Rational Unified Process). Developed use cases, project plans and manage scope.
- Reviewed and approved the QA team's test scripts also participated in the QA team's defect meetings to review defects as entered in Quality Center.
- Wrote UAT testing scripts, conducted UAT planning, and held meetings with business users for UAT testing.
- Perform Gap Analysis of the processes to identity and validate requirements
- Identified/documented data sources and transformation rules required populating and maintaining data Warehouse content.
- Collected requirements from Business Users and take part in preparation of technical specifications
- Created new database objects like tables, procedures, Functions, Indexes and Views
- Designed Constraints, rules and set Primary, Foreign, Unique and default key and hierarchical database.
- Developed stored procedures in SQL Server to standardize DML transactions such as insert, update and delete from the database.
- Created databases, tables, stored procedures, DDL/DML Triggers, views. User defined data types, functions, cursors and indexes using T-SQL
- Wrote queries and analyze SQL Server Views, Stored Procedures in the underlying Data Warehouse.
- Identified the dimension, fact tables and designed the data warehouse using star schema and designed new schema for the data mart.
- Created SSIS package to load data from Flat files, Excel and Access to SQL server using connection manager
- Transform the data using various SSIS tools such as derived Columns, conditional splits, data conversions, Aggregate and Pivot transformation
- Created data transformation task such as BULK INSERT to import data from client
- Identified the dimension, fact tables and designed the data warehouse using star schema and designed new schema for the data mart.
- Manage Production and UAT defects raised by business end-users using HP Quality Centers. Analyze user stories and data attributes.
- Documented and managed data dictionaries of systems.
- Supported logical and physical mapping efforts.
- Helped review the accuracy of the logical data models and data mapping transformation rules.
- Developed all the required stored procedures, user defined functions and triggers using T-SQL and SQL
- Created SSRS report, prepared prompt generated/ parameterized report using SSRS
- Created reports from OLAP, sub reports, bar charts and matrix reports using SSRS.
- Worked on preparation of Test plans, Test data and execution of test cases to check application functionality that meets user requirements.
- Developed test plans with QA team and helped test every scenario using Mercury Test Director Tool for system testing.
- Used Test Case distribution and development reports to track the progress of test case planning, implementation and execution results.
Environment: SQL, RUP, Quality Center, UAT, SQL Server 2017, DML, DDL, T-SQL, SSIS, SSRS, OLAP, Mercury, Test Director, Rational Rose, Agile, PL/SQL, HTML 5, MS Office 2016, MS Visio 2016
Confidential - Denver, CO
Sr. Data Architect/Data Modeler
Responsibilities:
- Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
- Created logical and physical 3NF relational models based on XML data extracts from transportation logistics application (ER/Studio)
- Used Star Schema and Snowflake Schema methodologies in building and designing the Logical Data Model into Dimensional Models
- Lead a team responsible for design and implementation of ODS for transportation and logistics management system
- Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data modeling using Star and Snowflake schemas.
- Attended numerous s to understand the Healthcare Domain and the concepts related to the project (Healthcare Informatics).
- Collaborated with other data modelers to understand and implement best practices within the organization.
- Involved in Data Architecture, Data profiling, data mapping and Data architecture artifacts design.
- Developed strategies for warehouse implementation, data acquisition, and archive recovery.
- Used M-LOAD, Fast-load and T-pump loading to migratedatafrom Oracle to Teradata.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using ER Studio 9.7.
- Driven the technical design of AWS solutions by working with customers to understand their needs.
- Conducted numerous POCs (Proof of Concepts) to efficiently import large data sets into the database from AWS S3 Bucket.
- Worked on analyzing source systems and their connectivity, discovery, data profiling and data mapping.
- Driven the technical design of AWS solutions by working with customers to understand their needs
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from Teradata database.
- Collected large amounts of log data using Apache Flume and aggregating using PIG in HDFS for further analysis.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Designed and architecting AWS Cloud solutions for data and analytical workloads such as warehouses, Big Data, data lakes, real-time streams and advanced analytics
- Interacted with End-users for gathering Business Requirements and Strategizing the Data Warehouse processes
- Write complex Netezza views to improve performance and push down the load to database rather than doing it in the ETL tool.
- Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
- Created DDL scripts using ER Studio and source to target mappings to bring the data from source to the warehouse.
- Worked with MapReduce frameworks such as Hadoop and associated tools (pig, Sqoop, etc)
- Used ETL methodology for supporting data extraction, transformations and loading processing, in a complex MDM using Informatica.
- Generated the frame work model from IBM data Architect for the Cognos reporting team.
Environment: ER/Studio 9.7, Teradata 15, Amazon Redshift, AWS, Oracle 12c, ODS, OLAP, OLTP, Hadoop 3.0, MapReduce, HDFS, Sqoop 1.4, Apache Flume 1.8, Agile, OLAP, SAP Kafka, Pig 0.17, Oozie, Cassandra 3.11, MDM, Informatica 9.6, NoSQL, Unix.
Confidential - Broadway, NY
Sr. Data Architect/Data Modeler
Responsibilities:
- Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
- Designed and developed the conceptual then logical and finally physical data models to meet the needs of reporting.
- Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
- Implemented logical and physical relational database and maintained Database Objects in the data model using Erwin
- Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
- Used Agile Methodology of Data Warehouse development using Kanbanize.
- Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce.
- Performed the Data Mapping, Data design (Data Modeling) to integrate the data across the multiple databases in to EDW.
- Designed both 3NF Data models and dimensional Data models using Star and Snowflake schemas.
- Involved in Normalization/Denormalization techniques for optimum performance in relational and dimensional database environments.
- Developed Master data management strategies for storing data.
- Worked with Data Stewards and Business analysts to gather requirements for MDM Project.
- Involved in Testing like Unit testing, System integration and regression testing.
- Worked with SQL Server Analysis Services (SSAS) and SQL Server Reporting Service (SSRS).
- Worked on Data modeling, Advanced SQL with Columnar Databases using AWS.
- Perform reverse engineering of the dashboard requirements to model the required data marts.
- Developed Source to Target Matrix with ETL transformation logic for ETL team.
- Cleansed, extracted and analyzed business data on daily basis and prepared ad-hoc analytical reports using Excel and T-SQL
- Created Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Handled performance requirements for databases in OLTP and OLAP models.
- Conducted meetings with business and development teams for data validation and end-to-end data mapping.
- Responsible for Metadata Management, keeping up to date centralized metadata repositories using Erwin modeling tools.
- Involved in debugging and Tuning the PL/SQL code, tuning queries, optimization for the Sql database.
- Lead data migration from legacy systems into modern data integration frameworks from conception to completion.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2 and SQL Server 2014 database systems..
- Managed the meta-data for the Subject Area models for the Data Warehouse environment.
- Generated DDL and created the tables and views in the corresponding architectural layers.
- Handled importing of data from various data sources, performed transformations using Map Reduce, loaded data into HDFS and Extracted the data from My SQL into HDFS using Sqoop
- Involved in performing extensive Back-End testing by writing SQL queries and PL/SQL stored procedures to extract the data from SQL Database.
- Participate in code/design reviews and provide input into best practices for reports and universe development.
- Involved in Netezza Administration Activities like backup/restore, performance tuning, and Security configuration
- Involved in the validation of the OLAP, Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
- Created a high-level industry standard, generalized data model to convert it into logical and physical model at later stages of the project using Erwin and Visio
- Participated in Performance Tuning using Explain Plan and TKPROF.
- Involved in translating business needs into long-term architecture solutions and reviewing object models, data models and metadata.
Environment: Erwin 9.5, HDFS, AWS, HBase, Hadoop 3.0, Metadata, MS Visio 2016, SQL Server 2016, Agile, PL/SQL, ODS, OLAP, OLTP, flat files, MDM.
Confidential - Brentwood, TN
Sr. Data Analyst
Responsibilities:
- Involved in Business and Data analysis during requirements gathering. Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
- Performed segmentation to extract Data and create lists to support direct marketing mailings and marketing mailing campaigns.
- Defined Data requirements and elements used in XML transactions. Reviewed and recommended database modifications
- Analyzed and rectified d Data in source systems and Financial Data Warehouse databases.
- Generated and reviewed reports to analyze Data using different excel formats Documented requirements for numerous Ad-hoc reporting efforts
- Troubleshooting, resolving and escalating Data related issues and validating Data to improve Data quality.
- Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
- Involved in Regression, UAT and Integration testing
- Participated in testing of procedures and Data, utilizing PL/SQL, to ensure integrity and quality of Data in Data warehouse.
- Metrics reporting, Data mining and trends in helpdesk environment using Access
- Gather Data from Help Desk Ticketing System and write Ad-hoc reports and, charts and graphs for analysis.
- Compiled Data analysis, sampling, frequencies and stats using SAS.
- Involved in SQL Server and T-SQL in constructing Tables, Normalization and De-normalization techniques on database Tables.
- Identify and report on various computer problems within the company to upper management
- Report on trends that come up as to identify changes or trouble within the systems using Access and Crystal Reports.
- Performed User Acceptance Testing (UAT) to ensure that proper functionality is implemented.
- Guide, train and support teammates in testing processes, procedures, analysis and quality control of Data, utilizing past experience and in Oracle, SQL, Unix and relational databases.
- Maintained Excel workbooks, such as development of pivot tables, exporting Data from external SQL databases, producing reports and updating spreadsheet information.
- Modified user profiles, which included changing users cost center location, changed users authority to grant monetary amounts to certain departments - monetary amounts were part of the overall budget amount granted per department
- Extracted Data from DB2, COBOL Files and converted to Analytic SAS Datasets.
- Deleted users from cost centers, deleted users authority to grant certain monetary amounts to certain departments, deleted certain cost centers and profit centers from database
- Created a report, using SAP reporting feature that showed which users have not performed scanning of journal voucher documents into the system.
- Created Excel pivot tables, which showed a table of users that, have not performed scanning of journal voucher documents. Users were able to find documents by double-clicking on his/her name within the pivot table
- Load new or modified Data into back-end Oracle database.
- Optimizing/Tuning several complex SQL queries for better performance and efficiency.
- Created various PL/SQL stored procedures for dropping and recreating indexes on target tables. Worked on issues with migration from development to testing.
- Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the Data
- Validated cube and query Data from the reporting system back to the source system. Tested analytical reports using Analysis Studio
Environment: SAS/BASE, SAS/Access, SAS/Connect, Informatica Power Center (Power Center Designer, workflow manager, workflow monitor), SQL *Loader, Congas, Oracle 11g, SQL Server 2014, Erwin 9.2, Windows 7, TOAD
Confidential . - Atlanta, GA
Data Analyst/ Data Modeler
Responsibilities:
- Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
- Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
- Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
- Translated logicaldatamodels into physical database models, generated DDLs for DBAs
- PerformedDataAnalysis andDataProfiling and worked ondatatransformations anddataquality rules.
- Involved in extensivedatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
- Collected, analyze and interpret complexdatafor reporting and/or performance trend analysis
- Wrote and executed unit, system, integration and UAT scripts in adatawarehouse projects.
- Extensively used ETL methodology for supportingdataextraction, transformations and loading processing, in a complex DW using Informatica.
- Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
- Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Written complex SQL queries for validating thedataagainst different kinds of reports generated by Business Objects XIR2
- Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle, flat files, with high volumedata
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business
- Involved in extensivedatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
- Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL.
- Performed GAP analysis of current state to desired state and document requirements to control the gaps identified.
- Developed the batch program in PL/SQL for the OLTP processing and used Unix Shell scripts to run in corn tab.
- Identified & record defects with required information for issue to be reproduced by development team.
- Worked on the reporting requirements and involved in generating the reports for the Data Model using crystal reports
Environment: Erwin 8.5, PL/SQL, Business Objects XIR2, Informatica 8.6, Oracle 10g, Teradata R13, Teradata SQL Assistant 12.0, PL/SQL, Flat Files, Teradata
Confidential
Data Analyst
Responsibilities:
- Performed Data analysis for the existing Data warehouse and changed the internal schema for performance.
- Helped to maintain systems and provided overall support to internal and external customers.
- Worked with internal IT resources to communicate both technical and non-technical requirements, proposed solution architecture, and supported implementation of new features and system upgrades.
- Created package to transfer data between OLTP and OLAP databases.
- Created pivot tables and charts using worksheet Data and external resources, modified pivot tables, sorted items and group Data, and refreshed and formatted pivot tables.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
- Prepared reports by collecting, merging, analyzing, and summarizing information.
- Developed and implemented custom inventory management database utilizing Excel (built regression analysis models) transferred it to MS Access which contributed to more efficient inventory control and sales projection.
- Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Wrote the test cases and technical requirements and got them electronically signed off.
- Performed Data analysis and Data profiling using complex SQL on various sources systems.
- Created SQL scripts to identify keys, Data anomalies, and Data validation issues.
- Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirement.
- Heavily worked on SQL query optimization also tuning and reviewing the performance metrics of the queries
- Used MS Visio for business flow diagrams and defined the workflow.
- Used cascaded parameters to generate a report from two different Data Sets.
- Involved withQuery Optimizationto increase the performance of the Report.
- Built risk templates for team members to enhance the business risk framework of Asset Management Services. Results from these templates will be reported to Senior Management.
- Translated Data from multiple sources into useful information and business drivers utilized by senior management for strategic decision making.
- Acknowledged by superiors for excellent communication of findings and projections in both narrative and oral reports.
Environment: Erwin 8.0, Sql, MS Office, MS Visio, SAS/BASE, SAS/Access, SAS/Connect, MS Access, SSRS, Windows 2000, Oracle 9i