Sr. Consultant - Technical Data Analyst Resume
Atlanta, GA
SUMMARY:
- Experience in Data Analysis, Data Profiling, Data Integration, Migration, Data governance and Metadata Management, Master Data Management and Configuration Management.
- Experience with Data Modeling, creating Star & Snow - Flake Schemas, Fact and Dimensions Tables, Physical and Logical Data Modeling using Erwin and Embarcadero.
- Ability to collaborate with peers in business and technical areas to deliver optimal business process solutions in line with corporate priorities.
- Assist in various testing tasks such as System integration testing, UAT, Sanity testing, smoke testing etc.
- Collaborated with the lead Data Architect to model the Data warehouse in accordance to FSLDM subject areas, 3NF format, Snow flake schema.
- Working knowledge of DiCom and Problem Loan Management applications.
- Help create ETL design document for implementation of data flow from source system to target.
- Knowledge in Business Intelligence tools like Business Objects, Cognos, Tableau and OBIEE
- Experience with Teradata as the target for data marts, worked with BTEQ, Fast Load and Multi Load
- Experience with Reporting tool Microsoft Power BI to make various Vendor spend and trade cycle reports.
- Experience in implementing MDM software solutions with Informatica MDM formerly, Siperian, Strong exposure to working on scheduling tools like AutoSys and Control-M.
- Strong experience in interacting with stakeholders/customers, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and analysis tools.
- Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Data Validation, Source to Target mappings, SQL Joins, Data Cleansing.
- Documenting new data to help source to target mapping. Also updating the documentation for existing data assisting with data profiling to maintain data sanitation, validation
- Experience in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project’s exposure to the forces of change
- Experience in coding SQL/PL SQL using Procedures, Triggers and Packages.
- Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies.
- Experience working on Data quality tools Informatica IDQ (9.1), Informatica MDM (9.1).
- Worked on different platforms such as Windows 95/windows/98/NT and UNIX, Sun Solaris, AIX, HP.
- End to end data transform and data analysis to provide the business with Consumption layer data
- Some knowledge about HDFS (Hadoop Distributed File System) in Hadoop platform.
- Experience in Working with different industries.
- Implemented Optimization techniques for better performance on the ETL side and also on the database side.
- Experience in creating functional/technical specifications, data design documents, data lineage documents based on the requirements.
TECHNICAL SKILLS:
Languages: T-SQL, PL/SQL, SQL, C, C++, XML, HTML, DHTML, HTTP, Matlab, DAX, Python
Databases: SQL Server 2014/2012/2008/2005/2000, MS-Access,Oracle 11g/10g/9i and Teradata
DWH / BI Tools: Microsoft Power BI, Tableau, SSIS, SSRS, SSAS, Business Intelligence Development Studio (BIDS), Visual Studio, Crystal Reports, Informatica 6.1.
Database Design Tools and Data Modeling: MS Visio, ERWIN 4.5/4.0, Star Schema/Snowflake Schema modeling, Fact & Dimensions tables, physical & logical data modeling, Normalization and De-normalization techniques, Kimball & Inmon Methodologies
Tools: and Utilities: SQL Server Management Studio, SQL Server Enterprise Manager, SQL Server Profiler, Import & Export Wizard, Visual Studio .Net, Microsoft Management Console, Visual Source Safe 6.0, DTS, Crystal Reports, Power Pivot, ProClarity, Microsoft Office, Excel Power Pivot, Excel Data Explorer, Tableau, JIRA
PROFESSIONAL EXPERIENCE:
Confidential, Atlanta, GA
Sr. Consultant - Technical Data Analyst
Responsibilities:- Part of DPO and trade cycle optimization team as Sr. Technical Consultant Data Analysis.
- Played critical role in collecting data from different data sources and data system like SAP, JDE, Lubes, etc.
- Gathering data requirement for project implementation and passing it along to the data services.
- Collaborating with Sr. management consultants and business Analysts to lay down the data requirements and data architecture.
- Collaborated with various data services teams for specifying the data requirement.
- Worked in Agile environment taking in consideration the frequent change in data requirements from the client. Used JIRA software to track progress in Agile methodology.
- KT with the client to understand their various Data Management systems and understanding the data.
- Creating meta-data and data dictionary for the future data use/ data refresh of the same client.
- Mapping flow of trade cycle data from source to target and documenting the same.
- Transforming and merging all the weekly client data into yearly file using ETL SSIS.
- Analysis and extraction of data from oracle DB using SSMS SQL.
- Structuring the Data Marts to store and organize the customer’s data.
- Collaborating with the DBA to archive the outdated data and for data base optimizations.
- Running SQL scripts, creating indexes, stored procedures for data analysis.
- Used various BI tools like SSRS, tableau, Power BI to visually show findings of data analysis.
- Creating reports on Microsoft Power BI on the data extracted from the DB.
- Writing DAX scripts on Power BI for reporting purposes.
- Presenting the report to the client and the Sr. management, as consumption layer.
- Provided strategic vision and direction with operational and business leaders for enterprise information initiatives enabling Executives to make information-based decisions.
- Performing QA on the data extracted, transformed and exported to excel.
- Perform Continuous Data Validation post implement of the project assuring management of the data sanitation and data quality.
- Documenting the data lineage from getting the data from the client till the data is analyzed by the top level management, documenting the QA test plans and documenting the data analysis scripts for better understanding by the business.
- Documenting the new incremental data from various sources (client and external sources)
- Technology used - SSMS, SSIS, Power BI, Microsoft Excel, Microsoft Power Point
- Environment - Oracle, MS Office, Microsoft reporting tools etc
- Proficiency - Writing SQL scripts, building ETL packages for data transformation, writing stored procedures, Building views, Data analysis, performing QA on the data.
Confidential, Buffalo, NY
Sr. Data Analyst
Responsibilities:- Played an Integral role in the Phase 2 of the CODS application Development. Involvement in different tasks such as Requirements gathering, Designing data models, Data analysis & mapping, Unit/System testing, ETL support, Sanitation testing and production support.
- Worked on Oracle SQL developer, created Complex SQL scripts for stored procedures, writing queries for data analysis, fluent with various SQL function such as joins, etc.
- Data Analysis for the credit review application such as DiCom and PLM.
- Using different approaches to data validation keeping in mind the nature of the data stored like Slowly Changing Dimension (SCD) type 1, 2 or 3.
- Created stored procedures for Informatica to automate creation of reports in .CSV for automating the emailing process to predefined mail boxes for consumption by the business users.
- Generative visualization reports on performance of mining trucks using MS Excel, Tableau.
- Created/altered table, views on various database environments such as Development, Test, Cert and Prod. Migrating script changes from one environment to another to assist the test team in ensuring data validation.
- Gathering requirements from the end business users and the end application users using the JAD methodology
- Documenting the requirements gathered and creation of documents such as BRD and Functional Specs for individual items in the BRD. Also assisted in defining the scope of the project and high-level timeline of the deliverables, keeping in mind agile methodology in high level view and waterfall methodology for individual deliverables.
- Worked on JIRA to track development progress, in scrum or Agile model.
- Liasoning between the technical team and fictional teams, while working directly with the VP of credit technology.
- Data analysis and Data profiling of the sample data pulled from the source systems i.e. EDW (Teradata), CRSSQL (Microsoft) which was the compilation of data from various banking systems like AFS, CLC, CLN, DSI, ILN etc.
- Creating scenario report for the Business users showcasing the data requirements for the phase 2 release.
- Supporting the modeling team to create Logical Data Model and Physical Data Model in Star and Snowflake Schema using the ERWIN application.
- Helping create a dynamic model in the DW in 3NF format and creating logic so that ETL is prepared to ‘Self-Heal’ it-self on exposure to unexpected values from the source database.
- Inbound and outbound ETL data mapping for 3NF data model in FSLDM structure, created to store dynamic and static data.
- Identifying the optimal tables and route for the data flow through multiple Data warehouses till the data reaches the end Application.
- Functional understanding of DiCom application used to identify problem loans and assist the audit team to review the loans with high probability of default.
- Assist the test team with UAT, System integration testing. Also carry out sanitation testing as and when required. Creating Master Test plan for the testing team to carry out the due diligence as soon as the ETL jobs are carried out in development phase.
- Carry out smoke testing to ensure data validation/ensure the changes to the data base were committed and the changes reflect the data as expected.
- Carry out BAU activities like monitor monthly load of data from the source to target ensuring the CA7 jobs are kicked off as expected.
- Contributed to Data Architecture, Data Design and Data sourcing.
Environment: Oracle Data base, Teradata, Microsoft SQL studio (partly), Microsoft Project, MS office, Windows, Tableau, DiCom Application (CQS), PLMR (tailored in-house application), ERWIN
Confidential, Charlotte, NC
Data Analyst
Responsibilities:- Planned, coordinated, and monitored project levels of performance and activities to ensure project completion in time.
- Conducted deep data validation and root cause analysis to ensure high data integrity
- Attending regular post-shop debrief meetings with shopping services to review results and improve performance
- Conduct complex ad hoc analytics, summaries and recommendations, as required
- Writing simple python programming as per the business requirement assisting with data analysis.
- Managed significant volumes of retail transaction activity for the Pricing team, partnering with Pricing Analysts to ensure timely and accurate retail uploads to stores
- Ensured Pricing audit compliance by organizing and tracking key files/information within systems
- Utilities such as Python, MLOAD, BTEQ and Fast Load
- Worked in a Scrum Agile process & Writing Stories with two week iterations delivering product for each iteration
- Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using ER Studio.
- Implemented the Slowly changing dimension scheme (Type III) for most of the dimensions.
- Implemented the standard naming conventions for the fact and dimension entities and attributes of
- Logical and physical model.
- Reviewed the logical model with Business users, ETL Team, DBA’s and testing team to provide
- Information about the data model and business requirements.
- Extensively worked in Oracle SQL, PL/SQL, SQL*Loader, Query performance tuning, created DDL scripts, created database objects like Tables, Views Indexes, Synonyms and Sequences
- Worked on transferring the data files to vendor through SFTP & FTP process
- Involved in defining and Constructing the customer to customer relationships based on Association to an account & customer
- Assisted in developed of client centric Master Data Management (MDM) solution.
- Define and design the Data Acquisition, Transformation, and Data Cleansing approach for the MDM implementation.
- Worked with architects and, assisting in the development of current and target state enterprise level data architectures
- Worked with project team representatives to ensure that logical and physical data models were developed in line with corporate standards and guidelines.
- Developed the ETL mappings in PL/SQL via packages, stored procedures, functions, views and triggers.
- Involved in defining the source to target data mappings, business rules and data definitions.
- Responsible for defining the key identifiers for each mapping/interface.
- Used data analysis techniques to validate business rules and identify low quality missing data in the existing WMS data warehouse (WMS)
- Developed PL/SQL packages using bulk collects and bulk variables.
- Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
- Worked on fulfilling/responding to business requirements from clients, preparing for business workshops, project kickoffs, data warehouse assessments, Informatica and MDM initiatives, data profiling and data quality workshops using Informatica Data Explorer and Informatica Data Quality.
- Migrated three critical reporting systems to Business Objects and Web Intelligence on a Teradata platform
- Created Excel charts and pivot tables for the Adhoc data pull
Environment: Teradata 13.1, Informatica 6.2.1, Ab Initio, Business Objects, Oracle 9i, PL/SQL, Microsoft Office Suite (Excel, Vlookup, Pivot, Access, Power Point), Visio, VBA, Micro Strategy, Tableau, ERWIN.
Confidential, Rockville, MD
Data Analyst
Responsibilities:- Created data governance and privacy policies.
- Assisted the project with Python programming, coding and running QA on the same from time to time.
- Defined accountability procedures governing data access, processing and storage, retention, reporting and auditing measuring contract compliance.
- Ensured that Business, Data Governance and Integration team leads are deeply involved in critical design issues and decisions.
- Developed data stewardship program establishing metadata registry responsibilities.
- Implemented metadata standards, data governance and stewardship, master data management, ETL, ODS, data warehouse, data marts, reporting, dashboard, analytics, segmentation, and predictive modeling.
- Gathering the information about manual sources that are not captured in point of delivery (POD) Mappings.
- Created the Enterprise Information Group encompassing Business Intelligence Center of Excellence, Data Governance Working Group, Enterprise Data Warehousing and Master Data Management.
- Provided strategic vision and direction with operational and business leaders for enterprise information initiatives enabling Executives to make information-based decisions.
- Work with users to identify the most appropriate source of record and profile the data required for sales and service.
- Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Define the list codes and code conversions between the source systems and the data mart.
- Worked with internal architects and assisting in the development of current and target state data architectures.
- Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
- Involved in defining the source to target data mappings, business rules, and business and data definitions.
- Responsible for defining the key identifiers for each mapping/interface.
- Responsible for defining the functional requirement documents for each source to target interface.
- Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
- Involved in configuration management in the process of creating and maintaining an up-to-date record of all the components of the development efforts in coding and designing schemas.
- Developed the financing reporting requirements by analyzing the existing business objects reports.
- Interact with computer systems end-users and project business sponsors to determine, document, and obtain signoff on business requirements.
- Responsible in maintaining the Enterprise Metadata Library with any changes or updates.
- Document data quality and traceability documents for each source interface.
- Establish standards of procedures.
- Generate weekly and monthly asset inventory reports.
- Evaluated data profiling, cleansing, integration and extraction tools (e.g. Informatica).
- Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality.
- Remain knowledgeable in all areas of business operations in order to identify systems needs and requirements.
Environment: SQL/Server, Oracle 9i, MS-Office, Teradata, Informatica, ER Studio, XML, Business Objects
Confidential, NJ
Data Analyst
Responsibilities:- Work with users to identify the most appropriate source of record and profile the data required for sales and service.
- Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Involved in defining the business/transformation rules applied for sales and service data.
- Define the list codes and code conversions between the source systems and the data mart.
- Worked with internal architects and, assisting in the development of current and target state data architectures.
- Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
- Involved in defining the source to target data mappings, business rules, data definitions.
- Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Teradata.
- Responsible for defining the key identifiers for each mapping/interface.
- Responsible for defining the functional requirement documents for each source to target interface.
- Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
- Coordinated meetings with vendors to define requirements and system interaction agreement documentation between client and vendor system.
- Enterprise Metadata Library with any changes or updates.
- Document data quality and traceability documents for each source interface.
- Establish standards of procedures.
- Generate weekly and monthly asset inventory reports.
- Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality.
- Remain knowledgeable in all areas of business operations in order to identify systems needs and requirements.
- Responsible for defining the key identifiers for each mapping/interface.
- Implementation of Metadata Repository, Maintaining Data Quality, Data Cleanup procedures, Transformations, Data Standards, Data Governance program, Scripts, Stored Procedures, triggers and execution of test plans
- Performed data quality in Talend Open Studio.
Environment: SQL/Server, Oracle 9i, MS-Office, Teradata, Informatica, ER Studio, XML, Business Objects
Confidential
Systems Analyst
Responsibilities:- Analyze business information requirements and model class diagrams and/or conceptual domain models.
- Gather & Review Customer Information Requirements for OLAP and building the data mart.
- Performed document analysis involving creation of Use Cases and Use Case narrations using Microsoft Visio, in order to present the efficiency of the gathered requirements.
- Calculated and analyzed claims data for provider incentive and supplemental benefit analysis using Microsoft Access and Oracle SQL.
- Analyzed business process workflows and assisted in the development of ETL procedures for mapping data from source to target systems.
- Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Terra-data.
- Responsible for defining the key identifiers for each mapping/interface
- Responsible for defining the functional requirement documents for each source to target interface.
- Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
- Coordinated meetings with vendors to define requirements and system interaction agreement documentation between client and vendor system.
- Enterprise Metadata Library with any changes or updates.
- Document data quality and traceability documents for each source interface.
- Establish standards of procedures.
- Generate weekly and monthly asset inventory reports.
- Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality
- Managed the project requirements, documents and use cases by IBM Rational RequisitePro.
- Assisted in building an Integrated Logical Data Design, propose physical database design for building the data mart.
- Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.
Environment: SQL Server 2008R2/2005 Enterprise, SSRS, SSIS, Crystal Reports, Windows Enterprise Server 2000, DTS, SQL Profiler, and Query Analyzer