Sr. Data Analytics Engineer Resume
Austin, TX
SUMMARY
- Around 8 years of technical experience in Data analysis and Data Modeling business needs of clients, developing TEMPeffective and efficient solutions and ensuring client deliverables wifin committed timelines.
- Extensive experience wif Normalization (1NF, 2NF and 3NF) and De - normalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments.
- Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies.
- Involved in validating, reviewing, and designing Enterprise Data Models and Critical Data Modules
- Experienced in all phases of Agile Software Development Life Cycle (SDLC). (Analysis, Requirements gathering, Designing) wif expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.
- Experienced in performing Data analysis using Excel, Python, SQL
- Performed analysis of implementing Spark using Scala and wrote spark sample programs using PySpark
- Experienced Data Modeler wif strong conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, creating data mapping documents, writing functional specifications, queries.
- Experienced in using R Programming, SAS, Python, Tableau and Power BI for data cleaning, data visualization, risk analysis, and predictive analytics.
- Knowledge in retrieving data from different databases like Microsoft SQL Server, Oracle, SQL, MYSQL, MS Access, DB2,and Teradata
- Expertise and Vast knowledge of Enterprise Data Warehousing including Data Modeling, Data Architecture, Data Integration (ETL/ELT) and Business Intelligence.
- Skilled in implementing SQL tuning techniques such as Join Indexes (JI), Aggregate Join Indexes (AJI's), Statistics and Table changes including Index.
- Extensive experience in development and designing of ETL methodology for supporting data transformations and processing in a corporate-wide environment using Teradata, Mainframes, and UNIX Shell Scripting.
- Developed Tableau visualizations and dashboards wif interactive views, trends and drill downs along wif user level security using Tableau Desktop.
- Expert at gathering business requirements by organizing and leading meetings between business users, clients, third party vendors, Application development teams and analysts by conducting and participating in JAD sessions on a scheduled basis
- Experience in design, development and implementation of teh enterprise-wide architecture for structured and unstructured data providing architecture and consulting services to Business Intelligence initiatives and data driven business applications
- Lead database development team and other members to develop and deliver BI/Data service solutions as per teh project requirements by ensuring data governance polices and best practices
- Designed and developed Conceptual, Logical and Physical Database design of relational databases
- Very strong in developing data algorithms, doing data analysis, and applying data warehousing skills like Star schema, Snow flake schema, and Canonical schema
- Well versed in understanding and enhancing teh existing models present in teh organization as well as creating new Entity-Relationship (E-R) models, applying Normalization/De-normalization techniques for optimum performance in Relational and Dimensional database environments.
- Advanced knowledge on Salesforce.com user administration tasks including permission sets, role assignments, uploads, data backups, Customer Objects, Workflow Rules, Custom Fields, Validation Rules and Visual Force Page etc
TECHNICAL SKILLS
Data Modeling: Conceptual, Logical, Physical, OLAP-Star/Dimensional Modeling, Snowflake, OLTP, E-R Diagrams, Relational Database Design, Master Data Management (MDM).
Data Modeling Design Tools: Sybase Power Designer, ERWin Data Modeler, ER/Studio Data Architect
Databases: Teradata, Oracle 12c/11g/10g/9i, DB2, MS Access, SQL Server, DB2
Programming & Query Languages: SQL, PL/SQL, C, C++, UNIX, VB, Python, JavaScript, Java, HTML5, CSS3, XML, UML, C++
Query Tools: TOAD, PLSQL Developer, Teradata SQL Assistant, Azure Data Studio, SQL Developer
Other tools: Azure Data Factory, Microsoft SSIS, Informatica, Cognos, IBM Infosphere, MS Visio, SPSS, MS Office
Visualization Applications: Tableau, Data Studio, PLX, Power BI, MS Excel
PROFESSIONAL EXPERIENCE
Client: Interior Global Logic Austin, TX
Role: Sr. Data Analytics Engineer
Description: ILG is a warehouse management company. Teh project primarily delt wif data conversion and governance for teh new SAP system in teh company
Responsibilities:
- Developed data management, quality and production strategies for bulk data updates/uploads
- Created and advanced SQL queries to formulate and analyze business requirements
- Produced databases, tools, queries, and reports for analyzing, summarizing, and root causing board failure data. Versed in finding patterns and trends in complex, multivariable data sets.
- Develop VBA and Python Scripts to Automate SAP-QM data pull process
- Analysis of ETL Mappings based on Facts & Dimensions from Source to target tables for directs moves and indirect moves based on transformation rules & lookup tables.
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau Desktop.
- Ensure purchasing requisitions are accurately and entirely filled out according to company procedures.
- Worked wif accounting in order to ensure shipments were received, processed, and paid for in timely manner.
- Worked wif maintenance department to create an organized inventory system.
- Experienced in performing Data Modelling, Data analysis using SQL, Python, Excel
- Developed pattern requirements and analyze data on how teh requirements interact.
- Create Python programs for data Extraction and Analysis by creating data science models
- Updated records wif a 98% on time rating, efforts prevented logistics impact
- Work wif cross functional teams to develop business requirements into production
- Participate in scrum meetings and coordinate wif Business Analysts to understand teh business needs and implement teh same into a functional design
- Developed data warehouse model in snowflake for over 100 datasets using whereScape.
- Heavily involved in testing Snowflake to understand best possible way to use teh cloud resources.
- Developed ELT workflows using NiFI to load data into Hive and Teradata.
- Worked on Migrating jobs from NiFi development to Pre-PROD and Production cluster.
- Scheduled different Snowflake jobs using NiFi.
- Used NiFi to ping snowflake to keep Client Session alive.
Confidential, Austin, TX
Sr. Data Analytics Engineer
Responsibilities:
- Created and run advanced SQL queries to pinpoint client's data issue
- Created issue tracking system using BI dashboard
- Documented business rules and implemented data transformations.
- Managing teh activities required to maintain a data & process governance structure
- Demonstrated experience in one or more of teh following disciplines: Data Quality
- Working knowledge of Hadoop and Data governance frameworks like Apache Atlas
- Exposure to Search and Lineage, Data security and policy, and data auditing
- Analysis of ETL Mappings based on Facts & Dimensions from Source to target tables for directs moves and indirect moves based on transformation rules & lookup tables.
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau Desktop.
- Reviewed purchasing agreements wif vendors and maintained open lines of communications and strong working relationships wif those vendors
- Compared product deliveries wif issued purchase orders and contacted vendors when their are discrepancies
- Complied to internal/external processes and procedures (Quality Management System, Quality Assurance Provisions, etc.)
- Developed new vendors as appropriate and collect supplier evaluation for Approved Supplier List
- Experienced in performing Data Modelling, Data analysis using SQL, Python, Excel
- Performed analysis of implementing Spark using Scala and wrote spark sample programs using PySpark
- Developed data management, quality and production strategies for bulk data updates/uploads
- Functioned as SME and managed new system that streamlined customer information process and increased productivity
- Involved in data model reviews wif internal Data Architect, Business Analysts, and Application Architects wif in-depth explanation of teh data model to make sure it is in line wif business requirements
- Experienced
- Performed high level and detailed design and planning for complex business requirements spanning multiple enterprise systems
- Created logical data model from teh conceptual model and converted into teh physical database design using Power Designer
- Worked on Master Data Management (MDM) business solution to deliver unique consumer profiles from different sources by coordinating and assisting wif third party vendor, on site and off site developers and project team
- Facilitated face to face and classroom training sessions for new users of all levels and positions
- Created ad hoc reports using multiple systems including Informatica, SQL and Python
- Collaborated wif internal stakeholders, identifying and gathering analytical requirements for customer, product and projects.
Confidential, Foster City, CA
Data Analytics Engineer
Responsibilities:
- Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and Bulk collects
- Extract and Analyze data in Hadoop environment to identify potentially fraudulent activity reducing Chargeback ratio by 50%.
- Develop complex queries using SQL to examine customer transaction data.
- Promoted enterprise-wide business intelligence by enabling report access in SAS BI Portal and on Tableau Server.
- Design robust analytic methods to process large reports of consumer information using Numpy.
- Analyzed Source to Stage and Stage to Target mapping document based on Facts & Dimensions and validating teh DDL, Counts, Not Nulls, Duplicates, PK & FK validation, business rules to be applied on target tables.
- Identified 16 fraud patterns preventing $3.2 million in loss and protecting customer information.
- Normalized tables, established joins, and created indexes wherever necessary utilizing profiler and execution plans.
- Hands-on experience in analyzing teh data and writing Custom SQL queries for better performance, joining teh tables and selecting teh required data to run teh reports.
- Restricted data for Users using row level security and use filters.
- Utilized technology such as SQL and Excel PowerPivot to query test data and customize end-user requests.
- Wrote custom procedures and triggers to improve performance and maintain referential integrity.
- Optimized queries wif modifications in SQL code, removed unnecessary columns and eliminated data discrepancies.
- Utilized dimensional data modeling techniques and story boarding ETL processes.
- Developed ETL procedures to ensure conformity, compliance wif standards and lack of redundancy, translated business rules and functionality requirements into ETL procedures using Informatica Power Center.
- Coordinate and complete ad-hoc inter-department requests and requests from strategic partners by providing decision support and consultative services.
Confidential, Madison, WI
Data Analytics Engineer
Responsibilities:
- Created logical Data model from teh conceptual model and its conversion into teh physical database design using Erwin.
- Used SAS to build models to identify definite patterns and suggest business wif possible problems and feasible solutions. Identify requirements for new possible business flows.
- Created reports using either Cognos and/or Tableau based client needs for dynamic interactions wif teh data produced.
- Use of Excel and PowerPoint on various projects as needed for presentations or summarization of data to provide insight into key business decisions.
- Data consolidation for operational weekly and monthly business reports, TEMPeffective and efficient solutions to deal wif LargeDatasets.
- Gatheird and documented teh Audit trail and traceability of extracted information for data quality.
- Pulled out sales order data and generated ad hoc stock pulling list using advanced excel functions every day.
- Extensive work experience in Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) system environment.
- Experienced in creating OLAP cubes, identifying dimensions, attributes, hierarchies,and calculating measures and dimension members.
- Experienced in working wif RDBMS like Oracle 10g and MicrosoftSQL Server 2016.
- Strong working experience on DSS (Decision Support Systems) applications and Extraction, Transformation and Load (ETL) of data from Legacy systems.
- Good experience in Data Modeling using Star Schema, Snowflake Schema.
- Extensive experience in writing SQL to validate teh database systems and for backend database testing.
- Performed analysis of implementing Spark using Scala and wrote spark sample programs using PySpark.
Confidential
Data Analyst
Responsibilities:
- Built Data Modules to find logic that flags transactions that resemble known patterns to build a risk model.
- Created customized reports, processes in SAS and Tableau Desktop
- Identified, analyzed and eliminated teh risk/fraud which arises due to teh unauthorized use of debit cards/credit cards.
- Maintained teh Service Level Agreements taking decisions independently wif high degree of accuracy.
- Redesigned a web based pivot-table application to perform teh business strategy layout.
- Gatheird requirements from teh business user groups, and translated teh requirements into functional designs and detailed technical designs
- Created logical and physical data models in ERWin data modeling tool for all teh releases and forward engineered teh tables to teh database by keeping adherence to existing naming standards
- Generated DDL scripts and worked wif DBA to translate and implement them for teh data model changes
- Used database design methodologies including star-schema, normalization, and selective demoralization to design clean, understandable, and elegant logical and physical databases appropriate for expected data use
- Involved in teh implementation, roll out and subsequent releases
- Created and maintained Source to Target mapping document which contains teh transformation logic and worked closely wif ETL Team and developed various complex transformations
- Maintained warehouse Metadata, and Enterprise Data warehouse standards
- Maintained data model dictionaries and integration of systems through database design.
- Performed Data Validation and Unit Testing before handing over teh data for production and Quality Testing
- Verified teh test plan, test conditions and test cases to be used in testing based on business requirements, technical specifications and/or product knowledge
- Worked on HP Application Lifecycle Management for defect tracking and resolved Data Modeling defects
- Created procedures in MS InfoPath on how to transmit flat files to database and upload them in SharePoint
- Investigated and generated various reports resulting in fraud reduction by 30%.
Confidential
Data Analyst
Responsibilities:
- Developed web scrapers to collect raw data from teh internal tools.
- Identified and reviewed suspected transactions on a day to day basis to detect fraud.
- Generated and investigated various reports, which are aimed at preventing teh fraud before-hand.
- Detected fraud patterns worth $1.05 Million, and limited exposure to fraud by closing transactions prematurely.
- Achieved and maintained teh error rate at 0.2% against a goal of 0.33%.
- Identified Multiple risk patterns of teh Identity Theft and debit/credit Fraud.
- Worked directly wif user groups, management and other team members to analyze and evaluate requests, and wrote detailed description of user needs
- Involved in creating logical models and transformed them into physical models using ERWin
- Designed and developed Oracle PL/SQL Procedures
- Spent majority of time on maintaining numerous entity relationship diagrams and data repositories, and sales reporting data mart on customers, employees, human resources and payroll, and marketing data
- Developed stored procedures, triggers, functions and packages in PL/SQL
- Gatheird statistics on large tables and redesigned indexes
- Worked on performance tuning of teh database, which includes indexes, and optimizing SQL statements, and monitoring teh server