We provide IT Staff Augmentation Services!

Senior Data Analyst/data Engineer Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • More than 14 years of experience in design, development, implementation, testing, data analysis and reporting of applications using Teradata, Oracle, SQL, PL/SQL, Python, AWS, Spark and UNIX.
  • Expertise in full project life cycle development (SDLC) from initial system planning and technology acquisition through installation, training and operation. Deep understanding of technology with focus on delivering business solutions.
  • Experience in working on different Databases/Data warehouses like Teradata, Oracle, AWS Redshift, Snowflake.
  • Performed Data Analysis using SQL, PL/SQL, Python, Spark, Databricks, Teradata SQL Assistant, SQL server management studio, SAS.
  • Strong knowledge and use of development methodologies, standards and procedures.
  • Proficient in writing Packages, Stored Procedures, Functions, Views, Materialized views and Database Triggers using SQL and PL/SQL in Oracle.
  • Developing and executing campaigns using Unica’s Affinium Campaign tool.
  • Building flowcharts from existing templates and also from scratch in Unica’s Affinium Campaign.
  • Created automated solutions/tools using Python and Javascript.
  • Experience in Performance Tuning & Optimization of SQL statements.
  • Experience in analyzing and understanding the business/system requirements.
  • Performed QA/QC for functional as well as backend perspective to validate campaign related data.
  • Worked with different loading utilities in Teradata and Oracle like Teradata BTEQ, MLOAD, FLOAD, TPUMP, Oracle SQL Loader.
  • Extensive experience in Analysis and Design of Database using Normalization Techniques.

TECHNICAL SKILLS

Enterprise Marketing Tools: Unica Affinium Campaign 8.1/8.0/7.5/7.2/6.0

Languages: SQL, PL/SQL, Python, Spark, JavaScript, SAS, ASP, HTML, NET basics

Databases: Teradata V 15.0, Oracle 8.0, 8i, 9i, 10g, 11g, AWS Redshift, Snowflake

Tools: Teradata SQL Assistant, TOAD 10.5 & 9.6.1, PL/SQL Developer, SQL Developer, Visual Source Safe, Clear Quest, Clear Case, WINSCPUltra Edit, Putty, Notepad++, Microsoft Visio, ODI, Oracle Reports 10g, Oracle Designer, SQL Workbench, Hue, Eclipse, Databricks, Anaconda, Spyder, Jupyter Notebook, Github

DB Utilities: SQL Loader, Export / Import, Teradata BTEQ, MLOAD, FLOAD, TPUMP

Operating Systems: UNIX, Windows

PROFESSIONAL EXPERIENCE

Confidential

Senior Data Analyst/Data Engineer

Responsibilities:

  • Perform Data Analysis on the analytical data present in AWS S3, AWS Redshift, Snowflake, Teradata using SQL, Python, Spark, Databricks.
  • Design, develop, implement and execute marketing campaigns for US card customers using Unica Affinium campaign, Snowflake, AWS S3, Pyspark, Databricks.
  • Create scripts and programs to gain an understanding of data sets, discover data quality and data integrity issues associated with the analytical data and perform root cause analysis for those issues.
  • Write complex SQL scripts to analyze data present in different Databases/Datawarehouse’s like Snowflake, Teradata, Redshift.
  • Perform segmentation analytics for each campaign using database technologies present both on premise (such as SQL, Teradata, UNIX) and on Cloud platform using AWS technologies and Big Data technologies such as Spark, Python and Databricks.
  • Create custom reports and dashboards using business intelligence software’s like Tableau and QuickSight to present data analysis and conclusions.
  • Create automated solutions using Databricks, Spark, Python, Snowflake, HTML.
  • Create programs using python to read real time data from SDP (Streaming Data Platform) and perform analysis and load the data to Analytical Cloud Datawarehouse.
  • Migrate scripts and programs from On - Prem environment to AWS Cloud environment.
  • Migrating campaigns from Unica Affinium campaign marketing tool to Quantum.
  • Automate the process to send the data quality alerts to slack channel and email using Databricks, Python and HTML. This will alert users if there are any issues with data.
  • Perform data comparison between SDP(Streaming Data Platform) real time data with AWS S3 data and Snowflake data using Databricks, Spark SQL, and Python.
  • Create and monitor production batch jobs which loads analytical data to the data source tables on daily basis. And fixing and re-executing jobs when there is a job failure.
  • Alert data consumers about the delays in data loads to the data sources/tables using Slack bot API integration with Python code.
  • Create batch programs using UNIX shell script and Teradata BTEQ.
  • Create and manage versions of the scripts using Github.
  • Segment customers using Unica Affinium campaign marketing tool.
  • Create segmentation reports (Test level Audit Report) for each campaign.
  • Create fulfillment reports like Data and Business intent validation reports. Perform Ad-Hoc queries & extracting data from existing data stores.
  • Manage the entirety of campaign’s logic, including audience segmentation, exclusions, and assignment of offers and channels.

Environment: Teradata V 13.10, Unica Affinium Campaign 8.1/8.0/7.5/7.2/6.0 , Teradata SQL Assistant, BTEQ, UNIX, SQL, Python, Databricks, Snowflake, Redshift, AWS, Spark, Quantum, HTML.

Confidential

Technical Lead

Responsibilities:

  • Analyze the requirements and communicate the same to various internal teams and business owners.
  • Involve in the business and functional requirements gathering with different batch teams for creating new batches.
  • Write complex SQL queries including inline queries and sub queries for faster data retrieval from multiple tables.
  • Create Views, Functions, Packages, Stored Procedures, in Oracle to implement application logic.
  • Create batch programs using UNIX to monitor the application.
  • Perform thorough QA/QC before staging the data and validate various business critical reports. Assisted the team in developing business strategies based on data.
  • Create scripts to import and export data between different environments using SQL* Loader.
  • Create scripts to clean up Database, Windows Server and created database links to retrieve data from multiple servers.
  • Develop reports for final approval, to send the data downstream to users.

Environment: Oracle (9i, 10g, 11g), WCC, UNIX, SQL, PL/SQL, SQL* Loader.

We'd love your feedback!