We provide IT Staff Augmentation Services!

Data Analyst Resume

2.00/5 (Submit Your Rating)

Morristown -, NJ

SUMMARY

  • 2 years of experience in Data Warehousing and Reporting including Extract, Transform, and Load Processes as a Data Analyst/ETL Developer
  • Strong wif SQL Statements to extract Data from Tables wif SQL programming experience (stored procedures, PL/SQL, query optimization).
  • Proficient in programming languages such as Python and SAS for algorithm design and statistical analysis.
  • Expertise in generating reports and dashboards using various tools like OBIEE, Tableau and Microsoft Excel.
  • Experience working in projects following Agile methodology, Components of SDLC, including Client Requirements Analysis and Development.
  • Good noledge of insurance domain explicitly in Duck Creek Policy Administration system.
  • Strong understanding ofdata warehousing concepts, OLTP and OLAPdata models.
  • Strong communication and interpersonal skills wif an analytical background.

TECHNICAL SKILLS

ETL and BI Tools: Informatica Power Center, Tableau, OBIEE

Databases: C, Python, SAS, MySQL, Oracle 11g

Statistical Analysis Tool: SAS, Microsoft SPSS

Interfaces & Other tools: Eclipse, MS Visio, MS Excel, Linux

PROFESSIONAL EXPERIENCE

Data Analyst

Confidential, Morristown - NJ

Responsibilities:

  • Performed data analysis activities such as data profiling, data cleansing and data exploration to identify trends, patterns and anomalies using SQL.
  • Involved in ETL Testing to ensure Informatica Source and Target Mapping are as per requirements.
  • Perform data mapping from Source Duck Creel XML to target SQL Tables
  • Collaborated wif offshore team on development and implementation to ensure bugs are prioritized, resolved and retested.
  • Created high and low-level design documents representing source to target mapping based on transformations.
  • Monitored the source and target rows and analysed the rejected rows and worked on reloading them.
  • Performed Unit testing and maintained test logs, created and modified Pivot tables during testing to check the data.
  • Worked closely wif the business to resolve the production issues and handle the change requirements.
  • Environment/Tools: Informatica Powercenter 9.1, JIRA, SQL Server 2016

Associate Software Engineer: ETL Developer

Confidential

Responsibilities:

  • Experience in requirement analysis, planning, design and implementation of Mappings and Workflows.
  • Developed ETL mappings of heterogeneous source files into using various transformations to meet business logic.
  • Optimized the Informatica mappings and SQL Queries to achieve high performance and data throughput.
  • Designed interactive dashboardsin OBIEE and Tableau using drill down, prompts, filters and variables.
  • Developed advanced SQL queries wif multi-table joins, subqueries and stored procedures including export/import in UNIX environment.
  • Created SCD type 1 and Type 2 mappings to update slowly changing dimensions.
  • Involved in performance tuning of the dashboards using cache management and aggregate tables.
  • Modeled Swim lanes to map the source to target dataflow used by the client in their businesses.
  • Assisted in designing the System Understanding document to halp onboarding members in the project to understand the client business flow and requirements.
  • Environment/Tools: Informatica Powercenter 9.1, Oracle 11g, SQL Server, OBIEE
  • A general empirical model of protein derived for the understanding of evolution of species.
  • Conducted deep-dive data analysis to gain insights, diagnose issues and provide analytically driven recommendations to improve the final output.
  • Predicted molecular evolution and phylogeny based on maximum likelihood algorithm using bioinformatics software MEGA 6.
  • Analyzed the disordered regions between species to study the conserved regions using data analysis and graphing software SigmaPlot.
  • Implemented complex algorithm in python to calculate the 3 Dimensional structural difference between two proteins.
  • Developed PDB file handling script to calculate the Root-mean Square deviation between the Cartesian coordinates.
  • Extensively used Numpy and Scipy libraries in python on both Linux and Windows Platform.
  • Worked on parsing file from PDB and searchIO framework.

We'd love your feedback!