We provide IT Staff Augmentation Services!

Analytics Research Associate Resume

0/5 (Submit Your Rating)

MA

SUMMARY

  • About 8 years of work experience in Analytics and related areas, developing statistical models for various projects, visualizing results and providing business insights.
  • I have played multiple analytics - oriented roles on several projects including formulation of the problem, data manipulation, model building, testing of hypotheses and proposing solutions that helped businesses.
  • I am also currently pursuing my second MS, in Business Analytics, on a part-time basis at Confidential .
  • Furthermore, I also have experience as a Research Assistant and Research Associate.
  • Collaborative and articulate with ability to analyze data, build models and report to key business stakeholders for use in strategic decision making.
  • In depth understanding of Advanced Analytical concepts and Data Mining techniques.
  • Experience in developing and implementing Statistical, Descriptive and Predictive Data Mining algorithms for a variety of problems involving large data sets.
  • Adept in understanding Business Objectives and formulating requisite hypothesis for model building.
  • Proficient in the visualization and interpretation of analytical findings from a Business perspective.
  • Skillful in communicating the results arising out of analytical models for use by the Stakeholders in strategic decision making.
  • Experience with Data Extraction, Data Validation, Data Aggregation and Data Manipulation from multiple sources using SQL, SAS, R Studio and Python.
  • Experience using different analytical tools like SAS, SPSS, XLMiner, R Studio, Python and ASPE.
  • Proficient in the building of ad hoc analytical and optimization models.
  • Proficient with data visualization tools like such as Tableau, SAS, R Studio and shinyapps.
  • Excellent with report creation using MS Access, Tableau and R Markdown.
  • Adept with both the theory and approach in building Time Series Forecasting Models using Smoothing techniques, ARIMA, SARIMA, GARCH and VAR models.
  • Experience using web APIs to import data as JSON format in R and subsequent manipulation of data.
  • In-depth proficiency with Hypothesis Testing, Multiple Regression, Logistic Regression, Decision Trees, Cluster Analysis, Factor Analysis, PCA, Queueing Theory, Network Graphs and Linear Programming Models.
  • Experience with Machine Learning techniques like Neural Networks, Naïve Bayes, SVM, k-NN and Association Rules besides ensemble methods like Random Forest.
  • Proficient in writing SQL queries and experience in Data Management using Oracle SQL Developer.
  • Excellent with MS Word, PowerPoint, MS Visio (Workflows/Process Flows), MS Excel (Pivot table, macros, v-look up, h-look up, charts and graphs), MS Access and LaTeX.
  • Proficient in developing Business Process model and use case diagrams using UML Methodology with MS Visio and SmartDraw.
  • Extensive experience in gathering and understanding business requirements, communicating effectively with Stakeholders, SME's, Tech Leads and Business Users.
  • Experience and comfortable with Software Development Life Cycle (SDLC) methodologies like Waterfall and Agile.
  • Dynamic Team Player with an aptitude to lead the team effectively and meet deadlines.

TECHNICAL SKILLS

Analytical Tools: SPSS, SAS, R Studio, ASPE, Python, LINDO, LINGO, JMP, Spark

Other Tools: Matlab, Mathematica, MS Excel, MS PowerPoint, MS Visio, SmartDraw, Apache Spark (Hadoop, Cassandra) and Salesforce CRM.

Techniques: Multiple Regression, Logistic Regression, Decision Trees, Random Forest, Factor Analysis, Cluster Analysis, Network Graphs, Linear Programming Optimization, Time Series Forecasting, Data Mining, Text Analytics, Probit, Logit and Machine Learning Algorithms including Neural Networks, Naïve Bayes, Support Vector Machines and k-Nearest Neighbors.

Reporting Tools: Tableau, MS Access, R Studio.

Programming Language: SQL, Perl, C, JavaScript

Database: Oracle, MS Access, NoSQL (Cassandra)

Documentation: MS Word, LaTeX, R Markdown

PROFESSIONAL EXPERIENCE

Confidential, MA

Analytics Research Associate

Responsibilities:

  • Created Reference databases from academic and industry journals/articles, and identify the keywords pertinent to analytics techniques and their business applications, over the past 10 years.
  • Worked with SAS Enterprise Miner to develop descriptive Text Mining algorithms to cluster the documents into disjointed sets.
  • Analyzed the descriptive clusters to ascertain descriptive terms for both academia and industry, thereby identifying the gaps between the two.
  • Wrote scripts in RStudio to create Network Graphs for a more comprehensive analysis that enabled the identification of gaps even better.
  • Developed classifier models using a variety of supervised learning algorithms including neural networks, Naïve Bayes, Support Vector Machines, Decision trees and ensemble method of Random Forest to predict the financial success of a Hollywood movie.
  • Carefully selected the potential significant variables from the data, based on evidence and rationale, which had the predictive power for the classification.
  • Some of these models outperformed the classification accuracy that were achieved using academic models.
  • Worked on another project in developing an application using shinyapps and Google Visualization for the visual exploration of global climate change indicators as provided by the World Bank.
  • Built models to predict the enrollment probability of graduate students at Confidential to help align recruitment effort for better yields.
  • Built models using logistic regression as well as decision trees for different programs.
  • Used Cross-tabulations to group programs with similar enrollment rates.
  • Provided recommendations based on the Lift Charts.
  • Involved in few other projects that involved using techniques like Time Series Analysis, Queueing theory and other Analytics techniques.

Confidential, IL

Data Analyst

Responsibilities:

  • Used Smoothing filters, ARIMA and SARIMA to model the stochastic part of the forecasting process using R Studio and SAS.
  • Built multiple linear regression models to model the deterministic part of the forecasting process.
  • Diagnosed the various models to analyze the utility of them in effectively forecasting the desired quantities.
  • Tested and validated the chosen models on a regular basis to determine if the performance of the implemented models were adequate.
  • Presented the results of the forecasting to the concerned teams along with recommendations for optimizing the resources involved and thereby increasing efficiency.
  • Built nonparametric models using Machine Learning Algorithms to estimate the Customer Churn probability.
  • Built Logistic Regression models as well to estimate the Customer Churn probability using various sets of predictors.
  • Formulated cases of optimization of the network traffic as a minimum cost flow model, a type of linear programming model.
  • Solved the Minimum Cost Flow models using ASPE add-on in MS Excel and presented the results.

Confidential, MA

Data Analyst

Responsibilities:

  • Discussed with the database team and provided inputs to improve the design of the database that would in turn help query data, using SQL, easier.
  • Manipulated data using SQL and SAS prior to building the models.
  • Built Survival models to predict the time duration after which a drug will be successful, if approved.
  • Built models, in conjunction with the research team, to predict the success of clinical trials and thereby predict the drug’s potential market based on several other factors like current market share of similar drugs amongst others.
  • Built models using machine learning algorithms to classify the success of potential new drugs based on historic data that included several variables including the ailment type.
  • Created reports to compare the various drugs based on data available/ provided by the database team.

Confidential, PA

Business Analyst

Responsibilities:

  • Built Multiple Regression models to explore the factors that influence insurance premiums for different types of insurances.
  • Built Logistic Regression models to estimate the churn probability of customers.
  • Built Probit and Logit models to estimate insurance pricing for different customer segments. These models were also used for Risk assessment, with further improvements.
  • Created reports using MS Access and Tableau as requested, from Views created using SQL.
  • Interpreted the various models along with its business implications, in addition to validating and testing the models regularly.

Confidential, MA

Ecommerce Analyst

Responsibilities:

  • Collaborated with the Database team to devise an approach for Data Aggregation.
  • Involved in the creation of Views, from multiple Tables, using Oracle SQL Developer that included all the non-confidential information from the aggregated data.
  • Manipulated data using SQL, R Studio and SAS including data reconstruction for fields where data were missing/ incorrect.
  • Created simple reports, along with descriptive statistics describing Sales, using Tableau and MS Access for weekly, monthly and quarterly data.
  • Used Cluster Analysis to group consumers, based on several factors that included their purchasing frequency and purchasing power. This targeting approach helped launch promotions targeting specific groups of consumers.
  • Presented the Target segments, with the traits, to the Marketing team to help enable them devise adequate promotional offers and campaigns.
  • Built multiple regression models, using SAS and SPSS, to estimate the overall sales/ sales of a product segment based on several factors.
  • Tested the statistical significance and usefulness of the models using the Analysis of Variance F-test and the Coefficient of Determination.
  • Used Variance Inflation Factor (VIF) to identify the issue of Multicollinearity.
  • Created PowerPoint presentations describing the models, along with the interpretations, insights derived using the principle of parsimony and helped interpret the results from a Business perspective to the Stakeholders.
  • Was involved in brainstorming sessions with the Sales and Marketing teams to identify data fields to be used in the segmentation models for email campaigns.
  • Built various models to identify segments for email campaigns based on historical data.
  • Built scoring model using RFM with additional variables like demographic information.
  • Built Trees using CHAID and CART methods to score individual customers using SAS.
  • Built Multiple Regression model using Dummy Variables for the response as well as for other Categorical variables.
  • Machine learning algorithm of Neural Networks was also used for scoring the individual customers.
  • Used SAS Visual Analytics to visually present various data fields to provide useful insights effectively for decision making.

Confidential, NJ

Jr. Data Analyst

Responsibilities:

  • Created data tables using SQL to restructure the database as instructed and joined different tables so as to make data access quicker.
  • Wrote queries in SQL to extract requisite data and validate if the results are consistent with those in the existing structure.
  • Provided data needed for the reports and dashboards creating views based on series of queries.
  • Assisted in building the models for patient throughput and capacity forecasting.
  • Supported with building models for the optimal scheduling of Emergency Rooms by minimizing the wait times of patients.

Confidential, MA

Research Associate/ Research Assistant

Responsibilities:

  • Investigated meteorological data, extracted patterns and important features using discrete-time filtering, spectral estimation techniques, and cross-correlation analysis.
  • Groundbreaking results led to a first-author paper, published by the Journal of Atmospheric and Oceanic Technology.
  • Managed a Team of undergraduate students to setup experiments for meteorological data acquisition; led
  • Team in conducting field experiments to gather data at the Boulder Atmospheric Observatory (BAO), CO, in 2008.
  • Examined data using custom-coded routines in MATLAB to compute power spectrum, periodograms, and spectrograms; observed these ocean generated infrasound signals using an absolute barometer, the first such reported observation in history.
  • Set up quizzes and assignments and graded same for a class of 60 undergraduate engineering students.
  • Explained signal processing concepts and tutored students which resulted in students’ improved overall performance.

We'd love your feedback!