We provide IT Staff Augmentation Services!

Data Science Research Analyst Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Mathematics and Computer Science graduate with more than 12 years of work experience.
  • Eight years of experience in Data Analysis and Data Science, Four Years in Business development.
  • Expertise in using SAS, SQL and Python programming to perform Data science and Machine learning. Implementation of agile practices in developing projects.
  • Highly motivated individual committed to quality work. Domain expertise with Banking, Auto, Retail, Channel distribution, Trading, Oil and Gas, Electrical industry, Health Care, and Higher Education.
  • Highly motivated individual committed to quality work.
  • Work experience and knowledge to test and deploy end - to-end solutions.
  • Trained and deployed machine learning models using notebooks, Automated ML, and Designer.
  • Setup environment from data ingestion to data factory, data lakes, HD insight cluster, Azure databricks and Azure SQL database for reports.
  • Performed correlations, Simple linear, Multiple and Logistic regression, Categorical data analysis, Cluster analysis, Classification modeling, Deep learning, Time series models, ARIMA, Trend, Seasonality, Decision Trees, Forecasting, Factor Analysis, Neural Networks, and other advanced statistical and econometric techniques.
  • Descriptive, Prescriptive and Predictive modeling with Machine learning.
  • Work expertise with Banking, Auto, Retail, Oil and Gas, Electrical companies, Higher Education.

TECHNICAL SKILLS:

Analytical Software: SAS, SQL, and Python.

SAS Modules: SAS Base, SQL, Macros, SAS STAT, Enterprise Guide, Enterprise Miner, SAS Operations Research.

Python libraries: NumPy, Pandas, Matplotlib, Scikit - learn, PySpark, Spark-SQL, Spark MLlib.

Machine learning: Train and Evaluate regression models, classification models, clustering and deep learning models.

Tensor Flow: Application of tensor flow modules for image processing.

Cloud Technologies: Azure ML ¸AWS S3, EC2, RDS, AWS forecast.

Full Stack development: (Intermediary) Knowledge on HTML, CSS, Django, Git and Github.

WORK EXPERIENCE:

Data Science Research Analyst

Confidential

Responsibilities:

  • Evaluated the data requirements for various groups including the president, provost, faculty, staff, and students to identify information needs.
  • Coordinated with different departments in developing data solutions and data products.
  • Analyze and Prepare data, identify the patterns on a dataset by applying historical models.
  • Collaborating with Senior Data Scientists for developing analytical solutions.
  • Perform data manipulation, data preparation, normalization, and predictive modeling.
  • Served as CO-PI for a research project, “Predictor of Student Success”. This study helped directors to make policy decision on continuing support to students in need.
  • Designed SAS quantitative methods like logistic regression to study the effectiveness of tutoring videos on students in academic distress.
  • Advised department on implementation of analytical projects and data storage in EDW.
  • Utilized SAS SQL to automate daily operations of an analytical team of three processing marketing data reports.
  • Published daily reports Power BI, Tableau, SAS.
  • Documented project procedures, report submissions and SAS code.
  • Established good communications procedures across departments for consistent data flow.
  • Served as project manager in integrating and implementing new software while coordinating with resources across departments.
  • Working on application of SAS Linear programming models to allocate classes as per registrar’s requirement, model under study.
  • Application of SAS OPTLP and SAS OPTMILP procedures and Machine learning.

Confidential, Houston, TX

Data Lead Consultant

Responsibilities:

  • Provided services on developing Data products, Machine-learning models including assistance in startup projects with Retail client.
  • Provided end-to-end environment setup for machine learning solutions.
  • Interacted with Business professionals, application developers and technical staff in building machine-learning algorithms in agile environment.
  • Utilized SAS products, Python, SQL, SQLite and ETL tools to access relational and non-relational databases and extract data for further processing.
  • Working knowledge on Azure machine learning, Google cloud, AWS instances for analytics.
  • Explore and analyze data with SAS and Python.
  • Train and Evaluate regression models, classification models, clustering, and deep learning models.
  • Creating workflows to schedule tasks for data utilization and analysis using SAS enterprise guide and Airflow.
  • Knowledge of continuous Integration and Continuous deployment using Azure DevOps.
  • Improved reports and scheduled automated tasks to distribute reports.
  • Created dashboards and reports using PowerBI, Tableau and SAS.
  • Trained student community on analytical and mathematical models.
  • Worked with statistics, machine learning and natural processing applications.
  • Work expertise with Banking, Auto, Retail, Oil and Gas, and Electrical companies.

Data Lead Consultant

Confidential, Atlanta, GA

Responsibilities:

  • Actively involved with new strategy group at Confidential to build analytical applications.
  • Developed “DealShield” as database application while utilizing extensive analytical techniques.
  • Participate in database design, analyze data, interpret and produce automated reports is the main functionality.
  • Job involved extensive data mining operations and required extensive statistical and modeling expertise.
  • Developed summary and inferential knowledge on data behavior using regression methods and descriptive stats.
  • Developed SAS stored processes for business users to extract transactional data and summary data through web browser.
  • Developed automated reporting system to schedule and distribute business summary reports through automated email.
  • Extracted data from different sources and built tables and views for reporting.
  • Distributed daily log data files data through automated email systems.
  • Aggregated data from multiple data sources and distribute to business users.
  • Actively involved in developing new database system, a rewrite to the existing model.
  • Developed GIS application to include actual distance between two business locations or customer group at street level addresses. This product is an application of TIGER files from census and statistical modeling.
  • Gathered business requirements from various groups to identify data requirements for application development.

Lead Consultant

Confidential, Troy, MI

Responsibilities:

  • Lead the design, execution, analytical plan in Total Loss Vehicle validation. “TLV” is a vehicle identification system.
  • Evaluated current “TLV” System to deliver on time solutions with high quality.
  • Redesigned to build and design statistical models to estimate true market value of TLV.
  • Provided analytical support to key clients using advance regression estimation methods.
  • Identified client needs for additional data-solutions services and convey them to the business development / client service team.
  • Advanced ETL operations using SAS/SQL, PL/SQL and MACROS to retrieve input data from various web based sources.
  • Actively participate in modeling, prototyping, coding, testing and documentation stages.
  • Designed analytical solutions to increase readership.
  • Participated in designing survey questions and identifying relevant methods of estimation.
  • Multivariate testing methods and applications.

Research Assistant

Confidential, MS

Responsibilities:

  • Performed data cleaning and analysis on historical price data collected from USDA database to support research activities.
  • Applied SAS BASE and Enterprise guide extensively to perform necessary transformations.
  • Executed Extensive analysis on historical price series to generate seasonal price Indexes.
  • Analyzed past data to model deterministic and stochastic seasonal factors.
  • Identified trend coefficients by performing regression analysis.
  • Performed investment analysis using SAS/STAT to measure NPV and IRR ratios.
  • Applied time-series models to forecast sales and inventory levels.
  • Applied trend and seasonal factors in developing sales and inventory forecasts for gas stations.
  • Developed dashboard for sales and inventory at tank level. Utilized data from IOT sensors obtained through Kachoolie database.
  • An application of big data. Application of Linear programming to find optimal solution of product mix for profit maximization.
  • Coordinate with various business functions to gather data and implement model improvements. Application of machine learning end-to-end solutions.

We'd love your feedback!