We provide IT Staff Augmentation Services!

Machine Learning Engineer Resume

3.00/5 (Submit Your Rating)

OR

SUMMARY

  • 8+ years of professional IT expertise in Data Science, Machine Learning, Reinforcement Learning, Predictive Analytics, project development, implementation, deployment, and maintenance using Bigdata technologies.
  • Good Understanding of Azure Big data technologies like Azure Data Lake Analytics, Azure Data Lake Store, Azure Data Factory, Azure Data Bricks.
  • Setting Up AWS and Microsoft Azure with Databricks, Databricks Workspace for Business Analytics, Manage Clusters in Databricks.
  • Worked on end - to-end data pipeline starting from data collection, cleaning, pre-processing, model building and deploying.
  • Implemented short-term and long-term Data Science & Analytics strategies for the current business problem and presented innovative solutions to the clients.
  • Designed and developed front-end and back-end applications with machine learning capabilities (Python, PHP, JavaScript/NodeJS). Utilizing web frameworks like Django, Flask, Laravel, ReactJS and Vue JS.
  • Tech led for multiple production AI/ML projects, leading teams of 7+ Software and ML engineers.
  • Automated machine learning pipelines in Kubeflow.
  • Worked with Business Analyst’s and Product Managers to frame a problem, both mathematically and within the business context.
  • Researched and applied machine learning algorithms in the areas of analytics, computer-vision, NLP and time series analysis.
  • Technical expertise with data models, data mining, and segmentation techniques.
  • Expertise in integrating machine learning solutions with Hadoop, Spark, Python, SQL, Ab Initio, AWS.
  • Processed the data from Kafka pipelines from topics and show the real-time streaming in dashboards.
  • Experienced in leveraging AI & Machine learning capabilities in smart factories to provide hyper automation and real time analytics.
  • Working knowledge on various deployment tools like docker, Kubernetes and Jenkins.
  • Develop & Implement NLP models (topic modeling, semantic search, Q&A answering, chatbots, etc.) for real-time inference on text data.
  • Delivered multiple ML solutions in Cloud for the Sales and Marketing team, that directly contributed to the top-line growth of the business.
  • Define clear target performance metrics from ambiguous the client's requirement and accomplish them at a fast pace.
  • Adept at multitasking, working independently, and as part of a team as required. Very flexible at adapting to changing client needs and deadlines. Possesses strong problem-solving and communication skills.
  • Proficient in Machine Learning techniques (Supervised, Unsupervised, and Reinforcement Learning) and Statistical Modeling.
  • Capable in building vast Neural Networks.
  • Experienced in building forecasting models using various Machine Learning algorithms and Deep Learning architectures.
  • Lead a team of DevOps engineers to support machine learning Ops and software development.
  • Experience in Designing and implementing data structures and commonly used data business intelligence tools for data analysis.
  • Ability to work effectively in cross-functional team environments, excellent communication, and interpersonal skills.

TECHNICAL SKILLS

Languages: R, Python, SQL, SAS, Django

Tools: /Technologies SciPy, Spark, Pandas, Hdfs, Hadoop MapReduce, Tableau, TensorFlow, Kera s, Power BI, Excel, Azure, MLOps

IDE s/ Text Editors: Jupiter, PyCharm, Visual Studio, Excel, Spyder, RStudio, MATLAB, Notepad++, Cube IDE

Cloud: GCP, Azure, AWS

Automation Tools: Maven, Jenkins, GIT & AWS Cloud CICD.

OLAP Tools: Tableau, SSAS, Business Objects

Database: MySQL, PostgreSQL, MongoDB, SQLite, Oracle

Web Development: Flask, HTML, CSS

PROFESSIONAL EXPERIENCE

Confidential, OR

Machine Learning Engineer

Responsibilities:

  • Fine-tuned GPT-3 to perform personalization of cold emails, which in turn increased email open rates by 20%, response rate by 4%, and conversion rate by 2%.
  • Minimized the dependency on Open Ai’s GPT-3 because of the cost overhead and fine-tuned a T5 transformer in-house. This project reduced the expense by almost 60%.
  • Collaborated on the PDG (Predictive Demand Generation) feature with the team. Trained an ANN to identify potential buyers from a bunch of people on a list. AUC score obtained was 0.81.
  • Innovated on a Business Intelligence tool, utilizing which the users reported up to 10% increase in sales using various suggestions generated with the help of AI and plots created using past sales campaign data.
  • Worked with teams to design and build automated pipelines that run, monitor, and retrain Client Models for business applications. Built ETL Pipelines for new and existing models.
  • Implement end-to-end solutions for batch and real-time machine learning algorithms along with requisite tooling around monitoring, logging, automated testing, performance testing.
  • Design and implement Model and Pipeline validation procedures alongside teams of Data Scientists, Data Engineers, and other Client Engineers.
  • Developed a customized model for universal Shopper profile that can detect objects on the images uploaded by the user on the web platform to ensure a personalized shopping experience.
  • Implementation and Deployment of Ensemble methods on AWS Sage maker to assess the credit worthiness of customers.
  • Developed MLOps framework with feature engineering, data preprocessing, data versioning, model, model evaluation, model versioning, batch inference, data validation, data drift detection, target drift detection, model performance monitoring and automated notifications for model re.
  • Personalization of search and browse recommendations of customer profiles based on their intrinsic behavior.
  • Worked on Real-time ML and experienced in handling streaming services for Machine Learning.
  • Worked on Dimensional and Relational Data Modeling using Star and Snowflake Schemas, OLTP/OLAP system, Conceptual, Logical and Physical data modeling using Erwin.
  • Involved in building long-range forecasts and short-range forecasts.
  • Replaced the forecasting models from traditional machine learning algorithms to deep learning models which further improved the accuracy.
  • Built an Artificial Intelligence agent which can automatically adjust forecasts using Reinforcement Learning. This reduced the deviations in forecasts from 15% to 3%.
  • Worked on Docker and Kubernetes architecture for deploying the data pipelines.
  • Worked on Databricks environment and created delta lake tables.
  • Worked on airflow as a scheduling and orchestration tool.
  • Worked on performance tuning of spark jobs for better performance and cost savings.
  • Crated External storage integration within snowflake for importing and exporting the data to the snowflake.

Confidential, GA

Machine Learning Engineer

Responsibilities:

  • Implemented Apache Airflow for authoring, scheduling and monitoring Data Pipelines
  • Designed & build infrastructure for the Google Cloud environment from scratch
  • Experienced in ETL concepts, building ETL solutions and Data modeling
  • Leveraged cloud and GPU computing technologies for automated machine learning and analytics pipelines, such as AWS, GCP
  • Involved in creating the notebooks for moving data from raw to stage and then to curated zones using Azure data bricks.
  • Deployed windows Kubernetes (K8s) cluster with Azure Container Service (ACS) from Azure CLI and Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test and Octopus Deploy.
  • Built MLOps Pipeline Monitoring System with Azure Databricks, Data factory and Logic App with MLflow to detect data and model drift and trigger alerts in the form of email notification to user to retrain model in case of model getting stale with time.
  • Leveraged Azure ML experiments to submit and track different runs, Blob storage for data storage and Azure Container Instance for deploying models to production.
  • Implemented MLOps guide for the projects from business requirement gathering till Model Operationalization.
  • Lead a team of DevOps engineers to support machine learning Ops and software development
  • Built a search engine using deep learning transformer model (Roberta), Celery, Redis, Elastic Search, MySQL, React, Flask and Nginx.
  • Lead data operations for ML workflows at Confidential to reduce order processing errors at leading international restaurant chains via computer vision solutions for real time analytics.
  • Created data pipelines to use for business reports and process streaming data by using Kafka on-premises cluster.
  • Developed highly complex Python and Scala code, which is maintainable, easy to use, and satisfies application requirements, data processing, and analytics using inbuilt libraries.
  • Worked on Docker and Kubernetes architecture for deploying the data pipelines.
  • Deployed Machine Learning models, pipelines build using Docker and Kubernetes architectures.
  • Worked on price optimization which increased the sales as well as profits to the organization.
  • Conduct machine learning proof-of-concepts and architect/lead production of state-of-the-art intelligent solutions that deliver business values and continuously innovate on behalf of businesses.
  • Drive vision of the machine learning team in areas of fundamental algorithms, NLP, computer vision, machine learning platforms/infrastructure.
  • Worked closely with Supply Chain Management team to deliver the right quantity of products to the stores based on the demand.
  • The right quantity is predicted with the forecasting models built using Machine Learning and Deep Learning approaches.
  • Built a modern data & advanced analytics platform on AWS to acquire data from multiple sources, centralize and catalog data assets, enable secure access control, accelerate AI & ML models and pipelines, and automate deployment.

Confidential, MO

Data Scientist

Responsibilities:

  • Experienced in designing and deploying Hadoop clusters and different Big Data analytic tools including Pig, Hive, HBase, Oozie, Sqoop, Kafka, Spark with Cloudera distribution.
  • Developed a scalable and configurable Auto ML solution involving multiple Regression and Classification Algorithms that Optimizes features, algorithms & hyperparameters. Reduced the experimentation time by weeks.
  • Configured, deployed, and maintained multi-node Dev and Tested Kafka Clusters.
  • Proficient with snowflake architecture and concepts.
  • Created classification logistic regression model within GCP Big query.
  • Proficient in building interactive visualization dashboards in Tableau.
  • Built an application which can automatically predicts the crop type, plant type, pest, insects, fungi effected to the crops with the help of an image.
  • Vast Neural Networks and different type of neural architectures were used to implement this model, and the model developed is deployed into a web application.
  • Worked on computer vision (YOLO algorithm) which helps in detecting the insects damaging the crops so that proper precautions can be taken.
  • Also worked on Object Detection, and Movement Detection which is deployed into the web application by combining capabilities of AWS Deep lens with AWS Services as Recognition, Greengrass and Lambda to identify intruders.
  • Involved in building various predictive and forecasting models which forecasts various features like rain, soil fertility, strength in the soil etc.
  • Based on the model outputs, proper guidance is given to the farmers by the field representatives which in turn develops Yield of the crop.
  • Worked on building a recommendation system which recommends the best pesticides, fertilizers, nutrients for the soil, type of crops to be planted based on the soil conditions, water availability, rain forecasts, moisture, environmental conditions within the user’s budget.

Confidential

Data Engineer

Responsibilities:

  • Collaborated with Business Analysts, SMEs across departments to gather business requirements, and identify workable items for further development.
  • Partnered with ETL developers to ensure that data is well cleaned, and the data warehouse is up to date for reporting purposes by Pig.
  • Exploring with Spark to improve the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, PostgreSQL, Data Frame, OpenShift, Talend, pair RDD's
  • Involved in integration of Hadoop cluster with spark engine to perform BATCH and GRAPHX operations.
  • Enabling ML tools and platforms within Advanced Analytics & Big data team. Supporting data science needs for the platform marketing team to introduce prediction capabilities for enhancing business processes.
  • Involved in defining the source to target Data mappings, business rules, and data definitions.
  • Created Informatica mappings using various Transformations like Joiner, Aggregate, Expression, Filter, and Update Strategy.
  • Hosted client calls for project planning and business analysis to provide oversight of long term and short-term technical project initiatives.
  • Generated report on predictive analytics using Python and Tableau including visualizing model performance and prediction results.
  • Provided business intelligence analysis to decision-makers using an interactive OLAP tool.
  • Developed a chatbot which designs a Resume of the user by asking necessary questions. All the user responses were stored in the MongoDB Database.
  • From MongoDB, the responses were taken out and stored in a Resume template. As this is a web application user can directly download his/her Resume after answering all the necessary questions asked by the chatbot.
  • Developed an application which can forecasts the stock process using TensorFlow, Kera’s. Application has some unique feature which can automatically buy or sell the stocks based on predictions.
  • These forecasting models were built using Neural Networks, various neural architectures like LSTM’s, RNN etc.

We'd love your feedback!