Aws Automation Engineer Resume
SUMMARY
- Over 11+ years of industry experience in all aspects of the software life cycle, including system analysis, designs, development, testing and implementation which includes 6 + years in AWS Cloud Services (EC2, S3, IAM, VPC, Athena, Cloud Formation, SQS, SNS, Cloud Watch, Kinesis, Lambda, API Gateway, EMR, Dynamodb) and Devops tools (Jenkins, Ansible) and 2 year experience with apigee google platform.
- Experienced with Ansible Playbooks, modules and roles
- Experienced with kubernetes administration and deployment experience
- Experienced with deployment of spring boot based java app deployment in kubernetes platform.
- Experienced with Docker file creation to create images for java web applications.
- Having experience with java and python language development
- Experienced with migration from onprem to AWS
- Provided 24/7 support on a weekly rotation basis for all apigee Confidential customers.
- Having experience in leading a team size of 7 including offshore India
- Implemented AWS security features (ex: Audit logging, Restrict IAM policies) for real time data ingestion application.
- Implemented oauth for REST APIs.
- Re - architected the existing platform to improve the performance of the real time data ingestion application.
- Architected and implemented real time data analytics platform by using kinesis, firehose, lambda and S3.
- Implemented custom authorizer in API Gateway for a server less application to post data.
- Architected and designed Cloud Monitoring Services by using AWS Services.
- Designed distributed processing architecture to monitor and maintain continuous security and compliance by using AWS Services and Python.
- Architected and designed server based deployment to server less based deployment to reduce unnecessary cost and management effort.
- Worked with Cloud tools CloudHealth, NewRelic and Alert Logic Tool to analysis the performance of an application, provide cost solutions to the customer and analysis the logs.
- Good understanding of CI/CD pipeline and Jenkins plugins.
- Worked with various customers to provide Cloud architect solutions based on their needs.
- Programming experience in Python, Java, J2EE, UNIX Shell Scripting, and Ajax.
- Experienced working with methodologies like Agile SCRUM, Rally, RUP and SDLC Waterfall.
- Has extensive domain knowledge on Automotive, Retail and Pharma.
- Providing infrastructure level operational support to different AWS project teams.
- Strong analytical, problem solving, organizational and planning skills.
- Worked on 24/7 support for high critical systems for Confidential project.
- Good understanding about AWS EMR, Hive, Hadoop, HDFS, sqoop and develop custom map reduce program in java.
TECHNICAL SKILLS
AWS Cloud services: EC2, RDS, Redshift, S3, IAM, Cloud Formation, VPC, SQS, SNSAthena, Cloud Watch, Lambda, API Gateway, Dynamo dB, EMR, Kinesis stream and Kinesis Firehose.
Cloud Orchestration Tools: Puppet, Ansible.
Programming Languages: Python, Java.
Scripting: Shell Scripting, PowerShell Scripting.
Frameworks: Spring 3.0, Hibernate 3.0, Struts 2.0, LDAP.
Database: Oracle 11g, Microsoft SQL Server and MySQL.
Servers: JBoss 6.0, IBM Web Sphere 6.0, Tomcat 7.2.24.
XML Technologies: HTTP, SOAP, XSLT, HTML.
Scripting Language: JQuery, Shell Scripting, Ajax, JavaScript, and CSS.
Version Control: GitHub, Maven, SVN.
Methodologies: Agile SCRUM, SDLC Waterfall.
PM Tools: Confluence Jira, Stash.
PROFESSIONAL EXPERIENCE
AWS automation engineer
Confidential
Responsibilities:
- Worked as kubernetes application developer to develop yaml file to deploy Spring boot based application in Confidential kubernetes platform.
- Setting up CICD pipeline for Spring boot based java webapplication.
- Setting up autoscaling for apigee RMP nodes and set up centralized logging for autoscaling instances. Developed shell scripts, ansible playbooks and configured shell script in user data to call these playbooks and this playbook install the required component in the ec2 instance.
- Developed python based aws lambda to create security approval flow and automatic removal of terminated instance info.
- Expanded apigee platform to us-west-2 region which provides much faster performance to Confidential apigee customers. Developed ansible playbooks which will help to expand the platform.
- Developed rehydration scripts by using ansible to automate rehydration of AWS instances every 3 months once. Rehydration scripts are developed by using ansible, python and shell script.
- Developed aws cloudformation scripts to create asg resources, cloudwatch resources
- Created custom Jenkins plugin by using java to automate the apigee resources promotion from one environment to another and successfully released in production environment.
- Lead a team size of 7 including offshore team in India to coordinate the JIRA task, tracking .
- Setup the Jenkins master and slave nodes to deploy the apigee proxy migration custom Jenkins plugin.
- Created ansible scripts to setting apigee environment.
- Created custom plugin to execute cloudformation scripts in any aws account by using java, Jenkins and ansible.
- Developed ansible custom python modules to call Confidential internal nebula to setup host groups dynamically.
- Moved apigee analytics data which are older than certain years from apigee postgres database to s3 and then moved to redshift to view the data in grafana dashboard.
- Worked in aurora mysql db to setting up db, tables for CICD plugin, apigee useronboarding.
Environment: AWS Cloud, Python programming, Ansible, Java, Jenkins, Kubernetes and Docker.
AWS cloud architect
Confidential
Responsibilities:
- Advising on and implementing all things around AWS, particularly related to the DevOps, Security, and Self-Service aspects of the platform.
- Create monitoring graphs by using Grafana Dashboards.
- Used multimachine to perform load testing on AWS lambda.
- Implemented blue green deployment architecture for micro services architecture.
- Architected and implemented oauth authorization for public facing rest apis by using oauth0 and Confidential internal security.
- Architected and implemented API Gateway for rest APIs and implemented custom authorizer for API gateway to access RESTAPIs based on the bearer JWT token.
- Implemented monitoring tools by using grafana, datadog.
- Developed Hadoop Map reduce java program to validate the data from S3 and modify to corrected data and place it in amazon s3. Deployed the code in AWS EMR cluster.
Environment: AWS Cloud, Python programming, Java 1.8, Unix Shell Scripting, Angular JS, Ajax, Jenkins, Git, Stash, Jira, AWS-Boto SDK, AWS-Cloud Formation, AWS-CLI, Jinja templating, YAML, Python unittest, Mock, Moto (Mock for Boto), Amazon Dynamo DB, Elastic beanstalk, Kinesis, EMR and Kinesis Firehose.
Senior Cloud DevOps Engineer/Cloud Architect
Confidential
Responsibilities:
- Python Developer working on design, development of the Xbot framework using Boto SDK.
- Programming experience with S3, EC2, RDS, Redshift, Athena, Cloud Formation, and VPC Services.
- Develop frontend web app using Python Flask framework, JQuery, Ajax and Angular JS.
- Execute deploying applications to public Cloud systems like AWS, Azure, and Google Cloud Platform.
- Create Cloud Formation scripts to create network setup, IAM Group, role and user.
- Implement Disaster Recovery high availability systems in AWS.
- Architect and design severless application CI/CD by using AWS Serverless application model.
- Design and develop a tool to monitor the server building process across AWS platforms. This tool is a web app tool so end users can login and check the EC2 server build status.
- Review the code developed by peer team members and followed the PEP8 Python standards.
- Create Pipeline in Jenkins to build and test Xbot code.
- Architect and designed server less based monitoring for each account resources by using AWS.
- Work with automated testing tools such as Selenium and JMeter.
- Architect and designed source code framework helpful for customers to only focus on their development instead of deployment.
- Work with various Confidential customers to provide architectural solutions for their on premise model.
- Create web application for customers to view application data, alarms, tickets and logins for their own AWS account.
- Implement Service Oriented Architecture (SOA and REST).
- Worked in sqoop tool to import data from oracle to EMR HDFS
- Worked with AWS EMR hive and created external tables, hive partitions to analysed the data imported from AWS S3 and write it back to S3
- Worked with aws oracle, postgres and mysql databases to migrate data from onpremise Confidential & Confidential to AWS db.
- Worked in AWS DMS (Data migration service) to migrate data from onpremise to AWS cloud.
Environment: AWS Cloud, Python programming, Python Flask framework for web dev, Unix Shell Scripting, Angular JS, Ajax, Jenkins, Git, Stash, Jira, AWS-Boto SDK, AWS-Cloud Formation, AWS-CLI, Jinja templating, YAML, Python unittest, Mock, Moto (Mock for Boto), Amazon Dynamo DB, MySQL, Oracle, MSSQL server.
Devops Developer
Confidential, Atlanta, GA
Responsibilities:
- Developed web application by using Spring framework to move the images from on premise cloud.
- Developed Cloud formation scripts to create network setup between on premise data center to AWS Cloud account.
- Worked on Amazon SWF (simple work flow) to find the images status from on premise to cloud.
Environment: Java 1.7, Spring 1.3, AWS Cloud, GitHub.