Site Reliability Engineer/ Sr. Aws Devops Engineer Resume
0/5 (Submit Your Rating)
Dallas, TX
SUMMARY
- Experience IT professional with over 5 years of professional experience optimizing systems, supporting infrastructure, monitoring, and improving performance / efficiency.
- Highly proficient in working with Infrastructure components, Azure/AWS DevOps, and application testing.
- Hands - on experience in using DevOps tools such as Jenkins, Maven, Docker, Kubernetes, Ansible, Splunk, and CI/CD pipelines.
- Automated infrastructure provisioning on AWS using Terraform and Ansible
- Developed Docker-based microservices deployment with Kubernetes, Jenkins, and Ansible-based pipelines/frameworks.
- Wrote Ansible playbooks for code deployment and configuration management and using Linux
- Performed releases with the Maven release plugin, and manage artifacts.
- Set up Jenkins master, added the necessary plugins, added more slaves to support scalability and agility
- Set up data Wearhouse, integrate data bricks, data factory, data lakes, and Database SQL
- Processed, and transformed massive quantities of data, exploring the data through machine learning models (structured data), Python, Scala, SQL, AWS Lambda and RDS, Azure DevOps CI/CD Pipeline
- Created python API running on Oracle DB, created AWS Lambda functions to host API
- Set up CI/CD pipelines for Microservices & integrated tools such as Jenkins, Maven, Bitbucket, SonarQube, Nexus, Docker, and Slack for providing immediate feedback to DEV teams after code check-in
- Used Terraform to manage AWS application and Azure DevOps CICD pipeline
- Worked on Splunk for monitoring, searching, analyzing, and visualizing the machine-generated data in real-time
- Worked on Azure DevOps CI/CD Pipeline with integrating code using Azure Cloud endpoint
Areas Of Expertise
- Cloud platforms, AWS (CLII)AWS CLI $ AWS Monitoring using CloudWatch, Microsoft Azure pipeline & Azure Data Wearhouse, {PCF}Pivotal Cloud Foundry
- Framework/DevOps tools, Maven, J, SonarQube, Nexus, Jenkins, Slack, RedHat
- Azure DevOps CI/CD Pipeline, VMware, Centos S, Git, GitHub, Bitbucket, GitLab
- Puppet, Ansible, Terraform, RedHat Centos S, VMware, Docker to manage instances
- Containers, Terraform, Infrastructure as Code, Terraform Variables
- Terraform Models, Application/Web Servers, Tomcat
- Apache, Red Hat Linux, Windows
- Oracle, SQL Server, MySQL, Postgres AWS LAMBDA, RDS,
- Selenium / Nagios / AWS CloudWatch, Groovy, UNIX Git Bash Putty Python
- JSON, Chef Puppet, Ansible, DevOps Security
TECHNICAL SKILLS
- AWS
- JavaScript
- CICD Pipelines
- Jenkins
- Maven
- Site Reliability
- Cloud Engineering
- Terraform
- GIT
- Ansible
- Azure
- Python
- Shell
- Bash
PROFESSIONAL EXPERIENCE
Site Reliability Engineer/ Sr. AWS DevOps Engineer
Confidential, Dallas, TX
Responsibilities:
- Performed change management functions involving software deployment and server maintenance requests
- Coordinated and planned changes with other technology teams involving servers within our environment
- Responded to and mitigated incidents as they occur within the environment
- Contributed details to the post-incident review process
- Communicated and escalated customer impacting issues according to department guidelines
- Represented operations in a technical fashion to leadership and development teams
- Led projects, and engagements that evolve the stability, scalability, & supportability of the Cobalt Services environment
- Partnered with development, and other operations teams to continue evolving our monitoring & operational procedures for the Cobalt Services architecture
- Ensured service requests are completed and assigned with the appropriate priority within the operations teams
- Updated support documentation as needed as environment changes occur
- Developed and wrote Shell scripts, Bash scripts, Python & PowerShell for setting up baselines, branching, merging, and automation processes across the environments using SCM tools like GIT
- Created python API running on Oracle DB, created AWS Lambda functions to host API
- Deployed API into Lambda using Jenkins, managed one pipeline with Bogiefile workflow, system, process, and procedure improvements within team and department
- Recommend Seek opportunities to participate in special projects/special project teams
- Set up day-to-day patching to drain servers & US Prod database patching and apply patch database
- Wrote Ansible playbooks for code deployment and configuration management and using Linux
- Set up all servers on maintenance, used Jenkins tools on below functions on searching begin maintenance and end maintenance on all applications
- Used Jenkins to rebuild release code, used UberGUI to enable and disable all drain servers
- Used Jenkins tools for the smoke test
- Used perfsat tools for sites scopes & all, built and maintained ending sites to confirm all apps in save mode and perfect working conditions
- Virtualized the servers using Docker for the test environments and dev-environments needs and configuration automation using Docker containers.
- Used Jenkins tools to start all applications and stopped applications when necessary
- Used GOTMON GUI to manage all Linux and Windows servers
- Used AWS Cloud for Cloud tools, set up CYBER ARK, and set up IAC for the new release
- Used ServiceNow for access requests, managed all tasks, and closing tasks when done
- Worked with prism client hosting environment status
- Used Datadog to monitor the environment
- Worked with Cobalt service on alert rules to monitor all alerts Appmon and app TimmingGUI
- Used CMDB tools to manage the applications coherence, servers, layouts, and server profiles and deployed with the release set
- Performed day-to-day activities on PROD, QED deployment also maintained and monitored
- Worked on CSS-CM Cloud collaboration
- Worked day to day Database Migrations
DevOps/AWS Cloud Engineer
Confidential, Austin, TX
Responsibilities:
- Was an integral part of members DevOps practice team
- Main responsibility was to migrate the legacy applications to PCF and Azure DevOps cloud using DevOps tools like GitHub, Jenkins, Maven, Nexus, Docker, SonarQube, & Slack while using Splunk for setting up the Data Wearhouse
- Set up CICD using Gitlab build and deploy on PCF Cloud
- Perform rollback deployment using Ansible Tower
- Used Splunk on below functions on searching index and data
- Worked on aggregate functions, event functions, date / time functions, comparison, and conditional functions
- Set up Splunk, ELK Elastic Logstash, and Kibana deployed on PCF Cloud
- Worked on IaaS, PaaS, SaaS, IAAC infrastructure terraform, and orchestration automation
- Used Kibana to search for data on elastic index
- Built a Docker container to manage Instances for developers
- Built multiple containers to instances to resolve developer problems
- Set up data Wearhouse, integrate data bricks, data factory, data lakes, and Database SQL
- Processed, and transformed massive quantities of data, exploring the data through machine learning models (structured data), Python, Scala, SQL, AWS Lambda and RDS, Azure DevOps CI/CD Pipeline
- Integrated code using Azure Cloud Endpoint Terraform to manage infrastructure by using terraform import
- Created an access key, a secret key to integrate Terraform and AWS together
- Set up a provider tf
- Used ServiceNow setting up incidents such as change requests resolving incidents, receiving email communications, performing changes, requesting task
- Created onboarding on ServiceNow, created tables, users, groups, and roles
DevOps Engineer
Confidential, Chicago, IL
Responsibilities:
- Was part of five members of the DevOps practice team
- Set up Git repositories and SSH Keys for Agile teams
- Set up ELK Elastic Logstash and Kibana
- Deployed on AWS Cloud on 5G endpoint servers
- Used Splunk to search through machine data on servers was like exploring a deep dark cave
- Worked on Splunk for monitoring, searching, analyzing, and visualizing the machine-generated data in real-time
- Used Terraform to manage AWS application and Azure DevOps CICD pipeline
- Worked on input variables in terraform and terraform module to reuse configuration files with changing codes
- Built a Docker container to manage instances for developers
- Built multiple containers to instances to resolve developer problems
- Set up Jenkins master, added the necessary plugins, added more slaves to support scalability and agility
- Worked on Azure DevOps CI/CD Pipeline with integrating code using Azure Cloud endpoint
- Worked on DevOps security, Static Application Security Testing (SAST) using SonarQube Plugins
- Verified 3rd party component
- Used Dynamic Application Security Testing (DAST) and ZAD Zed attack policy
- Scanned for credentials and secrets infrastructure
- Set up CICD with Azure on prem and Azure Data Wearhouse
- Set up AWS VPC, AWS CLI $, AWS monitoring using CloudWatch
- Used Terraform to setup IAM, VPC, EC2nS3 Using provider tf
- Public VPC private VPC -internet gateway -rout table - subnet - for framework security in the Cloud
- Set up CI/CD pipelines for Microservices & integrated tools such as Jenkins, Maven, Bitbucket, SonarQube, Nexus, Docker, and Slack for providing immediate feedback to DEV teams after code check-in
- Setup slack plug-ins and notifications in Jenkins pipelines for collaboration
- Created Docker file, automated Docker image creation using Jenkins and Docker
- Automated infrastructure provisioning on AWS using Terraform and Ansible
- Developed Docker-based microservices deployment with Kubernetes, Jenkins, and Ansible-based pipelines/frameworks.
- Created nightly builds with integration to code quality tools such as SonarQube, Veracode
- Created quality gates in SonarQube dashboard and enforced in the pipelines to fail the builds when conditions were not met
- Converted Java projects into Maven projects by creating a POM file and ensuring all the dependencies are built
- Worked on integrating GIT into the Continuous Integration (CI) environment along with Jenkins.
- Managed and mentored both onsite/offshore team
- Enforced Test Driven development for the DEV teams for every sprint
Linux Systems Administrator/AWS DevOps Engineer
Confidential, Irving, TX
Responsibilities:
- Provided user technical support for application and operating system software
- Responsible and reliable team player with a set of strong technical skills
- Launched Amazon EC2 Cloud Instances using Amazon Web Services (Linux/) and configured instances with respect to specific applications
- Implemented AWS Trusted Advisor to provision resources, improved performance, reliability, and security issues
Confidential, Dallas, TX
Responsibilities:
- Responsible for site pre-staging (T1 wiring & troubleshoot) before the new network integration (migration), the integration itself, the troubleshooting, and alarm clearing after the integration
- Completed tasks such as call testing, swapped sector test, commissioned GSM, CDMA, 4G, 2.5LTE, RRU, Microwave, and site router
- Commissioned of eNBs, grow eNBs, and BTS, & upgraded FW
- Performed releases with the Maven release plugin, and managed artifacts.
- Coordinated with field engineer for call tests
- Provided tier 1 support on RAN and core equipment
- Coordinated with customer, confirmed site firmware and data loads
- Contributed to a daily activity report and support troubleshooting activities as required
- Troubleshot and resolved issues with BTS cell site equipment during commissioning
- Moved RETs to other BTS using the BTS manager
- Shut down the site before commissioning and bring the site back up after commissioning
- Assisted field tech in the commissioning process
- Assisted field tech in clearing external alarms on site
Confidential, Irving, TX
Responsibilities:
- Provided project technical support for troubleshooting, resolving issues, escalations from the project, support performance, and optimization issue troubleshooting with regards to KPIs
- Provided system-level knowledge in supporting Confidential Siemens Networks Radio access 3G & 4G platform products.
- Worked on T1 verification testing, developed and MOP on 2G/3G/LTE project competence, process, documents for deployment
- Focused on switching CDMA and wireless networks operations
- Determined whether problems were due to connectivity or if they were higher layer problems
Field Technician
Confidential
Responsibilities:
- Communicated, utilized Q-Scopes to convert T1 lines to fiber, added carrier channels, and assessed ATP testing
- Assisted, diagnosed, resolution of problems with VSWR, RSSI, TX, and RX.
- Collaborated with Confidential personnel to deploy, and activate sites, & provided technical
- Performed S3 buckets creation, policies, and also on the IAM role based polices
- Created and managed the security groups to provide security for the EC2 instances
- Reduced monthly Infrastructure running cost by subscribing reserved instances for always running AWS servers
- Experience on TCP, TCPIP, DHCP, DNS Server. Bash Scripting, Shell Scripting