We provide IT Staff Augmentation Services!

Sr. Cloud/devops Engineer | (azure) Resume

2.00/5 (Submit Your Rating)

Bothell, WA

SUMMARY

  • Over 8 years of experience in the IT industry as a Cloud/DevOps Engineer with a major focus in the areas of Cloud Infrastructure Providers, Data Center Migration, Containerization Technologies, Configuration Management, CI/CD Pipeline, Virtualization technologies using different tools and cloud services like Azure, AWS and OpenStack.
  • Extensive experience in Azure Development and Azure Compute Services worked on Azure Web Applications, Azure SQL Database, Content Delivery Network, Virtual machines, Azure Active Directory, Key Vault, Azure VPN Gateway, Azure Fabric, Azure search, App Services, and Notification hub.
  • Experience in Architecting and securing the Infrastructure on AWS using IAM, KMS, EMR, Cognito, API Gateway, Cloud Trail, Cloud Watch, Amazon Simple Queue Service (Amazon SQS), AWS Kinesis, Lambda, NACL, Elastic Beanstalk, Redshift, and CloudFormation.
  • Experience working on Vagrant boxes to set up a local Kafka and StreamSets pipelines.
  • Implemented StreamSets data collector tool for ingestion into Hadoop. Also Worked with JSON file format for StreamSets
  • Experience in Developing Spark applications using Spark - SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns
  • Experienced in migrating on-premise applications using Terraform and ARM templates and used Azure Site Recovery and Azure backups for migrating storage to Microsoft Azure and deployed Azure IaaS virtual machines and Cloud services PaaS role instances into secure Subnets and Virtual Network.
  • Expertise on Azure Data Factory to orchestrate the intake of data from different sources like Azure Storage Blobs, Azure SQL Database, Azure SQL Data Warehouse, and on-premises databases to Azure Data Lake Store.
  • Created dimensional SnowFlake models based on Kimball methodology. Developed facts and dimensions tables using star and/or SnowFlake schema. In-depth understanding of SnowFlake cloud technology.
  • In-Depth understanding of SnowFlake Multi-cluster Size and Credit Usage Played a key role in Migrating Teradata objects into the SnowFlake environment.
  • Experience with SnowFlake Multi-Cluster Warehouses and SnowFlake Virtual Warehouses.
  • Migration from on-prem to Google cloud environment and for the GCP services such as Google compute engine, Big Query, cloud storage, Bigtable, Spanner, Memory Store, IoT Core, and Network service tiers.
  • Strong Experience in Azure IoT and Azure PaaS components like IoT Hub, Event Hub, Stream Analytics, Service Fabric, Document DB, App services, Service bus, Distributed cache, and Azure Functions.
  • Developed different environments for different applications on Google Cloud Platform (GCP) by provisioning Kubernetes clusters on GCE instances and GKE using Docker, Bash, and Python.
  • Experience in Building and securing the Infrastructure on AWS using IAM, EC2, EBS, S3, VPC, Elastic Beanstalk, Cloud Front, Route 53, Dynamo DB, Redshift, RDS, KMS, ECS, ELB, EFS, Cloud Formation, Elastic Cache, Cloud watch, SNS, SQS, SES, AWS kinesis focusing on High Availability, Fault-Tolerance and Auto Scaling.
  • Experienced in creating AWS Cloud Formation templates to make custom VPC, subnets, NAT to ensure successful deployment of Applications and database template and used Informatica to migrate data to AWS Redshift and used AWS Beanstalk for fast deploying, scaling, and load balancing of web applications and services developed with Java, Python, and Docker on familiar web servers such as Apache.
  • Extensively involved in infrastructure as code, execution plans, resource graph, and change automation using Terraform, and used Auto Scaling launch configuration templates for launching amazon EC2 instances while deploying microservices.
  • Experience in creating and deploying a highly available and fault-tolerant infrastructure on AWS using terraform modules, these modules install a web application in the Public subnet and database in the Private subnet which can communicate using the routing table in the VPC.
  • Experience with Pivotal Cloud Foundry, Kubernetes architecture and design, troubleshooting issues with platform components (PCF), and developing global or multi-regional deployment models and patterns for large-scale developments and deployments on Cloud Foundry and Kubernetes.
  • Experience in supporting the deployment of a user application in OpenStack with Nova, Neutron, Cinder, Swift, Keystone command-line clients, and hands-on experience on testing private cloud storage using OpenStack SWIFT software and also worked with OpenStack Multi-node installation, Configuration, and Administration.
  • Used Azure Kubernetes Service to deploy a managed Kubernetes cluster in Azure and created an AKS cluster in the Azure portal, with the Azure CLI, also used template-driven deployment options such as Resource Manager templates and Terraform.
  • Implemented microservices, application development, and migration using AWS/Azure services such as Azure DevOps, Kubernetes Service (AKS), Container Registry, Cosmos DB, and Grafana, Azure pipelines, Monitor, RBAC, AWS Kubernetes EKS, and Kubernetes API to run workloads on EKS Clusters.Experience in creating clusters using Kubernetes, creating pods, replication controllers, deployments, labels, health checks, and ingress by writing YAML files and hands-on experience in building and deploying the application code using CLI of Kubernetes.
  • Expertise in setting up Docker environments, working on Docker hub, creating Docker images and handling multiple images, used Docker File to create custom images, and used Docker Compose to build and host custom applications.
  • Experienced in using Docker Swarm and its Overlay Networking to keep the related containers under the same network, scaled out with Routing Mesh of Docker, and used Secrets in Docker Swarm Services.
  • Worked on integrating Ansible Tower with a cloud environment, providing role-based access control, setup job monitoring, email notifications, Scheduled jobs, multi-playbook workflow to chain playbooks.
  • Experience in working with Ansible Playbooks to automate various deployment tasks and working knowledge on Ansible Roles, Ansible Inventory, and Ansible Galaxy.
  • Used Atlassian tools for defect management, team collaboration, source code management, and continuous integration and deployment practices.
  • Responsible for Plugin-Management in Jenkins according to requirement Upgrading and Degrading Plugin Versions, Hands-on experience with Bash, Perl, and Python scripting.
  • Implemented CI/CD pipelines with the help of Jenkins and built pipelines, multi-branch pipelines and has hands-on experience in handling various jobs through Jenkins file.
  • Experienced in authoring pom.xml, build.xml files performing releases with the Maven, ANT release plugin, and managing artifacts in NEXUS, JFrog Artifactory.
  • Skilled in monitoring servers using Nagios, Splunk, ELK, and New Relic for Resource Monitoring, Network Monitoring, and Log Trace Monitoring.
  • Managed DNS, FTP, Tomcat & Apache web servers on CentOS, Ubuntu, Red Hat Enterprise Linux environments.
  • Proficient in automating the build and deployment process in various enterprise environments by writing automation scripts using C++, Python, Shell, PowerShell, Bash.
  • Integrate systems using a wide variety of protocols like REST, SOAP, MQ, TCP/IP, JSON, and others.
  • Experienced in the implementation of JMS to exchange information over a reliable channel in an asynchronous way by using Active MQ, Rabbit MQ as a message queue.
  • Experience in building enterprise Applications and Distributed Systems using technologies such as Java, J2EE, Gemfire

TECHNICAL SKILLS

Cloud: Azure, GCP, AWS, PCF, OpenStack, Cognito

Big Data Technologies: Apache Kafka, Cassandra, StreamSets.

Configuration Tools: Chef, Puppet, Ansible

CI/CD Tools: Jenkins, TeamCity, Bamboo

Build Tools: Terraform, Cloud Formation, Maven, Ant

Container Tools: Kubernetes, Docker, Docker swarm, OpenShift

Version Control: GitHub, Bitbucket

Monitoring Tools: Nagios, Splunk, ELK, New Relic

Scripting: Bash/Shell, Python, Perl, PHP, PowerShell, JSON, YAML, HTML5, JavaScript, C++, JAVA

Databases: MySQL, MS SQL, Dynamo DB, Mongo DB, Cassandra, AWS RDS, Gemfire

Application Servers: JBoss 4.x/5.x, Apache Tomcat 5.x/7.x, Web Sphere, Web Logic

Web Servers: Apache HTTP, Nginx, Apache TOMCAT

Networking: TCP/IP, DNS, NFS, ICMP, SMTP, DHCP, UDP, NIS, LAN, FTP

Operating Systems: Red Hat Linux 7/6/5/4, Ubuntu 16/14/13/12, Debian, CentOS, Windows, Solaris 11/10 Mac OS, Fedora

Virtualization Tech: VMware vSphere ESXi 5.x/4.x, ESX/3.x, VMware Workstation, Vagrant, Virtual box, Oracle

PROFESSIONAL EXPERIENCE

Confidential - Bothell, WA

Sr. Cloud/DevOps Engineer | (Azure)

Responsibilities:

  • Prepared capacity and architecture plan to create the Azure Cloud environment to host migrated IaaS, VMs, and PaaS role instances for refactored applications and databases.
  • Worked on designing and developing the Real-Time Time Tax Computation Engine using Oracle, StreamSets, Kafka, Spark Structured Streaming, and MySQL.
  • Involved in ingestion, transformation, manipulation, and computation of data using StreamSets, Kafka, MemSQL, Spark.
  • Performed implementation of the Azure Operations dealing with IAAS infrastructure (Azure VMs, Virtual Networking, Azure services, Website Deployments) and deployed application as PaaS (Websites, Web Roles, and Worker Roles).
  • Created Azure automated assets, Graphical runbooks, PowerShell run books that will automate specific tasks. Expertise in deploying Azure AD connect, configuring ADFS installation using Azure AD connect.
  • Involved in migrating SQL server database to SQL Azure database using SQL Azure migration wizard and used Python API to upload agent logs into Azure blob storage.
  • Developed data marts in the Snowflake cloud data warehouse.
  • Extracted and loaded data into Azure Blob Storage and Snow Flake databases using Azure Data Factory and Data bricks.
  • Developed CI/CD pipelines using Azure DevOps to deploy Snow SQL's, data factory components, and python scripts to their respective destinations.
  • Implemented a CI/CD pipeline with Docker, Jenkins (TFS Plugin installed), Team Foundation Server (TFS), GitHub and Azure Container Service, whenever a new TFS/GitHub branch gets started, Jenkins, our Continuous Integration (CI) server, automatically attempts to build a new Docker container from it.
  • Involved in Migrating Objects from Teradata to SnowFlake. Created Snowpipe for continuous data load.
  • Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL, and U-SQL Azure Data Lake Analytics.
  • Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
  • Responsible for estimating the cluster size, monitoring, and troubleshooting of the Spark Azure Databricks cluster.
  • Creation and Maintenance of MS Azure Cloud Infrastructure and Virtual Network between MS Azure Cloud and On-premise network for backend communication.
  • Designed Architecture for API development & deployment as Microservice including Python code in Docker container and Azure Service Fabric.
  • Hands-on development experience in several areas of the Microsoft Azure cloud technologies including Compute, Storage, App Services, PaaS v2, Service Fabric, Micro Services, Functions, Messaging, RM API, SQL Azure, Redis Cache, Service Fabric, IoT, PowerBI, PowerBI Embedded, Cognitive Services API, Cortana Analytics, BOTS Framework, deployment slots, Dev/Test Labs, Azure Active Directory Integration, Web jobs.
  • Developed and Deployed Integration solution using a serverless architecture. Utilized AWS S3, Dynamo DB, EC2, Cognito, Lambda, APIGateway, and SQS to integrate Warehouse, Lab, and third-party applications to consumer-facing websites.
  • Developed a greenfield app large app using AWS Cognito, Lambda, API gateway, node backend, Postgres, and React /Redux front end.
  • Worked on Azure Fabric, Microservices, IoT & Docker containers in Azure and involved in setting up Terraform continuous build integration system. Used Azure Internal Load Balancer to provide high availability for IaaS VMs&PaaS role instances.
  • Designed IoT SDK tools for automating Azure IoT Hub testing using Python, Docker, Bash, PowerShell, REST, C#, C++.
  • Azure Container Repository (ACR), Swagger, Remote Docker Debugging, Junit test suites, Docker Logging, IoT Hub, Device, Module creation and testing, IoT security, TLS, Encryption
  • Wrote Power Shell scripts to automate the Azure cloud system creation including end-to-end infrastructure, VMs, storage, firewall rules, etc.
  • Azure Automation through Runbooks Creation, Migration of existing.PS1 scripts, Authorizing, Configuring, Scheduling
  • Migration of on-premise data (Oracle/ SQL Server/ DB2/ MongoDB) to Azure Data Lake Store (ADLS) using Azure Data Factory (ADF V1/V2).
  • Worked on service-oriented systems that utilize REST web frameworks with Spring MVC, Spring REST templates, Rabbit MQ, Spring Integration.
  • Application Deployment on PCF using CF push and urban code deploy. Also, PCF backup for all the environments and set-up Jenkins maven build automation with uploads to Pivotal Cloud Foundry (PCF).
  • Used Azure Terraform to deploy the infrastructure necessary to create development, test, and production environments for a software development project.
  • Responsible for implementing containerized based applications on Azure Kubernetes by using Azure Kubernetes Service (AKS), Kubernetes Cluster, which are responsible for cluster management, Virtual Network to deploy agent nodes, Ingress API Gateway, MySQL Databases, and Cosmo DB for stateless storage of external data, and set up reverse proxy Nginx in the cluster.
  • Implemented Jenkins pipelines into Azure pipelines to drive all microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods, and managed using AKS. Used Kubernetes to deploy, load balance, scale and manage docker containers with multiple name-spaced versions.
  • Developed Kubernetes templates for various applications like Cassandra, Grafana and setting up Kubernetes Clusters for running microservices and pushed microservices into production with Kubernetes backed Infrastructure
  • Worked on Docker container snapshots, attaching to a running container, removing images, managing director structures, and managing containers.
  • Worked on the creation of custom Docker images, tagging, and pushing the images, and creating the Docker containers and Docker consoles for managing the application life cycle.
  • Worked on Ansible Playbooks with Ansible roles. Created inventory in Ansible for automating the continuous deployment. Configure the servers, deploy software, and orchestrate continuous deployments or zero downtime rolling updates.
  • Created Ansible roles in YAML and defined tasks, variables, files, handlers, and templates. Created inventory and configured the Ansible files for parallel deployment in Ansible for automating the continuous delivery process.
  • Configured Jenkins master with necessary plugins and slaves to support scalability and agility and configured Jenkins to implement nightly builds on daily basis and generated a change log to include daily changes.
  • Integrated Jenkins CI with GIT version control and implemented continuous build based on check-in for various cross-functional applications and created GitHub Webhooks to set up triggers for commit, push, merge, and pull request events.
  • Configuring and managing Splunk, to collect, search, and analyze log files from across the servers and integration of Application with monitoring tool New Relic for complete insight and proactive monitoring.
  • Working on version control systems like GIT and used Source code management client tools like Source Tree, GIT Bash, GIT Hub, GIT GUI, and other command-line applications, etc.
  • Maintained build related scripts developed in the shell for Maven builds. Created and modified build configuration files including POM.xml
  • Performed a POC to check the time taking for Change Data Capture (CDC) of oracle data across Striim, StreamSets, and DBVisitDeploy Infrastructure using ARM templates for Azure PaaS services with deployment tools such as Octopus and VSTS.
  • Experience in WebLogic 9.x/10.x to implement zero-downtime deployment using C++ and bash/shell script and used Jenkins to automate it.
  • Created scripts in Python to automate log rotation of multiple logs from web servers. Working with Python ORM Libraries including Django ORM to create Web applications.
  • Written Shell scripts to apply the Integration label to all the files which need manual labeling of files.
  • Configuring JIRA as a defect tracking system and configured various workflows, customizations, and plugins for the JIRA bug/issue tracker.
  • Good working knowledge on Database technologies such as Oracle, DB2, MS SQL, Big Data Greenplum, Gemfire, NoSQL as they are related to infrastructure and communication.
  • Worked on the NoSQL database MongoDB to replica setup and sharing. Also experienced in managing replica set.
  • Day to day duties for Linux server maintenance and support to the developer’s team for their issues with the application, tuning, troubleshooting software running on Servers.

Environment: Microsoft Windows AZURE, Azure AD, Azure SQL, Azure Network, Web Applications, Cognito, Kubernetes, Virtual Machines, PCF, Ansible, Chef, Jenkins, Puppet, Docker, ACS & AKS, Python, C++, Power shell, MongoDB, Microsoft Azure Storage, Spark (2.1), Databricks, Big Data, Gemfire, Azure Data Factory, Azure Data lakes, HDFS, MapReduce, Hive, Kafka, Python 2.7, DevOps, Azure App Services, StreamSets, Logic Apps, API Apps, Service Fabric, IoT Hub, Service Bus, Events Hub, Application Insights, PowerBI, PowerBI Embedded / Desktop, WebJobs, Cloud Services, SQL Azure, Azure Storage (Table, Blob, Queue, File), Azure Resource Manager, Micro Services, PowerShell, Azure Functions, Azure Stack, Cognitive Services API, Cortana Analytics, BOTS Framework, Microsoft .NET 5 / 4.5 stack, MVC 5, IIS 7.0/6.0, C#, LINQ, Bootstrap 3.0, CSS/LESS, HTML5

Confidential, Connecticut

Sr. Cloud/DevOps Engineer | (AWS/GCP)

Responsibilities:

  • Created AWS cloud formation templates to create custom sized VPC, subnets, EC2 instances, ELB, security groups. Worked on tagging standard for proper identification and ownership of EC2 instances and other AWS Services like Cloud Front, cloud watch, RDS, S3, Route53, SNS, SQS, Cloud Trail.
  • Implemented multiple AWS related service to portal based functionalities like Cognito, DNS, and Route 53 services.
  • Integrated AWS Dynamo DB using AWS lambda to store the values of the items and backup the Dynamo DB streams.
  • Automated backup of data in EBS and instance store to S3 buckets and created a backup of AMI for mission-critical production servers from AWS CLI and used the AWS Data pipeline to configure data loads from S3 into Redshift.
  • Configured AWS Route53 to manage DNS zones globally, create recordsets, DNS failover and health checks of domains, assign domain names to ELB, and CloudFront.
  • Develop a framework for converting existing PowerCenter mappings and to PySpark(Python and Spark) Jobs.
  • Create a PySpark frame to bring data from DB2 to Amazon S3.
  • Provisioned EC2 instances using Terraform and cloud formation, wrote new plugins to support new functionality in Terraform.
  • Used Bash and Python, to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMIs and scheduling Lambda functions for routine AWS tasks.
  • Migrated on-prem applications to Google Cloud Platform (GCP) to address scalability and performance issues. Creating of Google Kubernetes Engine for manage and orchestrate for Docker container clusters of containers that run with Google public cloud.
  • Developed a data warehouse model in SnowFlake for over 100 datasets using WhereScape.
  • Heavily involved in testing SnowFlake to understand the best possible way to use cloud resources.
  • Scheduled different SnowFlake jobs using NiFi. Used NiFi to ping SnowFlake to keep Client Session alive.
  • Used Terraform as infrastructure as code, execution plans, resource graph, and change automation. Managed GCP infrastructure as code using Terraform.
  • Managed Kubernetes charts using Helm and Created reproducible builds of the Kubernetes applications, managed Kubernetes manifest files and managed releases of Helm packages.
  • Established a local dev workflow that centered on minikube to validate deployments in Kubernetes. Created RESTful API to store, update, read & delete bucket information on Console using Kubernetes interfaces.
  • Virtualized the servers on AWS using the Docker, create the Docker files and version control to achieve Continuous Delivery goal on the highly scalable environment, used Docker coupled with load-balancing Nginx.
  • Integrated Docker with Jenkins using Cloud Bees Docker plugin to automate container deployment. Wrote Docker files in YAML for managing the whole life cycle of multi-container applications.
  • Automated the cloud deployment using Chef and AWS Cloud Formation. Used Chef for unattended bootstrapping in AWS.
  • Involved in setting up Chef Infra, bootstrapping nodes, creating, and uploading recipes, node convergence in Chef SCM.
  • Developed a pipeline job in Jenkins to provide API access to a private repo and deploy the artifact to EC2 instance.
  • Integrated AWS with other systems like Jenkins, Chef, and HP service Manager to develop CI/CD pipelines for automating Prod deployments using REST APIs.
  • Developed and maintained the continuous integration and deployment systems using Jenkins, Maven, JBoss, and Jfrog Artifactory.
  • Used a MicroService architecture, interacting through a combination of REST and MQ and leveraging AWS to build, test, and deploy Identity Micro Services.
  • Developed build using MAVEN as build tool and using Jenkins to kick off the builds to move from one environment to other environments.
  • Designed project workflows/pipelines using Jenkins as a CI tool and on building Jenkins jobs to create AWS infrastructure from bitbucket repositories.
  • Created and Implemented branching and merging strategy with multiple branches and used bitbucket as a source code management repository to keep track of version changes.
  • Built Elastic search, Log stash, and Kibana (ELK) for centralized logging and then store logs and metrics into the S3 bucket by triggering the Lambda function.
  • System log and Cloud Trail collection using Splunk including Splunk installation, collector configuration, and multi indexer setup on Production Environment.
  • Generated reports using JIRA for creating projects, assigning permissions to users and groups for the project & created mail handlers and notifications schemes for JIRA.

Environment: Amazon Web Services (AWS) (EC2, Cloud Front, Cloud Watch, RDS, ELB, EBS, S3, Route53, SNS, SQS, KMS, Cloud Trail, IAM, Cloud Formation, Virtual Private Cloud (VPC), LDAP, Terraform,), Cognito, GIT, Jenkins, Chef, Ansible, ELK RedHat Linux, Docker, Bash, Shell, Python, Tomcat.

Confidential, Foster City - CA

DevOps Engineer/ Build and Release

Responsibilities:

  • Implemented multiple EC2 instances at the same time and provided highly durable and available data by using S3 data store, versioning, lifecycle policies, & create AMIs for mission-critical production server’s backup.
  • Provided highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMIs for mission-critical production servers for backup.
  • Designed AWS Cloud Formation templates to create custom sized VPC, Subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Designed roles and groups for users and resources using AWS Identity Access Management (IAM) and built server deployment on Amazon Cloud (EC2) servers with the help of DevOps tool like Chef.
  • Managed Chef Workstation, Chef Attributes, Chef Templates, Chef Recipes, and Chef Files for managing and bootstrapping the configurations across various nodes, setting up keys, and Created Chef Cookbooks to install and configured Infrastructure across environments and automated the process using Python Script.
  • Implemented the Chef Software setup and configuration on VM's from the scratch and deployed the run-list into the chef server and bootstrap the chef client s remotely.
  • Designed CI/CD processes in the context of a Jenkins orchestration, including the use of automated build, test/QA, and deployment tools.
  • Involved in editing the existing MAVEN files in case of errors or changes in the project requirements.
  • Extensively used the MAVEN tool to do the builds and integrated with Jenkins for the builds as the continuous integration process. Modify build configuration file including POM.XML.
  • A tool to do the builds and integrated with Jenkins for the builds as the continuous integration process. Modify build configuration file including POM.XML.
  • Creating GIT repositories and give access rights to authorized developers and worker on Artifactory.
  • Used SonarQube for continuous inspection of code quality and to perform automatic reviews of code to detect bugs. Automated Nagios alerts and email notifications using Python script and executed them through Chef.
  • Developed Shell Scripts and Perl for automation of the build and release process. Developed custom solutions in C# and PowerShell to validate availability, consistency, and compliance with environments.
  • Created scripts in Python to automate log rotation of multiple logs from web servers and wrote shell scripts to automate the process of adding ssh-keys, generating passwords in python.
  • Implemented performance capacity and availability monitoring using tools like Nagios, Datadog, PagerDuty.

Environment: Amazon Web Services (AWS), Chef, Jenkins, Bitbucket, Maven, PowerShell Scripting, Linux, VMWare Servers, Shell scripting, Bash, Linux/RHEL, Windows, Python, PHP, Nexus, Artifactory.

We'd love your feedback!