Devops/ Infrastructure Engineer Resume
Mountain, VieW
PROFESSIONAL SUMMARY:
- AWS Certified Developer with over 9+ years of experience in DevOps, build automation, Software Configuration, Build & Release Engineer, Linux Administration, Openstack, experience in large and small software development organizations involving cloud computing platforms like Amazon Web Services (AWS), Azure and Google Cloud (GCP).
- 4+ years of work with DevOps OpenStack tools like Puppet, chef, Docker, Openshift, Redhat cloud forms.
- Transformed traditional environment to virtualized environments with, AWS - EC2, S3, EBS, EMR, ELB, EBS, Kinesis, Redshift, Matillion, chef, Puppet, Jenkins, Jira, Dockers, Vagrant, OpenStack - Nova, Neutron, Swift, Cinder, and VMware.
- Experience in deployment of SAML based high available Identity provider& Service Provider solutions using Ping federate, CA Siteminder SSO/Federation and Simple SAML systems.
- Implemented OAuth and OpenID for mobile and non-browser solutions using Ping Federate.
- Experience configuring Azure App services, Azure Application insights, Azure Application gateway, Azure DNS, Azure Traffic manager, App services, Analyzing Azure Networks with Azure Network Watcher, Implementing Azure Site Recovery, Azure stack, Azure Backup and Azure Automation.
- Experienced in Managing DNS, LDAP, FTP, JBOSS, Tomcat and Apache web servers on Linux servers.
- Working Knowledge of Visual Studio Build Professional, NANT, MSBUILD.
- Experience SonarQubeandJUnit for testing and reviewing the code and code quality in CI/CD processes.
- Gained extensive experience in RPM deployment via Chef, build automation through Jenkins, and server management RHEL.
- Experience in designing/working on Amazon Web Services such as EC2, S3, Route 53, ELB, VPC, Auto-Scaling, AMI, EBS, IAM, Cloud Formations and Cloud Watch.
- Wrote various chef modules, python & bash scripts to automate deployment of openstack components, Linux components and many other tools to list.
- Engineered OpenStack (Grizzle and Havana) private/public cloud on RHEL6.X/RHEL 7.x for kraft client.
- Expertise in scanning and remediating application vulnerabilities using SSAP scan analysis and Black Duck scanning.
- Developed processes, tools, automation for UrbanCode based software for build system and delivering SW Builds.
- Experienced in Parasoft Virtualize Service Virtualization tool and Parasoft SOA Test, SOAPUI / Ready API Web services Test Automation tool.
- Good working Experience in client side development with HTML, XHTML,CSS, JavaScript, JQuery and AJAX.
- Worked in infrastructure team on installation, configuration and administration of CentOS 5.x/6.x/7, Red Hat Linux 8/9, RHEL 5.x/6.x/7, Windows Server and SUSE Linux 10.x/11.
- Experienced in configuring and integrating the servers with different environments to automatically provisioning and cresting new machines using CM/ Provisioning tools like Ansible, Chef and Puppet.
- Automated deployment of builds to different environments using TeamCity, Jenkins.
- Involved in Design, development and testing of web application and integration projects using Object Oriented technologies such as Core Java, J2EE, Struts, JSP, JDBC, Spring Framework, Hibernate, Java Beans, Web Services REST/SOAP, XML,XSLT,XSL, and Ant.
- Machine learning and GPU utilization for general computing.
- Experience in designing and developing applications in Spark using Scala to compare the performance of Spark with Hive and SQL/Oracle.
- Experience on Airflow and related tools to get to the client requirements.
- Experience in Designing, Architecting and implementing scalable cloud-based web applications using AWS and GCP.
- Develop data pipelines for Machine Learning models to improve functionality and user experience of the google Test cloud platform and its data.
- Implemented and Managed Docker and Kubernetes infrastructure and working on Worked in DevOps group running Jenkins in a Docker container with EC2 slaves in Amazon AWS cloud configuration.
- Worked on maintaining Docker Images and containers.
- Performance tuning the tables in Redshift, data Validation, Quality check in Redshift using Python.
- Strong working knowledge in developing Restful webservices and Micro Services using Golang.
- Experienced in handling big data systems using NoSQL DB, Cassandra & data streaming tools like Kafka in multi-data center cluster.
- Experience is using Tomcat, JBOSS, Web logic and Web Sphere Application servers for deployment.
- Performed automation tasks on various Dockercomponents like Docker Hub, Docker Engine, Docker Machine, Compose and Docker Registry. Deployment and maintenance using Micro services using Docker.
- Monitor major metrics like Network packets, CPU utilization, Load Balancer Latency.
- Excellent communication, interpersonal, and analytical skills to work efficiently in both independent and teamwork environments.
- Working Knowledge of ISO OSI stack and Network Protocols like TCP/IP, UDP/IP and Embedded Ethernet.
- Excellent Experience in Hadoop architecture and various components such as HDFS Job Tracker Task Tracker NameNode Data Node and MapReduce programming paradigm.
- Have sound exposure to Retail market including Retail Delivery System.
- Hands on experience in installing configuring and using Hadoop ecosystem components like Hadoop MapReduce HDFS HBase Hive Sqoop Pig Zookeeper and Flume.
- Good Exposure on Apache Hadoop Map Reduce programming PIG Scripting and Distribute Application and HDFS.
- Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
- In-depth understanding of Data Structure and Algorithms.
- Experience in managing and reviewing Hadoop log files.
- Experience with Snowflake Multi-Cluster and Virtual Warehouses.
- In-depth knowledge of Data Sharing in Snowflake.
- In-depth knowledge of. Snowflake Database, Schema and Table structures.
- Experience in using Snowflake Clone and Time Travel.
- Areas of expertise includes analysis design and development of software involving technologies like Java J2EE Servlets JSP JDBC JSTL SPRING 3.0/2.5 JPA Hibernate 3.0 Struts 2.0 Web Services WSDL JMS EJB XML XSLT JNDI HTML JavaScript AJAX and JSF Prime faces.
- Set up a GCP Firewall rules in order to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
- Expertise in using build tools like MAVEN and ANT for the building of deployable Artifacts such as war and ear from Source Code.
- Championed in cloud provisioning tools such as Terraform and CloudFormation.
- Experience in Python, Perl, bash, Ruby, Groovy and Shell Scripting.
- Experience in software methodologies like Waterfall model, Agile Methodology, Scrum, and TDD.
- Extensive experience in using Continuous Integration tools like Cruise Control, Jenkins/Hudson, Build Forge, Team City, and Bamboo.
- Extensive experience creating Datamart and entering market data, trades, and securities into Murex systems.
- Hands-on experience performing extractions, margin calculations, and validations on data via Java and re-entering the data into the Murex system.
- Developed User Interface in JSP, JavaScript and HTML with Backbone JS Framework.
- Experience in spring module like MVC, AOP, JDBC, ORM, JMS, and Web Services using Eclipse and STS IDE.
- Experience in SAML based authentication 1.1 and 2.0 using Ping federate and CA SiteMinder Federation.
- Good knowledge configuring VNet to another VNet using Virtual network peering or An Azure VPN Gateway.
- Ability to act as the Murex API Architect, Java Developer and Analyst in many areas on projects.
- Experience with AWS instances spanning across Dev, Test and Pre-production and Cloud Automation through Open Source DevOps tools like Ansible, Jenkins, Openshift&Kubernetes.
- Experienced in all phases of the software development lifecycle (SDLC) with specific focus on the build and release of quality software. Experienced in Waterfall, Agile/Scrum, Lean and most recently Continuous Integration (CI) and Continuous Deployment (CD) practices.
- Experience in Branching, Merging, Tagging and maintaining the version across the environments using SCM tools like Subversion (SVN), GIT (GitHub) and ClearCase.
- Strong experience on C, Multi-threading, Boost, STL, Sqlite, GDB, Purify, Quntify, Fortify and Makefile on Unix/Windows platforms.
TECHNICAL SKILLS:
Version Control: SCM, SVN, GIT, Clear Case, GitHub, Bit Bucket, TFS.
Scripting: Perl, Ant, Maven, Shell Scripting, JMS, JavaScript and Python
CI/CD Tools: Jenkins, Hudson, AnthillPro, Build Forge, uBuild, Bamboo
Build Tools: MAVEN, Gradle, ANT, Make and MSBuild
Container Technologies: Docker, KubernetesConfiguration Mgmt: Chef, Puppet, Ansible and Vagrant
Deployment Tools: U-Deploy, Octopus Deploy, Run deck
Testing Tools: SonarQube, Junit, Fortify, Parasoft
Tracking Tools: IBM Clear Quest, Perforce, JIRA
Databases: Oracle 9i/8i/10g, IBM DB2/UDB, dra dra, MangoDB
Platforms: Windows, UNIX, Ubuntu and Linux
Servers: Apache Web Server, airflow, WebSphere, WebLogic, Tomcat, and JBoss
Cloud Technologies: AWS (VPC, EC2, S3, Cloud Watch, Lambda, RDS, EBS, IAM), GCP, (IaaS, PaaS, SaaS) XaaS, G, Murex, Snowflake, Hadoop, Golang, Spark, Terraform, Cloudant, Redis, Rabit, PostgreSQL, Kafka, Openshift
PROFESSIONAL EXPERIENCE:
Confidential - Mountain View
DevOps/ Infrastructure Engineer
Responsibilities:
- Develop tools to automate the deployment, administration, and monitoring of a large-scale AWS Linux environment.
- Analyze the business, technical, functional, performance and infrastructure requirements needed to access and process large amounts of data.
- Experience in using GIT/GITHUB integrating with Jenkins (CI) with Groovy with NEXUS.
- Extensive experience using Groovy as build tools for the building of deployable artifacts (jar, war & ear) from the source code.
- Experience in building CI/CD pipelines using Jenkins for deployments for End to End automation to support all build and deployment as a pipeline.
- Automated machine learning dnn training and optimized it on a gpu with cuda.
- Experience as a GIT, Environment Management and Build/
- Release Engineering for automating, building, release, Go-live and configuring changes from one to another environment.
- Worked Extensively on building and maintaining clusters managed by Kubernetes, Linux, Bash, GIT ,Docker, on GCP (Google Cloud Platform).
- Experience in Google Cloud components, Google container builders and GCP client libraries and cloud SDK’s.
- Developed Vagrant file for Updating, Upgrading, Deploying and configuring the existing and new Infrastructure applications.
- Wrote Terraform templates for AWS Infrastructure as a code to build staging, production environments & set up build & automations for Jenkins.
- Reviewing the Tire-I, Tire-II MRM data, Dashboard data and control tower data.
- Developed new RESTful API services that work as a middleware between our application and third-party APIs that we will used using Golang.
- Played key role in Migrating Teradata objects into SnowFlake environment.
- Developed data warehouse model in snowflake for over 100 datasets using whereScape.
- Heavily involved in testing Snowflake to understand best possible way to use the cloud resources.
- Setup Alerting and monitoring using Stackdriver in GCP.
- Used Splunk for the engineering dashboards used by the developers for easy track of application health checks.
- Experience writing data APIs and multi-server applications to meet product needs using Golang.
- Create and maintain highly scalable and fault tolerant multi-tier AWS and Azure environments spanning across multiple availability zones using Terraform and CloudFormation.
- Managing day to day SAP Basis issues for AWS hosted customers.
- Deploying and managing all SAP Applications on AWS.
- Daily Support of SAP Landscape Infrastructure.
- Experience in System Administration, System & Server builds, Upgrades, Patches, Migration, Troubleshooting, Security, Backup, Disaster Recovery, Performance Monitoring and Fine-Tuning on Linux Severs.
- Experience in Designing, Architecting and implementing scalable cloud-based web applications using AWS and GCP.
- Built custom dashboards with Smashing/dashing open source widgets to display Operational tasks on displays.
- Configured the QA Environment for Manual Testing as well as Automation through autosys.
- Implemented preventive guardrails using Service Control Policies (SCPs).
- Implemented detective guardrails using Cloud Custodian policies and AWS config.
- Set up a GCP Firewall rules in order to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
- Monitor Solar winds on Unclas and Classified networks on the SOCOM domain for over 100 severs including domain controllers and VM servers.
- Good experience working in Airflow.
- Generated workflows through Apache Airflow, then Apache Oozie for scheduling the hadoop jobs which controls large data transformations.
- Working on Deployment procedures using middleware like Tomcat, creating deploy scripts and setting for the Production Release.
- Expertise in scanning and remediating application vulnerabilities using static code analysis and Black Duck scanning.
- Strong production experience and insights of Consulting, Architecting/Designing and Implementing virtual environments for continuous delivery systems and its methodologies.
- Managing DNS, LDAP, FTP, JBOSS, Tomcat and Apache web servers on Linux servers.
- Developed automation scripts for various Configuration Management Tools Including AWS Lambda Functions to reduce day-to-day repetitive work using Python/Shell.
- Responsible for ensuring that team delivers projects that are technically sound and comply with defined standards and procedures.
- Organize and label material’s, and display student’s work in a manner appropriate for their eye levels and perceptual skills.
- Utilizes NIST A and NIST rev-4 to review implemented controls and enter information into the Requirements Traceability Matrix (RTM) and findings into the Security Assessment Report (SAR).
- Experience in writing Shell, Perl, Python and JSON scripts.
- Automated RabbitMQ cluster installations and configuration using Python/Bash.
- Installed on Puppet/Chef/Dockers for the Openstack and Openshift environment along with scripting in PERL/RUBY and PYTHON.
- Configured and maintained Jenkins to implement the CI process and integrated the tool with Ant and Maven to schedule the builds and automated the deployment on the application servers using the "code deploy" plugin for Jenkins.
- Written Corn Jobs to automate daily scripts.
- Continuous integration with Jenkins, continuously evaluate and recommend improvement to CI/CD processes.
- Provided Administration for TeamCity (Continuous Integration) & Build servers.
- Branching, Tagging, Release Activities on Version Control Tool SVN, TeamCity.
- Trained team members on PTC MKS Integrity. Utilized Coverity and Parasoft for static analysis.
- Provided access to data necessary to perform analysis on scheduling, pricing, bus bunching and performance. Queries in the Redshift environment performed x faster than in legacy environments.
- Created a Python process hosted on Elastic Beanstalk to load the Redshift database daily from several source.
- Integrated MISRA rules for a safer firmware. Performed code reviews.
- Successfully migrated the website's main database from MySQL to PostgreSQL.
- Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
- Using InformaticaPowerCenter Designer analyzed the source data to Extract & Transform from various source systems(oracle 10g,DB2,SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
- Using InformaticaPowerCenter created mappings and mapplets to transform the data according to the business rules.
- Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.
- Worked on Continuous Integration (CI) workflow using Virtual Environments like Docker and kubernetes to build various containers to deploy the micro services-oriented environments for scalable applications. Implemented Twist lock for containers and application security.
- Design, build, configure, test, install software, manage and support all aspects and components (Chef) of the application development environments in AWS.
- Writing Chef recipes and cookbooks and uploading them to Chef server, managing on-site OS, Applications, Services, Packages using Chef.
- Maintained Chef Configuration Management spanning several environments in VMWare and AWS.
- Using Docker, Openshift and Amazon Cloud Architecture that will best utilize our existing technology patents to serve real time needs and deployments.
- Experience in dealing with Windows Azure IaaS - Virtual Networks, Virtual Machines, Cloud Services, Resource Groups, Express Route, Traffic Manager, VPN, Load Balancing, Application Gateways, and Auto-Scaling.
- Building the AWS Infrastructure using VPC, EC2, S3, Route 53, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation, AMI, EBS, IAM, and Cloud Watch.
- Experience in Designing the DataDog to Monitor theDockercontainers, RDS Storage and CPU Usage.
- Implemented the new cluster configuration with node pools, cross region implementation and migrated existing applications to new AKS infrastructure.
- Experience in Agile/Scrum methodologies on most recent Continuous Integration (CI) and Continuous Deployment (CD) practices.
- Used Terraform to reliably version and create infrastructure on Azure.
- Created resources, using AzureTerraform modules, and automated infrastructure management.
- Similar infrastructure is deployed to Azure and additional cloud providers or on-premises datacenters using Terraform and managed infrastructure on multiple cloud providers.
- Deployment of web, enterprise java components, messaging components, concurrency and multi-threading.
- Working withDocker, Openshift,Kubernetes for the Container Security Engineer implementing monitoring/auditing security events on container and implement container network security detection.
- Designed and implemented application using JSP Spring MVC JNDI Spring IOC Spring Annotations Spring AOP Spring Transactions Hibernate 3.0 SQL ANT JMS Oracle and Oracle Web Logic Application server.
- Developed custom consumers and producers for Apache Kafka in Go (Golang) for cars monitoring system.
- Designed the real-time analytics and ingestion platform using Storm and Kafka. Wrote Storm topology to accept the events from Kafka producer and emit into Cassandra DB.
- Provided Infrastructure support and user support for AWS.
- Experience in Setting up the build and deployment automation for Terraform scripts using Jenkins.
- Provisioned the highly available EC2 Instances using Terraform and cloud formation and wrote new plugins to support new functionality in Terraform.
- Used Bash and Python included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMIs and scheduling Lambda functions for routine AWS tasks.
- Implemented Federation Solution using SAML 2.0 Ping Federate 6.
- Managed, developed, and designed a dashboard control panel for customers and Administrators using Django, Oracle DB, PostgreSQL, and VMWare API calls.
Environment: Linux, Shell, Python, Java, Git, Gradle, Chef, Puppet, Oracle, Vagrant, Tomcat, JBoss, AWS services (EC2, VPC, S3, IAM, RDS, SNS, Cloud Watch, Elastic Beanstalk, Route53, EBS, ELB), Lambda, Data Dog, Docker, Kubernetes, Jenkins, Maven, Bamboo, Nexus, Junit, Black Duck, JMS, Twist lock, Terraform, Rabit, PostgreSQL, ETL, Kafka, Redshift, airflow, OpenShift.
Confidential
Cloud DevOps Engineer
Responsibilities:
- Worked on agile development life cycle.
- Install and configure Virtual machines, storage account, virtual network, Azure load balancer in the Azure cloud.
- Responsible for implementing, design and architect solution for Azure cloud and network infrastructure, Data center migration for public, private and Hybrid cloud
- Perform assessment of the existing environment (Application, servers, database) using various tool like MAP toolkit, Azure Website Migration assistant and Manual assessment.
- Developed a migration approach to move workloads from On-Premises to Windows Azure for Windows machines & AWS for Linux Solaris machines. Administered RHEL, Centos, Ubuntu, UNIX & Windows servers.
- Installed and configured HadoopMapReduce HDFS Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Development of Data integration scripts using Anaplan connect to tie in Oracle as a data source to Anaplan EPM tool for CIOX.
- Implemented Federation Solution using SAML 2.0 Ping Federate 6.
- Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring and cloud deployment manager.
- Setup GCP Firewall rules to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
- Experience in installing configuring and using Hadoop ecosystem components.
- Practicing consulting on Snowflake Data Platform Solution Architecture, Design, Development and deployment focused to bring the data driven culture across the enterprises
- Driving replacing every other data platform technology using Snowflake with lowest TCO with no compromise on performance, quality and scalability.
- Building solutions once for all with no band-aid approach.
- Designed and developed Dynamics-AAA (Access, Authorize & Audit) Portal which provides secure access to Azure resources and assigns custom roles. This Portal became a standard for granting access and same compliance with MSIT standards.
- Responsible for implementing monitoring solutions in Ansible, Terraform, Docker, Openshift and Jenkins.
- Implemented the new cluster configuration with node pools, cross region implementation and migrated existing applications to new AKS infrastructure.
- Continued evaluation of Context by Guardrail of roadmap and utilized the context to ensure ongoing performance.
- Extensively used Data warehouse ETL Informatica methodology in testing for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL solution using ETL tools like Informatica.
- Worked on data cleansing using the cleanse functions in Informatica MDM.
- Good exposure in Informatica MDM where data Cleansing, De-duping and Address correction were performed.
- Administration of DevOps tools suite like Puppet Enterprise, AWS, TeamCity, GitHub, JIRA, Confluence, Rundeck, Puppet, Octopus Deploy, Splunk and ELK stack.
- Experience in using Continuous Integration tools like TFS Team Build, Cruise Control, Build Forge, TeamCity, Bamboo, Hudson, and Jenkins for End-to-End automation for all build and deployments.
- Involved in Informatica MDM hub configuration, IDQ cleanse function implementation, Hierarchy Configuration in MDM hub.
- Orchestrated Corrective Actions and Preventive measures throughout major project phase.
- Focused on Preventive actions over Corrective actions to mitigate risks.
- Developed new RESTful API services that work as a middleware between our application and third-party APIs that we will used using Golang.
- Experience writing data APIs and multi-server applications to meet product needs using Golang.
- Deployed the tools Microsoft Azure Cloud Service (PaaS, IaaS), and Web Apps.
- Used SQL Azure extensively for database needs in CustomerLookup& //AzNot.
- Coordinated with QA Testing Team to test various test scenarios involving Test Plans, Test Cases and Test Scripts.
- Migrated the Azure CXP Tools to HTTPS based authentication using SSL encryption.
- Worked with Nagios for Azure Active Directory&LDAP and Data consolidation for LDAP users. Monitored system performance using Nagios, maintained Nagios servers and added new services & servers.
- Worked with the application based on JSP, JavaScript, Struts 2.0, JSF 2.0, Cloudant, Hibernate 3.0, Service Oriented Architecture System Analysis and Design methodology as well as Object Oriented Design.
- Direct technical engagement with SOCOM, TRANSCOM, MARSOC, NAVSOC, and other high level security agencies for use of DVN in Clandestine Communications, Managed Attribution for Open Source Analysis, and mission critical communications.
- Expertise in scanning and remediating application vulnerabilities using static code analysis and Black Duck scanning.
- Managing SAP users, authorizations, and profiles, distributing online SAP user workloads, monitoring and managing SAP background job workloads.
- Perform risk assessments, update and review System Security Plans (SSP) using NIST (Guide for Developing Security Plans for federal information systems) Plans of Action and Milestones (POA&M), Security Control Assessments, Configuration.
- Perform vulnerabilities scan and monitor continuously using NIST and NIST as a guide with the aid of Nessus.
- Expertise in Preparing, arranging and testing the Splunk search strings and operational strings.
- Extensive experience in deploying, configuring, and administering Splunk Clusters.
- Experience in developing Splunk queries and dashboards targeted at understanding application performance and capacity analysis.
- Created and Scheduled the Bots in Control Tower. Expertise in scheduling, triggering of tasks with advance features of task-queuing technology and deploying tasks in Remote PC's.
- Experience in working with Splunk authentication and permissions and having significant experience in supporting large scale Splunk deployments.
- Expert in installing SPLUNK apps for Linux and UNIX environments.
- Data Profiling, Mapping and Integration from multiple sources to AWS S3/RDS/Redshift.
- Automation of ETL loads into Redshift Database using Windows Batch Scripts.
- Performed Murex CVA desk tasks.
- Secured Data is stored in MySQL. Vault (by HashiCorp) secures, stores and tightly controls access tokens and passwords used by the overall platform, started in the AWS cloud and currently integrates with several services like: AWS AIM, Amazon DynamoDB, Amazon SNS, Amazon RDS.
- Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring using Ansible Playbooks and has integrated Ansible with Jenkins .
- Created Ansible cloud modules for interacting with Azure services which provides the tools to easily create and orchestrate infrastructure on Azure and automated cloud-native applications in Azure using Azure microservices such as Azure functions and Kubernetes on Azure .
- Deployed and manage containerized applications more easily with a fully managed Kubernetes service. Azure Kubernetes Service (AKS) offers serverlessKubernetes, an integrated continuous integration and continuous delivery (CI/CD) experience, and enterprise-grade security and governance. Unite your development and operations teams on a single platform to rapidly build, deliver, and scale applications with confidence.
- Experience in dealing with Windows Azure IaaS - Virtual Networks, Virtual Machines, Cloud Services, Resource Groups, Express Route, Traffic Manager, VPN, Load Balancing, Application Gateways, and Auto-Scaling.
- Designed and created the database tables and wrote SQL queries to access PostgreSQL
- Worked on Continuous Integration (CI) workflow using Virtual Environments like Docker and kubernetes to build various containers to deploy the micro services-oriented environments for scalable applications. Implemented Twist lock for containers and application security.
- Configured RBAC and Azure Monitor for adding security in Azure Cloud.
- Installed and configured SCM tools, Chef on Azure.
- Coordination with continuous Integration to ensure that all applicable environment issues are resolved in advance of production implementation.
- Automated RabbitMQ cluster installations and configuration using Python/Bash.
- Designed and deployed applications utilizing all the AWS stack (Including EC2, Route53, S3, ELB, EBS, VPC, RDS, DynamoDB, SNS, SQS, IAM, KMS, Lambda, Kinesis) and focusing on high-availability, fault tolerance and auto-scaling in AWS Cloud Formation, deployment services (Ops Works and Cloud Formation) and security practices (IAM, Cloud Watch, Cloud Trail).
- Configured AWS IAM and Security Group in Public and Private Subnets in VPC.
- Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
- Created AWS Route53 to route traffic between different regions.
- Performed SVN to GIT/BitBucket migration and managed branching strategies using GIT flow workflow. Managed User access control, Triggers, workflows, hooks, security and repository control in BitBucket.
- Implemented multiple CI/CD pipelines as part of DevOps role for on-premises and cloud-based software using Jenkins, Chef, Openshiftand AWS/Docker.
- Experience in Configuration Management, Cloud Infrastructure, and Automation like Amazon Web Services (AWS), Ant, Maven, Jenkins, Chef, SVN, GitHub, Clear Case, Tomcat, and Linux.
- Used JIRA as defect tracking system and configure various workflows, customizations and plugins for JIRA bug/issue tracker, integrated Jenkins with JIRA, GitHub.
- Extensively experienced in Bash, Perl, Python, Ruby scripting on Linux.
- Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
- Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's.
- Implemented ELK (Elastic Search, Log stash, Kibana) stack to collect and analyze the logs produced by the spark cluster.
- Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark.
- Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.
- Installed on Puppet/Chef/Dockers for the Openstack and Openshift environment along with scripting in PERL/RUBY and PYTHON.
- Carried automated Deployments and builds on various environments using continuous integration (CI) tool Jenkins.
- Implemented Restful web service to interact with Redis Cache framework.
- Worked on developing Restful endpoints to cache application specific data in in-memory data clusters like REDIS and exposed them with Restful endpoints.
- Installing, configuring and administering Jenkins CI tool on Linux machines.
- Build Scripts using Ant and Maven build tools in Jenkins to move from one environment to other environments.
- Maintaining the Elasticsearch cluster and Logstash nodes to process around 5TB of Data Daily from various sources like Kafka, kubernetes, etc.
- Wrote Chef Recipes to automate our build/deployment process and do an overall process improvement to any manual processes.
- Wrote multiple cookbooks in Chef and implemented environments, roles and Data Bags in Chef for better environment management.
- Implemented Chef Knife and Cookbooks by Ruby scripts for Deployment on internal Data Centre Server and reused same Chef Recipes to create a Deployment directly into EC2 instances.
- Created PostgreSQL and Oracle databases on AWS and worked on modifying their settings.
- Created and managed multiple instances of Apache Tomcat and deployed several test applications in those instances in QA environment.
- Merging, and automation processes across the environments using SCM tools like GIT, OCTOPUS, Stash and TFS on Linux and windows platforms.
- Investigation of issues found in the production environment, Apache Tomcat configuration and support for other teams within IT.
- Deployed and maintained production environment using AWS EC2 instances and Elastic Container Services with Docker.
- Good Knowledge on container management using Docker in creating images.
- Worked on Docker components like Docker Engine and creating Docker images.
- Implemented a Continuous Delivery pipeline with Docker, Openshift, Jenkins and GitHub and AWS AMI's.
- Jenkins DSL script for Code quality analysis using sonar cube and HP Fortify tools.
- Monitoring the environments with Sensu monitoring tool.
- By designing and implementing Docker workflow reduced built and deployment times.
- Experience in creating Docker containers and Docker consoles for managing the application life cycle.
- Documented release builds and source control procedures and plans.
- Develop scalable build, test and deployment systems in virtualized environments.
- Resolved the issues on Amazon web services by capturing the snapshots of build boxes.
- Developed micro service on boarding tools leveraging Python and Jenkins allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services.
- Good experience working in Airflow.
- Generated workflows through Apache Airflow, then Apache Oozie for scheduling the hadoop jobs which controls large data transformations.
Environment: Azure (Web Roles, Worker Roles, SQL Azure, Azure Storage, Azure AD, Resource Groups, Office365, RBAC), GCP, SVN, GIT, GitHub, BitBucket, DSL (groovy), Sonarcube, Sensu, ANT, Maven, PostgreSQL, AWS, Docker, Kubernetes, JIRA, Shell Scripts, Chef, Python, Ruby, Jenkins, AWS, Groovy, Octopus, WebLogic, Tomcat, WebSphere, Golang, Black Duck, Twist lock, Fortify, Spark, Terraform, Cloudant, Redis, Rabit, PostgreSQL, ETL, airflow, OpenShift.
Confidential, Atlanta, GA
DevOps/ Puppet Automation Engineer
Responsibilities:
- Built and Deployed java source code into application servers in an AGILE continuous integration environment.
- Worked in DevOps collaboration team for internal build and automation configuration management in Linux/Unix and windows environment.
- As being a DevOps engineer used series of tools (subversion, CVS, maven, Jenkins, chef, Jira) and involved in day-to-day build and release cycles.
- Maintained and modified build related scripts developed in ANT (build.xml files).
- Managed source control systems GIT and SVN.
- Developed build and deployment scripts and used ANT/Maven tools in Jenkins to span from one environment to other.
- Installation and support of various applications and Databases including Oracle, MySQL along with Web Logic 10, JBOSS 4.2.x, Oracle 10g, Tomcat.
- Designed and maintained AWS infrastructure highly available, scalable, secured, monitored and alerting network infrastructure.
- Through Knowledge of Linux internals and utilities (kernel, Memory, Swap, CPU)
- Experienced in cloud automation using AWS cloud Formation Templates, Chef, Openshift,Puppet.
- Implemented CloudTrail in order to capture the events related to API calls made to AWS infrastructure.
- Configured AWS Identity Access Management (IAM) Group and users for improved login authentication.
- Launching and configuring of Amazon EC2 (AWS) Cloud Servers using AMI's (Linux/Ubuntu) and configuring the servers for specified applications.
- Worked on AWS designing and followed Info security compliance related guidelines.
- Designing and implementing for fully automated server build management, monitoring and deployment By Using DevOps Technologies like Puppet.
- Involved in leading Automation Deployment Team by working with Puppet.
- Created puppet manifests, profiles and roles module to automate system operations.
- Developed/managed Puppet manifest for automated deployment to various servers.
- Used Puppet to automate Configuration management & Applications.
- Deployed puppet, puppet dashboard and puppetDB for configuration management to existing infrastructure.
- Constructed the puppet modules for continuous deployment and worked on Jenkins for continuous integration.
- Automated Linux production server’s setup using Puppet scripts. Used these scripts to replicate production build environments on a local dev boxes using Vagrant and Virtual Box.
- Wrote AnsiblePlaybooks with PythonSSH as the Wrapper to Manage Configurations of Openstack Nodes and Test Playbooks on AWS instances using Python.
- Create develop and test environments of different applications by provisioning Kubernetes clusters on AWS using Docker, Openshift, Ansible, and Terraform.
- Installed, configured and automated build jobs in Jenkins for continuous integration using various plugins in AWS pipelining.
- Secured Data is stored in MySQL. Vault (by HashiCorp) secures, stores and tightly controls access tokens and passwords used by the overall platform, started in the AWS cloud and currently integrates with several services like: AWS AIM, Amazon DynamoDB, Amazon SNS, Amazon RDS.
- Configuring Chef to build up services and applications on the instances once they have been configured using cloud formation.
- Performed Continuous Delivery in a MicroServices infrastructure with Amazon cloud, Docker and Kubernetes.
- Built and maintaining Docker infrastructure for Service oriented architecture (SOA) applications.
- Worked on Continuous integration tools like Jenkins to build and test the applications and working on issue tracking tool like iTrack, JIRA, Confluence.
- Managed monitoring using Nagios and updated parameters with active and passive checks.
Environment: Linux (Redhat, Solaris, Ubuntu), Windows, AWS, Puppet, PuppetDB, Chef, Ansible, Docker, WebLogics, JBoss, Oracle, MySQL, Ant, Maven, CVS, GIT, SVN, Jenkins, Docker, Itrack, Jira, kernel, Memory, Swap, Terraform, Openshift.
Confidential, Atlanta, GA
SCM Build Engineer
Responsibilities:
- Designed and implemented GIT metadata including elements, labels, attributes, triggers and hyperlinks.
- Implemented & maintained the branching and build/release strategies utilizing GIT.
- Experience in creating the company's DevOps strategy in a mix environment of Linux (RHEL, Ubuntu) servers along with creating and implementing a cloud strategy based on Amazon Web Services.
- Performed all necessary day-to-day GIT support for different projects.
- Responsible for design and maintenance of the GIT Repositories, views, and the access control strategies.
- Experience with Configuration Management Tools (PUPPET, CHEF)
- Conduct DevOps Process Planning.
- Automating Deployment of Servers on ESXI and Cloud. Managed servers on the Amazon Web Services (AWS) platform instances using chef configuration management.
- Responsibilities include developing complex build, test, provision, secure and deployment systems and providing support to a large community of developers and testers.
- Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, Chef, and custom Ruby/Bash scripts
- Implemented Elastic Beanstalk to auto-deploy and auto-scale applications using services such as EC2 instances, Load balancer and Databases on RDS in AWS environment
- Responsible for writing Manifests to configure nodes.
- Strong understanding of JAVA project structures.
- Experience in build automation using JENKINS, MAVEN, ANT.
- Experience in deploying JAVA projects using MAVEN/ANT and JENKINS.
- Develop scalable build, test and deployment systems in virtualized environments.
- Lead configuration management and workflow development efforts for the development team.
- Created and maintained various DevOps related tools for the team such as provisioning scripts, deployment tools, and development and staging environments on AWS, Rackspace and Cloud.
- Deployment and implementation of Chef.
- Experience with CI tools (Jenkins, Hudson) and Version Control Tools or Source Code Management tool (GIT).
- Good understanding of building the Android Applications using the Maven and Jenkins.
- Building post install scripts using Shell scripting in Linux servers.
- Strong skills in managing Red Hat Linux servers, Virtualization, and system security.
- Experience with Apache/Tomcat, Load Balancer (Apache) and expertise in making configuration changes.
- Knowledge in load balancing and setting up load balancer and firewall rules in and enterprise environment.
- Provided 24x7 production support and development environments.
- Ability to communicate requirements effectively to team members and manage applications.
- Self-motivated and I can easily adapt new technologies and tools.
- Ability to work in both independent and Team environments.
Environment: GIT, AWS, Ruby, Anthill pro, Hudson, Chef, Java/J2EE, ANT, MAVEN, JENKINS, XML, Red Hat LINUX, Web logic, MY SQL, Perl Scripts, Shell scripts.
Confidential, New York, NY
System Administration
Responsibilities:
- Responsible for Remote Linux Support with more than 600 servers .
- Installation of Ubuntu and RHEL operating systems on HP and dell.
- Installation and configuration of Webserver (Apache 2.2.17), MySQL 5.5 and PHP in a LAMP stack.
- Responsible for creating and managing user accounts, security, rights, disk space and process monitoring in Solaris, CentOS and Redhat Linux.
- Utilize commands and utilities such as iptable, netstat, ping to implement operating system and network security. Managed and upgraded UNIX's server services such as Bind DNS.
- Configuration and administration of Web (Apache), DHCP and FTP Servers in Linux and Solaris servers. Decommissioning of the old servers and keeping track or decommissioned and new servers using inventory list.
- Involved in Upgrade of Bamboo&ArtifactoryServer.
- Installation, Configuration and administration of DNS, LDAP, NFS, NIS, NIS+ and Sendmail on RedhatLinux/Debian Servers. Configured, managed ESX VM’s with virtual center and VI client. Provided Support for Fixed Income application and batch processes running on UNIXservers.
- Installation, Configuration and Maintenance of VERITAS cluster server VCS for UNIX boxes.
- Worked on implementing Sudo on servers and providing root access for application users.
- Creation and management of users' and groups' accounts, passwords, profiles, security ( ACL, Disk Quota, and PAM ), permissions, disk space usage and processes.
- Administered RedHat Linux servers for several functions including managing Apache/Tomcat server, Mail server, MySQL database and firewalls in both development and production.
- Installation, configuration, support and security implementation on DNS, DHCP, NFS, HTTPD.
- Performance of RPM and YUM package installations, Yum repository management.
- Used the remedy ticketing system to troubleshoot and resolve issues with the servers such as mount point issues.
- Responsible for App li edp atc he s and s uppo rt ed Li nu x S e rv e rs w it hO r a cl eDa t ab as e s e rv e rs
- Handling problems or requirements as per the ticket (Request Tracker) created. Configuration and troubleshooting - LAN and TCP/IP issues.
Environment: Subversion, Solaris, Red Hat Linux/Unix, Apache Tomcat, Ubuntu, RHEL, Jira, Centos3/4, JIRA, LINUX, Web logic, Perl scripts, UNIX, Shell scripts YUM, Nagios.