We provide IT Staff Augmentation Services!

Devops Tools Engineer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Software Engineering experience with many consulting firms doing design, coding, testing, deployment and support for a wide variety of tool work primarily with backend development. Primary experience is with Linux/Unix systems using Python and Perl.

TECHNICAL SKILLS

OSes: Linux (Red Hat, CentOS, Ubuntu), Mac OS X

Languages: Python, perl, bash, HTML, sql, influxdb query

language: javascript, css, json, yaml

Databases: MySql, MS - SQL, Oracle, InfluxDB

Systems: Active directory, ldap

Tools: AWS, boto3, Git, perforce, svn, cvs, Confluence, pip, composergem, PyCharm, JenkinsTenable Security Center api, Jira api

PROFESSIONAL EXPERIENCE

DevOps Tools Engineer

Confidential

Responsibilities:

  • Work in DevOps team developing and maintaining monitoring and alerting tools for the Confidential production site - including Perl monitoring scripts for collecting production status data and inserting into InfluxDB time-series databases across regional datacenters.
  • Configuration of Grafana panels in json/yaml to graph and display data. Investigate, debug and multi-task across complex problems.
  • Main customer/user of our tools was Confidential ’s SRE Team.

Confidential: Linux, Grafana, Influx DB, Perl, Git, GitHub, json, yaml, javascript, bash, Go, Perforce, Agile, TDD

Confidential

Software Engineer

Responsibilities:

  • Program to query Active Directory and return hierarchy of managers upwards from a given person. Reversely, another to return downwards tree from any given manager. Perl, Linux, git, github, json, Active Directory, Net::LDAP
  • Program to scan S3 buckets in an AWS account and report on any that had unsafe global access permissions. Python, Linux, json, AWS S3, boto3
  • Program to scan log objects in S3 bucket/prefixes to determine if logs are current (PASS) or not (FAIL). Python, Linux, json, AWS S3, boto3
  • EC2 instance startup script to check if AWS S3 objects (containing service config files) were updated since last run of program and, if they were, download them to the EC2 instance and start/restart the service with changed files. Bash shell, AWS cli tool
  • Program to search Jira based on search criteria and prepare Jira query language (jql) to do the query and list the results returning all relevant fields for each issue. Jira-api, python
  • Script to query Tenable Security Center using its api for vulnerabilities and asset data. Aggregate vulnerabilities for all hosts by Severity by Organization, DataCenter, Operating System, BuildVersion, and Environment for use in Jira ticket creation/updates.
  • Script to poll AWS SQS queue and initiate S3 log transfer for any queue item processed. Python and boto3.

Confidential, Sunnyvale, CA

Operations Programmer

Responsibilities:

  • Maintenance of Python scripts such as to start and stop various Linux applications or services from Jenkins; manipulate Zabbix host monitoring; rotate developer build clusters in Zeus Load Balancer; migrate a Route53 HostedZone from one AWS account to another; truncate ElasticSearch log records based on age.
  • Cron job perl scripts to ensure critical systems running. Write automated tests to validate functionality. Create testing app with integration to MongoDB for reproducing CSV developer issue.
  • Assist with maintenance of Confidential systems: AWS, Zabbix, Zeus Load Balancer, Jenkins driven from developer Jira ticket request system.

Confidential, Glendale, CA

Operations Engineer

Responsibilities:

  • Created linux based perl tool to automate a small tape robot system making it easy for operators to control archiving of multi- terabyte assets to multiple tapes with one command requiring no handling until job was done.
  • Created php laravel web application to make it easy for system engineers to enable/disable a10 load balancer GSLB and SLB hosts so traffic would not flow to them when they needed to be serviced. In addition, a command line version of the php app was created. Used a10 load balancer web service api with json data communication and mysql db for state storage.
  • Debugged and finished an incomplete perl tool that controlled storage and retrieval of assets using StorNext api on Quantum i6000 Robotic tape library archive systems.
  • Developed perl tool to migrate all Disney assets from a compressed storage device to normal uncompressed storage. All tasks completed successfully.
  • Using python and Pep8 coding standards and aws api, developed tools to configure AWS Load Balancers, Databases, and Web Servers and deploying aspera faspex. Used Python to produce CLI statistical reports of AWS server statuses and analyze Splunk logs to produce usage reports.
  • Created Ruby/Chef recipes to install aspera entsrv and aspera p2p.
  • Write automated tests to validate functionality.

Confidential, Sunnyvale, CA

Operations Engineer

Responsibilities:

  • Created/modified perl scripted tools to analyze and route received 7z/tgz Confidential storage filer system core dump and data files (having file sizes up to 100GB) to appropriate server on network where it would be further processed or stored for analysis by Tech Support. Script made use of 3rd party 7z and aspera ascp tools, as well as rsync, to make the files transfers across Confidential ’s Linux servers.
  • Created/modified perl/bash tools to speed up Confidential ’s existing archival process used to get data to storage filers by allowing it to run multiple instances in parallel.
  • Assist in perl program changes needed for a Data Center migration.
  • Systems &Technologies used: Perl, Linux, Solaris, Aspera High-Speed File Transfer, Perforce.

We'd love your feedback!