Consultant Resume
2.00/5 (Submit Your Rating)
SUMMARY:
- Seeking full - time positions at a director level or as an expert-level consultant for long term projects
- 20 years of experience in professional software development and architecture
- Leadership, management, and small business ownership experience
- Strong foundation of skills and experience with big data projects, real-time analytics, customer facing web applications, API service back ends, and DevOps infrastructure management
- I have lead work on the design, architecture, and implementation of web application streamlining .
- I am a self-starter and leader equally at home working on a team as I am at finding clients and completing work for my consultancy business.
- I have managed and participated in projects using a variety of methodologies and actively encourage the use of agile methodologies and scrum whenever I can.
- The foundation of my experience is based in a lifelong passion for programming, technology, and making difficult problems possible.
- I have applied the discipline and attention to detail I learned as I have explored the worlds of software development, systems architecture, project management, and leadership.
SPECIFIC SKILLS AND TECHNOLOGIES:
- Linux / Windows Server / Solaris / AIX
- Scrum Master / Agile Coach / T eam Lead
- Chef / Puppet / Ansible / Vagrant
- Project Planning / Architecture / Waterfall / Agile
- AWS / Azure / GCP / Linode / Heroku / Vultr
- Docker / Mesos / Marathon / DCOS / Kubernetes
- Multichannel Marketing / Advertising
- Jenkins / HAProxy / IPTables / DNS
- Health and Wellness
- Active Directory / OpenLDAP / Kerberos
- Income and Identity Verification
- Log Shipping / Logstash / Kibana / Grafana
- Startups / Mobile Gaming
- ESXi / Xen / KVM / Virtualbox
- Consulting
- New Relic / Nagios / NetSNMP
- SQL / Postgresql / MySQL
- Hadoop / HBase / Hive / Pig / Oozie
- Netezza
- Spark / Microbatches / Streaming
- Microsoft SQL Server
- Kafka / Event Sourcing / Message Queue
- Mongodb / CouchDB / Influxdb
- Data Warehouse / Data Mart / Data Lake
- Solr / Elasticsearch
- H2o.ai / Predictive Analytics / Scoring / ML
- Schema Design / Evolution / Data Shape
- Zookeeper / etcd / Zuul / Tyk
- CSV / JSON / XML / AVRO / Parquet
- OLAP Cube / Star / Snowflake Schema
- Directed Acyclic Graph / Hierarchical Data Structure
- ERD / Master Data Management
- Confluent Tools / Schema Registry / Kafka Connect
- Bus Model / Fully Incremental Updates
- Slowly Changing Dimensions / MDX
- Java Spring / Servlet / JSP / Scalatra
- Pentaho / Kettle / Streamsets / TalenD
- Express / Ember.js / Backbone / KnockoutJS
- Ruby on Rails / Sinatra / Ramaze / Django
- Java / Scala / JRuby
- JSON API Servers / Swagger
- Javascript / NodeJS
- HTML 5/CSS 3
- Ruby / Python / Perl / PHP / Go
- Web Sockets / Push Technology / Message Queue
- C / C++ / Objective C
- Shell Scripting / Bash / KSH / ZSH / Awk / Sed
- Slack / Google Hangouts / IRC / Jabber
- TDD / BDD / Cucumber / JUnit / ScalaTest
- Atlassian Tools / Jira / Bitbucket / Confluence
- MVC / MVVM / Dependency Injection / IOC
- Git / Subversion / Vim / IntelliJ / tmux
PROFESSIONAL EXPERIENCE:
Confidential
ConsultantResponsibilities:
- Uptake - Streaming Data Management Platform and Predictive Analytics
- Symphony Project designed as a streaming enterprise data warehouse and analytics platform for Confidential
- Utilizing Apache Spark, Kafka, AVRO, Confluent Schema Registry, Confluent
- Kafkaconnect, Streamsets, Mesos, Marathon, Docker, Elasticsearch and PostgreSQL
- Lead and drove the team to adhere to best practices for Scrum, testing, acceptance criteria, business value oriented requirements decomposition, and avoiding purely reactive based behaviors
- Setup a continuous deployment pipeline in Jenkins to automatically deploy to our development environment once a pull request was merged
- Setup and managed Jenkins to build pull requests from Bit Bucket, run tests and report results to Slack
- Implemented Pact to help coordinate production and consumption of microservice APIs
- Test driven development with a mixture of unit tests using Junit and ScalaTest as well BDD tests with Cucumber
- Built a series of microservices for managing the flow of data from the data sources throughout the platform using Streamsets and its associated APIs
- Programming primarily in JVM based languages including Java, Scala and Groovy
- Web front end done with Angular JS 2
Confidential
ConsultantResponsibilities:
- Brought on to work the Personalization platform, a Ruby on Rails application providing real-time advertisement recommendations for point of sale print
- Analyzed the performance of the application under load using Funkload testing application, characterizing weak points within the application, determining the scability of the application, and providing recommendations the best path forward to meet client demand
- Rewrote the critical portion of the web application, the scoring models, in C as Ruby extensions, offering several orders of magnitude improvement over the Ruby code
- Utilized vagrant and puppet to orchestrate changes to the application environment, including managing software, Amazon instances, security groups, and S3 storage
- Designed and developed a star schema data warehouse replacing SOLR for reporting, running on PostgreSQL providing significantly faster report and ad-hoc analysis
- Populated the data warehouse using Spark Streaming with modern Java 8 code, pulling data from an Avro encoded event stream stored in Kafka
- Worked with core Java and Javascript application to pull event data from Kafka and display it in real-time on a NOC style heatmap dashboard
Confidential
CTO
Responsibilities:
- As the CTO continued to grow the technology of the company through partnerships
- Migrated hardware from a fully managed hosted operation to a privately hosted, internally managed, collocated site, cutting hardware spending costs by a factor of four
- Found, interviewed, and hired additional development resources from the Ruby on Rails community
- Continued to grow and develop our Ruby on Rails platform, migrating to Rails 4, improving performance of our mashery backended APIs by moving JSON generation in to the PostgreSQL database
- Ensured delivery of a quality product by using a test-driven development model, ensuring all new code was tested through both unit and integration or functional tests
- Lead the group in choosing Agile project management for our development processes, having daily scrums, using Pivotal Tracker for project management, and executing short week long sprints
- Designed an extensible algorithm for matching and scoring a food item against a nutrition guideline pre-calculating some of the score components using a Hadoop cluster with custom MR jobs
- Created an algorithm for doing real time scoring of food items in the database at request time, reducing the calculation time by a factor of 200 and allowing us to meet SLAs under a second
Confidential
Owner
Responsibilities:
- Brought on to help with performance issues
- Company matches food items against USDA or dietitian crafted guidelines, focusing on persons with health conditions
- Worked on the challenge of having billions of possible matches between food items and nutrition guidelines
- Architected a solution interfacing a sharded PostgreSQL cluster with Hadoop and Elastic search to process and store the recommendations
Confidential
ConsultantResponsibilities:
- Replaced the initial developer and friend of the owners who started game development
- Developed all backend logic, game logic, player profiles, Facebook integration, and game management
- Backend is written in Ruby on Rails 3
- API is REST returning JSON for data encoding
- Requests signed using a modified version of the Amazon Web Service v2 algorithm
- Full test suite for the application done using Cucumber
- Application written using Behavioral Driven and Test Driven Design principles
Confidential
ConsultantResponsibilities:
- Client runs a company specializing in reporting on search engine ranking over time for tracked keywords and local business rankings
- Client complained that their existing system for reports were too slow
- They required near real-time reporting with incremental data mart refreshes as facts were updated
- Solution was built on ActiveMQ using message passing to send keyword rankings back to the warehouse for processing as they are ranked
- Used classical dimensional model to store facts
- Migrated reporting queries to the dimensional model