Application Engineer Resume
PROFESSIONAL SUMMARY:
- Around 11+ years of experience as a Web/Backend Architect, Developer, Data Engineer and DevOps.
- Experienced with full software development life - cycle, architecting scalable platforms, object oriented programming, database design and agile methodologies
- Experienced in MVC frameworks like Django, Flask, AngularJS, JavaScript, JQuery and Node.js.
- Expert knowledge and experience in Object oriented Design and Programming concepts.
- Experience in object oriented programming (OOP) concepts using Python.
- Strong experience in system, design, analysis, implementation, testing, development, and maintenance of business applications using Model View Controller (MVC) Architectures like Python - Django, Flask.
- Test driven development with Django, Flask unit testing and Nose.
- Hands on experience writing Microservices.
- Hands on experience writing Asynchronous task using Celery, Redis and RabbitMQ.
- Hands on experience on version control tools like Perforce and git.
- Worked on big data technologies like Confidential Borg, GFS, Flume, Bigtable and Map Reduce.
- Data Analysis - Comfortable in Numpy/Pandas/Jupyter Notebooks.
- Comfortable with basic techniques of statistical analysis / Machine Learning.
- Intermediate working familiarity with Hadoop and Spark.
- Having good hands on experience cloud automation using Ansible, Terraform and Kubernetes.
- Sound knowledge on Selenium and Robot Framework to automate the Web applications testing process.
- Good hands on experience in writing automation scripts using python, perl and Shell Scripting.
- Experienced in Restful and Micro Web services.
- Understanding and designing highly-scalable, distributed systems for running web applications and web services (e.g., cloud computing).Restful web service API design.
- Good Understanding of Network and VPN concepts.
- Good Understanding of NodeJs and Mean Stack.
- Hand on experience in spinning Docker, Docker swarm and Writing docker Composer.
- Experience in writing Sub queries, Stored Procedures, Triggers, Cursors, and Functions on MySQL and PostgreSQL database.
- Experience in nosql database Mongodb.
- Expert-level understanding of the AWS cloud computing platform and related services.
- Expert-level understanding of scalable infrastructure design principles.
- Expert understanding of distributed data storage and processing technologies, particularly ColumnIO databases
- Expert-level understanding of Linux/Unix administration and internals.
- Worked in Agile and Waterfall Methodologies with high quality deliverables delivered on-time.
TECHNICAL SKILLS:
Frameworks: Django, web2py, Flask, Spring, pylons and CSS Bootstrap
Web Technologies: HTML, CSS, DOM, SAX, Javascript, JQuery, AJAX, XML, Angularjs
Programming Languages: Python-3.5 & 2.7, Java, SQL and PL/SQL.
Version Control: Git, Perforce
Application servers: Apache Tomcat, Nginx, Apache2, Gunicorn
Databases: MySQL, PostgreSQL, MongoDB, Firebase
IDE’s/ Development Tools: Eclipse, PyCharm, and Sublime Text.
Cloud: AWS, Confidential, Azure, Rackspace Cloud, OVH, Digitalocean, Heroku and scalway.
Operating Systems: Ubuntu, Centos, MAC OS X.
Protocols: TCP/IP, HTTP/HTTPS, SOAP, SNMP, SMTP
CI/CD: JenkinsTracking Tools
PROFESSIONAL EXPERIENCE:
Confidential
Application Engineer
Responsibilities:
- Developed Flask REST APIs for getting parts info from hardware test suits.
- Utilized Flask and extensions Flask-Restful/JWT/SQL Alchemy.
- Used bootstrap and Flask-WTF combined with JavaScript to develop webforms.
- Involved in designing user interactive web pages as the front end part of the web application using various web technologies like HTML, JavaScript, Angular JS, JQuery, AJAX and implemented CSS for better appearance and feel.
- Involved and developing tools for migrating data from big query to Sql spanner.
- Deployed the project on Confidential Confidential Engine.
- Wrote big query statements using python API’s for capturing open test process results.
- Developed scalable REST API backend using Flask REST API’s.
- Developed Restful Microservices hosted on GCP using Kubernetes
- Developed User Interface to consume Microservices.
- Created and deployed Docker image on Kubernetes.
- Worked with Big Query, Dremel, GoogleSQL, PLX and Python to help build out the analytics pipelines, dashboards.
- Implemented Confidential protocol buffers (protobuf) for hardware vendors for uploading large data files.
- Established data pipeline for supplier performance metrics to evaluate and monitor supplier
Environment: Python, Flask, SQLAlchemy, Confidential Appengine, Big query, Memcached, CherrypyGerrit, Pycharm, MySQL, Confidential Spanner, Pandas, Confidential Visualization, GCP, Docker, Kubernetes, Appengine flex, Microservices, plx, protobuf
Confidential
Consultant
Responsibilities:
- Developed GUI using Python and Django for dynamically displaying the team bug tracking.
- Developing applications using RESTFUL architecture using Node.js for real time data.
- Responsible for setting up Reactjs JS framework for UI development.
- Wrote and executed various PostgreSQL database queries from python.
- Added new functionality to existing Python and Django application.
- Deployed application in Orchard (Heroku).
Environment: Python, Django, Reactjs, NodeJs, npm, orchard (Heroku), Pycharm, PostgreSQL.
Confidential
Role: Founder and developer
Responsibilities:
- Single-handedly built, and developed “Confidential” Mobile Confidential using Python/Django and Ionic Framework for Android and IOS
- Used python libraries like Beautiful Soup and Scrapy.
- Build integration with a number of third parties, making heavy use of asynchronous processing with Celery and RabbitMQ.
- Designed front end using UI, HTML, Angular JS, CSS and JavaScript.
- Involved in Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins, Docker along with Shell script.
- Developed scalable REST API backend using Django REST API Framework.
- Created and deployed Docker image on Kubernetes.
- Did Performance testing of a restful API using apache JMeter.
- Wrote python libraries for Robot Framework for Test Automation.
- Designed and architected database solutions that highly available, scalable and efficient using PostgreSQL.
Environment: Python, Django 1.8, PostgreSQL, Docker, BitBucket, codeship, Kubernetes, OVH, Facebook api, Android Studio, Pycharm, IOS Xcode, Firebase, One Signal, Celery, RabbitMQ, Confidential Maps API
Confidential
Sr Software Engineer
Responsibilities:
- Designed and developed custom DNS filtering service Python Using Twisted Pair.
- Built a lookup engine that consumes a JSON search description, builds an equivalent SQL query, and executes the query on a sharded MySQL database system. This is used for filtering all URL requests from Restful Api services.
- Served as AWS Architect for Applications located in AWS including VPC, EC2, ELB, and Auto scaling, with Cloud watch metrics integration.
- Migrated Lookup database from Ms Sql server to amazon RDS.
- Migrated API services from windows env to Ubuntu.
- Scrapped website “meta tiles”data using scrapy.
- Developed Scrapy framework (Python) based web scraping pipelines to schedule incremental
- Download of public data via HTTP and API calls.
- Defining/executing feeds, processes, data pipelines, jobs mirroring between production and testing cluster using Falcon
- Assisted getting an OEM web site / URL Content Filtering service up and running on different PaaS.
- Designed Cloud Formation scripts for provisioning of AWS resources (IAM, EC2, S3, Route 53, SNS, RDS, ELB and Auto Scaling) and make calls to Puppet scripts that execute the rest (provisioning/configuration of servers on the instances).
- Single-handedly built, and developed Scalable API Services to be able to create, a professionally used web site / URL Content Filtering Services.
- Created end to end solutions for Internet of Things (IOT) including software and hardware using raspberry pi.
- Build command-line Confidential using Click framework with Restful API layers.
- Crawlered data out of HTML and XML file with Scrapy/Beautiful soup.
- Interacting with web- Confidential using Flask.
- Performed Data analysis using Python Pandas.
- Processing Data records and returns computed results to Mongodb.
- Working with structured and unstructured data which involves advanced analytics, machine learning models, and big data techniques using Python (Pandas, Numpy, SQL, AWS, Hadoop HDFS, YARN, HIVE, Spark and Kafka along with Tableau to evaluate data for anomaly detection, trend analysis.
- Responsible for continuous integration and automation using Jenkins.
Environment: Python, Django, MySQL, Ms Sql server, Amazon RDS, EC2, LB, Memcache, OpenVPN, SQLite db, java, Tomcat, Docker, NodeJs, Express, Ubuntu, scrapy, Route 53, Aws EC2 Container Service, Aws Code Commit, Github, BitBucket, click framework, Hadoop, Jenkins.
Confidential
Lead Developer
Responsibilities:
- I was the lead developer for an internal tools and dashboards for Confidential maps operations. I personally work with managers to create custom software products and dashboard that are tailored to meet their daily needs. In the process, I strive to implement multiple techniques and practices to help improve geo process maps productivity for the managers and users, while managing a small development team.
- Developed web applications using Django/Flask Framework model view control (MVC) architecture.
- Developed web based Django applications and integrated them with apache web server for providing stronger Authentication, Authorization and SSO to the end user.
- Performed efficient delivery of code based on principles of Test Driven Development(TDD) and continuous integration to keep in line with Agile Software Methodology principles
- Different testing methodologies like unit testing, Integration testing, web application testing
- Developed test scripts for automation with Selenium.
- Designed and maintained databases using Python and developed Python based API (Restful Web Service) using Flask, SQLAlchemy and PostgreSQL.
- Designed and developed the UI of the website using HTML, CSS and JavaScript
- Developed entire front-end and backend modules using Python on Django Web Framework.
- Designed and implemented data ETL pipeline using python for analytics for tracking each operator’s data and performance.
- I write mid-tier backend API in python to provide Restful web services to deliver data to our global team in India, USA, Ireland and Tokyo. Instead of requiring each site there maintain their own data. My tier combines everything in a JSON responses for building dashboards. This provides an additional layer of indirection and eliminates duplicate work that each dashboard would have to perform in order to get analytical data.
- Developed complex reports on a massive dataset involving complicated joins and filters in order to develop a diverse set of metrics and enable multiple levels of drilling which included working closely with Data Modeling, ETL, and Testing work streams.
- Analyze large data sets by running bigtable query tools and Plx scripts.
- Developing Map reduce jobs in python to analyze the logs that are collected from Bigtable and generating reports.
- Developed mock-up Tableau Dashboards based on preliminary client requirements by leveraging sample source data and building/structuring dummy data in Excel.
- Wrote Puppet config files for automation.
- Rewrite existing Python/Django module to deliver certain format of data.
- Used Django Database API’s to access database objects.
- Created an internal plx dashboard using SQL and Tableau to help managers track weekly demand, prospect, and productivity data.
- Populated GFS (like HDFS) with huge amounts of data using Flume (like apache spark).
- Set Up automated Borg corn jobs to populate bigtable data into database.
- Automated the reporting for quarterly presentations
- Wrote python scripts to parse XML documents and load the data in database.
- Performed troubleshooting, fixed and deployed manyPythonbug fixes of the two main applications that were a main source of data for both customers and internal customer service team
- Used Pandas library for statistical Analysis.
- Developed web based Django applications and integrated them with apache web server for providing stronger Authentication, Authorization and SSO to the end user.
- Managed large datasets using Panda data frames and MySQL.
- Risk analysis for potential points of failure (database, communication points, file system errors)
- Troubleshoot the process execution and worked with other team members to correct them.
- Actively worked as a part of team with managers and other staff to meet the goals of the project in the stipulated time.
- Interacted with QA to develop test plans from high-level design documentation.
Environment: Python, Django 0.96 to 1.6, HTML/CSS, Big table, MySQL, Script, Eclipse, Linux, Shell Scripting, JQuery, Github, Angular.JS, Borg, Confidential Cloud Platform, Java, Confidential Maps Api, Ubuntu
Confidential
Linux Support Engineer
Responsibilities:
- Troubleshooting client issues via Email Tickets and Live chats.
- Resolving VPS Issues.
- Resolving Domain Propagation issue editing Zone files
- Maintenance of Servers like DNS, DHCP, WEB SERVER(APACHE), LAMP, Administering and
- Monitoring Linux servers remotely, file sharing services(ftp, nfs, samba), Installing software's
- into Linux machines, Handling web servers (apache), MySQL, managing user accounts (ftp),
- Remote Desktop connections Ssh, backups.
- Taking Backups using Python, Shell Scripting.
- Facile with Cpanel and WHM (setting up customer accounts, setting up DNS, IP etc.).
- 3rd party Software installations CMS (Ecommerce, Phpnuke, Drupal), Fantastical, RVSkin,
- ImageMagick, NetPBM, Gallery Software).
- Setuping Database MySQL and Ms Sql server for web applications.
- Hosting administration with WHM and CPANEL troubleshooting network problems.
Environment: Python, Shell Scripting, Linux, Red Hat, Centos