We provide IT Staff Augmentation Services!

Sr. Python Developer / Aws Devops Engineer Resume

4.00/5 (Submit Your Rating)

TX

SUMMARY

  • 8 years of IT experience as a Web/Application Developer and coding with an analytical programming using Python, Django, MySQL, JavaScript, C/C++ and Go Lang.
  • Experienced in object oriented programming (OOP) concepts using Python and C++.
  • Expertise in database connectivity for Python, used MySQL, Microsoft SQL Server, Oracle and running various queries.
  • Experience on pythonlibraries like Pandas, Matplotlib, NumPy, to manipulate and visualize the data using interactive charts.
  • Experiencing on developing forms using HTML and performing client side validations using JavaScript, JQuery and Bootstrap.
  • Worked on various complex modules in python including ODBC, pyODBC, pyETL, JSON, XML, Requests, profiler.
  • Expert in writing python scripts to scrape web data for data usage/collection using Beautiful SOUP, Scrapy, Selenium.
  • Experienced in building a Command line tool to interact with RESTful API using in GoCa.
  • Experienced in Agile Methodologies, Scrum stories and sprints experience in aPythonbased environment, along with data analytics, data wrangling and Excel data extracts.
  • Familiarity with MongoDB for storing/retreiving data in JSON format.
  • Created mappings in Informatica for data manipulation and loading.
  • Designed UI interfaces using Django Templates, HTML, CSS and Bootstrap.
  • Expert in business process and software development life cycle, including analysis, design, development, testing and implementation of software applications.
  • Excellent working knowledge in UNIX and Linux shell environments using command line utilities and shell scripting.
  • Expertise in Production support and Knowledge of deployment using Jenkins.
  • Good knowledge of development best practices such as code reviews, unit testing, system integration Worked on a new web app built in React and Redux using ES6.
  • Basic knowledge on Rest API, JSON Parsing, jQuery and AngularJS
  • Good knowledge of Apache server, Apache Server Tomcat, and Web logic in application servers.
  • Experienced in Working on Big Data Integration and Analytics based on Hadoop, Spark and No - SQL databases like HBase and MongoDB.
  • Experience with continuous integration and automation using Jenkins
  • Experience with Unit testing/ Test driven Development (TDD), Load Testing.
  • Good experience of software development in Python (libraries used: Beautiful Soup, NumPy, SciPy, Matplotlib, Pandas data frame, urllib2, MySQL dB for database connectivity) and IDEs - sublime text, PyCharm, Microsoft Visual Code.
  • Designed and developed presentation layer for web applications using technologies like HTML, CSS, and JavaScript.
  • Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on SQL and PostgreSQL database.
  • Experienced in Agile Methodologies, Scrum stories and Sprints experience in a Python based environment.
  • Used python scripts to parse XML and JSON reports and load the information in database.
  • Experienced in writing PERL script to extract data from text files, web automation and converting the file formats.
  • Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on SQL and PostgreSQL database.
  • Developed Cloud Formation templates, also launched AWS Elastic Beanstalk for deploying, monitoring and scaling web applications using different platforms like Docker, Python etc.
  • Extensively worked with automation tools like Jenkins, Artifactory, Sonarqube for continuous integration and continuous delivery (CI/CD) and to implement the End-to-End Automation.
  • Ability to analyze complex systems and be in command of the details to provide solutions.
  • Highly motivated, dedicated, quick learner and have proven ability to work individually and as a team.

TECHNICAL SKILLS

Programming Languages: C, C++, Python 3.x, JavaScript, PHP, XML, JAVA, SQL.

Web Technologies: HTML/HTML5, CSS/CSS3, XML, JSON and CSS Bootstrap.

SCM Tools: Subversion, Jenkins/Hudson, Jira, TFS, Confluence, Clear Case, GIT, GitHub, Artifactory.

Operating Systems: UNIX, Linux, Solaris, Windows, Mac Os, Ubuntu, DOS, VMware.

VersionControl Systems: Confidential, SVN, GIT and GITHUB.

Python Frameworks: Django, Pyramid, Flask, web2Py.

Database: PostgreSQL, SQL Server, MYSQL, MongoDB and Oracle

Application Servers: Web logic, JBoss, IBM WebSphere, Apache tomcat5.5, IIS

Development IDE: PyCharm, Pydev Eclipse, Vim, Net beans, MS Visio, Sublime Text, Notepad++

Methodologies: Agile, Scrum

Delivery tools: Saltstack, Chef & Puppet, Ansible

PROFESSIONAL EXPERIENCE

Confidential, TX

Sr. Python Developer / AWS DevOps engineer

Responsibilities:

  • Worked on building PySpark algorithms for different aggregations of data based on the specifications. Involved with performance and process enhancement of the PySpark framework.
  • Developed integration checks around the PySpark framework for Processing of large datasets.
  • Worked on migration of PySpark framework into AWS Glue for enhanced processing.
  • Wrote various automation scripts for automation of data processing on AWS Glue.
  • Worked on Scripting CloudFormation Templates to auto provisioning of resources.
  • Worked of various Quality control checks for data processing using Spark SQL.
  • Build a data processing algorithm for incremental processing of data that’s aggregated each week.
  • Involved in the complete software development lifecycle (SDLC) to develop the application.
  • Worked on data transformation like cleaning, partitioning of data for enhanced processing of data.
  • Worked on setting up of SEE5 data mining server on AWS for pattern finding in the data
  • Extensively worked on AWS Athena database to provide various client reports.
  • Involved with migration of AI based project from Google Cloud Platform into AWS.
  • Build Schedule based AWS Lambdas for automatic build and run of data processing pipelines
  • Worked on building a standalone UI tool using wxPython for creating custom requirement JSON templets to feed into AWS Glue for ETL processing of data.
  • Worked on setting of automated loading of data into SQL database using AWS Glue and Step Functions.
  • Worked on integrating different AWS components like EC2 and Lambdas to work with AWS Athena.
  • Scheduled anETLprocess throughS3 Event Triggerto load the data fromS3to tables inAWS RedshiftusingAWS lambda.
  • Creating S3 buckets also managing policies for S3 buckets and Utilized S3 bucket and Glacier for storage and backup onAWS
  • Build integration testing system that involved testing the entire framework with a code check in.
  • Experienced writing unit test and google open source automated testing framework Bazel
  • Implemented Security features around AWS CodeCommit repository like pull requests using AWS IAM policies.
  • Experienced in using different community and enterprise IDE’s (Integrated Development Environments) and tools like PyCharm and VSCode
  • Involved in translating technical documents into engineering Specifications.
  • Coordinated with DevOps teams for bug fixes and code releases
  • Developed a rich user interface using CSS, HTML, JavaScript, and jQuery.
  • Worked on automating the repetitive tasks using Ansible.
  • Extracting acting data from the database using SAS/Access, SAS SQL procedures and creating SAS data sets.
  • Created and modified PL/SQL scripts for data conversions.
  • Developed and maintained various automated web tools for reducing manual effort and increasing efficiency of the Global Shipping Team.
  • Created databases using MySQL, wrote several queries to extract data from the database.
  • Communicated effectively with the external vendors to resolve queries.

Environment: Python, PySpark, wxPython, Apache, AWSGlue, AWS Athena, AWS S3, AWS StepFuctions, CloudFormation pytest, Bootstrap, Flask, Oracle, PL/SQL, MySQL, MS-SQL, REST, PyCharm, Windows, Linux.

Confidential, CA

Sr. Python Developer / AWS DevOps engineer

Responsibilities:

  • Wrote Python routines to log into the websites and fetch data for selected options. Used other packages such as Beautiful soup for data parsing.
  • Worked on writing and as well as read data from csv and excel file formats.
  • Developed a MATLAB algorithm which determines an object's dimensions from digital images.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Web-services backend development using Python (CherryPy, Django, SQLAlchemy).
  • Designed and configured database and backend applications and programs.
  • Participated in developing the company's internal framework onPython.
  • This framework became a basement for the quick service's development. Framework based on CherryPy with GnuPg encryption (reGnuPg module) on the top.
  • Used Luigi to build data pipelines to create dependencies between tasks and it is easy to write modular code.
  • Used Service Oriented Architecture (SOA) to create an architecture using restful web services, to carry out actions like producing data, validating a customer,
  • Taken the help of SOA architecture to enable loosely coupled services to reuse which saves lot of time.
  • Experience in creating databases, users, tables, views, functions, Packages, joins and hash indexes in Teradata database.
  • Wrote Insert and Updated SQLs in Teradata Scripts and validated the load process into the target tables.
  • Used Luigi id data pipelining to enable immediate resumption after failure.
  • Used Airflow for pipeline creation to write code that initiates pipelines dynamically.
  • Created a Full-Service Catalog System which has a full workflow usingElasticsearch.
  • For basic search used by different system via an API on top of theElasticsearch.
  • Used object-relational mapper (ORM) code library to automate the data transfer from relational tables in to objects which are being used in application code.
  • Used object-relational mapper (ORM) as a bridge between relational database tables and python objects.
  • Used ORM to switch between different relational databases whenever required.
  • UsedAnsibleto document application dependencies into version control.
  • ImplementedAnsibleto manage all existing servers and automate the build/configuration of new servers.
  • Worked on direct Postscript to get more efficient Postscript output which prints faster than generic printer drivers.
  • Adding Virtualization nodes into OpenStack.
  • DevOps role converting existing AWS infrastructure to Server-less architecture AWS Lambda deployed via CloudFormation.
  • Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.
  • Has good experience in using Open stack CLI prompt and dashboard service.
  • Generated Postscript files using python as plain text fields.
  • Used Split function in python to break down large string into smaller strings.
  • Used Commands like Show page in PostScript which forces the printer to print the currently drawn page.
  • Experience in moving data between GCP and Azure using Azure Data Factory.
  • Worked on Google Cloud Platform (GCP) services like compute engine, cloud load balancing, cloud storage and cloud SQL.
  • Developed backend of the application using the flask framework.
  • Developed and tested many features for dashboard using Flask, CSS and JavaScript.
  • Worked on ETL tasks like pulling, pushing data from and to various servers.
  • Worked on resulting reports of the application and Tableau reports.
  • Built Web application using Python, Django, Flask, JavaScript, AJAX, HTML and template languages.
  • Experience in developing and configuring slack applications.
  • Used slack API to integrate complex services with slack to get out of the box integrations.
  • Built monitoring and self-healing with InfluxDb, Bash and Python.
  • Developed dashboards to monitor druid cluster health using InfluxDb, Kafka.
  • In - depth knowledge of Hadoop Eco system - HDFS, Yarn, MapReduce, Hive, Hue, Sqoop, Flume, Kafka, Spark, Oozie, NiFi and Cassandra.
  • Worked on HTML5, CSS3, JavaScript, Git, REST API, Mongo DB, Riak, intelliJ IDEA.
  • Design and Setting up of environment of Mongo dB with shards and replica sets. (Dev/Test and Production).
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Create custom VB scripts in repackaging applications as needed.
  • NLP File Prep Settlement-Prepare files for review for Settlement.
  • Used AWS Redshift data warehouse to analyze data using standard SQL.
  • Experience with Amazon SQS, and Amazon Web Services like EC2, Redshift and S3.
  • Held meetings with client and worked all alone for the entire project with limited help from the client.
  • Automated deployment of micro services to pull an image from private Docker Registry and deploy Docker swarm cluster using Ansible.
  • Setup automated cron jobs to upload data into database, generate graphs, bar charts, upload these charts to wiki, and backup the database.
  • Monitoring and Logging, Pingdom, Logstash and FluentD.
  • Experience with Web services development REST API/ micro services
  • Wrote scripts in Python for extracting data from HTML file.

Environment: Python 3.4/2.7, Django 1.7, HTML5, CSS, Bootstrap, JQuery, JSON, JavaScript, PostgreSQL, T-SQL, MongoDB, Ansible, SoapUI, Sqoop 1.4.6, HIVE 1.2.0, Apache Hadoop 2.6, Kafka Jython 2.7, Vugen, Oracle 11g/10i, MS Office, Google Cloud Platform (GCP), Elastci Load Balancer, Elastic Search, WordPress, Chartbeat, Docker, Ansible, MySQL, NOSQL, Oracle Warehouse Builder(OWB), Data Migrator(IBI), Google Cloud, Azure IOT Suite, Amazon S3, Redshift, Bugzilla, JIRA.

Confidential, RI

Python Developer / AWS DevOps engineer

Responsibilities:

  • Involved in preparing engineering specification with OOA and OOD.
  • Used Rational Rose Enterprise to develop Use Case diagrams, Class diagrams, Collaboration and Sequence Diagrams, State Diagrams, Data Modeling.
  • Implemented common fault tolerant solution for ingesting real time data into data lake using kafka streams with schema evaluation using confluent schema registry.
  • Data Design and analysis in order to handle huge amounts of data.
  • Developed a generic Kafka producer which can publish any Avro message which follows coding standard events to any topic on GKS.
  • Developed application logic using Python, Jython, Java script.
  • Used JMS for updating Mailing plans and tracking them. Implemented front end for third party Web service using JQuery, Html, AJAX, JSON and JavaScript.
  • Used Java Server Pages for content layout and presentation with Python.
  • Developed the frontend for interaction by using the Django framework. Created Data layer in MYSQL. Extracted and loaded data using Python scripts and PL/SQL packages.
  • Supported Java application for Media portal management.
  • Associated with development of Web Services using SOAP for sending and getting data from the external interface in the XML format.
  • Involved in preparing technical design document.
  • Used Connect SOAP rule to fetch the Webservices.
  • Used SOAP UI to test the external service.
  • Used JIRA for project tacking.
  • Involved in testing the application.

Environment: Java, Python 2.6, Django, Confidential, JavaScript, HTML/CSS, MYSQL, PL/SQL, JDBC, Unix Shell Scripting, Red Hat Linux, WebLogic Application Server.

Confidential, NJ

Python Developer / AWS DevOps engineer

Responsibilities:

  • Worked on predictive analytics use-cases using R language.
  • Use Python unit and functional testing modules such as unit test, unittest2, mock, and custom frameworks in-line with Agile Software Development methodologies.
  • Added licensing feature for 3 products CVU'S, CCLEAR, CSTOR from scratch by going through most modules of code runs at backend and created html pages to view the EULA (End User License Agreement).
  • Developed micro services API'S for an individual module and exposed the results in JSON format using Bottle framework.
  • Involved in development of testing frameworks, used selenium web driver for Automation.
  • Worked with Docker Container, have setup an environment and used accordingly.
  • Wrote Python code embedded with JSON and XML to produce HTTP GET and POST requests for parsing HTML data from website.
  • Used JIRA for issue tracking and bug tracking for each individual sprint and used confluence to create design documents.
  • Installed Hadoop, Map Reduce, HDFS, and AWS and developed multiple MapReduce jobs in PIG and Hive for data cleaning and pre-processing.
  • Manage datasets using Panda data frames and MySQL, queried MYSQL database queries from python using Python-MySQL connector and MySQL dB package to retrieve information.
  • Involved in the Web/Application development using Python 3.5, HTML5, CSS3, JSON and JQuery.
  • Develop and tested many features for dashboard using Python, Java, Bootstrap, CSS, JavaScript and JQuery.
  • Generate Python Django forms to record data of online users and used PyTest for writing test cases.
  • Implemented and modified various SQL queries and Functions, Cursors and Triggers as per the client requirements.
  • Clean data and processed third party spending data into maneuverable deliverables within specific format with Excel macros and python libraries such as NumPy, SQL Alchemy and Matplotlib.
  • Used Pandas as API to put the data as time series and tabular format for manipulation and retrieval of data.
  • Helped with the migration from the old server to Jira database (Matching Fields) with Python scripts for transferring and verifying the information.
  • Analyze Format data using Machine Learning algorithm by Python Scikit-Learn.
  • Experience in python, Jupyter, Scientific computing stack (NumPy, SciPy, panda sand Matplotlib).
  • Perform troubleshooting, fixed and deployed many Python bug fixes of the two main applications that were a main source of data for both customers and internal customer service team.
  • Write Python scripts to parse JSON documents and load the data in database.
  • Generating various capacity planning reports (graphical) using Python packages like Numpy, Matplotlib.
  • Analyzing various logs that are been generating and predicting/forecasting next occurrence of event with various Python libraries.
  • Developed single page application by using Angular JS backed by MongoDB and NodeJs.
  • Design and maintain databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL Alchemy and PostgreSQL.
  • Manage code versioning with GitHub, Bit Bucket and deployment to staging and production servers and implement MVC architecture in developing the web application with the help of Django framework.
  • Use Celery as task queue and RabbitMQ, Redis as messaging broker to execute asynchronous tasks.
  • Design and manage API system deployment using fast http server and Amazon AWS architecture.
  • Develop remote integration with third party platforms by using RESTful web services and Successful implementation of Apache Spark and Spark Streaming applications for large scale data.
  • Built various graphs for business decision making using Python mat plotlib library.
  • Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML format.
  • Export Test case Scripts and modified the selenium scripts and executed in Selenium environment.
  • Developed entire frontend and backend modules using Python on Django Web Framework.
  • Scraping website using Python Beautiful Soup, and then parsed it with XML.
  • Outputting the parsed data as JSON/BSON and stored into MongoDB.
  • Used NLTK and StanfordNLP to process text data and created offline intelligence.
  • Querying data from MongoDB and use them as input for the machine learning models.
  • Using AWS for application deployment and configuration.
  • Wrote UNIX shell scripting for automation.
  • Developed views and templates with Django view controller and template Language to create a user-friendly website interface.
  • Used JavaScript and JSON to update a portion of a webpage.
  • Develop consumer-based features using Django, HTML and Test-Driven Development (TDD).
  • Increase the speed of pre-existing search indexes through Django ORM optimizations.
  • Develop module to build Django ORM queries that can pre-load data to greatly reduce the number of databases queries needed to retrieve the same amount of data.

Environment: Python, Django, HTML5/CSS, PostgreSQL, MS SQL Server 2013, MySQL, JavaScript, Jupyter Notebook, VIM, PyCharm, Shell Scripting, Angular.JS, JIRA.

Confidential

Python Developer

Responsibilities:

  • Wrote Python routines to log into the websites and fetch data for selected options.
  • Implemented code in Python to retrieve and manipulate data.
  • Used Python modules such as requests, urllib, and urllib2 for web crawling.
  • Used other packages such as Beautiful soup for data parsing.
  • Worked on writing and as well as read data from CSV and excel file formats using Python.
  • Web-services backend development using Python (CherryPy, Django, SQLAlchemy).
  • Worked on resulting reports of the application and Tableau reports.
  • Participated in developing the company's internal framework on Python. This framework became a basement for the quick service's development.
  • Worked on HTML5, CSS3, JavaScript, AngularJS, Node.JS, Git, REST API, and MongoDB.
  • Design and setting up of the environment of MongoDB with shards and replica sets (Dev/Test and productions).
  • Designed and developed components using Python with Django framework.
  • Private VPN using Ubuntu, Python, Django, CherryPy, Bootstrap, jQuery.
  • Created a Python-based GUI application for Freight Tracking and processing.
  • Experience in designing and developing applications in Spark using Scala.
  • Wrote scripts in Python for extracting data from HTML files.
  • Used Python and Django creating graphics, XML processing of documents, data exchange and business logic implementation between servers.
  • Developed a rich user interface using CSS, HTML, JavaScript, and jQuery.
  • Participated in the complete SDLC process.
  • Worked on automating the repetitive tasks using Ansible.
  • Extracting acting data from the database using SAS/Access, SAS SQL procedures and creating SAS data sets.
  • Created and modified PL/SQL scripts for data conversions.
  • Developed and maintained various automated web tools for reducing manual effort and increasing efficiency of the Global Shipping Team.
  • Created databases using MySQL, wrote several queries to extract data from the database.
  • Communicated effectively with the external vendors to resolve queries.

Environment: Python, Django, MySQL, Windows, Linux, HTML, CSS, jQuery, JavaScript, Apache, Linux, Quality Centre, Ansible, PL/SQL.

We'd love your feedback!