We provide IT Staff Augmentation Services!

Python Aws Developer Resume

2.50/5 (Submit Your Rating)

SUMMARY

  • Around 6 of experience as a Python with analytical programming using Python and Django.
  • Skilled in debugging/troubleshooting issues in complex applications
  • Developed consumer - based features and applications using Python, Django, HTML
  • Experience in Cloud (Azure, AWS, GCP), DevOps, Configuration management, Infrastructure automation, Continuous Integration & Continuous Delivery (CI/CD).
  • Working with various Python Integrated Development Environments like (IDE) PyCharm, Spyder.
  • Experience in application software Development and Design, Object Oriented, Technical Documentation, Software testing and debugging.
  • Strong working knowledge in developing Restful webservices and Micro Services using flask, Django, API Gateway and Apigee.
  • Experience in building and managing SOAP, REST, GraphQL API setups both on AWS and On Premise.
  • Experience in design, development and deployment of enterprise applications for J2EE platform using JAVA, J2EE, Hibernate 3.0, Springs, JPA, Web Services, EJB 3.0, and XML JavaScript, SQL, HTML.
  • Exceptional ability to work independently and also lead a team of data engineers for building modern data pipeline with serverless strategy inside AWS.
  • Made web-based apps using Django for Insurance premium calculations where implemented the apps using microservices architecture.
  • Experience in AWS Cloud Computing services, such as EC2, S3, Lambda, API, DynamoDB, EBS, VPC, ELB, Route53, Cloud Watch, Security Groups, Cloud Trail, IAM, Cloud Front, Snowball, EMR and Glacier.
  • Experience on Amazon Web Services for deploying using Code commit and Code deploy of EC2 instances consisting of various flavors like AWS Linux AMI, Red Hat Linux Enterprise, SUSE Linux, Ubuntu server, Microsoft Window Server2012.
  • Extensive experience on configuration management tools like Ansible, chef (integrated with opsworks in AWS)
  • Experience in creating one click deployments using Cloudformation, Serverless and Terraform.
  • Studied and stayed current on features and functionality of PostgreSQL and RedshiftDB.
  • Experience in upgrading and migrating various versions of PostgreSQL database on different platforms.
  • Responsible for all backup, recovery, and upgrading of all of the PostgreSQL databases. ETL and data warehouse design experience
  • Proficient with all major PostgreSQL procedural languages (Plpgsql).
  • Developed entire frontend and back-end modules using Python on Django Web Framework.
  • Good Experience in Linux Bash scripting where I managed creating and sourcing global configuration modules in a Linux server, implemented exception traps using Linux signals.
  • Good experience of software development in Python (libraries used: NumPy, SciPy, matplotlib, python -twitter, Pandas data frame, network, urllib2, MySQL dB for database connectivity)
  • Experienced in developing web-based applications using Python and HTML.
  • Experience in developing web applications by following Model View Control (MVC) Architecture using server-side application Django
  • Vast experience in dynamic programming languages and implementation of design patterns like MVC, Factory and Decorator design patterns.
  • Have Experience in List Comprehensions and python inbuilt functions such as Map, Filter and Lambda.
  • Experienced in Agile Methodologies, Scrum stories and sprints experience in a python-based environment, along with data analytics, data wrangling and Excel data extracts.
  • Work experience in various environments such as: Python, Django, HTML/CSS, MS SQL Server 2012, Script, Eclipse, GIT, GitHub, AWS, Linux, Shell Scripting.
  • Experienced in Unit testing, System Integration testing (SIT), User acceptance testing (UAT), Functional testing.
  • Performed code reviews and implemented best Pythonic programming practices.
  • Good experience in handling errors/exceptions and debugging the issues in large scale applications.
  • Implemented a CI/CD pipeline using Azure DevOps (VSTS, TFS) in both cloud and on-premises with GIT, MS Build, Docker, Maven along with Jenkins plugins.
  • Extensive knowledge in Marketing and SCM Domains.
  • Solid understanding of NOSQL and RDBMS
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Experience in major Hadoop ecosystem components such as PIG, HIVE, HBASE, SQOOP, KAFKA and monitoring them with Cloudera Manager and Ambari.
  • Expertise in services such as Azure Kubernetes and AWS Elastic container service.
  • Good exposure to UI design using Bootstrap, HTML, CSS, JavaScript. In depth knowledge and expertise in Data Structures and Algorithms, Design Patterns, proficient in UNIX Shell Scripting, python scripting and SQL Query building (SQL query with join, sub query, correlated query and analytical query).

TECHNICAL SKILLS

Programming Languages: Python, C

Web Technologies: HTML5, CSS3, XML, Bootstrap3, AJAX, and Dom, Springboot, jQuery, Angular.js, Bootstrap. Service Now

Web Servers: Web Sphere, JBoss, WebLogic, Apache Tomcat

Application Server: Tomcat, SOAP, RESTful services, SOA

Databases: PostgreSQL, SQL, MySQL, Oracle, MongoDB

Frameworks: Django, Flask, Pyramid

Tools: & IDE: Visual Studio, Eclipse, PyTest, PyCharm, XCode, Idle

Python Libraries: NumPy, Pandas, matplotlib

Operating systems: Windows, Linux

Automation tools: Jenkins, chef, Puppet, Ansible, Docker, Kubernetes, Terraform.

Methodologies: Agile, Scrum, Waterfall

Cloud services: AWS S3, EC2, Amazon EMR, Google cloud Platform

Monitoring Tools: Splunk, CloudWatch

PROFESSIONAL EXPERIENCE

Confidential

Python AWS Developer

Responsibilities:

  • Developed views and templates with Python and Django's view controller and templating language to create a user-friendly to interface to perform in a high-level.
  • Design of new features or improvements to existing features.
  • Creating code using Python, Django, MySQL and any other applicable technology for building application features
  • Administering the application, related databases and the hosting environments.
  • Fixing Bugs and performing any other remediation as necessary
  • Providing technical support to users of the application
  • Providing user documentation on features of the application.
  • Involved in setting up Jenkins Master and multiple slaves for the entire team as a CI tool as part of Continuous development and deployment process
  • Responsible for managing and supporting Continuous Integration (CI) using Jenkins
  • Used the Django Framework to develop the application and Build all database mapping classes using Django models.
  • Helped in interactive API documentation for specific Python SDK methods
  • Experienced in Python SDK - Python extension module that provides users the ability to write custom requirements
  • Identified improvements to enhance CI/CD
  • Developed framework for converting existing PowerCenter mappings and to PySpark(Python and Spark) Jobs.
  • Created Pyspark frame to bring data from DB2 to Amazon S3.
  • Provide guidance to development team working on PySpark as ETL platform
  • Optimize the Pyspark jobs to run on Kubernetes Cluster for faster data processing
  • Troubleshoot non-running pods issue related to Kubernetes
  • Load balancers are plumbed inconsistently across the cluster nodes- helped in troubleshooting this Kubernetes issue.
  • Worked on Development of company´s internal CI system, providing a comprehensive API for CI/CD
  • Refactored existing batch jobs and migrated existing legacy extracts from Informatica to Python based micro services and deployed in AWS with minimal downtime.
  • Developed the application using Spring Web MVC and other components of the Spring Framework, the controller being Spring Core Dispatcher Servlet. Also implemented Dependency Injection using the spring framework.
  • Created AWS Security Groups for deploying and configuring AWS EC2 instances.
  • Added support for Amazon AWS S3 and RDS to host files and the database into Amazon Cloud.
  • Debugging the application and following messages in log files, to figure out the error if existing.
  • Assisted with development of web applications Flask.
  • Developed API modularizing existing python modules with the help of py yaml libraries.
  • Developed and tested many features for dashboard using Flask, CSS and JavaScript.
  • Developed backend of the application using the flask framework
  • Developed Restful microservices using Flask and Django
  • Used Celery with RabbitMQ, MySQL, Django, and Flask to create a distributed worker framework.
  • Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask
  • Worked with developing backend services with NodeJS and Python in AWS Lambda for creating microservices.
  • Worked on project driven on AWS Connect where we enabled communication between Lex, Lambda and customer.
  • Observed the data behavior and made reusable scripts to help other teams to access for similar cases Worked on a android application in a backend team on Appsync, Lambda, Sns, Rds Aurora Serverless, DynamoB, Api Gateway Created mutation, resolvers, query, schema for appsync.
  • Extensively worked on Data Extraction from REST API’s, Data Munging, Data
  • Modelling, Loading the data into RedShiftDB, Data Analysis.
  • Worked with networking teams in configuring AWS Direct Connect to establish dedicated connection to datacenters.
  • Build servers using AWS Connect, importing volumes, launching EC2, RDS, creating security groups, auto-scaling load balancers (ELBs) in the defined virtual private connection.
  • Developed python code for different tasks, dependencies, SLA watcher and time sensor for each job for workflow management and automation using Airflow tool.
  • Responsible for designing and implementing the data pipeline using Big Data tools including Hive, Oozie, Airflow, Spark, Drill, Kylin, Sqoop, Kylo, Nifi, EC2, ELB, S3 and EMR.
  • Created Airflow Scheduling scripts in Python
  • Responsible for ETL and scheduling process using Airflow tool.
  • Finalized data is loaded into RedshiftDB and automated the entire QA process in such a way that the script validates data scenarios and alerts us when needed
  • Built an Interface between Django and Salesforce with RESTAPI.
  • Used REST&SOAP to build web services and tested them using requests and postman.
  • Developed Stored Procedures in PostgreSQL.
  • Backend experience, with data-processing using NoSQL: Redis, MongoDB.
  • Has working knowledge of Redis Cache, Memcached and Rabbit MQ.
  • Designed GraphQL to build client applications by providing an intuitive and flexible syntax and system for describing their data requirements and interactions.
  • Used GraphQL for complete and understandable description of the data in our API, gives clients the power to ask for exactly what they need
  • Used GraphQL to enable the powerful developer tools.
  • Used GraphQL which is very easy to add features or make changes to the app without messing around with the server code
  • Deep experience with the design and development of Tableau visualization solutions
  • Generated tableau dashboards for sales with forecast and reference lines.
  • Generated tableau dashboards with combination charts for clear understanding.
  • Utilized PyUnit , the Python unit test framework, for all Python applications.
  • Creating unit test/regression test framework for working/new code.
  • Assisted with writing effective user stories and divide the stories into SCRUM tasks.
  • Experience with Docker, deployment of application inside software containers.
  • Experience in Scientific Programing and using NumPy and Pandas in python for Data Manipulation.
  • Introduced new coding standards for evaluating sanity checks.
  • Designed plan to implement serverless wherever required. worked on code reviews for backend code i.e. python and NodeJS.

Environment: Python, Flask, AWS, Pyramid, Redis, Django, Docker, REST, GitHub, LINUX, NumPy, Node.JS, AJAXReactJS, Angular2, Azure Devops

Confidential

Python AWS Developer

Responsibilities:

  • Developed Python based micro service to extract the data from system of records into Enterprise Data warehousing.
  • All micro services are written in Python utilizing distributed message passing via Kafka message broker with JSON as data exchange formats.
  • Developed new RESTful API services that work as a middleware between our application and third-party APIs that we will used using Flask and API Gateway.
  • Using GO, developed a microservice for reading large volume of data(millions) from PostgreSQL database.
  • Experience writing data APIs and multi-server applications to meet product needs using Golang.
  • Experience in writing the HTTP RESTful Web services and SOAP API's in Golang.
  • Managed large-scale, geographically distributed database systems, including relational (Oracle, SQL server) and NoSQL (MongoDB, Cassandra) systems.
  • Written wrapper scripts to automate deployment of cookbooks on nodes and running the chef client on them in a Chef-Solo environment.
  • Optimization of Hive queries using best practices and right parameters and using technologies like Python, PySpark.
  • Developed spark applications in python(PySpark) on distributed environment to load huge number of CSV files with different schema in to Hive ORC tables.
  • Developed Templates for AWS infrastructure as a code using Terraform to build staging and production environments.
  • Build, manage, and continuously improved the build infrastructure for global software development engineering teams including implementation of build scripts, continuous integration infrastructure and deployment tools.
  • Troubleshooted Production issues pertaining to AWS Cloud Resources and Application Infrastructure point of view.
  • Worked on reading and writing multiple data formats like JSON, on HDFS using PySpark.
  • Worked on Swagger API and auto-generated documentation for all REST calls
  • Built numerous Lambda functions using python and automated the process using the event created.
  • Created an AWS Lambda architecture to monitor AWS S3 Buckets and triggers for processing source data.
  • Experience working with and writing Swagger definition
  • Have prior experience managing swagger definitions
  • Worked on app development using AWS Appsync where we used GraphqlDatabase as our API platform.
  • Worked in building isomorphic applications using Redux with GraphQL on server side.
  • Worked in working with GraphQL queries and use Apollo GraphQL library.
  • Experience in building isomorphic applications using Redux with GraphQL on server side.
  • Was involved and responsible for managing more than 75 NoSql clusters
  • Migrated Database from SQL Databases (Oracle and SQL Server) to NO SQL Databases (MONGODB).
  • Involved in the setting up Micro services using API Gateway, Lambda, DynamoDB that connects to UI.
  • Converting production support scripts to chef, testing of cookbooks with chef-spec
  • Used Puppet server and workstation to manage and configure nodes.
  • Performed troubleshooting, fixed and deployed many Python bug fixes of the main applications that were sources of data for both devices and Lab.
  • Experience in writing Infrastructure as code (IaC) in Terraform, Azure resource management, AWS Cloud formation. Created reusable Terraform modules in both Azure and AWS cloud environments.
  • Analyzing hive tables using spark by integrating hive to spark
  • Involved in Architect, build and maintain Highly Available secure multi-zone AWS cloud infrastructure utilizing Chef with AWS Cloud Formation and Jenkins for continuous integration.
  • Installed, Configured and automated the Jenkins Build jobs for Continuous Integration and AWS Deployment pipelines using various plug-ins like Jenkins EC2 plug-in and Jenkins Cloud Formation plug-in.
  • Setup and Implement Continuous Integration and Continuous Delivery (CI & CD) Process stack using AWS, GITHUB/GIT, and Jenkins.
  • Created recommendations on how to duplicate a subset of on- premise machines to the Azure Infrastructure as a Service (IAAS) offering which will be used for disaster recovery. This analysis included the specifics to synchronize on premise data with SQL Server and SharePoint instances hosted in VMs.
  • Led migration of Virtual Machines to Azure Virtual Machines for multiple global business units.
  • Worked on Ansible Tower to manage Multiple Nodes and Manage Inventory for different Environments.
  • Performed client acceptance and prototyping using Azure Compute and SQL Azure instances.
  • Implemented high availability with Azure Resource Manager deployment models.
  • Setup Azure Virtual Appliances (VMs) to meet security requirements as software-based appliance functions (firewall, WAN optimization and intrusion detections).
  • Research and implement new tools such as Kubernetes with Docker to assist with auto-scaling, continuous integration, rolling updates with no downtime.
  • Good experience in backend application development with Django, Flask, JavaScript, Angular JS, MySQL, PostgreSQL.
  • Involved in working with process owners to develop workflow, implement the workflows in Service Applications and administer the tools and enhanced requests by java script.
  • Created Procedure guidelines used in supporting Technology and Application issues and Responsible for maintaining and growing data held within Service Now such as our users, service catalog it
  • Capture an image of a virtual machine, attaching a Disk to virtual machine, manage and create virtual network and end points in azure portal.
  • Maintaining a farm of AWS solution using EC2 instances, ELB's, S3, EBS, Auto Scaling and RDS . Setting up servers through using AWS for deployment or other uses for application.
  • Designed roles and groups for users and resources using AWS Identity Access Management (IAM) and managed network Security using Security Groups, and IAM.
  • Expertise in Application Deployments & Environment configuration using Chef.
  • Involved working on Ansible as configuration management tool to automate repetitive tasks and to deploy applications.
  • Installed Jenkins/Plugins for GIT Repository, Setup SCM Polling for Immediate Build with Maven and Maven Repository and Deployment of apps using custom modules through Puppet as a CI/CD Process.
  • Used version controlling systems like GIT and BitBucket.
  • Building/Maintaining Docker container clusters managed by Kubernetes, Linux, Bash, GIT, Docker, on GCP . Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy.
  • Implemented a production ready, load balanced, highly available, fault tolerant Kubernetes infrastructure.
  • Managed Kubernetes charts using Helm. Created reproducible builds of the Kubernetes applications, managed Kubernetes manifest files and Managed releases of Helm packages.
  • Loaded the data into Spark RDD and do in memory data Computation and implemented data transformation using Dynamic Frames and used Spark SQL.
  • Configured and maintained Jenkins to implement the CI/CD process and integrated the tool with Ant and Maven to schedule the builds.
  • Developed Templates for AWS infrastructure as a code using Terraform to build staging and production environments.
  • Build, manage, and continuously improved the build infrastructure for global software development engineering teams including implementation of build scripts, continuous integration infrastructure and deployment tools.
  • Troubleshooted Production issues pertaining to AWS Cloud Resources and Application Infrastructure point of view.
  • Created Maven POMs to automate build process for the new projects and integrated them with third party tools like SonarQube.
  • Involved in infrastructure as code, execution plans, resource graph and change automation using Terraform.
  • Managed AWS infrastructure as code using Terraform.
  • Managed Amazon Web Services (AWS) infrastructure with automation and configuration management tool such as Chef. Designing cloud hosted solutions, specific AWS product suite experience.
  • Created S3 buckets and managed policies and Utilized S3 bucket and Glacier for storage and backup on AWS.

Environment: AWS EC-2, GCP, Route 53, S3, ELB, SVN, Clear Case, Maven, ANT, Gradle, Jenkins, GIT, Chef, Kubernetes Web Sphere, Jira, SDLC, chef, Docker, Nagios, Shell Scripts, Unix/ Linux environment, python, Spark, Spark API, SparkSQL

Confidential

Python Developer

Responsibilities:

  • Developed internal auxiliary web apps using python Django framework with Angular.js and Bootstrap CSS / HTML framework.
  • Simple web app for reviewing sitcoms that gives users the ability to view, add, review, up/down vote, search, etc.
  • Set up rules and policies with a Node.js/python back end.
  • Utilized Active Record eager loading to improve rendering time of index pages, incorporated up/down voting, reviewing, and several custom sorting methods for shows to provide smooth user experience.
  • Supported the development of BI portal using SQL.
  • Designed satcom solutions for Vertical markets, conducted POCs for DSNGs with major media houses and channels.
  • Involved in tokenizing the sensitive data before archiving in AWSS3 using based Enterprise Tokenization service and encrypting the data before sending it over wire to external systems.
  • Used AWS for application deployment and configuration
  • Check the Accuracy of the Data being updated and ensure Perfect Reporting and Verify the Data Accuracy through various SQL Queries.
  • Migration of Data from Reports generated by various vendors into PostgreSQL Databases using PostgreSQL Export/Import Procedures.
  • Manage the Space and Storage of the Databases through various Shell Scripts scheduled in the corncob periodically in regular intervals. Write Complex Shell Scripts that automate the Import of Reports to PostgreSQL and MySQL Databases.
  • Used NumPy to create tabular timestamp for east timestamp data manipulation and retrieval.
  • Cross-referenced Seat Geek and Spotify APIs to get album artwork and track previews for each artist
  • Good experience in developing web applications and implementing Model View Control (MVC) architecture using server-side applications like Django.
  • Created the Linux Services to run REST web services using Shell script.
  • Involved in various phases of Software Development Life Cycle (SDLC) such as requirements gathering, modeling, analysis, design and development.
  • Involved in tokenizing the sensitive data before archiving in AWSS3 using based Enterprise Tokenization service and encrypting the data before sending it over wire to external systems.
  • Built an AWS and REDIS server for storing the data and performed defect analysis and interaction with Business Users.
  • Support customers with SDK-usage, design related queries and create high quality collateral.
  • Implementation exposure on service based, SOA, RESTful technologies.
  • Experienced in Git, Object-oriented programming, MySQL and Amazon Web Services.

Environment: Python, Django, PyCharm, AWS, Git, SQL, MYSQL, RESTful, SOA, LINUX, Web Sphere, TomcatWebLogic, Angular.js, Bootstrap, NumPy.

We'd love your feedback!