We provide IT Staff Augmentation Services!

Aws Data Engineer Resume

5.00/5 (Submit Your Rating)

Greensboro North, CarolinA

SUMMARY

  • 7+ years of extensive IT experience in all phases of Software Development Life cycle (SDLC) with skills in data analysis, design, development, testing and development of software systems.
  • 3+ years of strong experience, working on Apache Hadoop ecosystem components like MapReduce, HDFS, Hive, Spark.
  • Hands on experience with different AWS services like S3, EC2, EMR, SNS, SQS, Lambda, Redshift, Data pipeline, Athena, AWS Glue, S3 Glacier, Cloud Watch, Cloud Formation, IAM, AWS Single Sign - On, Key Management Service, AWS Transfer for SFTP, VPC, SES, Code Commit, Code Build.
  • Utilize AWS services with focus on big data analytics, enterprise data warehouse and business intelligence solutions to ensure optimal architecture, scalability and flexibility.
  • Developed spark jobs to process json, csv, text files. Used Spark Data Frames to further perform joins, apply transformations and store the data in S3.
  • Used AWS Athena for querying the records in S3 location for analytical purpose.
  • Experience in AWS Cloud platform and its features which includes EC2, AMI, EBS Cloud watch, AWS Config, Auto-scaling, IAM user management, and AWS S3.
  • Experienced with event-driven and worked on AWS Lambda functions to trigger various AWS resources.
  • Experience with different data formats like Json, Avro, Parquet and compressions like snappy and bzip.
  • Used Data pipelines for the automation of the daily jobs for loading the Rawdata into Datalake and from Datalake into Redshift.
  • Good knowledge of Dataware housing concepts and ETL processes.
  • Excellent Experience in Hadoop architecture and various components such as HDFS, Task node, Name node, Data node and MapReduce programming paradigm.
  • Created Cloud formation templates for the provisioning of services, used nested stacks for deployment of the end-to-end architecture.
  • Have sound exposure to Retail market.
  • Experience with large data sets - regularly transforming and querying large tables, Writing high-performance, reliable and maintainable code.
  • Assisted data architects in designing plans to perform ETL procedures for loading data into Enterprise Data Warehouse (EDW) from heterogeneous sources and design of Data Models. Experienced in preparing Data Mapping documents.
  • Worked with data owners, Business Units, Data Integration team and customers in fast paced Agile/Scrum environment.
  • Facilitated various Scrum ceremonies such as Sprint Planning, Sprint Review, Sprint Retrospection, Daily Scrum and Backlog Grooming to improve the team’s performance and maximize value while focusing towards a Sprint Goal.
  • Extensive experience in developing web page quickly and effectively using jQuery, HTML5, CSS3, Responsive Web Design, Bootstrap, Angular JS.
  • Expertise in design and development of various web and enterprise applications using J2EE technologies like JavaScript, JSP, Java Beans, Servlets, JDBC, EJB, JMS, JVM, JNDI, HTML, AJAX, jQuery and Web Services.
  • Developed SOAP and REST web services using Java, Hibernate, JAX-WS, JAX-RS and JAXB.
  • Expertise in OOA, OOD, SDLC, JavaApplication Development, Distributed Application Development and Object-Oriented Programming (OOP), UML, Rational Rose 2000.
  • Experience with RDBMS concepts and in writing Queries, Functions, Triggers, Stored Procedures, Cursors and PL/SQL Packages with databases like IBM DB2, Oracle 10/11g, SQL Server, MySQL, DB2, NoSQL (Casandra and Mongo DB).
  • Expertise implementing and integrating framework like Struts, Spring MVC, Spring IOC, Spring AOP, Hibernate, and JPA.
  • Experienced in Developing and Deploying Applications using JBoss, WebLogic, WebSphere, and Apache Tomcat.
  • Experience in working with projects utilizing test-driven development (TDD) and Acceptance Test-driven development (ATDD) methodologies.
  • Sound knowledge in Version control systems like GIT, CVS, Subversion, Clear Case, VSS.
  • Committed to excellence, self-motivator, fast-learner, team-player, and a prudent developer with strong problem-solving skills and communication skills.

TECHNICAL SKILLS

Languages: Java, J2EE, C, C++, SQL, PL/SQL, NO SQL.

Big Data: Hadoop, HDFS, MapReduce, Hive, Spark.

AWS-Services: S3, EC2, EMR, SNS, SQS, Lambda, Redshift, Data Pipeline, Athena, AWS Glue, S3 Glacier, CloudWatch, Cloud Formation, IAM, AWS Single Sign-On, Key Management Service, AWS Transfer For SFTP, VPC, SES, Code Commit, Code Build.

Java Technologies: Java, J2EE, JDK 1.4/1.5/1.6/1.7/1.8 , JDBC, Hibernate, JSP 1.2/2, Servlets, JMS, Struts, Spring Framework, Java Beans, Web Services (REST, SOAP), AJAX, JNDI, XML Parsers, EJB.

Framework: MVC, Spring 2.0/2.5/3.0/3.5 , Hibernate3.0/3.2, Struts 1.2/2.0.

Web Technologies: Java Script, HTML, DHTML, XHTML, CSS, Bootstrap, Angular JS, jQuery.

Servers: WebLogic, WebSphere, JBoss, Apache Tomcat.

RDBMS: Oracle 12c/ 11g/10g/9i/8i, DB2, SQL Server, My SQL, MS Access, PostgreSQL, MongoDB.

Tools: & Utilities: Eclipse, RAD, JBoss Dev Studio, TOAD, IntelliJ, NetBeans, Edit Plus, Dreamweaver, SOAPUI.

Build Scripts: Ant, Maven, Jenkins.

Testing: JUnit, Mockito, Selenium, Cucumber, log4j.

Defect Tracking Tools: Jira, Rally.

Version Control Systems: SVN Subversion, Concurrent Versions System (CVS), Git, IBM Rational ClearCase.

PROFESSIONAL EXPERIENCE

Confidential, Greensboro, North Carolina

AWS Data Engineer

Responsibilities:

  • Objective of this project includes building a data lake as a cloud-based solution in AWS using Amazon S3 and make it a single source of truth.
  • Worked on core AWS services (S3, EC2, ELB, EBS, Route53, VPC, Auto scaling etc.) and deployment services (Cloud Formation) and security practices (IAM, Cloud Watch and Cloud trail).
  • Used SFTP to transfer the raw-data files from the source system to AWS S3.
  • Worked on ETL Migration services by developing and deploying AWS Lambda functions for generating a serverless data pipeline which can be written to Glue Catalog and can be queried from Athena.
  • Active participation in decision making and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements & Design.
  • Worked on raw data migration to Amazon cloud into S3 and performed refined data processing.
  • Created Hive tables (Internal/external) for loading data and have written queries that will run internally in MapReduce and queries to process the data.
  • Experienced with event-driven and worked on AWS Lambda functions to trigger various AWS resources.
  • Worked on creating DDL's of target and staging tables for both datalake and redshift with partition columns and distribution and sort keys for better query performance.
  • Used AWS Athena for querying the records in S3 location for analytical purpose.
  • To develop spark code to achieve the business transformation rules and reads the data from multiple database sources and load to Glue data catalog. Glue Data Catalog holds the metadata tables and has conformed, enriched and trusted catalog stored in Amazon S3 service.
  • Used data pipelines to perform transformations on the EMR clusters and loading the transformed rawdata into datalake and from datalake into redshift.
  • Created views on redshift tables to provide access to the business team and users for reporting purposes.
  • Worked with Hadoop in AWS using EMR, setup Cloudwatch alarms for monitoring and alerting.
  • Writing to Glue metadata catalog which in turn enables us to query the refined data from Athena achieving a serverless querying environment.
  • Writing CloudFormation Templates in JSON for Network and Content Delivery of the AWS Cloud Environment.
  • Worked with different file formats like CSV, Json, Text.
  • Worked with various HDFS file formats like Parquet, ORC, Avro for serializing and deserializing.
  • Created large datasets by combining individual datasets using various inner and outer joins in SQL.
  • Worked on Pl/SQL queries for processing the data and for optimization.
  • Tested data mapping & specifications to ensure data format and schemas matched the data warehouse as per the business requirements and if there were any changes in the underlying data.
  • Involved in Data Validation and fixing discrepancies by working in coordination with the Data Integration and Infra Teams.
  • Worked with data owners, Business Units, Data Integration team and customers in fast paced Agile/Scrum environment.

Environment: S3, EC2, EMR, SNS, SQS, Lambda, Redshift, Data Pipeline, Athena, AWS Glue, S3 Glacier, CloudWatch, Cloud Formation, IAM, AWS Single Sign-On, Key Management Service, AWS Transfer For SFTP, VPC, SES, Code Commit, Code Build.

Confidential, Cincinnati, Ohio

AWS Data Engineer

Responsibilities:

  • Designing and deploying a multitude application utilizing almost allAWSstack (EC2, S3, VPC, ELB, Auto Scaling Groups, SNS, SQS, IAM, CloudFormation, Lambda, Glue, SQS) focusing on high-availability, fault-tolerance and auto-scaling.
  • Worked onAWSS3 services creating bucket, configuring buckets with permission, logging, versioning and tagging.
  • Experience on setting up the life cycle policies to back the data from AWS S3 to AWS Glacier, Worked with various AWS, EC2 and S3 CLI tools.
  • Deployed a website using Amazon S3 for hosting, Lambda functions as backend server processing, DynamoDB for database and API Gateway for RESTful endpoints.
  • Created LAMBDA functions using java to trigger event-driven functions for DynamoDB, S3 Glue and view logs in Cloud Watch.
  • Followed Agile Methodology to analyze, define, and document the application, which will support functional and business requirements, Coordinate these efforts with Functional Architects.
  • Implemented the application using Spring MVC Framework and handled the authentication, authorization, and access-control features by using Spring Security.
  • Developed REST web services to perform transactions from front end to our backend applications.
  • Designed, Configured and deployed Amazon Web Services (AWS) for a multitude of applications utilizing the Amazon cloud formation.
  • Involved in writing and executing PL/SQL.
  • Used JIRA to assign, track, report and audit the issues in the application.
  • Used code commit to maintain the code.

Environment: EC2, Route 53, S3, VPC, IAM, Cloud Watch, SNS, Lambda and AWSCLI, Hadoop, Hive, Spark.

Confidential, San Antonio, Texas

Sr. Java/J2EE Developer

Responsibilities:

  • Involved in entire Software Development Life Cycle (SDLC) of the project like Requirement gathering, Conceptual design, Analysis, Detail design, Development, System Testing, and User Acceptance.
  • Worked in Agilesoftware development, attended daily scrum meetings, sprint planning meeting and sprint retrospective and tracked the progress on Rally and defect tracking.
  • Developed web interface for user's modules using JSP, HTML, XML, CSS, JavaScript, AJAX.
  • Involved in Core Javacoding by usingJavaAPIs such as Collections, Multithreading, Exception Handling, Generics, Enumeration, andJavaI/O to fulfill the implementation of business logic.
  • Used AJAX for developing asynchronous web applications on client side.
  • Used Spring MVC and JDBC Template for object-to-relational mapping from the database and creating POJOs.
  • Implemented the service layer using CXFframework and designed a full-fledged RESTfulbased Web Services.
  • Integrated Spring Security and Spring OAuth2 for application security and API security.
  • Designed Database schema and created complex SQLqueries for creating, reading, updating and deleting data in database.
  • Developed scalable applications using the MEAN stackand created POC's for rest service development using Node.JS, Express.JS and MongoDB.
  • Involved in configuring Hibernate mapping files/annotations and POJO objects.
  • Involved in working with different external databases to consume the data and provide it to the UI through the service layer.
  • Involved in deployment of this application on JBoss Application Server for various environments like Development, UAT and production.
  • Experience in building RESTful web services with Spring MVC and JAX-RS.
  • Used NodeJS for server side web Applications, Real time communication.
  • Implemented mock services in node.js with the help of modules using ExpressJS.
  • Successfully handled JSON data and carried outJSON parsing for form submissions and DOM manipulation.
  • Worked on binding XML Schemainto objects to use in the application using JAXBAPI.
  • Designed and coded application components in an agile environment utilizing Acceptance Test-Driven Development (ATDD) approach.
  • Used Junit, Mockito framework for unit test coverage of the code.
  • Used Cucumber for automation.
  • Used Maven for dependency management and for building the application and Jenkins for continuous integration.
  • Used Log4j utility to generate run-time logs.
  • Worked with Git for configuration management and version control.
  • Worked closely with the QA team to fix bugs and issues found in the application.Supported QAdefect resolution, used Rally to publish daily defect reports to the management.

Environment: J2EE,Java1.7, JBOSS 6.4.0, JBoss Dev Studio, Spring framework, Hibernate, Multithreading, JDBC, Web services, REST, XML, SOAP UI, NodeJS, Express.JS, Apache CXF, MY SQL, Agile Methodology, Design Patterns, Maven, Jenkins, Git, Junit, Mockito, Cucumber, Rally, Eclipse, JBOSS.

Confidential

Jr. Java/J2EE Developer

Responsibilities:

  • Involved in various phases of SDLC such as requirements gathering, modelling, analysis, and design.
  • Developed applications in environments of Waterfallmethodologies.
  • Developed the GUIs using HTML, CSS, JSP and Angular JS framework Components.
  • Used Hibernate named queries concept to retrieve data from the database and integrate with Spring MVC to interact with back end persistence system (Oracle).
  • Utilized core J2EE design patterns such as DTO (Data Transfer Object), DAO in the implementation of the services.
  • Used SOAP web services for consuming third-party services.
  • Developed SOAP web services using JAX-WS API for using internal applications.
  • Designed a Web application using Web API with AngularJS and populated data using java entity framework.
  • Developing SQL stored procedure and prepared statements for updates. ting and accessing data from database
  • Developed various test cases to test the functionality for E2E (End-to-End) flow of application.
  • Developed ANT script to build .war and. ear files for the application.
  • Also worked in Production support analyzed issues and fixed by writing SQL scripts and java code.
  • Request and Response mapping using Spring (AOP) from and to the application server by annotation based spring configuration in Eclipse IDE.
  • Involved in deploying the application under WebLogic.
  • Developed MDBs using JMSto exchange messages between different applications using MQ Series.
  • Used JIRA tracking tool to manage and track the issues reported by QA and prioritize and take action based on the severity.
  • Configured log4j to enable/disable logging in application.
  • Involved in JUnit Testing on various modules by generating the Test Cases.
  • Involved in training team members with the application and business functionality.
  • Used Tortoise SVN for maintaining the component and for release and version management.

Environment: Java1.7, Spring 3.2.3, Hibernate 3.0, HTML5, CSS3, Angular-JS, NodeJS, RESTFUL Web Services, SQL, XML, JSP, JMS, SOAP, AngularJS1.2/2.0, Eclipse, WebLogic server, JUnit, Ant, Oracle 11g, Tortoise SVN, JIRA.

Confidential

Jr. Java/J2EE Developer

Responsibilities:

  • Involved in analysis, design and development of the project.
  • Worked in Waterfall Software Development methodology.
  • Worked on the user interface of the system using HTML, CSS, and JavaScript.
  • Modeled classes and interfaces to assist the programmers in coding using UML.
  • Used JSP's to create dynamic pages for user interaction.
  • Worked on My SQL, Oracle DBMS SQL Server 2008 database for creating DB tables and DB objects.
  • Involved in writing deployment files for deploying EJB onto the Tomcat Web server.
  • Involved in writing the Enterprise Java Beans including Session Beans.
  • Developed ANT script for auto generation and deployment of the web service
  • Used Microsoft VSS for version control to update and aid the code checking and code checkout during development.
  • Used JMS to establish message communication.
  • Extensively used Log4J for logging and tracing the messages.
  • Used JUnit for testing the application, rectified performance bottlenecks and eliminated critical bugs.

Environment: s: Core Java, J2EE, JSP, JDBC, UML, HTML/XML, CSS, SVN, JavaScript, Apache Tomcat, JMS, Waterfall methodology, My SQL, Microsoft VSS, Eclipse, EJB.

We'd love your feedback!