We provide IT Staff Augmentation Services!

Java/big Data & Spark Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • 11+ years of experience in IT industry in complete project lifecycles which included Design, Development, Testing and Production Support in Big data and Java, J2EE Technologies
  • Having 4 years’ hands - on experience in Big Data Analysis using Hadoop, HDFS, MapReduce, Hive, Spark, Scala, Python, Avro, Parquet, Sqoop, Flume, Kafka, HBase, Cassandra, Cloudera Manager and Shell Scripting.
  • Experience in JAVA, J2EE technologies, Web Services, WebLogic, Oracle SOA Suit and Oracle Service Bus.
  • Strong experience creating real time data streaming solutions using Spark Core, Spark SQL & Data Frames, Spark Streaming, Kafka.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS
  • Expertise in writing MapReduce programs and UDF’s to incorporate complex business logic into Hive queries in teh process of performing high level data analysis using Hive
  • Experience in Avro, Parquet and NoSQL technologies like HBase, Cassandra
  • Experience in designing Time Driven and Data Driven automated Oozie workflows
  • Hands-on experience in Spring Boot and Micro Services Development using REST
  • Hands-on Experience in Spring, Hibernate, MVC Architecture, Struts Framework
  • Experience in JSP, Servlets, JDBC, SOAP, XSD, XML, AJAX, ANT, JUnit and TestNG
  • Experience in Web Services Restful, SOAP, WSDL, Apache Axis2, JaxB, XMLBeans
  • Good Experience and exposure to relational databases like Oracle, MySQL and SQLServer
  • Experience and exposure to Applications servers like WebLogic, WebSphere and JBoss.
  • Experience in Cloud Computing Technology - AWS and Azure
  • Architecting and Developing highly scalable and large-scale distributed data processing systems using Cloud Platform - Amazon Web Services (AWS)
  • Expertise in developing Machine Learning algorithms using Apache Spark
  • Architected and provisioned cloud infrastructure using AWS services like EC2, S3, EBS, SQS, ELB, VPC, Dynamo DB, Redshift.
  • Adept in dealing with people and leading teams. Mentored developers and evaluated their performance.
  • . Expertise in triggering Spark jobs on teh AWS - Elastic Map Reduce (EMR) cluster resources and perform fine tuning based on teh cluster scalability.

TECHNICAL SKILLS:

Operating Systems: Windows and UNIX

Languages: J2SE & J2EE, Scala, Python

BIG DATA Frameworks: Hadoop (HDFS, Map/Reduce. SPARK, HIVE, HBase, Cassandra Phoenix, Zookeeper, Flume, OOzie Sqoop and Kafka)

Cloud Services: AWS services like S3, EC2VPC, Dynamo DB, and Redshift, Azure.

DEV OPs tools: Jenkins, Dockers, GitHub, Chef, Puppet

Web Business Logic Technologies: Servlets

Mailing Services: Java Mailing API

GUI: JSP, JavaScript, AJAX, jQuery

Enterprise Technologies: EJB2.x & EJB 3.x and JMS

Middleware Technologies: Web Services, IBM MQ and Oracle SOA12C

(BPEL, OSB, BAM) and OAG (Oracle API Gateway):

Web Frameworks: Struts Framework and Spring Framework

ORM Frameworks: Hibernate, JPA (Java Persistence API)

Tools & Utilities: My Eclipse, ANT, Maven and RAD

Parsing Technologies: SAX, DOM and JAX-B binding framework

Web/App Servers: Tomcat /Web logic /WebSphere

Testing: Unit Testing, SOAP UI testing, SOA Testing, Performance Testing, Java Performance Tuning, Profiling, and Memory Management, CA LISA Service Virtualization automation tool

PROFESSIONAL EXPERIENCE

Confidential

Java/Big Data & Spark Developer

Responsibilities:

  • Involved in requirement study, design, development, unit testing
  • Involved in importing teh data from various formats to HDFS environment.
  • Importing & exporting data from Netezza, Teradata to HDFS using SQOOP
  • Involved in implementing teh shell script
  • Involved in sourcing teh data to HDFS from excel sheets using Java
  • Involved in writing and configuration of Autopsy’s jobs
  • Involved in Data Quality and Data Integrity implementation
  • Involved in creating Hive tables, loading with data, formation of results and writing hive queries and writing teh UDF’s for Hive
  • Involved in development Spark workflows
  • Implementing teh workflow using oozie for various business scenarios.
  • Involve in all teh phases of teh SDLC using Agile Scrum methodology.

Environment: Java,Scala,phython, HDFS( Hadoop Distributed File System), Hadoop, Map Reduce (YARN), HDFS, Hive, HBase, Phoenix, Sqoop, Flume, Oozie, Spark, Python, Kafka, Cloudera Manager,Hue UI, Java (JDK 1.6), Eclipse, SVN, DB2, Netezza, Teradata, UNIX Shell, Rally and Cloudera Distribution, Kafka Autosys, Tableau, GIT, Linux

Confidential

Java/ Big Data & Spark Developer

Responsibilities:

  • For injection data from Oracle, MySQL to S3 which can be queried using hive and spark SQL tables.
  • Worked on Sqoop jobs for ingesting data from MySQL to Amazon S3
  • Created hive external tables for querying teh data
  • Use Spark Data frame APIs to inject Oracle data to S3 and stored in Redshift.
  • Write a script to get RDBMS data to Redshift.
  • Process teh datasets and apply different transformation rules on teh top of different datasets.
  • Process teh complex/nested JSON and Csv data using Data frame API.
  • Automatically scale-up teh EMR Instances based on teh data.
  • Apply Transformation rules on teh top of Data Frames.
  • Run and schedule teh Spark script in EMR Pipes.
  • Process Hive, csv, JSON, oracle data Confidential a time (POC).
  • Validate and debug teh script between source and destination.
  • Validate teh source and final output data.
  • Test teh data using Dataset API instead of RDD
  • Debug & test teh process is reaching Client's expectations or not.
  • Query execution is trigger. Improve teh process timing.
  • Based on new spark versions, applying different optimization transformation rules.
  • Debug teh script to minimize teh shuffling data
  • Environment: Scala,SparkSQL, Streaming, Sqoop, AWS (EMR, Elastic Search, S3, EMR, Dynamo DB, Pipes, Redshift)

Confidential

Java/Big Data Developer

Responsibilities:

  • Involved in POC (Proof of Concept) for processing large amount of data (in TB) doing some manipulation on teh data.
  • Involved in Informatica Big Data Edition configuration.
  • Involved in Java code writing for large silted xml file into single xml file.
  • Involved in Multithread Batching processing for source XML.
  • Involved in shall scripting for zip xml into unzip xml in HDFS system.
  • Involved in creating teh tables in Hive.
  • Involved in data loading into Hive.
  • Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis.
  • Very good understanding of Partitions, Bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance Solved performance issues in Hive scripts with understanding of Joins, Group and aggregation and how does it translate to MapReduce jobs.
  • Installed and configured Hadoop cluster in Test and Production environments.
  • Performed both major and minor upgrades to teh existing CDH cluster.
  • Very good experience in monitoring and managing teh Hadoop cluster using Cloudera Manager.
  • Developed Sqoop jobs to transfer data from several data sources such as Oracle to HDFS

Environment: Teradata, HIVE, HDFS (Hadoop Distributed File System), Informatica Big Data Edition, Maven, J2EE, Web Services, JSON, XML, Oracle Service BusWebLogic, Linux, TFS.

Confidential

Java/ Hadoop Developer

Responsibilities

  • Involved in Analysis for Migrating teh Oracle Web logic 11g to Oracle Web logic 12.1.2 version.
  • Involved in Design, Estimations, development & Unit testing phases for all teh applications.
  • Involved in Environment set up for Web logic 12c track.
  • Involved in preparing teh Design Documents (Technical Design Specification document).
  • Involved in New Web logic track set up for Dev, SIT, PREF, UAT and PROD environments.
  • Involved in configuring BPEL (Business Process Execution Language) and composite deployment using Hudson deployment tool.
  • Played teh role of a developer and technical lead.
  • Involved in deploying teh applications into web logic 12c sever and monitoring.
  • Involved in deploying applications on various non-production environments using Script and Console mode on web logic application
  • Troubleshooting teh issues raised in Dev, SIT, UAT and Production environments.
  • Involved in install and configure Oracle SOA Suite 11g components in Oracle Web logic Server domains.
  • Monitor and manage SOA Components by using teh Oracle Enterprise Manager Fusion Middleware Control Console to perform administrative tasks.

Environment: Java/J2EE, WSDL, SOAP, BPEL, OSB, BigData, Oracle 11g, Hadoop HIVE, Web Services (Apache Axis2), EJB2.x, EJB3.x, Web Logic Server 12C, IBM MQ, SOA (BEPL, OSB and BAM), My Eclipse blue edition 10.5, JDeveloper 10G, MQ Explorer, TOAD, ANT, Windows 7 Basic

Confidential

Java Application Developer

Responsibilities:

  • Requirements gathering and updated teh design document.
  • Involved in estimate upcoming projects in same business line.
  • Impact Analysis of new enhancements on existing implementation
  • Extensively used JQquery, Struts 2 Framework, Spring Framework, and Hibernate Framework and Web Services for Presentation, Control & Model layers
  • Developed Business Delegate classes for minimizing teh tight coupling between Presentation tier and Business tier.
  • Involved in teh application deployment of dev., stage, production servers.
  • Responsible for fixing bugs reported in Mercury Quality center.
  • Involved in monthly &quarterly release production.
  • Code moving dev to stage and QC environment

Environment: Java/J2EE, JSP, JSP, jQuery, Ajax, JavaScript, Oracle 11g, Siebel, Struts 2.2 & Spring 2.5 & 3.0, Hibernate 2.5 & 3.0, Web Services (WS), Java Mail API, Web Logic Server 10.6, SOA(BPEL,OSB,BAM), My Eclipse blue edition 10.5, Maven, SVN, Windows 7 Basic

Confidential

Java Application Developer

Responsibilities:

  • Requirements gathering and updated teh design document.
  • Impact Analysis of new enhancements on existing implementation
  • Extensively used Struts & Hibernate Framework and Web Services for Presentation, Control & Model layers
  • Involved Web Services in Care Point Customer Search project
  • Involved to writing business logic in state full session bean.
  • Involved to exposed to create generate WSDL file.
  • Involved to create customer information response as a XML tree Structure format
  • Involved to code moving into clear case.
  • Involved to deployment of dev environment.
  • Supporting enhanced project Development.
  • To release monthly and quarterly production ear
  • Unit testing

Environment: JAJA/ J2EE, DB2, Struts, EJB, Hibernate, Web Services, WebSphere 6.1, RAD, Clear Case & Clear Quest Tools, Windows XP.

Confidential

Java Developer

Responsibilities:

  • Requirements gathering and updated teh design document.
  • Impact Analysis of new enhancements on existing implementation
  • Extensively used ADF Framework, Spring Framework & Hibernate Framework for Presentation, Control & Model layers
  • Developed Business Delegate classes for minimizing teh tight coupling between Presentation tier and Business tier.
  • Developed SQL /PLSQL queries required.
  • Responsible for maintaining and implementing enhancements for Batch Job Engine
  • Involved in teh application deployment of dev, stage, production servers.
  • Responsible for fixing bugs reported in Mercury Quality center.
  • Involved in monthly &quarterly release production.
  • Code moving dev to stage environment

Environment: JAVA/J2EE, Struts, Spring, Hibernate, ADF Framework, Web Services, OC4J, Eclipse, CVS, Oracle 9i/PLSQL, Windows XP

Confidential

Java Developer

Responsibilities:

  • To resolve teh Production tickets issues daily.
  • To release monthly and quarterly production ear.
  • Code moving into ST1 & ST2 environment.
  • Unit testing
  • Requirement gathering from System Engineer (SE) people
  • Supporting enhanced project Development
  • Analysis and Design
  • To Prepare Algorithm for existing code
  • To Prepare Algorithm for PL/SQL code
  • Debugging for previous java code through logs.
  • To Analyzing for Business Logic & Implementation
  • Coding & Implementation (Action Class)
  • Unit Testing
  • Involved in View Biller and Delete Biller module.
  • Testing.
  • Involved in URD and Prototype development.
  • Code Review

Environment: Java, Struts, JSP, JDBC, JSTL, ISO Messages, Oracle 8i, JDeveloper, OC4J (Oracle Container for J2EE) server, CVS (Subversion), Windows 2000 Server.

We'd love your feedback!