We provide IT Staff Augmentation Services!

Java/spark/aws Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • Over 7+ years of professional it Experience in all phases of software development life cycle which includes hands on Experience in java/j2ee, Hadoop,spark and big data analytics and cloud.
  • Hands on experience in Data analysis and data transformation, data validation using spark - pythonspark.
  • Hands on experience in aws cloud (s3,ec2,emr,auto scaling,elastic load balancing, vpn, etc …)
  • Experience in Designing and working on data lake solutions on Horton Works Hadoop platform
  • Hands on Experience in Spring boot and Rest Api using cloud RDS instance and running on EMR
  • Hands on Experience on Notebook for validate cross platform validation using python-spark
  • Experience in Kafka for near real time batch jobs to load data from existing warehouse to cloud warehouse snowflake using s3.
  • Experience in data ingestion using sqoop and flume and attiunity, and managing distributed storage on hdfs.
  • Experience in data processing and analysis of big data using distributed technologies like map reduce, hive, pig, spark and scala.
  • Experience in developing customized udf's in java to extend hive and pig latin functionality.
  • Experience in developing core java/j2ee based applications and database applications using spring, hibernate, oracle/sql and batch programming, unix and php.
  • Experience in web services implementation and xml parsing (using Rest, jaxb, jax-Rs, jersy).
  • Experience in unix/shells scripting, experience using xml, xsd, xslt and json.
  • Good knowledge of log4j for error logging, developed stored procedures and queries using pl/sql, expertise in RDBMS like oracle, postgres sql, mysql,rds, snowflakes.
  • Good understanding with production issue and resolution steps to fix them for Hadoop ecosystem.

SKILL:

Big Data Ecosystems: Spark, Hadoop Horton Works Data Platform, python-spark, MapReduce, HDFS, Zookeeper, Hive, Pig, Sqoop, Flume

Language: Java/J2EE Core Java, Servlet, JSP, Hibernate, PL/SQL,XML(Sax parsing),JavaScript, Design Patterns, Unix, Python

Databases: Oracle, Postgress sql, cloud database RDS instances, Snowfalkes, NoSql db(Hbase)

Development Environment: IntellijiA, Eclipse IDE (Helios),Net Beans

Operating System: Windows, Unix

Software Tools: VSS/TFS/SVN, Apache Tomcat.

Web service: Rest (Jax-Rs), xml/json(produce, consume), JaxB, Jersey

Frameworks: Spring,spring boot, MVC, Hibernate(ORM)

Build Tool: Ant, Maven

Domain: - Finance and Risk, Banking, Insurance, QReuse

EXPERIENCE:

Confidential

Java/Spark/aws Developer

Technologies: Java,Python, spark,cloud, AWS, Kafka, Shell script,Snowfalke, Pl/Sql,Rest API, S

Tools: IntellijiA IDE,Snowfalke UI,Frameworks.

Responsibilities:

  • Involved in the requirement gathering/understanding for the module, designing of Confidential different steps solution
  • Imported data using Kafka to load data from different databases to S3 cloud on regular basis.
  • Clean data and transform it, making fact and dimension, reload using sql, pyspark etc
  • Producationalize jobs to run in schedule basis.
  • Excessive data validation cross platoform warehouses using py spark and notebook.
  • Created Pl/sql complex queries that helped creating metrcis and end user views.
  • Debug and log anaylis in cloud EMR .
  • Worked on small dashboard project which use java-spring boot,rest api,Rds instances,emr

Confidential

Java/Hadoop Developer

Technologies: MapReduce, Hive, Sqoop, Scala, Attunity, Pig, Java, SVN, shell script,AWS

Tools: Eclipse 3.5, Autosys

Responsibilities:

  • Involved in the requirement gathering/understanding for the module, designing of data lake Solution
  • Imported data using Sqoop to load data from different databases to HDFS on regular basis.
  • File framework getting data from (S3, local or mount location for diff sources),which is going to ingest into hive after doing bit cleaning nd format by pig, converting fix length delimiter file to some delimiter file,validation against bad/good record,
  • Development of UDF in Java for Hive and Pig
  • Developed Map Reduce programs to Clean and parse the raw data, populate staging tables and store the refined data in partitioned tables in hive.
  • Developed Map Reduce progrme to create utility to compare 2 files and get dif in between these files. Merge small josn file to create one large josn file
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Experience in managing and reviewing Hadoop log files.
  • Written pig script for transformation on raw data, base on business rule
  • Worked with blob/clob data to ingest on hive
  • Worked on Attunity
  • Involved in poc to convert existing data lake to cloud lake

Confidential

Java/J2EE-Hadoop Developer

Technologies: MapReduce, Hive, Scala, Sqoop, Pig, Java, Maven

Tools: Eclipse 3.5

Responsibilities:

  • Involved in the requirement gathering/understanding for the module, designing of data lake Solution
  • Imported data using Sqoop to load data from different databases to HDFS on regular basis.
  • Development of UDF in Java for Hive and Pig
  • Had done poc on spark/scala/hbase
  • Developed Map Reduce programs to Clean and parse the raw data, populate staging tables and store the refined data in partitioned tables in hive.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Experience in managing and reviewing Hadoop log files.

Confidential

Java Developer

Technologies: Spring 3.0, Hibernate 3.2, Development Server-Tomcat 6.0, Eclipse 3.4, Oracle 9i, Maven, Unix, PHP

Tools: Ecclipse 3.5

Responsibilities:

  • Involved in the requirement gathering/understanding for the module, designing.
  • Developing code for the product, coordinating with other team members &, preparing unit test cases.
  • Involved in fixing many production bugs.
  • Involved in writing batch programing for component.
  • Working in report server using unix and php, to gives support to system.
  • Unit testing and integration testing on dev environment for owned use case
  • Making Builds and creating Build related documents for Release.
  • Environment setup.

Confidential

Java Developer

Technologies: Spring 3.0, Hibernate 3.2, Development Server-Tomcat 6.0, Ecclipse 3.4, Oracle 9i, java mail api, Apache poi.

Responsibilities:

  • Involved in the requirement understanding for the ’Template Management’ module.
  • Responsible for create, update, delete template, deactivate template, preview of template, search functionality on template, clone tempalte, publishing of template and validation on template
  • Developing code for this module, coordinating the work of other team members &, preparing Unit test cases.
  • Use XML parsing, marshling, read/write excel files in module
  • Creating Build related documents for Release.
  • Environment setup.

Confidential

Java Developer

Technologies: JSP, Servlets, Struts, Oracle.

Responsibilities:

  • Involved in the requirement understanding for the module, designing.
  • Developing code for the product, coordinating the work of other team members &, preparing Unit test cases.
  • Making Builds and creating Build related documents for Release.
  • Environment setup.

We'd love your feedback!