We provide IT Staff Augmentation Services!

Hadoop/java/oracle Developer Resume

4.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY:

  • Around 9+ years of Professional experience in IT Industry, involved in Developing, Implementing, configuring, testing Hadoop ecosystem components and maintenance of various web based applications.
  • Around 5 years of IT Industry Experience ranging from Big Data tools like Hadoop.
  • As a Tech Lead supervised a team of 5 - 8 engineers and help them in maintain their quality of service by reviewing their work and providing constructive feedback.
  • Understands the complex processing needs of big data and have experience developing codes and modules to address those needs.
  • In-depth understanding of MapReduce and the Hadoop Infrastructure.
  • Vast experience with Hadoop eco systems such as HDFS, MapReduce, Yarn, Pig, Hive, Hbase, Oozie, Zookeeper, Sqoop, Impala, Kafka.
  • Excellent Experience in Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node, Application Master, Resource Manager, Node Manager and MapReduce programming paradigm.
  • Experience in analyzing data using Hive, Pig Latin, HBase and custom Map Reduce program.
  • Good Exposure on Apache Hadoop Map Reduce programming PIG Scripting, Distribute Application and HDFS.
  • Expertise in optimizing traffic across network using Combiners, joining multiple schema datasets using Join and organizing data using Practitioners and Buckets.
  • Experience in working with different file formats and compression techniques in Hadoop.
  • Expertise in extending Hive and Pig core functionalities by writing custom User Defined Functions (UDF).
  • Handle the TEXT, JSON, XML, Sequence file, Parquet data using Hive (SERDE), Pig and filter the data based on query factor.
  • Strong Knowledge in Hive and Hive's analytical functions.
  • Hands on experience in job workflow scheduling and monitoring tools like Oozie and Zookeeper.
  • Capturing data from existing databases that provide SQL interfaces using Sqoop.
  • Efficient in building hive, pig and map Reduce scripts.
  • Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map-Reduce and Pig jobs.
  • Expertise with NoSQL databases such as Hbase.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • Expertise in writing queries on Splunk tool to see the history data and trending.
  • Experience in managing and reviewing Hadoop log files.
  • Hands on experience on Devops tools like GitHub, Jenkin, Maven, Chef, Solar and used in Hadoop Usecases for deployments.
  • Proficient in using various IDEs like Eclipse, My Eclipse.
  • Expertise in using the Service Now tool for creating Incidents, Problems, Knowledge Articles, Change Requests.
  • Hands on experience in build management tool Maven and Ant.
  • Involved in Hadoop testing.
  • Hand on experience on UNIX scripting.
  • Expertise in creating technical documents and user manuals. Used confluence as document repository.
  • Hands on experience in application development using Java, RDBMS and Linux shell scripting.
  • Scripting to deploy monitors checks and critical system admin functions automation.
  • Experience in database design using Stored Procedure, Functions, Triggers and strong experience in writing complex queries for Oracle.
  • Good knowledge of J2EE design patterns and Core Java design patterns.
  • Expertise in using Version Control systems like SVN, Jenkins, GIT HUB.
  • Expertise in J2EE technologies like Servlets, JSP, Struts, Hibernate, EJB and JDBC.
  • Hands on experience with SPARK to handle the streaming data.
  • Good knowledge of stored procedures, functions, etc. using SQL and PL/SQL.
  • Expert in SQL, PL/SQL, Developer 6i/9i (Forms 6i/9i/10g/Reports 6i/9i/10g), Workflow Builder 2.6, SQL* Loader, Shell Scripting etc.
  • Self-motivated and able to manage workload efficiently and effectively.
  • Presented multiple Show and Tell sessions to client on Hadoop use cases. .

TECHNICAL SKILLS:

Big Data: MapReduce, Pig, Hive, Sqoop, HBase, Oziee,Kafka, Zookeeper, yarn.

Programming Languages Core java, UNIX Shell Scripting, SQL, Knowledge on Scala.

J2EE Technologies: JSP, Servlets.

Frameworks: Struts, JUnit.

Client Technologies: Java Script, AJAX, CSS, HTML 5.

Operating Systems: UNIX, Windows, LINUX.

Web technologies: JSB, Servlets, Socket Programming, JNDI, JDBC, JavaScript.

Databases: Oracle 8i/9i/10g, MySQL 4.x/5.x.

Java IDE: Eclipse.

Tools: Oziee, Hue, SQL Developer, ANT, Maven, Jenkins, Git Hub, Splunk, Confluence.

No Sql Databases: HBase.

PROFESSIONAL EXPERIENCE:

Confidential, Phoenix, AZ

Technical Lead

Responsibilities:

  • Involved in development and monitoring of Hadoop usecases.
  • Handled importing of data from various data sources, performed transformations using Hive, Map Reduce and loaded data into HDFS.
  • Exporting the Data from Hadoop to HBase for Real Time Applications using HBase Java APIs.
  • Stored, Accessed and Processed data from different file formats i.e. Text, ORC and Parquet.
  • Involved in creating Hive Tables, loading with data and writing Hive queries.
  • Load the xml files to HDFS and then process with Hive.
  • Loading the data to Hbase Using Java API's.
  • Performance tuning the Hive, Pig queries.
  • Developed UDF's to implement complex transformations on Hadoop.
  • Worked on optimizing Shuffle and Sort phase in Map Reduce Phase.
  • Supported Map Reduce Programs those are running on the cluster.
  • Managing and reviewing Hadoop log files to identify bugs. scheduling Oozie workflow and Spring Batch for configuring the workflow for different jobs like Hive, Mapreduce, Shell
  • Written Shell scripts to start the usecase and for pre validations.
  • Automated all the jobs for pulling data from FTP server to load data into Hive tables using Oozie workflows
  • Monitored and scheduled the UNIX scripting jobs
  • Actively involved in code review and bug fixing for improving the performance
  • Developed and optimized Pig and Hive UDFs (User-Defined Functions) to implement the functionality of external languages as and when required
  • Managed, reviewed and interpreted Hadoop log files.
  • Co-ordinate with Administrator team to analyze Map Reduce Jobs performance for resolving any cluster related issues.
  • Expertise in platform related Hadoop Production support tasks by analyzing the job logs.
  • Co-ordinate with different teams to determine the root cause and taking steps to resolve them.
  • Responsible for continuous Integration with Jenkins and deploying the applications into production using XL Deploy.
  • Managed and reviewed Hadoop log files to identify issues when job fails and finding out the root cause.
  • Utilizing service now to provide application support for the existing clients.
  • Load log data into HDFS using Kafka.
  • Setting-up Kafka brokers, Producers and Consumers.
  • Setting-up Data pipelines to hadoop with Kakfa.
  • Created partitioned tables in Hive for best performance and faster querying.
  • Managing and scheduling Jobs on a Hadoop cluster using Cron Tab, Oozie Scheduler, Event Engine(Inhouse Tool)
  • Ensure the quality and integrity of data are maintained as design and functional requirements change.
  • Develop and document design/implementation impacts based on system monitoring reports.
  • Reduced number of open tickets for couple of usecases by analyzing, categorizing & prioritizing all open issues recorded. This required motivation of the offshore team & thoughtful delegation of workable tickets to the offshore resources to maximize on efficiency.
  • Reduced response time to clients on issues reported in the production environment.
  • Drafted support procedures to decrease client reported ticket/issue turnaround.
  • Used the Service Now ITSM tool for creating Incidents, Problems, Knowledge Articles, Change Requests.
  • Presented weekly status reports to client on usecase progress and issue tracker.
  • Responsible to design Shift roster to the production support team.
  • Guide offshore programmers assigned to the Production Support group.
  • Conduct code reviews to maintain quality of code.

Environment: Linux, Hadoop, Hive, Hbase, GIT, Pig, Java, Kafka, Map Reduce, Sqoop.

Confidential

Hadoop/Java/Oracle Developer

Responsibilities:

  • Hands on experience in application development using Java RDBMS and Linux shell scripting.
  • Worked on SQOOP for Import/Export data into HDFS and Hive.
  • Worked on handling data from different data sets, join them and pre-process using hive join operations.
  • Experienced with Map-Reduce programs to aggregate the data
  • Developed Hive queries to analyze data and generate results
  • Managed, reviewed and interpreted Hadoop log files.
  • Strong understanding of Hadoop eco system such as HDFS, MapReduce, HBase, Zookeeper, Pig, Hadoop streaming, Sqoop, Oozie and Hive
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from Oracle into HDFS using Sqoop
  • Implemented secondary sorting to sort reducer output globally in map reduce.
  • Implemented data pipeline by chaining multiple mappers by using Chained Mapper.
  • Created Hive Dynamic partitions to load time series data
  • Experienced in handling different types of joins in Hive like Map joins, bucket map joins, sorted bucket map joins.
  • Created tables, partitions, buckets and perform analytics using Hive ad-hoc queries.
  • Actively participated in software development lifecycle (scope, design, implement, deploy, test), including design and code reviews.
  • Expertise in generating reports using Oracle Reports10g meeting variety of Business needs including custom Reports.
  • Modifying PL/SQL Packages, Procedures, Functions for data population and updating as per user requirements.
  • Created custom templates with Data Definitions for XML Publisher and Developed reports using report builder and registered as XML reports in applications.
  • Customized various XML reports by adding new fields, modifying template as per the business requirements.
  • Responsible for creating an interface for Purchase Order upload, which creates PO from a CSV file.
  • Designed and developed an Item Conversion program to import legacy inventory item data.
  • Personalized many Oracle standard Forms as per the Client requirements.
  • Designed wide variety of reports in Order Management with BI/XML Publisher.
  • Developed many reports and designed to view output in different formats like PDF, EXCEL, RTF and HTML using BI/XML Publisher.
  • Developed and customized complex XML reports by communicating and interpreting end-user business requirements.
  • Developed various technical components like SQL, PL/SQL, Procedures, Functions, Packages, views, and materialized views, Collections for Forms, Reports, Interfaces and Conversions.
  • Developed PL/SQL scripts to load data from external system to AP and AR interface tables.
  • Documented all the above-mentioned processes with MD050 and MD070 using AIM methodology

Environment: Hadoop, Hive, Hbase, GIT, Pig, Java, Kafka, Map Reduce, Sqoop, OracleApps EBS R12, SQL, PL/SQL, Oracle Reports 10g, BI/XML Publisher, UNIX, SQL Developer.

Confidential

Java/J2EE Developer

Responsibilities:

  • Implemented Hibernate for the ORM layer in transacting with MySQL database.
  • Experience in event - driven applications using AJAX, Object Oriented JavaScript, JSON and XML.
  • Good knowledge on developing asynchronous applications using Jquery.
  • Good experience with Form Validation by Regular Expression, and jQuery Light box.
  • Involved in design and Development of UI using HTML, JavaScript and CSS.
  • Developed coded, tested, debugged and deployed JSPs and Servlets for the input and output forms on the web browsers.
  • Designed and developed various data gathering forms using HTML, CSS, JavaScript, JSP and Servlets.
  • Developed user interface modules using JSP, Servlets and MVC framework.
  • Experience in implementing of J2EE standards, MVC2 architecture using Struts Framework
  • Developed J2EE components on Eclipse IDE.
  • Used JDBC to invoke Stored Procedures and used JDBC for database connectivity to SQL.
  • Deployed the applications on Tomcat Application Server
  • Created Java Beans accessed from JSPs to transfer data across tiers.
  • Database Modification using SQL, PL/SQL, Stored procedures, triggers, Views in Oracle9i.
  • Experience in going through bug queue, analyzing and fixing bugs, escalation of bugs.
  • Involved in Significant customer interaction resulting in stronger Customer Relationships.
  • Responsible for working with other developers across the globe on implementation of common solutions.
  • Involved in Unit Testing.
  • Design and development of front end modules with business logic.
  • Preparing the Unit Test Cases, Involved in executing the test cases.
  • Involved in Bug Fixing.
  • Attending various conference calls to track project status

Environment: Java, JSP, Servlets, JDBC, Eclipse, Web services, Spring 3.0, Hibernate 3.0, MySQL, JSON, Struts, HTML, JavaScript, CSS

Confidential

Internship and Java/Oracle Apps Developer

Responsibilities:

  • During my Internship I had several software trainings and done couple of POCs on Java/J2EE applications and Oracle Apps. After the Internship I have started working on OCA tool.
  • Understanding the requirements and proposing better ideas where ever necessary.
  • Developed the form personalization.
  • Developed JSP’s and Java Script.
  • Developed PL/SQL packages.
  • Developed AOL objects.
  • Involved in Bug Fixing.
  • Written Oracle packages to make store data in to base tables.
  • Implemented OCA Tool in Invoices, Purchase orders, Sales Orders.
  • Used Oracle APIs to insert the attachments to Oracle database as objects
  • Used Java Applets to call the hardware from the web interface.
  • Developed UI using JSP and CSS.
  • Done the oracle form personalization to call the JSP Page/Scanner from Oracle forms.
  • Created custom objects like categories, data types, LOB objects for all new attachments.
  • Created bundle as software kit for ready to install setups.
  • Created user manual for OCA tool.

Environment: Oracle Apps Financials EBS R12, SQL, PL/SQL, Oracle, UNIX, SQL Developer, Java, JSP, CSS, Applets, Oracle forms.

We'd love your feedback!