We provide IT Staff Augmentation Services!

Hadoop Consultant Resume

4.00/5 (Submit Your Rating)

Tulsa, OklahomA

SUMMARY

  • Over seven years of comprehensive IT experience in BigData and BigData Analytics, Banking, Insurance, and Energy.
  • Substantial experience writing MapReduce jobs in Java,Pig and Python.
  • Experience in working with Java, C++ and C.
  • Hands - on experience with Productionalizing Hadoop applications such as administration, configuration management, monitoring, debugging, and performance tuning.
  • Hands on experience in installing, configuring and using ecosystem components like Hadoop MapReduce,HDFS,Hbase,Zoo Keeper,Oozie,Hive,Cassandra,Sqoop,Pig, Flume.
  • Extensive experience in SQL and NoSQL development.
  • In-depth understanding of Data Structure and Algorithms.
  • Knowledge of Mahout.
  • Experience in creating web-based applications using ActiveXControls,JSP,Servlets.
  • Experience in implementation of Open-Source frameworks like Struts,Spring,Hibernate,Web Services etc.
  • Experience in deploying applications in heterogeneous Application Servers TOMCAT,Web Logic and Oracle Application Server.
  • Worked on Multi Clustered environment and setting up Cloudera Hadoop echo System.
  • Background with traditional databases such as Oracle,Teradata,SQL Server,ETL tools / processes and data warehousing architectures.
  • Extensive experience in designing analytical/OLAP and transactional/OLTP databases.
  • Proficient using ERwin to design backend data models and entity relationship diagrams (ERDs) for star schemas, snowflake dimensions and fact tables.
  • Ability to perform at a high level, meet deadlines, adaptable to ever changing priorities.

TECHNICAL SKILLS

Big Data Eco System: Hadoop,MapReduce,HDFS,HBase,Zookeeper,Hive,Pig,SqoopCassandra,Oozie,Flume

Programming Languages: Java,J2EE(JDBC, Servlets, JSP,Multi Threading,)

Framework: Struts,Springs,Hibernate

Web Technologies: HTML,DHTML,JavaScript,Ajax,CSS,XML,DTD

Web Services: Apache CXF/XFire,Apache Axis,SOAP

Testing/Logging Tools: JUnit,EasyMock,JMock,log4Js

Database: Oracle 11g/10g,DB2,MySQL,SQL Server,Teradata

Application Server: Apache Tomcat,Jboss,Web Sphere,Web Logic

Tools: ANT,TOAD

Operating System: Windows XP/Vista/7,Unix,Linux

PROFESSIONAL EXPERIENCE

Hadoop Consultant

Confidential, Tulsa, Oklahoma

Responsibilities:

  • Responsible for complete SDLC management using different methodologies like Agile, Incremental,Waterfall,etc
  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experienced in managing and reviewing Hadoop log files.
  • Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Supported Map Reduce Programs those are running on the cluster.
  • Jobs management using Fair scheduler.
  • Cluster coordination services through Zoo Keeper.
  • Involved in loading data from UNIX file system to HDFS
  • Installed and configured Hive and also written Hive UDFs.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Automated all the jobs for pulling data from FTP server to load data into Hive tables using Oozie workflows.
  • Data scrubbing and processing with Oozie.

Environment: Hadoop, MapReduce,HDFS,Hive,Ooozie,Java (jdk1.6),Hadoop distribution of HortonWorks, Cloudera,MapR,Oracle 11g/10g,PL/SQL,SQL*PLUS,Toad 9.6,Windows NT,UNIX Shell Scripting.

Hadoop Consultant

Confidential, Richmond,Virginia

Responsibilities:

  • Extensive experience in Hadoop map reduce as Programmer Analyst in business requirement gathering,analysis,scoping,documentation,designing,developing and creating Test Cases.
  • Developed big data analytic models for customer fraud transaction pattern detection models using Hive from customer transaction data.It also involved transaction sequence analysis with gaps and no gaps,network analysis between common customers for the top fraud patterns.
  • Developed customer transaction event path tree extraction model using Hive from customer transaction data.
  • Enhanced and optimised the customer path tree GUI viewer to incrementally load the tree data from HBase NoSQL database.Used prefuse opensource java framework for the GUI.
  • Design and implement Map/Reduce jobs to support distributed data processing.
  • Process large data sets utilizing our Hadoop cluster.
  • Designing NoSQL schemas in Hbase.
  • Developing map-reduce ETL in Java/Pig.
  • Extensive data validation using HIVE.
  • Importing and exporting the data using Sqoop from HDFS to Relational Database systems and vice-versa.

Environment: Eclipse IDE, Linux, Hadoop Map Reduce, Pig Latin, Sqoop, Java, Hive, Hbase, Unix Shell Scripting.

Java/J2EE Developer

Confidential, Cleveland,Ohio

Responsibilities:

  • Involved indesigning&documentationofflows&functionaldiagrams.
  • Responsible for writingAJAX functions usingJavaScriptandCSSinHTML.
  • UsedJavaScriptcode,HTML markup andCSSstyle declarations to enrich websites.
  • Involved in creating and extracting data from database usingSQL Queries, Stored procedures, triggers, and packagesonOracledatabase.
  • Implemented the application usingStruts Frameworkwhich is based onMVCdesign pattern.
  • UsedHibernatefor Object relational Mapping and usedSpring AOPfor Connection Management and Transaction Management.
  • UsedStrutsValidation Framework and JavaScript for server and client side validation.
  • Deployment process of the application on theWeb sphereapplication servers.
  • UsedClear questas bug tracking system. Extracted Logging errors byLog4j.
  • Written Test Cases for Unit Level Testing usingJUnit.
  • Extensive usage ofANTbuilds process for the delivery of the end product.
  • UsedApache CXFas theWeb Services frameworkto implement theREST APIsinvolved.
  • Involved in testing, debugging complete flow of the modules.

Environment: J2EE,JDBC API,XML,HTML,AJAX,CSS,JavaScript,Servlet 2.3,Jsp,Spring AOP,Struts 2.0,Hibernate,Clear Quest,Web sphere,DB2,SQL,JUnit,ANT,Log4j,Eclipse 3.1,Unix.

Java/J2EE Developer

Confidential, Neenah,WI

Responsibilities:

  • Involved in various phases of Software Development such as modeling, system analysis and design, code generation and testing using AGILE Methodology
  • Participated in daily Stand up meetings with Scrum Master.
  • Designed and developed web interface in J2EE Framework using Struts Framework (MVC Design pattern), J2EE, Servlets, JavaBeans, JDBC, SWING, HTML, DHTML and XML as per Use Case specification.
  • Produced visual models of the system by generating UML use-case diagrams from the requirements.
  • Designed, developed and deployed application using Eclipse and Tomcat application Server.
  • Classes are designed by using Object oriented Design(OOD) concepts like encapsulation, inheritance etc
  • CreatedCustom Tagsto reuse the common functionality.
  • Preparation and review of Test Cases for the module using the user requirement documents.
  • Extensively used CVS Version Control to maintain the Source Code.
  • Involved in testing the module as per user requirements.

Environment: Java/J2EE,JSP 1.2,Hibernate, Eclipse 2.0,WAS,Struts 1.2,Struts Tiles 1.x

Oracle Developer

Confidential

Responsibilities:

  • Worked with Data Modeling both Physical and Logical Design.
  • Developed operational plan for ETL, Data Loading and data cleaning process and wrote scripts for automation using shell scripting.
  • Backend Programming using Oracle 10g/9i.
  • Wrote stored procedures, functions and other PL/SQL blocks for automations in processing and loading data in to tables.
  • Wrote Unix Shell script to monitors Oracle instance performance, tablespace, users and objects, and send email to pager automatically.
  • Involved in designing database,modeling database and maintaining database.
  • Creation of schema objects through Erwin tool, transferring data from non-Oracle platforms to Oracle database using SQL * Loader.
  • Involved in Monitoring,Tuning,auditing users,assigning roles and privileges,backup and recovery.
  • Involved in partition designing and table space and table partitioning for data warehousing purpose.
  • Developed shell scripts for the execution of different module procedures for batch processing.

Environment: Oracle 9i,Oracle 10g,Erwin,SQL Loader,TOAD,MS Visio,UNIX Shell Scripting, Sun Solaris/HP-UX / NT

We'd love your feedback!