We provide IT Staff Augmentation Services!

Hadoop Consultant Resume

0/5 (Submit Your Rating)

La, CA

SUMMARY:

  • 7+ years of overall experience in Systems Administration and Enterprise Application Development in diverse industries which includes hands on experience in Big data ecosystem related technologies
  • Including 2.5 years of comprehensive experience as a Hadoop, Big Data & Analytics Developer
  • Experienced in processing Big data on the Apache Hadoop framework using MapReduce programs
  • Experienced in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Cloudera distributions and AWS
  • Experienced with Yarn while dealing with interactive SQL, real - time streaming, data science and batch processing to handle data stored in a single platform.
  • Experienced in using Pig, Hive, Scoop, Oozie, ZooKeeper, HBase and Cloudera Manager
  • Imported and exported data using Sqoop from HDFS to RDBMS
  • Application development using Java, RDBMS, and Linux shell scripting
  • Extended Hive and Pig core functionality by writing custom UDFs
  • Experienced in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java
  • Familiar with Java virtual machine (JVM) and multi-threaded processing
  • Worked on NoSQL databases including HBase, Cassandra and MongoDB
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Experienced in designing, developing and implementing connectivity products that allow efficient exchange of data between the core database engine and the Hadoop ecosystem
  • Experienced in Data warehousing and using ETL tools like Informatica and Pentaho
  • Expert level skills in developing intranet/internet application using JAVA/J2EE technologies which includes Struts framework, MVC design Patterns, Chrodiant, Servlets, JSP, JSLT, XML/XLST, Java Script, AJAX, EJB, JDBC, JMS, JNDI, RDMS, SOAP, Hibernate and custom tag Libraries
  • Experience using XML, XSD and XSLT
  • Experience with web-based UI development using jQuery UI, jQuery, ExtJS, CSS, HTML, HTML5, XHTML and JavaScript
  • Extensive experience in middle-tier development using J2EE technologies like JDBC, JNDI, JSP, Servlets, JSP, JSF, Struts, Spring, Hibernate, JDBC, EJB
  • Possess excellent technical skills, consistently outperformed schedules and acquired interpersonal and communication skills

TECHNICAL SKILLS:

Hadoop/Big Data: HDFS, Mapreduce, HBase, Pig, Hive, Sqoop, Flume, MongoDB, Cassandra, Power pivot, Puppet, Oozie, Zookeeper

Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans

IDE s: Eclipse, Net beans

Big data Analytics: Datameer 2.0.5

Frameworks: MVC, Struts, Hibernate, Spring

Programming languages: C, C++, Java, Python, Ant scripts, Linux shell scripts

Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server

Web Servers: Web Logic, Web Sphere, Apache Tomcat

Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL

Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP

ETL Tools: Informatica, Pentaho

Testing: Win Runner, Load Runner, QTP

PROFESSIONAL EXPERIENCE:

Confidential, LA, CA

Hadoop Consultant

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, managing and reviewing data backups and Hadoop log files
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager
  • Upgraded the Hadoop Cluster from CDH3 to CDH4, setting up High availability Cluster and integrating HIVE with existing applications
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted data from Teradata into HDFS using Sqoop
  • Worked extensively with Sqoop for importing metadata from Oracle
  • Configured Sqoop and developed scripts to extract data from MySQL into HDFS
  • Hands-on experience with productionalizing Hadoop applications viz. administration, configuration management, monitoring, debugging and performance tuning
  • Created Hbase tables to store various data formats of PII data coming from different portfolios
  • Cluster co-ordination services through ZooKeeper

Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, Cloudera Manager, Pig, Sqoop, Oozie, ZooKeeper, Teradata, PL/SQL, MySQL, Windows, Oozie, HBase

Confidential, Edison, NJ

Hadoop and Java Developer

Responsibilities:

  • Installed and configured Hadoop MapReduce, HDFS and developed multiple MapReduce jobs in Java for data cleansing and preprocessing
  • Importing and exporting data into HDFS and Hive using Sqoop
  • Used Multithreading, synchronization, caching and memory management
  • Used JAVA, J2EE application development skills with Object Oriented Analysis and extensively involved throughout Software Development Life Cycle (SDLC)
  • Proactively monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures
  • Extracted files from CouchDB through Sqoop and placed in HDFS and processed
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS
  • Load and transform large sets of structured, semi structured and unstructured data
  • Supported Map Reduce Programs those are running on the cluster
  • Wrote shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions
  • Involved in loading data from UNIX file system to HDFS, configuring Hive and writing Hive UDFs
  • Utilized Java and MySQL from day to day to debug and fix issues with client processes
  • Managed and reviewed log files
  • Implemented partitioning, dynamic partitions and buckets in HIVE

Environment: Hadoop, MapReduce, HDFS, Hive, CouchDB, Flume, Oracle 11g, Java, Struts, Servlets, HTML, XML, SQL, J2EE, JUnit, Tomcat 6. Java, JDBC, JNDI, Struts, Maven, SQL language, Oracle, XML, Eclipse

Confidential, Richmond, VA

Hadoop Consultant

Responsibilities:

  • Responsible for maintaining Hadoop clusters
  • Used Pig Scripting to transfer data from Hive tables to HBase via stage tables
  • Extracted data from HBase and into HBase. Used Impala to query HBase tables
  • Monitored Hadoop cluster using tools like Nagios, Ganglia and Cloudera Manager
  • Automation script to monitor HDFS and HBase through Cron jobs
  • Implemented massive amounts of marketing information, complete with information enrichment, text analytics, and natural language processing
  • Prepared multi-cluster test harness to exercise the system for performance and failover
  • Developed high-performance cache and improved its performance
  • Administrative support for parallel computation research on a 24-node Fedora/ Linux cluster
  • Built and supported standard-based infrastructure capable of supporting tens of thousands of computers in multiple locations

Environment: Integrate Kickstart, Kerberos, and LDAP with Yahoo's internal components as well as the open source technologies (HADOOP) offered by Grid Services

Confidential, Roanoke, VA

Data Warehouse/BI Analyst

Responsibilities:

  • Created Enterprise Business Intelligence reports and worked as Dashboard designer in a Data Warehouse environment
  • Facilitated analyses of analytical business needs and translated them into system requirements and solution definitions
  • Developed and managed metrics, KPIs and reports based on user needs
  • Experienced with data dictionaries, data analysis, data marts and relational databases
  • Worked in creating data warehouse solutions, creating multidimensional models, logical and physical model design, data types, staging process and different data warehouse design approaches (top-down, bottom-up) and schemas (Star, Snow Flakes, etc.)
  • Hadoop Lead ETL/ELT solution design and development using various industry standard or custom built tools
  • Strong experience in designing and developing large scale and optimized Data Warehouses using Teradata, Oracle
  • In-depth hands on experience in database, ETL/ELT design, development and excellent data analysis skills
  • Experience in developing DW solutions for Sales, Operations, Support and Finance
  • Proficient with SQL language and reasonable understanding of Informatica and Pentaho
  • Experience with OLAP products and metadata tools on the market

Environment: Integrate Kickstart, Kerberos, and LDAP with Yahoo's internal components as well as the open source technologies (HADOOP) offered by Grid Services

Confidential, Chicago, IL

Sr. JAVA Developer

Responsibilities:

  • Developed the application using Struts Framework that leverages classical Model View Layer (MVC) architecture UML diagrams like use cases, class diagrams, interaction diagrams (sequence and collaboration) and activity diagrams were used
  • Worked in an Agile work environment with Content Management system for workflow management and content versioning
  • Involved in designing user screens and validations using HTML, jQuery, Ext JS and JSP as per user requirements
  • Responsible for validation of Client interface JSP pages using Struts form validations
  • Integrating Struts with Spring IOC
  • Used Spring Dependency Injection properties to provide loose-coupling between layers
  • Implemented the Web Service client for the login authentication, credit reports and applicant information using Apache Axis 2 Web Service
  • UsedHibernateORM framework withSpringframework for data persistence and transaction management
  • Used Hibernate 3.0 object relational data mapping framework to persist and retrieve the data from database
  • Wrote SQL queries, stored procedures, and triggers to perform back-end database operations
  • Developed ANT Scripts to do compilation, packaging and deployment in the WebSphere server
  • Implemented the logging mechanism using Log4j framework
  • Wrote test cases in JUnit for unit testing of classes

Environment: JDK 1.5, J2EE 1.4,Agile Development Process, Struts 1.3, Spring 2.0, Web Services (JAX-WS, Axis 2) Hibernate 3.0, RSA, JMS, JSP, Servlets 2.5, WebSphere 6.1, SQL Server 2005, Windows XP, HTML, XML, IBM Rational Application Developer (RAD), ANT 1.6, Log4J, XML, XSLT, XSD, jQuery, JavaScript, Ext JS, JUnit 3.8, SVN

Confidential

Junior JAVA Developer

Responsibilities:

  • Involved in Design, Development and Support phases of Software Development Life Cycle (SDLC)
  • Reviewed the functional, design, source code and test specifications
  • Involved in developing the complete front end development using Java Script and CSS
  • Author for Functional, Design and Test Specifications
  • Implemented Backend, Configuration DAO, XML generation modules of DIS
  • Analyzed, designed and developed the component
  • Used JDBC for database access
  • Used Data Transfer Object (DTO) design patterns
  • Unit testing and rigorous integration testing of the whole application
  • Written and executed the Test Scripts using JUNIT
  • Actively involved in system testing
  • Developed XML parsing tool for regression testing
  • Prepared the Installation, Customer guide and Configuration document which were delivered to the customer along with the product

Environment: Java, JavaScript, HTML, CSS, JDK 1.5.1, JDBC, Oracle10g, XML, XSL, Solaris and UML

We'd love your feedback!