We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

0/5 (Submit Your Rating)

SUMMARY

  • 5 Years of diverse experience in Software Engineering and Administration.
  • 2+ years of Experience in Hadoop Administration with globally best client.
  • 3 Years of Experience in Linux Administration. - with Confidential .
  • 3 Year of Automation Experience in Perl, Shell,Unix.
  • 7 Years of Software development experience in Java,C,C++ and Linux
  • Experience in installation, configuration and management of Hadoop Clusters
  • Experience Hortonworks HDP 1.3 to HDP 2.4 distributions
  • Experience in using Ambari for tracking cluster utilization defining data lifecycle rules
  • Good Knowledge of deploying Hadoop2 cluster on EC2 cloud service by AWS.
  • In depth knowledge on functionalities of every Hadoop daemon, interaction between them, resource utilizations and dynamic tuning to make cluster available and efficient
  • Experience in providing security for Hadoop Cluster with Kerberos
  • Experience in providing security for Hadoop Cluster with SSL.
  • Experience in creating job pools, assigning users to pools and restricting production job submissions based on pool
  • Experience in setting up the monitoring tools such as Nagios and Ganglia to monitor and analyze the functioning of cluster.
  • Experience in setting up and managing Hive, Oozie, Spark .
  • Good understanding of NoSQL databases such as Hbase and Cassandra
  • Experience in analyzing data on HDFS through MapReduce, Hive and Pig
  • Extensive experience with ETL and Query big data tools like Pig Latin and Hive QL
  • Experience in setting up workflows and scheduling the workflows using Oozie
  • Experience on UNIX commands and Shell Scripting
  • Excellent interpersonal, communication, documentation and presentation skills
  • Strong experience in interacting with business analysts and developers to analyze the user requirements, functional specifications and system specifications.

TECHNICAL SKILLS

Hadoop/Big Data platform: HDFS, MapReduce, Hbase, Cassandra, Hive, Pig, Oozie, Zookeeper, Flume, Sqoop,Spark, Storm

Hadoop distribution: Cloudera, Horton Works

Admin operations: Access control, Cluster maintenance, Performance tuning, Storage capacity management

Programming Languages: C,C++, Java, Pig Latin

Web Development Tools: VB Script

Operating Systems: Windows Series, HP Unix, Linux (RHCE), IBM AIX, Ubuntu, CentOS

Databases: MYSQL, Hbase, Cassandra

Scripting Languages: Perl, Shell, Python

PROFESSIONAL EXPERIENCE

Confidential

Hadoop Administrator

Responsibilities:

  • Worked on HDFS commands, HA with QJM and NFS.
  • HDFS architecture, permissions and quotas, HFTP.
  • Worked on HDP rolling upgrades.
  • Well versed with Yarn architecture, schedulers, Resource manager HA.
  • I have created and setup HDP 2.2 hadoop production clusters single handedly (For our client)
  • Worked on implementation Kerberos on Hadoop cluster.
  • Configuration of SSL on Hadoop clusters
  • Responsible for building scalable distributed data solutions using HadoopCloudera works
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
  • Worked on Ambari metrics service in embedded mode.
  • Enabled HA for Namenode, Resource Manager and Hive Metastore.
  • Monitored Hadoop cluster job performance and capacity planning.
  • Monitored and reviewed Hadoop log files.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster
  • Monitoring and troubleshooting, and review Hadoop log files.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Installed Oozie workflow engine to run multiple Hive.
  • Integration of Impala and hive with Tableau .
  • Performance tuning of Impala jobs and resource management in cluster.
  • Configured and Installed Spark and oozie, storm.

Environment: MapReduce, HDFS, Hive, SQL, Oozie, Sqoop, UNIX Shell Scripting, Yarn,Spark.

Confidential

Linux Administrator

Responsibilities:

  • Worked on installation, configuration of Linux clusters.
  • Performed network OS installation
  • Configuring and scheduling tasks using cron
  • Attach system to a network directory service (NIS and LDAP)
  • Configure Autofs
  • Add and manage users, groups, quotas, File access control lists
  • Configuration and management of RPM using YUM repository
  • Updating kernel packages
  • Modify the boot-loader and booting with kernel options
  • Configuration of RAID and network bonding
  • Capable of configuring network service like HTTP(S),SMB,NFS,Web Proxy,SMTP,IMAP,SSH,DNS,DHCP,NTP
  • Installed and configured monitoring services like nagios and ganglia on linux and Hadoop clusters.

Environment: Red Hat Linux, IBM AIX, Ubuntu, CentOS, Windows, .

We'd love your feedback!