We provide IT Staff Augmentation Services!

Hadoop Admin Resume

0/5 (Submit Your Rating)

Richardson, TX

SUMMARY

  • 11+ Years of extensive IT experience with 5 years of experience as a Hadoop Administrator and 6 years of experience as Linux/Network Administrator.
  • Hands on experiences with Hadoop stack. (HDFS, MapReduce, YARN, Sqoop, Flume, Hive - Beeline,Impala, Tez, Pig, Zookeeper, Oozie, Solr, Sentry, Kerberos, Centrify DC, Falcon, Hue, Kafka, and Storm).
  • Experience with Cloudera Hadoop Clusters with CDH 5.6.0 with CM 5.7.0.
  • Experienced on Horton works Hadoop Clusters with HDP 2.4 with Ambari 2.2.
  • Hands on day-to-day operation of the environment, knowledge and deployment experience in Hadoop ecosystem.
  • Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml and hadoop-env.xml based upon the job requirement.
  • Installed, Configured and Maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
  • Strong experience in automating the system administration tasks using Perl, Python and Shell Scripts.
  • Extensive experience in installing, upgrading, maintaining and configuring Red Hat Enterprise Linux on Dell Power edge servers using Custom and kick-start installation, and Sun Solaris Sun Spark servers using Jumpstart installation.
  • Experience in installing, configuring and optimizing Cloudera Hadoop version CDH3, CDH 4.X and CDH 5.X in a Multi Clustered environment.
  • Commissioning and de-commissioning the cluster nodes, Data migration.
  • Also, Involved in setting up DR cluster with BDR replication setup and Implemented Wire encryption for Data at REST.
  • Implemented Security TLS 3 over on all CDH services along with Cloudera Manager.
  • Data Guise Analytics implementation over secured cluster.
  • Blue-Talend integration and Green Plum migration has been successfully implemented.
  • Ability to plan, manage HDFS storage capacity and disk utilization.
  • Assist developers with troubleshooting MapReduce, BI jobs as required.
  • Provide granular ACLs for local file datasets as well as HDFS URIs. Role level ACL Maintenance.
  • Cluster monitoring and troubleshooting using tools such as Cloudera, Ganglia, NagiOS, and Ambari metrics.
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
  • Experience in importing and exporting the logs using Flume.
  • Optimizing performance of Hbase/Hive/Pig jobs.
  • Hands on experience in Zookeeper and ZKFC in managing and configuring in Name Node failure scenarios.
  • Expertise in Automating processes like start server, stop server, etc., using Shell Scripting.
  • Experience in monitoring the availability & performance of RedHat Linux Servers through tools like mpstat, vmstat, iostat, netstat and nfsstat.
  • Good working knowledge of network & UNIX security procedures.
  • Participate in installing and configuring of UNIX/Linux based Oracle 10g products.
  • Experience in backup/restore of PostgreSQL, Informix databases.
  • Excellent Knowledge of Cassandra Architecture, Cassandra data modelling & Monitoring Cassandra using Opscenter.
  • Manage and review HDFS data backups and restores on Production cluster.
  • Excellent knowledge on CQL (Cassandra Query Language), for retrieving the data present in Cassandra cluster by running queries in CQL.
  • Implement new Hadoop infrastructure, OS integration and application installation. Install OS (rhel6, rhel5, centos, and Ubuntu) and Hadoop updates, patches, version upgrades as required.
  • Implement and maintain security LDAP, Kerberos as designed for cluster.
  • Expert in setting up Horton works (HDP2.4) cluster with and without using Ambari2.2
  • Experienced in setting up Cloudera (CDH5.6) cluster using packages as well as parcels Cloudera manager 5.7.0.
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, YARN, Job Tracker, Task Tracker, NameNode, DataNode and MapReduce concepts.
  • Solid understanding of all phases of development using multiple methodologies i.e. Agile with JIRA, Kanban board along with ticketing toolRemedy and Servicenow.
  • Expertise to handle tasks in Red HatLinux includes upgrading RPMS using YUM, kernel, configure SAN Disks, Multipath and LVM file system.
  • Good Working knowledge of LDAP, Active Directory, SSSD, Kerberos and Samba.
  • Experience in installing, configuring and administering Unix Utilities like SUDO.
  • Installation and configuration of httpd, ftp servers, TCP/IP, DHCP, DNS, NFS and NIS.
  • Experience in Understand and Configuring /etc/fstab and /etc/mtab configuration files.
  • Troubleshoot NIS, NFS, DNS and other network issues, Create dump files, backups.
  • Creating and maintaining user accounts, profiles, security, rights, disk space and process monitoring. Handling and generating tickets via the BMCRemedy ticketing tool.
  • Configure UDP, TLS, SSL, HTTPD, HTTPS, FTP, SFTP, SMTP, SSH, Kickstart,Chef, Puppet and PDSH.
  • Overall Strong experience in system Administration, Installation, Upgrading, Patches, Migration, Configuration, Troubleshooting, Security, Backup, Disaster Recovery, Performance monitoring and Fine-tuning on Linux (RHEL) systems.

TECHNICAL SKILLS

Big Data Technologies: HDFS, Hive, MapReduce, Cassandra, Pig, Hcatalog, Sqoop, Flume, Zookeeper, Kafka, Mahout, Oozie, CDH, HDP

Tools: Quality center v11.0\ALM, TOAD, JIRA, HP UFT, Selenium,,Kerberos, JUnit

Programming Languages: Shell Scripting, Puppet, Scripting, Python, Bash, CSH, Java

QA Methodologies: Waterfall, Agile,(TM) V-model.

Front End Technologies: HTML, XHTML, CSS, XML, JavaScript, AJAX, Servlets, JSP

Java Frameworks: MVC, Apache Struts2.0, Spring and Hibernate

Domain Knowledge: GSM, WAP, GPRS, CDMA and UMTS (3G) Web Services SOAP(JAX-WS), WSDL, SOA, Restful(JAX-RS), JMS

Application Servers: Apache Tomcat, Web Logic Server, Web Sphere, JBoss

Databases: Oracle 11g, MySQL, MS SQL Server, IBM DB2 NoSQL Databases HBase, MongoDB, Cassandra

Automation tools: (Jenkins & Ansible).

AWS: EC2 instances, Route53, Subnets. Logical Volume Manager (LVM) & Veritas Volume Manager. LAN/WAN Administration. VMware Fault Tolerance (FT).

Servers Managed: Application servers, Database servers, Web servers, DNS server, NIS, LDAP, NFS, FTP server.

Monitoring Tools: Nagios, Ganglia.

Software/ Applications/ Tools: DBCA, DBUA, Recovery Manager (RMAN), Couchbase, Oracle Enterprise Manager (OEM), OEM Grid Control, ETL, Oracle Data Guard, API, PostgreSQL, Web Logic, Oracle Management Service (OMS),IBM I, Real Application Clusters (RAC), ASM, Data Pump (expdp, impdp), UML diagrams, SQL*Plus, SQL*Loader, Golden Gate, MongoDB.

Languages: UNIX Shell Scripting, Perl Scripting, HTML, Java Scripts, SQL Plus and PL/SQL, PL/SQL, C, C++, ASP.NET, ADO.NET, C#.NET, FORTRAN.

PROFESSIONAL EXPERIENCE

Sr. Hadoop admin

Confidential, St. Louis, MO

Responsibilities:

  • Created Hive tables and worked on them utilizing Hive QL.
  • Developed Spark scripts by using Scala shell commands as per the requirement to read/write JSON files.
  • Analyzed the data by performing Hive queries and running Pig scripts to know client conduct.
  • Used COBOL, so, by migrating or offloading from mainframe to Hadoop.
  • Strong experience working with Apache Hadoop Including creating and debug production level jobs.
  • Installation, configuration, supporting and managing Hortonworks Hadoop cluster.
  • Analyzed Complex Distributed Production deployments and made recommendations to optimize performance.
  • Driven HDP POC's with various lines of Business successfully.
  • Cloudera distribution of MR1 to MR2.
  • Configuration Memory setting for YARN and MRV2.
  • Design and develop Automated Data archival system using Hadoop HDFS. The system has
  • Configurable limit to set archive data limit for efficient usage of disk space in HDFS.
  • Configure Apache Hive tables for Analytic job and also create Hive QL scripts for offline Jobs.
  • Designed Hive tables for partitioning and bucketing based on different use cases.
  • Develop UDF to enhance Apache Pig and Hive features for client specific data filtering Logics.
  • Designed and implemented a stream filtering system on top of Apache Kafka to reduce stream size.
  • Written Kafka Rest API to collect events from Front end.
  • Implemented Apache Ranger Configurations in Hortonworks distribution.
  • Responsible for developing data pipeline using HDInsight, flume, Sqoop and pig to extract the data from weblogs and store in HDFS.
  • Involved in migration of ETL processes from Oracle to Hive to test the easy data manipulation.
  • Managed log files, backups and capacity.
  • Involved in designing various stages of migrating data from RDBMS to Cassandra.
  • Found and troubleshot Hadoop errors.
  • Experience with Cloudera Navigator and Unravel data for Auditing hadoop access.
  • Created Ambari Views for Tez, Hive and HDFS.
  • Gained Hands on experience in analyzing the Cassandra data from flat files using Spark.
  • Architecture and designed Hadoop 30 nodes Innovation Cluster with SQRRL, SPARK, Puppet, HDP 2.2.4.
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive.
  • Managed 350+ Nodes HDP 2.2.4 cluster with 4 petabytes of data using Ambari 2.0 and Linux Cent OS 6.5.
  • Responsible for writing MapReduce programs using Java.
  • Hands on writing Map Reduce code to make unstructured data as structured data and for inserting data into HBase from HDFS.
  • Involved in transforming data from Mainframe tables to HDFS, and HBase tables using Sqoop.
  • Experience in porting existing patches from community to Cassandra nodes.
  • Complete end to end design and development of Apache Nifi flow which acts as the agent between middleware team and EBI team and executes all the actions mentioned above.
  • Created 25+ Linux Bash scripts for users, groups, data distribution, capacity planning, and system monitoring.
  • Setup, configured, and managed security for the Cloudera Hadoop cluster.
  • Upgraded the Hadoop cluster from CDH4.7 to CDH5.2.
  • Supported MapReduce Programs and distributed applications running on the Hadoop cluster.
  • Continuous monitoring and managing EMR cluster through AWS Console.
  • Installing, Upgrading and Managing Hadoop Cluster on Cloudera distribution.
  • Managing and reviewing Hadoop and HBase log files.
  • Deployed Datalake cluster with Hortonworks Ambari on AWS using EC2 and S3.
  • Experience with UNIX or LINUX, including shell scripting.
  • Loading the data from the different Data sources like (Teradata and DB2) into HDFS using Sqoop and load into Hive tables, which are partitioned.
  • Built automated set up for cluster monitoring and issue escalation process.
  • Created HBase tables to store variable data formats of data coming from different portfolios.
  • Developed MapReduce jobs to automate transfer of data from HBase.
  • Administration, installing, upgrading and managing distributions of Hadoop (CDH3, CDH4, Cloudera manager), Hive, Hbase and Hortonworks

Environment: Hive, MR1, MR2, YARN, Pig, HBase Apache Nifi, PL/SQL, Hive, Mahout, Java, Unix Shell scripting, Sqoop, ETL, Ambari 2.0, Linux Cent OS, HBase, MongoDB, Cassandra, Ganglia and Cloudera Manager.

Hadoop admin

Confidential, Richardson, TX

Responsibilities:

  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs, Spark SQL. Coordinated with business customers to gather business requirements.
  • Installing and Configuring Systems for use with Cloudera distribution of Hadoop (consideration given to other variants of Hadoop such as Apache, MapR, Hortonworks, Pivotal, etc.)
  • Migrate huge volumes of data from various semi-structured sources and RDBMs using COBOL mainframe workloads to Hadoop.
  • Writing Scala User-Defined Functions (UDFs) to solve the business requirements.
  • Creating the Case Classes.
  • Working with the Data Frames and RDD's.
  • Install and maintain the Hadoop Cluster and Cloudera Manager Cluster.
  • Manually upgrading and MRV1 installation with Cloudera manager.
  • Importing and exporting data into HDFS from database and vice versa using Sqoop.
  • Responsible for managing data coming from different sources.
  • Worked on analyzing Hadoop cluster and different Big Data analytic tools including Pig, Hbase database and Sqoop.
  • Hands on experience on Hortonworks and Cloudera Hadoop environments.
  • Primarily using Cloudera Manager but some command-line.
  • Developed the Map-Reduce programs and defined the job flows.
  • Integrated NoSQL database like HBase with Map Reduce to move bulk amount of data into HBase.
  • Hands on experience with opens source monitoring tools Ambari and Cloudera Manager.
  • Experience in configuring the Storm in loading the data from MYSQL to HBASE using jms.
  • Load and transform large sets of structured and semi structured data.
  • Collecting and aggregating large amounts of log data using Apache and staging data in HDFS for further analysis.
  • Installed and configured Hadoop Map Reduce, HDFS and Hive, Pig, Sqoop and Oozie on the Hadoop cluster are installed and configured.
  • Analyzed data using Hadoop components Hive and Pig.
  • Worked on evaluating, architecting, installation/setup of Hortonworks 2.1/1.8 Big Data ecosystem which includes Hadoop, Pig, Hive, Sqoop etc.
  • Creating and truncating HBase tables in hue and taking backup of submitter ID.
  • Involved in running Hadoop streaming jobs to process terabytes of data.
  • Gained experience in managing and reviewing Hadoop log files.
  • Having knowledge on Installation and configuration of Cloudera Hadoop on production and development environment.
  • Involved in writing Hive/Impala queries for data analysis to meet the business requirements.
  • Worked on streaming the analyzed data to the existing relational databases using Sqoop for making it available for visualization and report generation by the BI team.
  • Created HBase tables to store variable data formats coming from different portfolios Performed real time analytics on HBase using Java API and Rest API.
  • Involved in creating the workflow to run multiple Hive and Pig jobs, which run independently with time and data availability.
  • Contributed to building hands-on tutorials for the community to learn how to setup Hortonworks Data Platform (powered by Hadoop) and Hortonworks Data flow (powered by Nifi).
  • Designed framework for doing migration from RDBMS to Cassandra.
  • Monitored the health of Map Reduce Programs which are running on the cluster.
  • Developed Spark scripts by using Python as per the requirement.
  • Developed Pig Latin scripts for the analysis of semi structured data.
  • Working in implementing Hadoop with the AWS EC2 system using a few instances in gathering and analyzing data log files.
  • Monitor Hadoop cluster using tools like Nagios, Ganglia, Ambari and Cloudera Manager.
  • Imported data using Sqoop to load data from MySQL to HDFS on regular basis.
  • Assembled Puppet Master, Agent and Database servers on Red Hat Enterprise Linux Platforms.
  • Implementation in Hive and its components and troubleshooting if any issues arise with Hive. Published Hive LLAP in development environment.
  • Designed, Automated the process of installation and configuration of secure DataStax Enterprise Cassandra cluster using puppet.
  • Involved in the process of designing Cassandra Architecture.
  • Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs.
  • Supported/Troubleshoot Map-Reduce programs running on the cluster.
  • Addressed Data Quality Using Informatica Data Quality (IDQ) tool.
  • Experience in the Azure components & APIs.
  • Thorough knowledge on Azure platforms IAAS, PaaS
  • Manage Azure based SaaS environment.
  • Azure Data Lakes and Data Factory.
  • Created and supported one Cassandra cluster with 48 nodes for the inventory purpose.
  • Worked on configuring Hadoop cluster on AWS.
  • Monitored workload, job performance and capacity planning using Cloudera Manager.
  • Used Hive and created Hive tables, loaded data from Local file system to HDFS.
  • Created user accounts and given users the access to the Hadoop cluster.
  • Performed HDFS cluster support and maintenance tasks like adding and removing nodes without any effect to running nodes and data.
  • Experience on Oracle OBIEE.

Environment: Hadoop, HDFS, Map Reduce, Shell Scripting, Spark, Pig, Hive, HBase, Sqoop, Flume, Oozie, Zoo keeper, Red hat Linux, Cloudera Manager, Horton works.

Linux/AWS Administrator

Confidential - Silver Spring, MD

Responsibilities:

  • Experience as Red Hat Enterprise Linux Systems Administrator and performed support role for applications on mission critical Enterprise Networks and Multi-server environment.
  • Experience in Administration, implementation and support of OS RHEL 5/6.
  • Installed SAS 9.3 on Linux 6.2 server in Test & Production Environment.
  • Experience in the kick-start installations, support, configuration and maintenance of RHEL 5.5, and x86-64 Servers under heterogeneous environments.
  • Experience on patches installation, patch upgrades and packages installation on Red Hat Linux servers by using RPM & YUM.
  • Responsible for supporting user base in resolving incidents with accessing the SaaS Compliance archiving application.
  • Built .NET 4.5 applications from Ubuntu Linux Bash.
  • Handled automation and configuration process using Ansible and Chef; used it to maintain company's servers for easy automation.
  • Configure diagnostics, monitoring and analytics on Azure Platform along with scale and resilience for Azure Web sites.
  • Systems administration including Windows Server 2003/2008, Terminal Servers, Citrix XenApp6.0,XenServer 6.0
  • Exported NAS volumes & qtrees, to Linux Servers, & created CIFS shares for windows access.
  • Responsible for Load balancing, parameters and Performance tuning for kernel and kernel module.
  • Extensive experience in Version Control tools like RTC, GIT and Subversion.
  • Used IBM Rational Clear Case as Version Control.
  • Managed Windows virtual servers and Ubuntu Linux (Rackspace and AWS EC2) with Chef using Git.
  • Involved in developing custom scripts using Perl and Shell scripts to automate jobs.
  • Configuration of Hardware and Software RAID on Digital & Sun Servers.
  • Development on Linux platforms via Docker on the AWS instances.
  • Design and implement/customize OpenStack features, fix defects and provide improvements wherever required in Python.
  • Extensively used Unix Scripting, Scheduled PMCMD and PMREP to interact with Informatica Server from command mode.
  • Experience in installation, configuration, tuning, security, backup, recovery and upgrades of IBM I AIX and Linux (RedHat & SuSE).
  • Developing BASH and Python scripts to automate cron jobs and system maintenance. Scheduled cron jobs for job automation.
  • Use Nagios to monitor Linux system activities, set, and configure Apache web servers.
  • Used Informatica and ETL debugger to troubleshoot error.
  • Experienced Python on LINUX for execution.
  • Monitor and communicate the availability of UNIX and Informatica Environment. Follow-up on problem tickets.
  • Implemented SharePoint Disaster Recovery Plan for multiple SharePoint 2010 environment.
  • Developed automation and deployment utilities Devops using Ruby, Bash, Power shell, Python and Run deck.
  • Worked on setting up multiple provider projects using Enterprise OpenShift container platform.
  • Implemented the Chef cookbook SSSD to automate the integration process between RHEL and Windows AD using Kerberos keytab file.
  • Planning and Implementation of Kernel patches on RHEL Servers.
  • Experience in providing day-to-day user administration like adding/deleting users in local and global groups on Red Hat Linux platform and managing user's queries.
  • Created Puppet Master and Agents machines in the production environment for deploying, updating the software in automation process.
  • Installed Apache/Tomcat Server.
  • Implemented RAID and the created disk groups and volumes using VERITAS Volume Manager.
  • Installed VMware vSphere client software, which is a part of vSphere suite on a client machine to monitor the virtual hosts.
  • Performed AIX Migration of AIX 5.3/6.1 to 6.1/7.1 and AIX TL Upgrades. Applied Semi-annual security patches and service packs to IBM AIX servers and RHEL6 servers.
  • Configuring the Docker containers and creating Docker files for various environment.
  • Coordinated with NetApp for upgrades and major issue on the Prod Environment
  • Experience in managing virtual instances and disks using Puppet.
  • Provisioning the Windows and Linux servers in AZURE cloud and implementing the service fabrics, creating the Resource group and assigning the permission to resource group.
  • Knowledge in configuration and managing Linux Virtual Machines under VMware 5.x.
  • Configured distributed file systems and administering NFS server and NFS clients and editing auto-mounting mapping as per system / user requirements.
  • Writing shell scripts in UNIX to automate the ETL process.
  • Extracted files from MongoDB through Sqoop, placed in HDFS, and processed.
  • Performed Network troubleshooting using 'ndd', 'traceroute', 'netstat', 'ifconfig' and 'snoop' etc.
  • Developed automated processes that run daily to check disk usage and perform cleanup of file systems on UNIX environments using shell scripting and CRON.
  • Developed custom OpenShift templates to deploy the applications and to create the Openshift objects build, deployment on figs, services, routes and persistent volumes.
  • Performed administrative activities such as site creations, creating and managing user's permissions, backup and restore SharePoint sites.
  • Assist other team members to troubleshoot Drupal issues and provide bug fixes.
  • Developed on a proprietary CMS platform using MySQL databases and Cold spring IoC framework.
  • Installed and configured Web hosting administration HTTP, FTP, SSH, & RSH.
  • Developed bash shell scripts to automate routine activities.
  • Monitored the hosts and networks using SAR, Iostat, VMstat, MPstat and other tools.
  • Developed UNIX shell scripts using Shell Scripting.

Environment: RHEL 5/6, RPM & YUM Server, HTTP, FTP, SSH, RSH, Cron, UNIX, ETL, Puppet Enterprise 2016.1 .x, IBM I, VERITAS Volume Manager, python, VxFS file system,.NET, Apache/Tomcat Server, PHP, NFS & VMware.

Linux Administrator

Confidential - West Palm Beach, FL

Responsibilities:

  • Administration of RHEL and AS 4, 5, 6, which includes installation, testing, tuning, upgrading and loading patches, troubleshooting both physical and virtual server issues.
  • Installation, configuration, administration of Solaris 9, 10 on SPARC based servers using Jumpstart.
  • Creating, cloning Linux Virtual Machines, templates using VMware Vsphere 4.0 and migrating servers between ESX hosts.
  • Solid UNIX and NetApp storage administrator.
  • Installs Firmware Upgrades, kernel patches, systems configuration, performance tuning on Unix/Linux systems.
  • Hands-on experience on Ansible and Ansible Tower as configuration management tool to automate repetitive tasks, manage changes and quickly deploys critical applications.
  • Designing the deployment and migration plan for the Red Hat Enterprise Virtualization environment and Red Hat Enterprise Open Stack.
  • Develop, debug and implement programs and/or scripts (Shell & Perl) in support of customer installations.
  • Reduced the imaging of an application from 8 months to 15 days through Devops.
  • The operation and support of Openshift Enterprise and Docker Cloud services.
  • Worked Open Stack Databases like MySQL, Maria DB.
  • Developed an application that would allow transfer of log files from Linux computer to Linux server using C++ multithreading environment.
  • Maintained and Administered Rational Clear case, RTC/GIT SCM tools.
  • Experience in support of a Citrix XenApp 6.0 environments (including hands-on experience architecting/engineering and supporting implementations of Citrix).
  • Provide support and maintenance of all corporate databases (Informix/DB2/MySQL/SQL Server).
  • Worked on configuring LDAPs, CA Siteminder, open Id Connect, OAuth with CA API Gateway and CA Siteminder.
  • Implemented multiprotocol accessing in NAS shares (ntfs security style share accessing in UNIX & vice versa).
  • Working on Volume management, Disk Management, software RAID solutions using VERITAS.
  • Designed the MongoDB application to get the feed from other oracle databases.
  • Installing Red Hat Linux using kickstart and applying security polices for hardening the server based on the company policies.
  • Possess experience on OpenSSL/Keytool (JKS/PKI) to create and install web/Java code signing based SSL certificates to secure web applications and networks.
  • Extensively used Hibernate Query Language (HQL) and Criteria based queries to work with Oracle databases.
  • Worked on automation of processes using Crontab and shell scripting using Bash.
  • Responsible for redesigning security and authentication method moving away from LDAP based authentication to PAM based authentication.
  • Configured and deployed Nagios Monitoring system for managing all the Linux warehouse Systems.
  • Designed and implemented User Directory changes from LDAP to AD.
  • Pro-actively identify, troubleshoot and resolve live MongoDB issues.
  • Work as project administrator in the Version 1 agile management tool.
  • Installed and Configured Veritas Symantec NetBackup and Veritas Cluster Suite on the UNIX Servers.
  • Used Docker for local application deployment and test and did orchestrating between container using Docker swarm and writing Docker files.
  • Built a deployment pipeline for deploying tagged versions of applications to AWS beanstalk using Jenkins CI.
  • Experience in working with UNIX Shell Scripts for automatically running sessions, and creating parameter files to run various batch jobs.
  • Implemented Puppet modules to automate configuration of a various services and deployment of various application.
  • Analyze and resolve compilation and deployment errors related to code development, branching, merging and building of source code repository using GIT version control.
  • Created the LDAP scripts, which monitors the LDAP connectivity and alerts the Admin Group if connection is closed.
  • Developed an API for managing monitoring and alerting hardware resources in Linux OS environment using C#, c++, mono framework and integration with various Linux modules and protocols.
  • Virtual Machine Backup and Recover from a Recovery Services Vault using Azure PowerShell and Portal.
  • Managing systems routine backup, scheduling jobs like disabling and enabling cron jobs, enabling system logging, network logging of servers for maintenance, performance tuning, testing.
  • Set up Kerberos locally on five node POC cluster using Ambari and evaluated the performance of cluster, did impact analysis of Kerberos enablement.
  • Installation and deployment of a Red Hat Network Satellite Server 5.4.1.
  • Decommissioning of the old servers and keeping track or decommissioned and new servers using inventory list.
  • Responsible for resolving GPOS transactional errors according to documented standard operating procedure.
  • Veritas Installation and Configuration of the Veritas Cluster on the UNIX Servers.
  • Troubleshoot various systems problems such as application related issues, network related issues, hardware related issues etc.
  • Build new Virtual machine and physical server.
  • Managing the VMWare ESX and ESXi server 5.5, 5.1, 4.0.0 and 3.5.0 based on requirement.
  • Troubleshooting Linux network, security related issues, capturing packets using tools such as IPtables, firewall, TCP wrappers, NMAP.
  • Preparing servers for Oracle RAC installation which includes tuning the kernel, agent installation, adding NAS storage on 2, 3, 4 node clusters.
  • Designing Firewall rules for new servers to enable communication with application, Oracle 10g servers.
  • Experienced in Troubleshooting critical hardware and software issues and other day-to-day user trouble tickets.
  • Developed automated processes that run daily to check disk usage and perform clean up of file systems on UNIX environments using shell scripting and CRON.

Environment: Red Hat Linux 5/6, KVM, Acronis, VMWare, Couchbase, ESX, Solaris, Sun Enterprise Servers, UNIX,SUN FIRE 6800/E6500/E4500, Sun Sparc1000, IBM RS/6000, Disk Suite, POP, Veritas Volume Manager, LDAP, DNS, NIS, NIS +, SNMP, Shell scripting, SENDMAIL, Informix/DB2/MySQL/SQL Server, Apache, Puppet, WebSphere, Sun and Veritas Clusters.

We'd love your feedback!