Hadoop Developer Resume
TX
OBJECTIVE:
- To be associated with a progressive organization that provides an opportunity to apply my knowledge and skills. Working with new emerging technologies to take pace of fast changing IT world for business enhancement.
SUMMARY:
- Having a total work experience of 8 years in software industry as Senior Software Engineer. I seek a position to utilize my skills and abilities in the Information Technology Industry that offers professional growth while being resourceful, innovative and flexible.
- Around 8 years of total experience in IT
- 1.5 years in Hadoop and Big Data platform.
- Good exposure to various Hadoop distributions like Apache Hadoop, Cloudera Enterprise Hadoop.
- Responsible for transformation of business functionality from EDW (Oracle, Terdata, Netezza)/ Mainframe environment to Hadoop Environment.
- Experience on Data migration from MySQL/DB2/Oracle/Teradata/MS SQL Server to HDFS, and Hive using Sqoop.
- Strong in Hive Queries and built Hive UDFs
- Experience on development of Map Reduce Programs and Pig scripts.
- Developed work flows in Oozie to automate the tasks of loading the data into HDFS, Map Reduce/Pig/Sqoop/Hive/Emailing jobs.
- Experienced in onsite and offshore project delivery model activities, including SOW initiation, requirements gathering, effort estimation, project tracking, preparing project planning documentation, scheduling, project execution and delivery.
- Experienced in onsite customer interaction, onsite - offshore project co-ordination by maintaining excellent customer satisfaction.
- Experience in the root cause analysis for many critical client issues and provided permanent fixes with innovative technical approach.
- Possess strong interpersonal and communication skills.
TECHNICAL SKILLS:
Technology Platforms: Apache Hadoop, Cloudera Hadoop
Big Data Tools: CDH 5.4.2, Hadoop 2.7.0, HDFS, Map Reduce/YARN,Pig 0.14.0, Hive 1.0, Hbase 1.1,Sqoop 1.4.5, Oozie 4.3.0, Zookeeper, Flume-ng 1.5.1,Hue 3.7.0, Java 1.7, Linux Ubuntu 14.04,Tableau 7.2, MySQL 5.6, Impala 1.3, JSON, AVRO, Hcatalog. ORC
Big Data Testing Tools: MRUnit, PigUnit
Programming Languages: Java, PL\SQL, Linux Shell Scripting, C, Lotus Script, Formula Language
Databases: SQLServer2000, MySQL, DB2, Oracle, Teradata, Hbase
Dev & Build Tools: Eclipse, Maven, SVN, Ant, GitHub.
Operating Systems: Windows, UNIX, Linux (Ubuntu, CentOS, Red Hat)
Domains: Banking, Insurance, Storage Devices.
Groupware: Lotus Notes R6.5.5, R7.0.3, R8.5.3 Powered by Domino
Tools: Quest tool for Notes to SharePoint Migration
PROFESSIONAL EXPERIENCE:
Confidential, TX
Hadoop Developer
Responsibilities:
- Involved in defining job flows, managing and reviewing log files.
- Supported Map Reduce Programs those are running on the cluster.
- As a Big Data Developer, implemented solutions for ingesting data from various sources and processing the Data-at-Rest utilizing Big Data technologies such as Hadoop, MapReduce Frameworks, HBase, Hive, Oozie, Flume, Sqoop etc.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Imported Bulk Data into HBase Using Map Reduce programs.
- Developed and written Apache PIG scripts and HIVE scripts to process the HDFS data.
- Perform analytics on Time Series Data exists in HBase using HBase API.
- Designed and implemented Incremental Imports into Hive tables.
- Involved in Unit testing and delivered Unit test plans and results documents using Junit and MRUnit.
- Involvedin collecting, aggregating and moving data from servers to HDFS using Apache Flume.
- Written Hive jobs to parse the logs and structure them in tabular format to facilitate effective querying on the log data.
- Wrote multiple java programs to pull data from Hbase.
- Involved with File Processing using Pig Latin.
- Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
- Experience in optimization of Map reduce algorithm using combiners and partitions to deliver the best results and worked on Application performance optimization for a HDFS cluster.
- Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other sources.
- Worked on debugging, performance tuning of Hive & Pig Jobs
- Used Hive to find correlations between customer's browser logs in different sites and analyzed them to build risk profile for such sites.
- Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.
Environment: Java, Hadoop 2.1.0, Map Reduce2, Pig 0.12.0, Hive 0.13.0, Linux, Sqoop 1.4.2, Flume 1.3.1, Eclipse, AWS EC2, and Cloudera CDH 4.
Confidential, Pasadena, CA
Hadoop Developer
Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS, Developed multiple Map Reduce jobs in Java for data cleaning and preprocessing.
- Experience in installing, configuring and using Hadoop Ecosystem components.
- Experience in Importing and exporting data into HDFS and Hive using Sqoop.
- Load and transform large sets of structured, semi structured and unstructured data.
- Worked on different file formats like Sequence files, XML files and Map files using Map Reduce Programs.
- Responsible for managing data coming from different sources.
- Experience in working with storm.
- Continuous monitoring and managing the Hadoop cluster using Cloudera Manager.
- Strong expertise on MapReduce programming model with XML, JSON, CSV file formats.
- Gained good experience with NOSQL database.
- Involved in creating Hive tables, loading with data and writing hive queries, which will run internally in map, reduce way.
- Responsible for building scalable distributed data solutions using Hadoop.
- Involvedin collecting, aggregating and moving data from servers to HDFS using Apache Flume.
- Experience in managing and reviewing Hadoop log files.
- Involved in loading data from LINUX file system to HDFS.
- Implemented test scripts to support test driven development and continuous integration.
- Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
- Worked on tuning the performance Pig queries.
- Mentored analyst and test team for writing Hive Queries.
- Installed Oozie workflow engine to run multiple MapReduce jobs.
- Implemented working with different sources using Multi Input formats using Generic and Object Writable.
- Cluster co-ordination services through Zookeeper.
- Extensive Working knowledge of partitioned table, UDFs, performance tuning, compression-related properties, thrift server in Hive.
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
- Worked with the Data Science team to gather requirements for various data mining projects.
Environment: Cloudera CDH 4, HDFS, Hadoop 2.2.0 (Yarn), Flume 1.5.2, Eclipse, Map Reduce, Hive 1.1.0, Pig Latin 0.14.0, Java, SQL, Sqoop 1.4.6, Centos, Zookeeper 3.5.0 and NOSQL database.
Confidential
Lotus Notes Consultant
Responsibilities:
- Dedicated single resource in production support for existing Lotus Notes applications by resolving over 3000 support tickets for various applications.
- Developing new client applications by gathering requirements from end users and clients by preparing statement of work, functional design document, technical design document, coding, test cases, black box testing, user acceptance testing, implementation and post implementation verification.
- Implementing changes to existing applications by gathering required changes from the end users and database owners. Moving these changes to production without effecting existing database functionality and with zero server down time.
- Migrated over 50 team room databases and other custom databases to SharePoint lists using Quest tool.
- Archived the migrated databases and revoking access the existing users.
Environment: Lotus Notes & Domino Server 8.5.3 and Windows 7
Confidential
Lotus Notes Application Specialist
Responsibilities:
- Dedicated single resource in production support for existing Lotus Notes applications by resolving over 3000 support tickets for various applications.
- Developing new client applications by gathering requirements from end users and clients by preparing statement of work, functional design document, technical design document, coding, test cases, black box testing, user acceptance testing, implementation and post implementation verification.
- Implementing changes to existing applications by gathering required changes from the end users and database owners. Moving these changes to production without effecting existing database functionality and with zero server down time.
- Migrated over 50 team room databases and other custom databases to SharePoint lists using Quest tool.
- Archived the migrated databases and revoking access the existing users.
Environment: Lotus Notes & Domino Server 8.5.3 and Windows 7
Confidential, Wilmington, DE
Lotus Notes Application Developer
Responsibilities:
- Involved in production support for existing Lotus Notes applications by resolving over 1000 support tickets in various applications.
- Developing new client applications by gathering requirements from end users and clients by preparing statement of work, functional design document, technical design document, coding, test cases, black box testing, user acceptance testing, implementation and post implementation verification.
- Implementing changes to existing applications by gathering required changes from the end users and database owners. Moving these changes to production without effecting existing database functionality and with zero server down time.
- Migrated over 30 team room databases and other custom databases to SharePoint lists using Quest tool.
- Archived the migrated databases and revoking access the existing users.
Environment: Lotus Notes & Domino Server R7.0.3 and Windows 7
Confidential, Milwaukee, WI
Senior Software Engineer
Responsibilities:
- Involving in continues production support for all the Lotus Notes applications by resolving over 2000 support tickets in various applications used for numerous modules like service, manufacturing, supply chain, inventory control etc.
- Developing new client applications by gathering requirements from end users and clients by preparing statement of work, functional design document, technical design document, coding, test cases, black box testing, user acceptance testing, implementation and post implementation verification.
- Worked as Quality Control Lead providing highest quality code and documentation.
- Worked as offshore team lead for Java and Data warehouse teams
- Presented weekly dashboard to client manager and managed numerous projects.
Environment: Lotus Notes & Domino Server 6.5.3, 7.0.3 and Windows XP