Hadoop Developer Resume
Dallas, TX
SUMMARY:
- Over 8 years of professional IT experience, 3 +years inBigDataEcosystemexperienceiningestion,querying,processing and analysis of big data.
- Experience in using Hadoop Ecosystem component slike Map Reduce, HDFS, HBase, Zookeeper, Hive, Sqoop, Pig, Flume, and Cloudera.
- Knowledge on NoSQL databases likeHBase,Cassandra
- ExperienceincludesRequirementsGathering,Design,Development,Integration,Documentation,Testing and Build.
- ExperienceinworkingwithMapReduceprograms,PigscriptsandHivecommandstodeliverthebestresults.
- ExtensivelyworkedondevelopmentandoptimizationofMapReduceprograms,PIGscriptsandHIVEqueries to create structured data for data mining.
- Solid knowledge of Hadoop architecture and daemons likeName node, Data nodes, Job trackers, TaskTrackers.
- Good knowledge on Zookeeper to coordinate clusters.
- ExperienceinDatabasedesign,Dataanalysis,ProgrammingSQL,Storedprocedure'sPL/SQL,andTriggers in Oracle and SQL Server.
- Experience in extending HIVE and PIG core functionality by using Custom user Defined functions.
- Experienceinwritingcustomclasses,functions,procedures,problemmanagement,librarycontrolsand reusable components.
- WorkingknowledgeonOozie,aworkflowschedulersystemtomanagethejobsthatrunonPIG,HIVEandSqoop.
- Followed Test driven development of Agile and scrum Methodology to produce high quality software.
- Experienced in integrating Java - based web applications in a UNIX environment.
- DevelopedapplicationsusingJAVA, JSP,Servlets,JDBC,JavaScript, XML and HTML.
- Stronganalyticalskillswithabilitytoquicklyunderstandclientsbusinessneeds.Involvedinmeetingsto gather information and requirements from the clients.
- Research-oriented,motivated,proactive,self-starterwithstrongtechnical,analyticalandinterpersonalskills.
TECHNICAL SKILLS:
Hadoop Ecosystem: HDFS,MapReduce,Hive,Impala, Cassandra, Pig,Sqoop,Flume,Oozie,Zookeeper, Ambari, Spark, Storm.
Project M anagement / Tools: All MS Office suites(incl. 2003), MS Exchange & Outlook, Lotus Domino Notes, Citrix Client, SharePoint, MS Internet Explorer, Firefox, Chrome
Web Technologies: HTML, XML, CSS, JavaScript
NoSQL Databases: HBase, Cassandra
Databases: Oracle 8i/9i/10g, MySQL
Languages: Java, SQL, PL/SQL, Ruby, Shell Scripting
Operating Systems: UNIX(OSX, Solaris), Windows, Linux(Cent OS, Fedora, Red Hat)
IDE Tools: Eclipse, NetBeans
Application Server: Apache Tomcat
PROFESSIONAL EXPERIENCE:
Confidential, Dallas, TX
Hadoop Developer
Responsibilities:
- Installed and configured Pig and also written Pig Latinscripts.
- Involved in managing and reviewing Hadoop Job tracker log files and control-m log files.
- Scheduling and managing cron jobs wrote shell scripts to generate alerts.
- Monitoring and managing daily jobs, processing around 200k files per day and monitoring thosethroughRabbitMQ and Apache Dashboard application.
- Used Control-m scheduling tool to schedule daily jobs.
- Experience in administering and maintaining a Multi-rack Cassandra cluster
- Got good experience with NoSQL databases like Cassandra, HBase.
- Involved in creating Hive tables, loading with data and writing hive queries which will runinternally in map reduce way.
- Used Sqoop to efficiently transfer data between databases and HDFS and used Flume to stream the log data from servers/sensors
- Developed MapReduce programs to cleanse the data in HDFS obtained from heterogeneous data sources to make it suitable for ingestion into Hive schema for analysis.
- Used Hive data warehouse tool to analyze the unified historic data in HDFS to identify issues and behavioral patterns.
- The Hive tables created as per requirement were internal or external tables defined with appropriate static and dynamic partitions, intended for efficiency.
- Used Control-m scheduling tool to manage interdependent Hadoop jobs and to automate several types ofHadoop jobs such as Java map-reduce, Hive and Sqoop as well as system specific jobs
- Worked with BI teams in generating the reports and designing ETL workflows on Tableau.
- Involved in Scrum calls, Grooming and Demo meeting, Very good experience with agile methodology.
Environment: Apache Hadoop 2.3, gphd-1.2, gphd-2.2, Map Reduce 2.3, HDFS, Hive, Java 1.6 & 1.7, Cassandra, Pig, SpringXD, Linux, Eclipse, RabbitMQ, Zookeeper, PostgresDB, Apache Solar, Control-M, Redis., Tableau, Qlikview, DataStax.
Confidential, Charlotte, NCHadoop Developer
Responsibilities:
- Installed and configured Hadoop Map reduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
- Installed and configured Pig and also written Pig Latin scripts.
- Developed PIG scripts using Pig Latin.
- Involved in managing and reviewing Hadoop log files.
- Exported data using Sqoop from HDFS to Teradata on regular basis.
- Developing Scripts and Batch Job to schedule various Hadoop Program.
- Written Hive queries for data analysis to meet the business requirements.
- Creating Hive tables and working on them using Hive QL.
- Experienced in defining job flows.
- Got good experience with NoSQL databases like Cassandra.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Designed and implemented Map reduce-based large-scale parallel relation-learning system
- Setup and benchmarked Hadoop clusters for internal use.
- Worked with BI teams in generating the reports and designing ETL workflows on Tableau.
Environment: Cloudera Hadoop(CDH 4.4), MapReduce, HDFS, Hive, Java, Pig, Cassandra, Linux, XML, MySQL, MySQL Workbench, Java 6, Eclipse, PL/SQL, SQL connector, Sub Version.
Confidential, New York, NYHadoop Developer
Responsibilities:
- Involved in review of functional and non-functional requirements.
- Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Installed and configured Pig and Developed Pig Latin scripts.
- Involved in managing and reviewing Hadoop log files.
- Importing and exporting data using Sqoop to load data to and from Teradata to HDFS on regular basis.
- Importing & exporting data from RDBMS to HDFS, Hive and HBase using Sqoop, Flume.
- Developed Scripts and Batch Job to schedule various Hadoop Program.
- Creating Hive tables and working on them using Hive QL.
- Experienced in defining job flows.
- Good exposure to NoSQL database HBase.
- Developed Custom UDFs in PIG.
- Prepared shell scripts for executing Hadoop commands for single execution.
- Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
- Setup and benchmarked Hadoop/HBase clusters for internal use.
Environment: Hadoop CDH4.1.1, Pig 0.9.1, Avro, Oozie 3.2.0, Sqoop, Hive, PIG, Java 1.6, Eclipse, Teradata, HBase.
Confidential, Newark, DEJava Developer
Responsibilities:
- Involved in Requirement analysis and design phase of Software Development Life cycle (SDLC).
- Designed, Developed and modified front-end UI using HTML, CSS, JavaScript,JQuery.
- Involved in designing the front end screens & Designed Low-Level design documents for the project.
- Writing complex SQL and PL/SQL queries for stored procedures.
- Generating Unit Test cases with the help of internal tools.
- Used JavaScript, JQuery for development.
- Used HTML, CSS and for the enriched front end.
- Developed Client applications to consume the Web services based on SOAP.
- Designed the projects using MVC architecture providing multiple views and thereby providing efficient modularity and scalability.
- Performed business validations at the back-end using Java modules and at the front-end using JavaScript.
- Performed building and deployment of EAR, WAR, JAR files on test, stage systems in Weblogic ApplicationServer.
- Used Singleton, DAO, DTO, MVC design Patterns.
- Involved in resolving Production Issues, Analysis, Troubleshooting and Problem Resolution.
- Involved in development and deployment of application on Linux environment.
- Involved in defect Tracking, Analysis and Resolution of bugs in system testing.
- Involved in Designing and creating database tables.
- Prepared project Metrics for Time, cost, Schedule and customer satisfaction (Health of the project).
Environment: Java, J2EE, XML, spring, Struts, Hibernate, Design Patterns, Maven, Eclipse, Toad, ApacheTomcat and Oracle.
Confidential, Framingham, MAJava/J2EE Developer
Responsibilities:
- Involved in Java, J2EE, struts, web services and Hibernate in a fast paced development environment.
- Followed agile methodology, interacted directly with the client on the features, implemented optimal solutions, and tailor application to customer needs.
- Involved in design and implementation of web tier using Servlets and JSP.
- Developed the user interface using JSP and Java Script to view all online trading transactions.
- Designed and developed Data Access Objects (DAO) to access the database.
- Used DAO Factory and value object design patterns to organize and integrate the JAVA Objects
- Coded Java Server Pages for the Dynamic front end content that use Servlets and EJBs.
- Coded HTML pages using CSS for static content generation with JavaScript for validations.
- Used JDBC API to connect to the database and carry out database operations.
- Used JSP and JSTL Tag Libraries for developing User Interface components.
- Performing Code Reviews.
- Performed unit testing, system testing and integration testing.
- Involved in building and deployment of application in Linux environment.
Environment: Java, J2EE, JDBC, Struts, SQL. Hibernate, Eclipse, Apache POI, CSS.
Confidential, Jersey City, NJApplication Packager
Responsibilities:
- Communicating with Customers to understand their business technical requirements, gathering information and creating Requirement gathering documents.
- Creating .MSI packages (Windows Installer Package) using Installshield Adminstudio, Configuring vendor supplied MSI with install tailor to create Transforms (.MST)
- Used VMware to build Images for WorkStation with Sequencing and Test Builds.
- Created Sequenced applications using AppV 4.5 on Windows XP Environment.
- Experience in creating, editing, upgrading and customizing existing Softgrid packaged sequences.
- Working with SoftGrid Console to import the sequenced applications and publishing them for testing.
- Sequenced different vendor products like Microsoft, Adobe, and Oracle software.
- Created Documentation for the Package Creation, Configurations made, and the Custom Actions included.
- Converted several third party (cots) applications into MSI format (Windows Installer) using Installshield Adminstudio.
- Used FileMon, RegMon, and Process Monitor Tools to troubleshoot the applications in Lockdown Environment. Used BeyondCompare tool to validate the applications comparing with the Gold Iamge.
- Extensively used Wise Script Editor and VBScript to develop Custom Actions and create Deployment Wrappers as per the Standards.
- Applied GPOs to control the working environment of user accounts and computer accounts
Environment: Windows XP,AppV 4.5, InstallShield Adminstudio9.0, VMware, SCCM 2007
Confidential, Round Rock, TXApplication Packaging Engineer
Responsibilities:
- Created silent MSI application packages using Wise Package Studio 6.0.
- Converted several third party (cots) applications into MSI.
- Involved in Re-packaging, testing, Deployment and support of MSI Packages.
- Created Transforms for several Vendor MSI applications using Wise Package StudioInstall Tailor Tool.
- Used ORK (Office Resource Kit 2003) to create transforms for Microsoft Products.
- Used Visual Basic Scripting and Wise Script Editor to create custom actions.
- Worked extensively with Windows Registry and file/registry Security/Permissions.
- Interacted with business unit management and support personnel to determine specifications on day to day basis.
- Worked with software vendors to resolve installation Problems.
- Used Windows Installer SDK development tools like Orca as part packaging and troubleshooting.
- Worked on VMware Workstation to create and test MSI packages and related tasks for package building.
- Used Sysinternal tools Filemon, Regmon, Install Analyzer etc as part of troubleshooting.
- Interacting with Users/Application technical contacts for Application Packaging/Distribution support for the creation packages.
Environment: VB Scripts, Wise Package Studio 6.0, Scripting, Wise Install Master 9.0, Ghost 8.x, Windows XP
ConfidentialApplication Scripter
Responsibilities:
- Created Windows Installer Applications using Wise Package Studio 5.6
- Worked on different Hardware devices like Printers, Scanners, and Other devices used in Staples Retail Environment.
- Worked on Windows XP and Server 2003 OS for Packaging the applications using Wise Package Studio and testing them.
- Tested the preexisting packaged applications on Windows XP using Application Compatibility Testing.
- Used Package Validation to check ICE errors and Microsoft Logo Standards, and Test Expert to test the Quality and Functionality of the packaged Applications.
- Created Documentation for the Package Creation, and the Custom Actions included.
- Used FileMon, RegMon, and Process Monitor Tools to troubleshoot the applications in Lockdown Environment.
- Extensively used Wise Script Editor to develop Custom Actions and create Deployment Wrappers as per the Staples Standards.
Environment: Windows XP, Windows 2003, Wise Package Studio 5.6, VMware, Altiris 6.0.