Hadoop Developer Resume
Conshohocken, PA
SUMMARY:
- 7+ years of experience in software development, MVC, intranet application and industrial system development including Hadoop Development and Administration.
- 3 years of experience in JAVA Programming, J2EE, Jsp, Servlet, JavaScript.
- 3 years of experience in Hadoop Development and Big Data handling using Hadoop Ecosystem Tools.
- 1 year of Experience in Hadoop Administration.
- Thorough Understanding of Hadoop Concepts like HDFS, Name Node, Data Node, Node Manager, Zoo Keeper, MapReduce Framework.
- Good Knowledge of installing, configuring and managing Hadoop Clusters and Hadoop Ecosystem Components such as Hive, PIG, SQOOP, Flume, Kafka, Storm, Spark.
- Hands - on Experience in Cloudera Distributions and Hortonworks Sandbox.
- Excellent Database development skills and solid understanding of database technologies (both Relational and NoSQL).
- Excellent knowledge and experience of Software Development methodologies like SDLC and Waterfall, Agile, Scrum Business Modeling.
- Expertise in Writing, Reviewing and Executing Test Cases.
- Conceptual Knowledge in Load testing, Performance testing, and Stress Testing.
- Experienced in project monitoring, with strong problem solving skills.
TECHNICAL SKILLS:
Programming Languages: Java, PigLatin, HiveQL, Shell Script, C#, SQL, PL/SQL, PHP, VB.NetHadoop/ Big Data Technologies: HDFS, MapReduce, Zookeeper, Hive, PIG, Sqoop, Flume, Kafka, Storm, Spark, Oozie, Ganglia
Web Technologies: HTML, XML, CSS, Java Script, Ajax, ASP.Net
Platforms: Windows 10/7/2000/ XP Professional, Linux, Ubuntu, CentOS
Databases: Oracle 11g/10g/9i, HBase, SQL Server, MS Access, DB2
IDE: Cloudera, Hortonworks Sandbox, Eclipse, NetBeans, Dream Weaver, Microsoft Visual Studio, PHPEdit, PLSQL Developer
Methodologies: Agile, Scrum, Waterfall
PROFESSIONAL EXPERIENCE:
Confidential, Conshohocken, PA
Hadoop Developer
Responsibilities:
- Developed MapReduce programs in Java for parsing the raw data.
- Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with PIG.
- Developed the UDF's in PIG using Java.
- Wrote Hive Queries to have a consolidated view of the data.
- Worked on Hue interface for querying the data.
- Worked on managing and reviewing Hadoop log files.
- Created multiple Hive tables, implemented Partitioning, Dynamic Partitioning and Buckets in Hive for efficient data access.
- Developed Custom Input Formats and Custom Output Formats to meet business requirements.
- Installed and configured Hive and written Hive UDFs.
- Involved in writing Java Map Reduce programs for processing data in HDFS.
- Involved in creating Hive External tables, loading with data and writing hive queries which will run internally in map reduce way.
- Involved in loading data into HBase using HBase Shell, and Sqoop .
- Ran various Hive queries on the data dumps and generate aggregated datasets for downstream systems for further analysis.
- Accountable for Performance tuning and resource management of Hadoop clusters and MapReduce routines.
- Developed Combiners and Partitioners to meet the business requirements.
- Worked in Distributed and Agile environment.
- Worked with Test team test cases reviews call.
Technologies: Hadoop, HDFS, MapReduce, Sqoop, Hive, Flume, Oozie, Zoo keeper, Cloudera distribution, JIRA, MySQL, Eclipse.
Confidential, Newark, NJ
Hadoop Developer
Responsibilities:
- Worked closely with business intelligence analyst to develop solutions.
- Involved in creating Hive tables, and loading and analyzing data using hive queries which will internally run MapReduce jobs.
- Developed Simple to complex MapReduce Jobs using Hive and Pig.
- Analyzed large data sets by running Hive queries and Pig scripts.
- Involved in running Hadoop jobs for processing millions of records of text data.
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
- Involved in loading data from LINUX file system to HDFS.
- Used Oozie workflow engine to run multiple Hive and Pig jobs.
- Performed continual data backup using Falcon for data recovery and burst capacity.
- Involved in installing and configuring Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster.
- Responsible for managing data from multiple sources.
- Extracted files from MySQL through Sqoop and placed in HDFS and processed.
- Experienced in managing and reviewing Hadoop log files.
- Load and transform large sets of structured, semi structured and unstructured data.
- Responsible to manage data coming from different sources.
- Assisted in exporting analyzed data to relational databases using Sqoop.
Technologies: Hadoop, HDFS, Map Reduce, Hive, Pig, Sqoop, Oozie, MySQL, Linux, Ubuntu
Confidential, Allendale, NJ
Hadoop Administrator
Responsibilities:
- Provided Administration, management and support for mid scale Big Data platforms on Hadoop eco-system.
- Involved in Cluster Capacity planning, deployment and managing Hadoop for our data platform operations.
- Implemented large multi node Hadoop clusters in AWS cloud from scratch.
- Installed, configured Hadoop Cluster using Puppet.
- Experience in decommissioning failed nodes and commissioning new nodes as the cluster grows and to accommodate more data on HDFS.
- Good experience on enabling High Availability to avoid any data loss or cluster down time.
- Backup configuration and recovery from a Name-node failure.
- Hands on experience on cluster upgrade and patching without any data loss and with proper backup plans.
- Rack aware configuration for quick availability and processing of data.
- Experience in HDFS data storage and support for running map-reduce jobs.
- Involved in analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.
- Managed and reviewed Hadoop Log files as a part of administration for troubleshooting purposes.
- Extensive knowledge on HDFS and YARN implementation in cluster for better performance.
- Installed and configured Sqoop to establish import/export pipeline from various databases such as MySQL.
- Installed and configured Flume to establish import/export data pipeline from various unstructured and semi structured data source.
- Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
Technologies: Hadoop, Map-Reduce, YARN, Cloudera manager, Flume, Sqoop, Red Hat Linux, Amazon Web Services (AWS).
Confidential, Chantilly, VA
Java Developer
Responsibilities:
- Worked with Agile and Waterfall methodologies, participated in monthly sprints and weekly scrums.
- Requirement gathering from functional teams.
- Implemented OOPS concepts to good effect in various parts of the application.
- Developed Java-enabled applications by using Struts, JSP and Hibernate.
- Coding in Java, JavaScript and implementing design patterns based on the existing application.
- Involved in the development of presentation layer and GUI framework in JSP, HTML.
- Worked with Ajax at the client side for asynchronous communication with the server
- Worked on full-life cycle development process, and experience in agile environment.
- Mentored a team of developers in Java, J2EE, unit testing, configuration management and agile methodologies.
- Interacted with developer and testers to resolve major bugs in later phase.
Technologies: Core java1.6, JSP, HTML, Java Script, CSS, AJAX, Oracle, Eclipse
Confidential, Philadelphia, PA
Jr. Java Developer
Responsibilities:
- Developed, integrate and maintain new and existing Java/J2EE applications
- Worked in dynamic development environment and responsible for delivering reliable software components
- Participated in analysis, design, implementation, test and documentation phase
- Designed and build web services that integrate with multiple backend systems
- Evaluate software from third parties and open source software and integrate them with new, ongoing and existing applications
- Develop database script modifications using SQL
Technologies: Java/J2EE, MySQL, Apache Tomcat, NetBeans
Confidential
Trainee and Team member
Responsibilities:
- Eliminate the need of hard coding by generating helpful information using already existing data.
- Generate and print barcodes and labels for each level of production and packaging.
- System also logs all modified entries along with detailed information like who made the change and when.
- Developed Web pages using HTML and JSP.
- Developed Server side code using Servlet, JSP and Enterprise Beans running on Apache tomcat 4.1.30
- Created the database tables in Oracle 10g; used JDBC to perform database operations.
- Created Pl/Sql queries, functions, procedures and triggers for database operations.
Confidential
Responsibilities:
- It was an application based on client’s requirement to fill income tax details for her business.
- It provides option to fill income tax quarterly/per 6 month/yearly as per client’s requirement.
- Application had good GUI and flow that makes it easy for client to enter various data.
- Used advanced mathematical equations and functions where required.
- Application’s logical code was written in C#.
- Created database, provided secured database connection and managed all the database operation in oracle.
Confidential
Responsibilities:
- System maintains records for all the doctor’s working in the hospital as well as referral doctors.
- It also maintains detailed records for all the patients including their allergies, blood group and other personal information which is very helpful to doctors in later visits of that patient as well as saves time in case of emergency.
- System also manages the entire inventory used in Hospital from medicine to bed sheet.
- It also handles transactions in ordering, purchasing, shipping, receiving the inventory.
- The client was the owner of more than one hospital in the region so the system was integrated between all those hospitals.
- Application was created in .Net framework. All the forms were created in Visual Basic.Net.
- Database was created in MS Access and database connection was performed using connection string.
- Crystal Reports were created.