Senior Hadoop Developer Resume
Sunnyvale, CA
SUMMARY
- Overall 6+ years of software development experience in a variety of industries, which includes hands on experience in Big Data technologies
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Name node, Data node, Resource manager, Application master, Node manager, Job tracker, Task tracker etc.
- Possess knowledge in Big Data Analysis tools like Hadoop, HDFS, Map Reduce, HIVE, PIG, Oozie, Sqoop and Hue.
- Acquainted with Cloudera, HortonWorks Data platform, Apache Hadoop distributions.
- Experience in working with MapReduce programs using Apache Hadoop for working with Big Data
- Excellent understanding of both Classic MapReduce, YARN and their applications in BigData Analytics
- Developed Data validation framework using Mapreduce.
- Expertise in writing Hadoop Jobs for analyzing data using Hive and Pig
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa
- Experience in Hadoop workflows scheduling and monitoring using oozie
- Have a good experience working with NoSQL database such as Cassandra, its architecture.
- Have a good knowledge in creation of keyspaces and tables, inserting and updating the tables in Cassandra.
- Good Experience in Core and Advanced JAVA technologies.
- Good Experience in Web application development using Spring framework
- Good Experience in working with different relational databases like MySQL, Oracle, DB2 and Teradata.
- Good Experience in working with Restful Web Services.
- Good Experience in writing Unix Shell Scripts.
- Having hands on experience on IDEs such as Rational Application Developer & Eclipse, Code management tools such as Rational Team Concert (RTC) & MKS and Code build tools such as Maven.
- Experience on various domains like Banking, Telecom and Agricultural.
- Strong experience and understanding of software development methodologies such as Waterfall and Agile methodologies (Scrum Methodology).
- Appreciated as Confidential RISE Award, Confidential UK to GLT award, Confidential Above and Confidential Expectations Award for the best service in period of .
- Certified as Oracle Certified Professional, JAVA SE 6 Programmer (Exam Code: )
- Handled several techno-functional responsibilities including estimates, identifying functional and technical gaps, requirements gathering, designing solutions, development, developing documentation, and production support
- An individual with excellent interpersonal and communication skills, strong business acumen, creative problem solving skills, technical competency, ability to learn quickly new technologies, adapt to new environments, self-motivated, team-player spirit, and leadership skills
TECHNICAL SKILLS
Operating Systems: Unix, Linux, Windows NT, Windows 95/98, Windows 2000 Professional, Windows XP/VISTA/7/8/10.
Hadoop Ecosystem & Tools: Apache Hadoop, HDFS, Map Reduce, Yarn, Hive, Tez, Pig, Oozie, Sqoop, Hue
NoSql Databases: Apache Cassandra
Hadoop Distributions: Hortonworks and Cloudera Hadoop distributions
Languages: Java (JDK 1.4/1.5/1.6/1.7/1.8 ), SQL
Web/Distributed Technologies: J2EE, Servlets 2.x/3.x, JSP 2.x, JDBC 2.0/3.0, RMI, XML, SOA, UML, MVC, Spring, Struts
Scripting: JSP 2.x, Java Script, HTML, CSS, Unix Shell Scripting.
Frameworks: J2EE, JDBC, RMI, Struts 1.1/1.2, Spring Framework
RDBMS: ORACLE 8i/9i/10g, DB2, MySQL, Derby, TeraData
Web/Application Servers: WebSphere Application Server 5.1/6.0/7.0, JBoss, Apache Tomcat
IDE / Tools: RAD, RTC, Eclipse 3.1, Maven, MKS, Log4j, Quality Centre
Methodologies: Unified Modeling Language (UML), Agile Methodology (Scrum)
PROFESSIONAL EXPERIENCE
Senior Hadoop Developer
Confidential, Sunnyvale, CA
Responsibilities:
- Responsible for CassandraInstalls, configures, troubleshoots, monitors and upgrades.
- Working on architecting and creatingCassandradatabase systems.
- Responsible to design Cassandra Data model such as keyspace, column families.
- Writing CQL scripts to create, alter and drop keyspaces and column families.
- Writing CQL scripts to perform CRUD operations on Cassandra database.
- Responsible for writing Cassandra Batch statements using Cqlsh
- Creating Cassandra tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, No SQL and a variety of portfolios.
- Monitoring Cassandra Tuning Bloom Filter
- Design UNIX shell scripts for getting cluster and node information
- Creating and maintaining Technical documentation for launching Cassandra Clusters and for executing CQL queries.
- Evaluate options for meeting user needs, and ensure that system requirements are identified, prioritized and incorporated in an effective, efficient manner;
- Troubleshoot complex development and production application problems and provide technical and production support on an on-call basis; and
- Work with project stakeholders to define system requirements for various projects.
Hadoop Developer
Confidential, Florham Park, NJ
Responsibilities:
- Responsible for loading the data from BDW Oracle database, Teradata into HDFS using sqoop.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Written Mapreduce, Hive Jobs to process the data as per business requirements to analyze further by data analytics.
- Used Hive partitioning and bucketing to improve the performance and to maintain historical data in tables in modular fashion.
- Involved in Design, develop Hive Data model, loading with data and writing Java UDF for Hive.
- Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing the data onto HDFS.
- Developed the Pig UDF'S to process the data for analysis.
- Developed Pig Latin scripts for transformations, event joins, filter.
- Written oozie workflows to schedule batch jobs
- Monitoring application status using Resource Manager web console.
- Involved in Installing, Configuring Hadoop Eco System using Hortonworks Distribution.
- Perform analysis, resolve problems and monitor to proactively prevent problems from occurring;
- Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.
- Troubleshoot complex development and production application problems and provide technical and production support on an on-call basis;
Hadoop Developer
Confidential, New York, NY
Responsibilities:
- Involved in collecting the business requirements for the project.
- Attended business meetings to gather the requirements and wrote functional requirements document based on it.
- Participated in technical discussions and overall architecture as well as to communicate with the other integration teams.
- Used Sqoop to import/Export the data from RDBMS system into HDFS system.
- Written Mapreduce, Hive Jobs to process the data as per business requirements to analyze further by data analytics.
- Used Hive partitioning and bucketing to improve the performance and to maintain historical data in tables in modular fashion.
- Written oozie workflows to schedule batch jobs
- Extensively worked on Unit and Integration testing
- Actively involved in QA and Production bug fixes
Java developer
Confidential
Responsibilities:
- Involved in walking through the functional requirements and estimating the effort for delivering these requirements.
- Developed the Backend code which has the business logic and interacts with the data base.
- Unit Testing and delivering a quality piece of code.
- Perform Build and deployment activities by using Maven build.
- Involved in Database schema design, developed stored procedures.
Java Developer
Confidential, New York, NY
Responsibilities:
- Involved in walking through the functional requirements and estimating the effort for delivering these requirements.
- Developed the front end UI screens and respective business logic for all the modules.
- Implemented using Spring MVC Framework
- Unit Testing and delivering a quality piece of code.
- Perform Build and deployment activities by using Maven build.
- Involved in Database schema design, developed stored procedures. Extensively worked on Unit and Integration testing
- Actively involved in QA and Production bug fixes