We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

MN

SUMMARY

  • Over 7+ years’ experience and understanding of System Development Life Cycle (SDLC) such as analysis, design, development, testing, implementation and support with exposure in database
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, MapReduce v1 and YARN concepts
  • Experience with configuration of Hadoop Ecosystem components: Hive, HBase, Pig, Sqoop, Mahout, Zookeeper
  • Proficient in using Cloudera Manager, an end to end tool to manage Hadoop operations
  • Experience in Database design, Entity relationships, Database analysis, Programming SQL, Stored procedure’s PL/ SQL, Packages and Triggers in Oracle and SQL Server on Windows and UNIX.
  • Basic MySQL administration skills
  • Proficient in UNIX (CentOS, RHEL,Linux) and Windows operating systems.
  • Knowledge of Relational Database Management System (RDBMS).
  • Strong working experience in Data Analysis, Modeling, Logical and Physical database design.
  • Experience in creating different kinds of Partitions on tables and indexes for managing large tables.
  • Proficient in ETL Processes using Oracle PL/SQL, Unix Scripts and SQL*Loader for data migration to Enterprise Data Warehouse with large data volume.
  • Ability to understand Functional specifications, Business requirements and technical Design documents.
  • Excellent team player with the ability to work independently and interact with people at all levels.
  • Excellent communication and interpersonal skills.

TECHNICAL SKILLS

Big Data: HDFS, Hive, Pig, Hbase, Sqoop, MapReduce v1, YARN

Languages: PL/SQL, SQL, Shell Programming, Java.

Databases: Oracle 11g, 10g, 9i, MS - Office.

GUI / Tools/Utilities: SQL* Loader, TOAD 8.0/7.1, PL/SQL Developer, Data Pump (Import/Export)SQL* PLUS, ERWIN 3.5/4.0, SQL Navigator MS VISIO, VI Editor, Oracle Wrap, FTP, SFTP, Microsoft Excel, Microsoft PowerPoint.

Operating Systems: Windows(2000/XP), Unix/Linux, Solaris, AIX, HPUX.

Scripting: Unix Shell Scripting

Reporting Tools: Oracle Reports 9i

PROFESSIONAL EXPERIENCE

Confidential, MN

Hadoop Developer

Responsibilities:

  • Understanding existing system to come up with migration plan to Hadoop system
  • Design, Development, testing and deployment of new system in Hadoop environment
  • Importing and exporting data into HDFS, Hive and Hbase using Sqoop from Relational Database
  • Extracted files from Hbase and placed in HDFS/HIVE for processing
  • Exported the analyzed data to the relational database / datawarehouse for visualization and to generate reports(RHadoop and OBIEE) for data visualization
  • Used Hive and Pig to analyze data from HDFS
  • Load and transform large sets of structured, semi structured and unstructured data
  • Importing and exporting data into HDFS and Hive using Sqoop
  • Hands on experience in developing Sqoop jobs to import data from RDBMS sources into HDFS as well as export data from HDFS into Relational tables
  • Responsible to manage data coming from different sources
  • Supported Map Reduce Programs those are running on the cluster
  • Involved in loading data from UNIX file system to HDFS.
  • Involved in creating Hive tables, loading with data and writing hive queries
  • Automated all the jobs, for pulling data from FTP server to load data into Hive tables, using Oozie workflows.
  • Implemented Oozie Workflow to run multiple PIG scripts and Hive queries

Environment: Hadoop, MapReduce, HDFS, Hive, Java (jdk1.6), Hadoop distribution of HortonWorks, Cloudera, MapR, IBM DataStage 8.1(Designer, Director, Administrator), Flat files, Oracle 11g/10g, PL/SQL, SQL*PLUS, Toad 9.6,Windows NT, UNIX Shell Scripting

Confidential, TX

Hadoop Developer

Responsibilities:

  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Supported code/design analysis, strategy development and project planning.
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Importing and exporting data into HDFS and Hive using Sqoop
  • Developed Pig Latin Scripts to extract the data from the web server output files to load data into HDFS
  • Executed queries using Hive and developed Map-Reduce jobs to analyze data.
  • Developed Hive queries for the analysts.
  • Enabled Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • Provided design recommendations to sponsors/stakeholders that improved review processes and resolved technical problems.
  • Managed and reviewed Hadoop log files.

Environment: Eclipse, Oracle 10g, Hadoop, MapReduce, Hive, Linux, MapReduce, HDFS, Hive, MapR, SQL, Toad 9.6.

Confidential, TX

Oracle Developer

Responsibilities:

  • Involved in developing SQL *Loader scripts for data loading.
  • Created database objects like tables, synonyms, sequences and views.
  • Gathered business requirements.
  • Created procedures and functions to implement business requirements.
  • Involved in the optimization of Oracle queries/scripts which resulted in substantial performance improvement for the conversion processes using Oracle Hints, Explain Plans and Trace Sessions.
  • Worked on Materialized view.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems
  • Extracted, transformed, cleansed, and loaded (ETL) Long Range Planning Department data from their legacy RBase system to Oracle.
  • Import data from Excel spreadsheets to Oracle tables. Export data from Oracle tables to Excel spreadsheets. Export Crystal Reports to Excel spreadsheets.
  • Worked with Bulk Collects to improve the performance of multi-row queries.
  • Worked on profiling and tracing PL/SQL program to analyze the execution of program in order to enhance performance of program.
  • Worked with Bulk Collects to improve the performance of multi-row queries.
  • Worked on profiling and tracing PL/SQL program to analyze the execution of program in order to enhance performance of program.
  • Created indexes for faster retrieval and improved query performance.
  • Worked on PL/SQL Tables, Records and Collections.
  • Worked with Bulk Collects to improve the performance of multi-row queries.
  • Implemented triggers based on the business rules.

Environment: Oracle 10g, UNIX/LINUX RHEL5, Oracle Enterprise Manager, Win CVSERWIN 3.5, Toad 7.5,Oracle Reports 9i, XML.

Confidential

PL/SQL Developer

Responsibilities:

  • Involved in interacting with the end-user (client) to gather business requirements.
  • Resolved application problems to maintain high level of customer satisfaction.
  • Involved in performance fine-tuning of the queries/report using PL/SQL and SQL Plus.
  • Developed Unix Shell Scripts to automate backend jobs, loading data into the Database using SQL* Loader.
  • Handled errors using system defined exceptions and user defined exceptions like INVALID NUMBER, NO DATA FOUND and PRAGMA EXCEPTION INIT.
  • Wrote shell scripts in Crontab to automate Backend Jobs.
  • Worked on minimizing CPU overhead by Tuning SQL Statements, Tuning Subprogram Invocation and Tuning computation-intensive PL/SQL Code. .
  • Analyzed Oracle objects and created Partitions for very large tables to reduce disk contention and improve performance.
  • Managed tables, indexes, constraints, views sequences, synonyms and stored program units.
  • Developed database triggers required for the Integrity constraints.
  • Created Logical and Physical Models using ERWIN.
  • Performed performance tuning and query optimization.
  • Participated in Performance Tuning of SQL queries using Explain Plan to improve the performance of the application.
  • Coded complex SQL queries to retrieve data from the database depending on the need.
  • Created Cursors and Ref cursors as a part of the procedure to retrieve the selected data.
  • Written PL/SQL cursors for transaction processing.
  • Written Queries for the management in the form of stored procedures and packages.
  • Involved in writing complex queries to generate reports as per client request as a part of production support.
  • Created partitions on the tables to improve the performance.
  • Extracted data from Flat files, Oracle and SQL server sources.

Environment: Oracle 9i, Solaris 5.9, SQL Developer 1.1, Serena PVCS, Oracle Enterprise Manager, Windows 2000, Oracle forms 9i

Confidential

PL/SQL Developer

Responsibilities:

  • Implement PL/SQL Tables of Records to improve performance and provide a temporary storage area.
  • Migrated data from Text file to Oracle database.
  • Developed SQL queries to fetch complex data from different tables in remote databases using database links.
  • Used Oracle Pre Defined Packages DBMS SQL, UTL FILE.
  • Handled the application errors using RAISE APPLICATION ERROR built in procedure.
  • Created Materialized Views.
  • Utilized tools like TOAD during development of the application.
  • Performed debugging of the PL/SQL codes using the DBMS OUTPUT.
  • Created DDL scripts to create, alter, drop tables, views, synonyms and sequences.
  • Fine tuning SQL queries to improve the execution time.
  • Developed UNIX Shell scripts to automate repetitive database processes.
  • Participated in application planning, design activities by interacting and collecting requirements from the end users.

Environment: Oracle 9i, VB 6, UNIX, SQL* Loader, SQL*PLUS

We'd love your feedback!