We provide IT Staff Augmentation Services!

Big Data Lead Resume

3.00/5 (Submit Your Rating)

Foster City, CA

SUMMARY:

  • 17 years of experience in analysis, design, development, testing, implementation, production support and maintenance in IT industry which includes 3 years of experience in Big Data Ecosystem(HDFS, MapReduce, Pig, Hive, HBase, Sqoop and Flume) related technologies.
  • Complete understanding of Hadoop fundamental concepts like MapReduce distributed computing paradigm, Hadoop 1.0 & 2.0, MapReduce MRv1, MRv2 / YARN, Job Tracker, Task Tracker, HDFS, Name Node, Data Node and HDFS Federation.
  • A robust supervisor with expertise in coaching, mentoring, and leading resources into becoming highly productive team members in a lean and agile environment.
  • Expertise in developing Hive and Pig reports.
  • Extensive knowledge of creating manage tables and external tables Hive eco system.
  • Expertise in tuning high volume MapReduce jobs by managing InputSplit to mappers, developing custom combiners, partitioners and optimizing the job during Shuffle/Sort phase by implementing RawComparator.
  • Experience in importing & exporting of data using Sqoop from HDFS to RDBMS and vice - versa.
  • Expertise in developing in customized UDFs in Java to extend Hive and Pig Latin functionality.
  • Excellent understanding and knowledge of NoSQL databases like MongoDB, HBase and Cassandra.
  • Expertise in developing scripts in UNIX and Shell.
  • Experience in working with Software Methodologies like Agile Unified Process - AUP Scrum, Rational Unified Process - RUP and Waterfall.
  • Experienced in full Software Development Life Cycle SDLC- Requirement Analysis, Design, Development, Testing, Implementation and Maintenance.
  • Experience in Object Oriented Analysis and Design (OOAD) using UML (Use Cases, Activity, Sequence, Class Diagrams etc.
  • Broad understanding and experience of real-time analytics, data modeling and data management, analytical tools and Hands on development experience with RDBMS, including writing complex SQL queries, PLSQL, views, stored procedure, triggers, etc.
  • Hands on working experience with MS word, MS excel, MS project, Visio, PowerPoint.
  • Expertise in system designing to provide the Business Strategic solutions and Expert level experience with the full software testing cycle. Self-starter, able to handle multiple projects while meeting deadline requirements. Strong technical, analytical and problem solving skills.
  • Good team player and ability to work in a collaborative "team-oriented" environment.
  • Extensively worked in Global delivery model and managed resources from both on-site and off-shore.

TECHNICAL SKILLS:

Operating Systems: Linux, Unix, Windows XP/7, z/OS, MVS/ESA

Big Data: Hadoop, Map Reduce, HDFS, Hive, Pig, Sqoop, Oozie, HBASE, Impala, Zookeeper

Languages: C, C++, JAVA, J2EE, Python, Cobol, SQL, and PL/SQL 2.X.

Database: Oracle 10g/11g, MySQL, DB2 9.0, IMS, Mongo DB

Java Web Technologies: Servlets 3.0, JDBC 4.0, JSP2.0

Web Technologies: HTML, Java Script, XML, XSLT

Application Servers: Web Logic 11g, Web Sphere 6.1/7.0

Web Servers: Tomcat 6.0

IDE Tools: Eclipse

Defect Tracking Tools: HP Quality Centre, BugZilla

PROFESSIONAL EXPERIENCE:

Confidential, Foster City, CA

Big Data Lead

Responsibilities:

  • Participated in BRI meetings and gathered business requirements.
  • Prepared low level design documents based on the business and functional requirements and reviewed them with the SMEs.
  • Lead a team of on-site and off-shore resources and effectively coordinated work efforts between on-site and off-shore.
  • Mentored off-shore developers and junior developers from on-site.
  • Involved in design reviews, code reviews and peer reviews of the on-site/off-shore development.
  • Responsible for interacting with client on daily basis from technical point of view including architecture, design and ensure the technical directions and choices.
  • Involved in the design and development of Secure Data Service to provide application level encryption/decryption.
  • Developed Map/reduce Jobs, scripts in Hive and Pig.
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted the data from AbInitio, DB2 into HDFS using Sqoop.
  • Developed Hive UDF's to implement business logic in Hadoop.
  • Supported QA in setting up test environments and resolving the issues identified.

ENVIRONMENT: Linux, Java, Hadoop, MapReduce, Pig Latin, Hive, HBase, Sqoop, HDFS, Cloudera, XML, Eclipse, AbInitio, Informatica, DB2.

Confidential, Los Angeles, CA

Sr. Hadoop Developer/Lead

Responsibilities:

  • Gathered business requirements from the Business Partners and Subject Matter Experts.
  • Lead a team of on-site and off-shore resources and effectively coordinated work efforts between on-site and off-shore.
  • Mentored off-shore developers and junior developers from on-site.
  • Involved in design reviews, code reviews and peer reviews of the on-site/off-shore development.
  • Developed Simple to complex Map/reduce Jobs using Hive and Pig.
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted the data from MySQL into HDFS using Sqoop.
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior.
  • Used UDF's to implement business logic in Hadoop.
  • Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other Sources.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
  • Worked on loading techniques using Sqoop, Hive and compression techniques such as Avro.
  • Developed Java APIs for invocation in Pig Scripts to solve complex problems
  • Developed Scripts and Batch Job to schedule various Hadoop Program using Oozie.
  • Involved in maintaining various Unix Shell scripts.

ENVIRONMENT: Linux, Java, Python, GitHub, Hadoop, MapReduce, Pig Latin, Hive, HBase, Zookeeper, Sqoop, Oozie, HDFS, Cloudera, XML, MySQL, Eclipse, Oracle and Mongo DB.

Confidential, Los Angeles, CA

Technical Lead

Responsibilities:

  • Document and develop the design requirements for the development of various programs and modification of others and then code the requirements.
  • Responsible for the definition, development and implementation of new systems and major enhancements to existing systems as well as production support for systems with high complexity.
  • Understanding the business requirements, coding in Java and unit testing using JUNIT.
  • Developed XSL Stlyle sheets to render the XML Response.
  • Supporting team for day to day issues.
  • Performed integration testing for new enhancements.
  • Involved in end- end system architecture.
  • Design and Execution of Test Cases manually.
  • Developed JSP's for front end of Echo-View.
  • Developed User Interface using CSS, GUI, JavaScript, JSP, Servlets and JSTL.
  • Performed Functional, User Interface, retesting/regression testing.
  • Analyze test results and report to management.
  • Reported the Bugs and attended weekly status meetings and provide detailed status report
  • Responsible for developing, modifying and executing JCL’s and Proc’s to run the new programs.
  • CICS online programs Modifications using to Run the screen changes and through debugging facilities (CEDF).
  • VSAM file manipulations and test scenario/data simulation
  • Created and published Development strategy, Plan & Requirement Traceability matrix.

ENVIRONMENT: Java, Swing, Applets, JSP, Servlets, SOAP, REST Web Service, XML, XSLT, XPath, JavaScript, jQuery, CSS, COBOL II, CICS, JCL, DB2, IMS, VSAM, SQL, TSO/ISPF, MVS, File Aid, Expeditor, Endevor, Super C, CA-View, Test Director.

Confidential, Los Angeles, CA

Sr. Programmer Analyst

Responsibilities:

  • Worked with business analysts and end-users to gather the requirements and documented them using MS-Word and MS-Visio.
  • Created Detail Design documents and Program Specification Documents based on requirements.
  • Worked on Impact Analysis for changes to be done for Customer Information System, Surge Protection, CRM, Electronic Bill Payment and Presentation and Mobile Data Dispatching System projects.
  • Used CODE-1 for address validations according to USPS standards.
  • Worked extensively with DB2 tables using SPUFI and QMF.
  • Involved in resolving online problems that users encounter during testing.
  • Involved in preparing test cases for testing various online screens for data validation.
  • Worked with JCL utilities like IDCAMS, IEBGENER and IEBCOPY.
  • Developed new programs in COBOL, CICS, IMS, DB2, Easytrieve and MVS/JCL.
  • Involved in preparing and executing JCLs and scheduling batch jobs using CA7.
  • Involved in setting up test environments for online, loading/re-loading files, tables.
  • Involved in building the queries using DB2 SQL’s for extracting the data from production to meet the test requirements and loading into the test environment.
  • Involved in setting up JCLs for batch cycles, migrating data from production to test environment, executing batch cycles and validating the results.
  • Involved in unit testing, system testing, and regression testing and user acceptance testing.

ENVIRONMENT: OS/390, VS COBOL II, CICS, VSAM, IMS, DB2, JCL, FILE-AID, SPUFI, EASYTRIEVE, LIBRARIAN, XPEDITER, CA7 and ABEND-AID.

We'd love your feedback!