We provide IT Staff Augmentation Services!

Senior Hadoop Developer Resume

0/5 (Submit Your Rating)

Pennigton, NJ

SUMMARY

  • 12.7 years of IT experience in the area of Hadoop/Bigdata Analytics and Mainframe Technologies.
  • Currently working as a Senior developer/Technical Lead in the area of Hadoop/Bigdata analytics
  • Worked as a Technical Lead/Senior Developer/Onsite coordinator in the area of Mainframe Technologies with various clients and domain.
  • Extensive experience in ETL process, analysis, coding & testing on Bigdata Analytics projects using Hadoop tools and technologies
  • Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experienced in optimizing ETL workflows
  • Expertise in IBM Mainframe with deep knowledge in mainframe based applications and mainframe tools
  • Expertise in troubleshooting and proficient to lead the team to fix large production issues
  • Proficient in Project Management, Production support, Application Development, Programming, System Analysis, Software Quality assurance and change management process with various clients
  • Conversant with all phases of project Life Cycle including Requirement gathering, Analysis, Design, Development, Testing, Implementation, Software quality standards and configuration management and change management and Quality procedures.
  • Expertise in Application development projects and very good exposure in the development methodologies like Agile, Water fall etc.
  • Expertise in handling support and maintenance project also hands on experience in the ticket tracking tools like HP SMPO, ITSM and Remedy, JIRA.
  • Hands on experience in the migration of mainframe application to other technologies like SAP, UNIX etc and Re - hosing and decommissioning mainframe to Micro focus enterprise server.
  • Handled small/medium size team, good at operational planning, work allocation, tracking and reviewing
  • Worked in the creation of SOWs and RFP discussion for the various new proposals
  • An individual with proven leadership, administrative, communication, & organizational skills
  • Have good combination of technical, communication and interpersonal skills, which provide the ability to be an effective mediator between programmers, end-users and clients.
  • Highly motivated with the ability to work effectively in teams as well as independently.
  • Experienced in executingOnsite-Offshoreproject execution models, Proficient in document management, able to prioritize and complete multiple tasks.
  • Have intense knowledge in various domains like retail, banking, insurance, healthcare.

TECHNICAL SKILLS

Platforms/frameworks: Hadoop 2.7,IBM S/390, IBM PC Compatibles

Operating systems: Linux,OS/390,Windows 10/7/XP/2000/Server,MS-DOS

API: Spark Engine, MapReduce

Programming Language: Python, Scala, Java, VS COBOL, JCL, Easytrive,SAS

Scripting Language: Korn shell/Unix shell scripting,XML,SQL

Workflow: Oozie

Databases: Hive,Impala,DB2,Oracle,IMS DB

ETL Tool: Sqoop, Flume

Web Interface: Hue

File systems: Avro files, Parquet files, HDFS,VSAM

OLTP: CICS, IMS DC/TM

Middleware: MQ Series

Tools: /Technologies: Spring Tool Suite, Crucible,Changeman, Endavor,Panvalet, Panvalet, Xpeditor,DB2/VSAMFileaid, Paltinum Startool, SAR,Jobtrac, SPUFI,QMFTape Management system (TMS), OPC schedulerAbendaid, DADS, IBM debugger,Mainframe Express

Ticket Tracking Tools: Remedy,ITSM,HP SMPO, JIRA.

PROFESSIONAL EXPERIENCE

Confidential, Pennigton NJ

Senior Hadoop Developer

Environment: Hadoop 2.7,Spark,Scala,Oozie,Sqoop,Hive,ImpalaOracle, Hue,Unix Shell Scripting

Responsibilities:

  • Design the DataLake to pull the HMDA Loan details of various client from the upstream system like Peaks, nCino
  • Design and implement the sqoop process to pull the client data from various Oracle data bases to Hadoop environment of BOFA.
  • Implement ETL process and conformance code for the HMDA Data Lake
  • Design and Implement the Oozie workflow to import and export client’s loan information to various loan processing and data analytical systems in BOFA
  • Create and Hive tables in the Hadoop data hub region and store the sqoop data in the parquet format.
  • Design and code the conformance logic using Spark-Scala which can be used for target or consuming systems.
  • Optimize the Spark-Scala, Spark-SQL code in the conformance layer for the process improvement.
  • Implement the Oozie coordinator and schedule the daily/weekly/monthly jobs.
  • Create the test suits using JUnit and perform the unit, integration and end to end testing using the JUnit in QA and SIT regions.
  • Optimizing the Hive queries using Partitioning and Bucketing techniques, for controlling the data distribution.

Confidential, Malvern, PA

Technology specialist

Environment: Hadoop 2.7,Spark,Python,Scala,Oozie,Sqoop,Hive,ImpalaOracle,DB2,Hue,Unix Shell Scripting, SAS

Responsibilities:

  • Design the ETL process to bring the client score details from Teleaf, Data warehouse and enterprise system to Confidential Hadoop ecosystem.
  • Discuss with business users to understand and clarify the business requirement and prepare the design documents
  • Design, code and implement the sqoop process and import the score details Hadoop data hub.
  • Perform cleanse and validation of the imported data convert to the Avro file format which are accessible to the Hadoop data mart environment.
  • Make the necessary changes to cleanse and validate programs using spark-scala.
  • Design and code the score calculation logic for the Confidential client in using pyspark and execute the pyspark programs in Hadoop data mart environment.
  • Design and implement the Oozie workflow for the daily/weekly/monthly client score calculation and web interaction reports.
  • Implement the Oozie coordinator and schedule the daily/weekly/monthly jobs.
  • Create the test suits using pyspark and perform the unit, integration and end to end testing using the pyunits.
  • Convert the Avro files in the Hadoop data hub to parquet format using the Hive scripts.
  • Import the data from Oracle and DB2 database to Hadoop ecosystem using sqoop.
  • Create the Hive tables in Hadoop data mart environment and validate the performance of Hive and Impala queries against the master tables.
  • Optimizing the Hive queries using Partitioning and Bucketing techniques, for controlling the data distribution.
  • Fine tune the pyspark codes for the optimized utilization of Hadoop resources for the production run.
  • Execute the comparison test in the production region and fine tune the end results to ensure the accuracy.
  • Trouble shoot and fix the daily Oozie workflow failure and implement the permanent fix.
  • Analyze the Java-MapReduce program and prepare the analysis documents and perform the feasibility study to convert the Java-MapReduce program to Spark-Python(pyspark).
  • Prepare the High level/low level design document for the conversion Java-MapReduce code to pyspark.
  • Code the programs in pyspark those are in Java and perform the Unit/Integration/Regression and comparison testing to ensure that the newly converted code have the same functionality and performance with the code in Java.
  • Mentor the juniors and provide the application training for the new joiners in the team.

Confidential, Nashville, Tennessee

Project Lead

Environment: MVS\ESA, TSO\ISPF JCL, VS COBOL II, VSAM, DB2,CICS

Responsibilities:

  • Go thru the various BRD's and understand the business requirement also discuss with client to get clarification on the requirement.
  • Prepare the Technical Spec and send to client for approvals.
  • Provide the estimation for Design, coding, testing and Implementation and support phases for each releases.
  • Allocate the task to the team members and review their deliverables.
  • Coordinate the analysis/coding/testing activities with team to ensure the traceability to requirements is maintained.
  • Review the code, test results and prepare the review and defect logs.
  • Attend the daily/weekly status call with onsite/client and provide the updates and answer the queries.
  • Prepare the daily, weekly and monthly status report also attend the SMR and prepare the monthly SMR MOM.
  • Responsible for providing and coordinating the cross functional for the peers.
  • Identify value add enhancement for the application, create the impact analysis document and submit to the client for approval.
  • Present the RCA reports to the client and suggest the enhancement of the application module to fine tune the application performance.
  • Analyzing the system performance.
  • Review the quality documents prepared by the team as per the change management process.

Confidential, New Jersey

Project Lead

Environment: MVS\ESA, TSO\ISPF JCL, VS COBOL II, VSAM, DB2,CICS

Responsibilities:

  • In the role of Project Lead, responsible for generating the daily, weekly and monthly status report for the L2 and L3 tickets.
  • Resolving L3 tickets for the Confidential ’s Banking application.
  • Work on major and minor enhancement also provides the estimation to the client.
  • Identify value add enhancement for the application, create the impact analysis document and submit to the client for approval.
  • Allocate the task to the team members and review their deliverables.
  • Initiate and coordinate the bridge call for Critical and High priority incidents.
  • Provide 24/7 on-call support and responsible as the 1st level escalation manager for the on-call support
  • Responsible for tracking of trouble tickets and resolution of escalated support incidents
  • Answers technical support queries and implements fixes for application problems
  • Responsible for providing and coordinating the cross functional for the peers.
  • Coordinate the analysis/coding/testing activities with team to ensure the traceability to requirements is maintained
  • Responsible for perform the root cause analysis for the recurring issues and provide the permanent fix.
  • Present the RCA reports to the client and suggest the enhancement of the application module to fine tune the application performance.
  • Attend the client meeting.
  • Analyzing the system performance.
  • Monitoring the Batch Cycles and flow of System jobs.
  • Review the quality documents prepared by the team as per the change management process.

Confidential - Milwaukee, WI

Team Leader

Environment: MVS\ESA, TSO\ISPF JCL, VS COBOL II, VSAM, DB2, IMSDC

Responsibilities:

  • In the role of Team lead, coordinating and guiding the Onsite and offshore team.
  • Interaction with client and get the requirement of the project from client.
  • Provide the technical Tier 2 and Tier 3 level production support for Rockwell’s application.
  • Responsible for resolving the L2 and L3 tickets created by the business users adhered to the SLA
  • Responsible for resolving the job and transaction abends which includes the critical and high priority production down issues.
  • Provide 24/7 on-call support.
  • Responsible for tracking of trouble tickets and resolution of escalated support incidents
  • Answers technical support queries and implements fixes for application problems
  • Works on cross-functional teams to proactively address support issues
  • Responsible for coordinating the analysis/coding/testing activities with offshore team to ensure the traceability to requirements is maintained
  • Responsible for perform the root cause analysis for the recurring issues and provide the permanent fix.
  • Present the RCA reports to the client and suggest the enhancement of the application module to fine tune the application performance.
  • Designed and coded the mainframe interface program during the migration of the application from IBM mainframe to SAP
  • Attend the client meeting.
  • Analyzing the system performance.
  • To Perform Minor and Major Enhancements.
  • Monitoring the Batch Cycles and flow of System jobs.
  • Review the quality documents prepared by the Onsite and Offshore team as per the change management process.

Confidential

Team Member

Environment: MVS\ESA, TSO\ISPF JCL, VS COBOL II, VSAM, DB2, CICSADS plus, IMS DB

Responsibilities:

  • Responsible for resolving L2 and L3 tickets.
  • Provide 24/7 on-call support
  • Analyzing the system performance.
  • To Perform Minor and Major Enhancements.
  • Monitoring the Batch Cycles and flow of System jobs.
  • Prepare the quality document as per the change management process.
  • Create and update KOP and KeDB document.
  • Work with onsite team to get the requirement for the project enhancement
  • Created and update the Project documents for Confidential as per the quality standards for internal and CMMi quality audits.
  • Extensively participated in RCA brain storming sessions and helped in value addition.
  • Training the new joiners in the team.

We'd love your feedback!