Sr. Hadoop Developer/team Lead Resume
New York, NY
SUMMARY:
- Having 9 years of IT experience and 3 Years of working experience in Big Data Hadoop technologies like Map Reduce, Hive, Hbase, Pig, Sqoop, Oozie, Zookeeper and HDFS.
- Experience in building, maintaining multiple HADOOP clusters (prod, dev etc.,) of different sizesand configuration and setting up the rack topology for large clusters.
- Extensive experience in HDFS,Map Reduce, PIG, Hive, Sqoop, Flume, Oozie, Zookeeper, Maven, HBase and Cassandra.
- Good Experience in core and advanced java concepts.
- Extensive experience with ETLand Query big data tools likePig Latin and Hive QL.
- Hands on experience in big data ingestion tools likeFlume and Sqoop.
- Experience intuning and troubleshooting performance issuesin Hadoop cluster.
- Implemented Proofs of Concept on Hadoop stack and different big data analytic tools.
- Migration from different databases (i.e.VSAM, DB2, Plsql and MYSQL) to Hadoop.
- Hands on No SQL database experience with HBase, Cassandra.
- Good understanding of Data Lakes.
- Experience in data management and implementation of Big Data applications using HADOOP frameworks.
- Having good knowledge of spark and spark SQL
- Good database experience using SQL Server,Stored Procedures, Cursors, Constraints and Triggers.
- Experience in designing, sizing and configuring Hadoop environments.
- Worked with application teams to install operating system,Hadoop updates, patches andversion upgradesas required.
- Extensive experience in documenting requirements, functional specifications, technical specifications.
- Highly motivated,adaptive and quick learner.
- Good functional Knowledge on Financial and Capital Markets.
- Exhibited excellent communication and leadership capabilities.
- Excellent Analytical, Problem solving and technical skills.
- Holds strong ability to handle multiple priorities and work load and also has ability to understand and adapt to new technologies and environments faster.
TECHNICAL SKILLS:
Operating System: MVS, Windows, Linux
Database: VSAM, MySQL, Oracle, No SQL, Talend, Tableau
Big Data: Apache Hadoop, Hbase, Hive, Pig, Sqoop, Oozie, Zookeeper, Flume, Kafka, Strom, Spark
Languages: Java, Pig Latin, HiveQL, COBOL, CICS, JCL, EZYTRIEVE
Technologies: Credit Card Processing Application (Vision PlusTM package) domain
Web Technologies: Servlets, JSP, XML, Tomcat, HTML, JavaScript, Prime Faces, JSF
IDE S: Eclipse, Net beans
Web Server: Tomcat
Development Methodologies: Agile/Scrum, Waterfall
PROFESSIONAL EXPERIENCE:
Confidential, New York, NY
Sr. Hadoop Developer/Team Lead
Technology: Horton works 2.0 Hadoop (Big Data), Apache Pig, Hive, SQOOP, Java, Servlets, JDBC, UNIX, MySQL, Cassandra, Maven, Spark, Kafka, Strom, Oracle, PLSQL.
Responsibilities:
- Responsible for analyzing and understanding of data Sources like iTunes, Spotify, YouTube and Facebook data.
- Developed a multithreaded framework to grab data for playback, Traffic source, Social, device, and Demographic reports from YouTube.
- Developed reusable component in java to load data from Hadoop distributed file system to ParAccel.
- Developed Map Reduce jobs to process the Music metric data. Scripts for uploading the data in ParAccel server.
- Developed Map Reduce codes for data manipulation.
- Implemented POC using Spark and Spark SQL.
- Working as an Architect for providing solutions
- Have been involved in designing & creating hive tables to upload data in Hadoop.
- Experienced in migrating all historical data from ParAccel to AWS S3 file System with help of SQOOP for feeds like iTunes Preorders, Radio Monitor and etc.
- Responsible for all the data flow and quality of data.
- Responsible for end to end development for the client.
- Have Developed Shell scripts for all process to be automated.
- Involved in Designing, development, coding, Unit testing.
Confidential, Minneapolis, MN
Sr. Hadoop Developer/Team Lead
Technology: Hadoop, Apache Pig, Hive, SQOOP, Java, Servlets, JDBC, UNIX, MySQL, Maven.
Responsibilities:
- Moved all crawl data flat files generated from various retailers to HDFS for further processing.
- Written the Apache PIG scripts to process the HDFS data.
- Created Hive tables to store the processed results in a tabular format.
- Developed the sqoop scripts in order to make the interaction between HDFS and MySQL Database.
- Involved in gathering the requirements, designing, development and testing
- Writing the script files for processing data and loading to HDFS
- Writing CLI commands using HDFS.
- Developed the UNIX shell scripts for creating the reports from Hive data.
- Completely involved in the requirement analysis phase.
- Analyzing the requirement to setup a cluster
- Created two different users (hduser for performing hdfs operations and map red user for performing map reduce operations only)
- Setting Password less Hadoop
- Setting up cron job to delete Hadoop logs/local old job files/cluster temp files
- Setup Hive with MySQL as a Remote Metastore
- Moved all log/text files generated by various products into HDFS location
- Written Map Reduce code that will take input as log files and parse the logs and structure them in tabular format to facilitate effective querying on the log data
- Created External Hive Table on top of parsed data.
Confidential, Valparaiso, FL
Sr. Hadoop Developer/Team Lead
Technology: Hadoop (CDH4), Java, Servlets, JDBC, HDFS, MapReduce, Hbase, Oozie.
Responsibilities:
- Setup Hadoop cluster
- Involved in requirements gathering, analysis.
- Understanding and designing the architecture.
- Storage of data in HDFS.
- Data is processed using MapReduceand the result is stored in Hbase and displayed as per the user requirement either Pie/Bar chart or both.
- Responsible for delivery and review of all tasks delivery
- Preparing weekly status and monthly status report
- Attending Defect calls to provide latest status to client
Confidential, Tampa, FL
Team Lead
Technology: Java, Servlets, JDBC, Vision Plus involving COBOL, CICS, MVS, JCL, MQ Series.
Responsibilities:
- Working as Project Team from client location.
- Coding and unit testing
- Analysis, resolving batch & online tickets of Production.
- Responsible for delivery and review of all tasks delivery
- Defect Fixes tracking.
- Responsible for release upgrade implementation in Production.
- Provide business analysis documents to understand the functionality of the particular incident raised, system flow.
- Handling Code Management for multiple region architecture.
- Preparing weekly status and monthly status report
- Attending Defect calls to provide latest status to clients
Confidential
Onsite Coordinator
Technology: Vision Plus involving COBOL, CICS, MVS, JCL, MQ Series, Oracle, PLSQL
Responsibilities:
- Worked as an Onsite Coordinator from client location. .
- Gathering Requirements & documenting them
- Developed automated tools for region verification before testing
- Involved in designing the environment like CICS regions, Scheduler deigning
- Handled Code Management for multi region operation which helps in parallel development of code.
- Provided timely solutions to the issues faced, which helped in meeting the deadlines of the project.
- Ensuring the quality of deliverables
- Attending defect calls and providing updates to clients on regular intervals.
Confidential
Team Lead
Technology: Vision Plus involving COBOL, CICS, MVS, JCL, and MQ Series...
Responsibilities:
- Working as Project Team from client location.
- Coding and unit testing
- Analysis, resolving batch & online tickets of Production.
- Responsible for delivery and review of all tasks delivery
- Defect Fixes tracking.
- Responsible for release upgrade implementation in Production.
- Provide business analysis documents to understand the functionality of the particular incident raised, system flow.
- Handling Code Management for multiple region architecture.
- Preparing weekly status and monthly status report
- Attending Defect calls to provide latest status to clients
Confidential
Application Programmer
Technology: Vision Plus (CMS) involving COBOL, CICS, MVS, JCL, MQ Series.
Responsibilities:
- Maintaining all the Regions (SIT, UAT, PRODUCTION)
- Refreshing all the regions on monthly basis.
- Running batches and resolving errors.
- Handling Change Requests.
- Providing Permanent solutions for the a bends in Production.
- Attending status Meeting, Conference Calls, Review Meetings