Hadoop/spark Technical Consultant Resume
Dallas, TexaS
SUMMARY
- Over 7years of professionalIT Experience as which includesBigdata ecosystem related technologies andOracle (11g/10g/9i/8i) in Production support environment.
- Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
- Hands on experience on major components in Hadoop Ecosystem like Hadoop MapReduce, HDFS, HIVE, PIG, HBase, Sqoop, Oozie and Flume.
- Experience in managing and reviewing Hadoop log files.
- Experience in analyzing data using HiveQL, Pig Latin, HBase and custom MapReduce programs in Java.
- Experience in data management and implementation of Big Data applications using Hadoop frameworks.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce concepts.
- Well versed in installation, configuration, supporting and managing of Big Data and underlying infrastructure of HadoopCluster.
- Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map/Reduce and Pig jobs.
- Experience in managing and reviewing Hadoop Log files.
- Understanding of basic concepts of SparkCore, SparkSQL, Sparkstreaming.
- Worked as a part of team in 24x7 Level-3 production environment and provided on-call and day-to-day support.
- Experience in Design, Creating and maintaining the Oracle Database.
- Expertise in Hadoop echo systems HDFS, Map Reduce, Pig, Sqoop and Hive for scalability, distributed computing and high performance computing.
- Expertise in critical code implementation and Designing Experience in Oracle PL/SQL&SQL.
- A track record of maintaining and improving job skills through training, self-research and self-study.
- Ability to work individually as well as in a team with excellent problem solving and trouble-shooting capabilities.
TECHNICAL SKILLS
Hadoop Ecosystem Development: HDFS, Map-Reduce, Hive, Pig, Oozie, HBase, Sqoop, Flume,Spark
Operating Systems: Windows XP, Windows 7 &10, UNIX
RDBMS: Oracle (11g/10g/9i/8i)
Languages: SQL, PL/SQL, C, C++, Core Java, Web Services, Pig, Sqoop, Hive, MapReduce
Tools: /Utilities: SQL*Loader, SQL Developer, Query builder, MS Access, Putty, BMC Remedy, Splunk.
Web Tools: XML,HTML,SOAP UI
PROFESSIONAL EXPERIENCE
Confidential, Dallas,Texas
Hadoop/Spark Technical Consultant
Responsibilities:
- Mainly worked on Hive queries to categorize data of different claims.
- Involved in creating Hive tables, Pig tables and loading data and writing hive queries, pig scripts
- Using Sqoop to extract the data back to relational database for business reporting.
- Monitoring Hadoop scripts which take the input from HDFS and load the data into Hive.
- Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
- Designing and creating Hive external tables using shared Meta-store instead of derby with partitioning, dynamic partitioning and buckets.
- Responsible to manage the test data coming from different sources Reviewing peer table creation in Hive, data loading and queries.
- Used Oozie workflow to co-ordinate pig and hive scripts.
- Used Spark for interactive queries, processing of streaming data and integration with popular NoSQL database for huge volume of data.
- Used spark-SQL to load data and create schema RDD and loaded it into hive tables and handled data using spark SQL.
Environment: Hadoop, HDFS, Hive, Map Reduce, Pig, Sqoop, Spark, Core java, Oracle, Unix
Confidential
Technical consultant
Responsibilities:
- Provide day to day Level-3 support includes monitoring, troubleshooting, resolution and customer support.
- Maintaining&monitoringapplicationslikeProfile,FMS(FinancialManagementSystem),ATMandReportServer,ATM,Mibank.
- Presently carrying out Back-End and Production Support activities in various applications being implemented in USAA.
- Preparing the different setup as per the bank needs, maintenance and handling the EOD activities.
- Proficient in Billing/EOD Production support & Application support.
- Vast experience in monitoring BOD and EOD jobs with Database and Dashboard for the production systems.
- HandlingRemedyticketstoresolveincidents.
- Prepared root cause analysis for problems occurred.
- Involved in Development of the applications through Oracle using PL/SQL created Tables, cursors and SQL Queries.
- Used several Oracle predefined functions NVL, NVL2, DECODE, CASE STATEMENT.
- Providing excellent Customer Service, as well as displaying professionalism when dealing with the clients.
- Performed several DDL, DML and TCL operations.
- Involved in working on Oracle pseudo columns like ROWID, ROWNUM, SYSDATE, USER etc.
- Involved in generating numbers for PRIMARY KEY VALUES using Oracle SEQUENCE objects.
- Developing Packages, Procedures, Functions and Triggers for the application.
- Transferred data using data transfer tools like FTP (File Transfer protocol) and SFTP (Smart FTP).
- Automated data fetch using UNIX shell script.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Oracle.
- Used ETL to extract files for the external vendors and coordinated that effort.
- Used FTP to transfer the files into different servers as per needed by business users.
- Provided 24*7 on call support to production environment.
- Adhering to the quality and process of the job profile.
- Displaying team work and positive attitude.
- Knowledge on Bond, Equity and FX Futures.
- Plan and supervise the daily activities of the application support.
- Mentorteamandcontributetoknowledgebase.
Environment: Oracle (11g/10g), SQL*Plus, PL/SQL, Putty, BMC Remedy, SQL*Loader, UNIX.
Confidential
Technical Consultant
Responsibilities:
- Fix outages within the stipulated Service Level Objective (SLO)
- Ensure that the down time of applications is kept to a minimum. Return to Service (RTS) is one of the main objectives of ACC.
- Once the outages are fixed, applications are returned to service from an application Maintenance stand point.
- Analyze the root cause as part of Application Support to prevent similar outages in future.
- Level 3 troubleshooting: Java Code changes&Data fixes in databases to process abandoned online transactions.
- Create solutions in Wisdom knowledge management system for successfully resolved production outages.
- Update tickets and manage production incidents and application workflow using Infra tool.
- Schedule and manage batch jobs execution using Control-M scheduler.
- Code walkthroughs, Debugging and Error fixing.
Environment: Windows, UNIX, Java, Web Sphere, DB2, Oracle
Confidential
Oracle DB Production Support Analyst
Responsibilities:
- Responsible to provide 24 x 7 database administrative activities over different database servers, representing various RDBMS as Oracle 10g.
- Alert monitoring and Ticketing.
- Efficient in using the Oracle Tools like SQL* Plus, SQL* Loader.
- Creation of database objects like Tables, Indexes, Views, Users, etc.
- Disk Space Monitoring.
- Shell scripting and PL/SQL programming to resolve business problems of various natures.
- Knowledge on BMC remedy tool.
- General database monitoring for optimization performances.
- Monitoring of BMC Remedy tickets and actions accordingly.
- Responsible for Data load on different databases using SQL*Loader
- Helped business users in resolving many issues in all the manufacturing modules.
- Perform Daily monitoring of Oracle instances, monitor users, table spaces, Memory structures, Logs and Alerts.
- Month end job monitoring and checklist preparation and co-ordination.
Environment: Oracle 9i, 10g, SQL*Plus, PL/SQL, Putty, BMC Remedy, SQL*Loader, UNIX