We provide IT Staff Augmentation Services!

Hadoop Admin Resume

4.00/5 (Submit Your Rating)

San, JosE

SUMMARY:

  • Over 16+years of experience in the IT industry encompassing the Analysis, Design, Development, implementation, upgradation, admin and support
  • 12 years of experience in Oracle Applications eBusiness Suite (ERP). Experience includes managing sizeable teams, analyzing gaps in the product, estimating technical components and cost, offshore development co - ordination and management, RICE development using Oracle Technologies like Forms, Reports, APIs and Oracle AIM methodology.
  • 5 years of Experience as Hadoop Admin/Hadoop Developer.
  • Hands on experience in installation, configuration, supporting and managing Clusters with Horton works, MapR distributions and Cloudera.
  • Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
  • Design Big Data solutions for traditional enterprise businesses.
  • Having good experience in Bigdata related technologies like Hadoop frameworks, Map Reduce, Hive, HBase, PIG, Sqoop, Spark, Kafka, Flume, Zookeeper, Oozie.
  • Experienced in writing complex MapReduce programs that work with different file formats like Text Sequence, Xml, JSON and Avro.
  • Having working experience on Cloudera Data Platform using VMware Player, Cent OS 6 Linux environment.
  • Strong experience on Hadoop distributions like Cloudera and Horton Works.
  • Used Network Monitoring Daemons like cloudera management,Ambari alerts, Nagios and CheckMK.
  • Adding/removing new nodes to an existing cluster.
  • Backup configuration and Recovery from a Name Node failure.
  • Decommissioning and commissioning the Node on running cluster.
  • Installation of various Ecosystems and Daemons.
  • Experience on securing the cluster by using Linux, ldap and Ambari ranger.
  • Experienced on authorization, authentication, auditing, data encryption and security administration using Apache Ranger, Apache Knox on Kerberos Cluster
  • Excellent command in creating Backups & Recovery and Disaster recovery procedures and Implementing BACKUP and RECOVERY strategies for off-line and on-line Backups.
  • Involved in bench marking /HBase cluster file systems various batch jobs and workloads
  • Making cluster ready for development team working on POCs.
  • Experience in minor and major upgrades of and eco system.
  • Experienced in scheduling backups and recovery of the entire EDW databases across various geographical locations for the business continuity and response time.
  • Efficient in vacuuming and analyzing database through regular cleanup of old logs, deleting old data, collecting statistics and tuning queries for efficient database running.
  • Extensive experience in creating Roles, Users and providing privileges to roles and user management.
  • Expert in extending core functionality of Hive and PIG by writing the Custom UDF’s using Java, Python based on user requirement.
  • Very good experience of Partitions, Bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance.
  • Optimization and performance tuning of Hive QL, formatting table column using Hive functions.
  • Experienced in maintaining and monitoring the database backup, disc space.
  • Experienced in troubleshooting and analyzing problems like slowdowns, revalidating distribution keysfor structured data distribution and update dev team accordingly.
  • Hands on experience in analyzing Log fileson server for Master, segments and mirrors and finding root cause.
  • Extensive experience in creating and modifyingResource queues.
  • Experienced in modifying connection limits in master and segment.
  • Extensive knowledge of creating schemas, databases, table creations with distribution keys, partitions with orientation and append only conditions
  • Extensive knowledge of creating external tables and data load using gpfdist and analyzing of error logs.
  • Experienced in maintaining and deleting the partitions, and terminating freeze transactions and connections.
  • Extensive experience in Big Data - Hadoop solution implementation for Operational Reporting and Analytics using HIVE, Sqoop. Good Knowledge in SPARK & SPARK SQL.
  • Experience in UNIX Shell Scripting& scheduling jobs using CRONTAB.
  • Good hands on experience in coding, developing and explain analyzing.
  • Very good Knowledge of Architecture and components.
  • Proficient in data analysis and documenting the findings.
  • Expertise in Development of interfaces and conversion programs to integrate Oracle Applications modules to import data from various sources into Oracle Applications using Data Load, PL/SQL, SQL*Loader.
  • Strong programming experience in creating application specific Procedures, API, Packages, Functions, Triggers and other database objects using SQL and PL/SQL.
  • Possess good interpersonal, presentation and developmental skills with Strong analytical and problem solving approach and an excellent team player.
  • Very good knowledge about the ERP systems and has proven experience in developing and implementing ERP solutions for various clients.
  • Self confident and positive approach on challenging assignments.

TECHNICAL SKILLS:

Data Bases: Oracle, MySql, Postgres

ERP: Oracle ERP R 12, 11i/ 11.5.9/10.7 Order Management (OM), TCA (Trading community Architecture), General Ledger (GL), Accounts Receivable (AR), Accounts Payable (AP), Purchase Order (PO), Inventory (INV), Application Object Library (AOL), SysAdmin.

GUI Tools: Oracle Developer 6i/2000 (Forms 6i/4.5, Reports 6i/3.0), Oracle database 10g, SQL*Loader, Oracle 8, 8i, 9i/10g

Languages: Java, Python, SQL, PL/SQL, Postgres

Web Related: HTML, XML, JSON

Tools: PGADMIN III,Aginity Workbench, TOAD, Benthic, Putty, Winscp, Golden Gate, PVCS, GitHub, Jenkins, Microsoft Office Tools, Quality Center, Remedy

Operating Systems: Windows Server/Vista/XP/ 2000/2007/2008, Ubuntu, Unix/Linux, Cent OS

Security: LDAP,AD, Kerberos, Apache ranger, Apache Knox, Sentry

Big Data: Hortonworks hdp 2.2 to 2.5.0.0, Cloudera 4.X,5.X,apache, pig, hive, hbase, sqoop, Flume, Puffet, Chef, zookeeper, ambari, Oozie, spark, Kafka, Ambari, Cloudera Manger, Pivotal HD, Apache Slor, AWS

PROFESSIONAL EXPERIENCE:

Confidential, San Jose

Hadoop Admin

Responsibilities:

  • Currently working as Hadoop admin on CDH 5.7 distribution for 4 clusters ranges from Dev, QA and PROD contains 100 nodes.
  • Responsible for Cluster maintenance, Cluster Monitoring, commissioning and decommissioning Data nodes, Troubleshooting, Manage and review data backups, Manage & review log files.
  • Extensively worked on capacity planning, design and installation of the clusters by fulfilling the business requirements.
  • Day to day responsibilities includes solving developer issues, troubleshooting jobs and providing instant solution to reduce the impact and documenting the same and preventing future issues.
  • Worked on Performance tuning at cluster level to improve the overall performance for the application running.
  • Experience on new component installations and upgrading the cluster with proper strategies.
  • Very good hands on exp on Linux admin tasks, in case of OS level issues causing problems to Hadoop cluster
  • Experience on new Discovery Tools installation like tableau, greenplum, Informatica and integration with Hadoop Components.
  • Monitoring systems and services, architecture design and implementation of deployment, configuration management, backup, and disaster recovery systems and procedures.
  • Hand on experience on cluster up gradation and patch upgrade without any data loss and with proper backup plans.
  • Changing the configurations based on the requirements of the users for the better performance of the jobs.
  • Experienced in guiding the teams while application On Boarding process starting from getting access to Preparing run book for them to use Hadoop cluster.
  • Involved in snapshots and mirroring to maintain the backup of cluster data and even remotely.
  • Installation of various Hadoop Ecosystems and Daemons.
  • Experienced in managing and reviewing log files.
  • Working experience on maintaining MySQL/Postgres databases creation and setting up the users and maintain the backup of cluster metadata databases.
  • Setting up MySQL/Postgres master and slave replications and helping business applications to maintain their data in MysqlServers and postgres servers.
  • Helping the users in production deployments throughout the process.
  • Experienced in production support which involves solving the user incidents varies from sev1 to sev4.
  • Managed and reviewed Log files as a part of administration for troubleshooting purposes. Communicate and escalate issues appropriately.
  • As an admin followed standard Back up policies to make sure the high availability of cluster.
  • Develop the HIVE Scripts and load data to HIVE Tables using HDFS Frame Work.
  • Schedule HIVE Jobs through UC4 Scheduling Tool
  • Worked on moving data marts into Hadoop by using Python, Hive, Sqoop and Tableau
  • Worked on Hadoop optimizer from moving Teradata into Hadoop
  • Involved in Hive partitioning, Bucketing, and performing different types of joins on Hive table and implementing serde’s like RegEx.
  • Design, Develop & Test complex ETL Jobs using Tera Data.
  • Design and Develop UC4 (Automic ) jobs and workflows to schedule the batch jobs
  • Worked on JIRA Agile Software Development Tool.
  • Expertise in GIT Version Controlling Software
  • Created and Developed Reports and Dash boards using Tableau Visualization Tool
  • Involved in Analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.
  • Worked with systems engineering team to plan and deploy new environments and expand existing clusters.
  • Part of Version upgrade from CDH 4.X to CDH 5.X.
  • Monitored multiple clusters environments using Cloudera Manager monitoring services like alert publisher, service/host monitor and Nagios.

Environment: CDH 5.7, HUE, Hive, pig, Sqoop, Flume, Zookeeper,spark and HBase, MYSQL, Python, Shell Scripting, Red hat Linux, sentry, Kerberos, Rstudio, Historian, Greenplum, Solr, Jupiter, Teradata, NoteBook.

Confidential, San Jose

IT Analyst

Responsibilities:

  • Gathered details about current processes and requirements from Business.
  • Designed and developed Reports & Dashboards using Tableau Data visualization Reporting Tool
  • Designed the process to load data from Teradata to Anaplan Planning Tool for Dashboard Creation
  • Worked as Lead in Data Modeling, Dimensional Modeling and Physical Design of Data warehouse projects.
  • Design, Architect and implement highly efficient & scalable ETL/ELT Processes using Informatica.
  • Created and maintained Standards and Best Practices documents for Data warehouse Design and Development
  • Performance Tuning of SQL queries in Teradata & Informatica workflows to meet SLAs.
  • Designed Scalable Solutions in migrating large (Terabytes of Data) volume of Data to EDW Teradata Warehouse.
  • Work with QA & Business teams for unit, integration & UAT testing
  • Developed $U Uprocs, Sessions, Tasks, Management Units(MU’s) &Rules and schedule informatica workflows using $U Scheduling Tool.
  • Actively involved in supporting UAT (User acceptance testing), Production Deployment & Normalization.
  • Follow up with infrastructure team, DBA & informatica support team for setting up the Dev, QA & Production Environments and resolve any environment issues which affects project timelines
  • Document best practices, Continuous process improvement with bi-Monthly sessions with teams.

Environment: Informatica 9.5/8.x, Talend,, SQL, ER/Studio 7.6.0, Tidal Scheduling Tool, Tableau Reporting Tool, OBIEE, Teradata V15.0,Oracle, PVCS,GIT, UCS UNIX Server, Hadoop, HIVE,PIG, SPARK,SPARK SQL, FAST LOAD, TPUMP, MULTILOAD,FAST EXPORT & TPT

Confidential, San Jose

IT Analyst

Responsibilities:

  • Gathered details about current processes and requirements from Business.
  • BI poc for CAB/CCB meetings to do impact assessment on change requests
  • BI impact assessment for CCW Releases
  • BI impact assessment for SOE releases
  • BI Impact assessment for Revenue attribution
  • BI impact assessment for SAJO project
  • BI Impact assessment for NGCCRM .
  • Experience writing detailed documentation including RD050, MD50, CV40 and BR100
  • Worked with global project and Operations teams to clearly and concisely communicate proposal issues and statuses.
  • Supported quarter, year-end reconciliation and closing process
  • Provided timely communication and escalation of issues to the Operations Leadership Team.
  • Worked as Lead in Data Modeling, Dimensional Modeling and Physical Design of Data warehouse projects.
  • Design, Architect and implement highly efficient & scalable ETL/ELT Processes using Informatica.
  • Created and maintained Standards and Best Practices documents for Data warehouse Design and Development
  • Designed Scalable Solutions in migrating large (Terabytes of Data) volume of Data to EDW Teradata Warehouse.
  • Extensively used Teradata utilities like fastload, fastexport, multiload, TPump.
  • Developed $U Uprocs, Sessions, Tasks, Management Units(MU’s) &Rules and schedule informatica workflows using $U Scheduling Tool.
  • Actively involved in supporting UAT (User acceptance testing), Production Deployment & Normalization.
  • Follow up with infrastructure team, DBA & informatica support team for setting up the Dev, QA & Production Environments and resolve any environment issues which affects project timelines
  • Worked on multiple concurrent projects by coordinating with global teams on providing end to end solutions delivery
  • Designed and developed Reports & Dashboards using Tableau Data visualization Reporting Tool
  • Designed and Developed Job Groups and Jobs and schedule informatica workflows using Tidal Enterprise (TES) Scheduling Tool.
  • Worked as Lead in Data Modeling, Dimensional Modeling and Physical Design of Data warehouse projects.
  • Created and maintained Standards and Best Practices documents for Data warehouse Design and Development
  • Performance Tuning of SQL queries in Teradata & Informatica workflows to meet SLAs.
  • Designed Scalable Solutions in migrating large (Terabytes of Data) volume of Data to EDW Teradata Warehouse.
  • Investigating and resolving Issues and Change Requests from customer
  • Work with QA & Business teams for unit, integration & UAT testing

Environment: UNIX, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle Apps Modules, Teradata, Doller Universe, Informatica 8.x, SQL, ERWin 7.6.0, $Universe Scheduling Tool, OBIEE & Business Objects Reporting Tool, Teradata, Oracle, PVCS,UCS UNIX Server, FAST LOAD, TPUMP, MULTILOAD,FAST EXPORT

Confidential, San Jose

Oracle Apps 11i Techno functional Consultant/Analyst

Responsibilities:

  • Gathered details about current processes and requirements from Business.
  • Worked with various stakeholders like internal and external partners for setting up new Operating Units for different countries.
  • Participated in 3 different phases of Business Process Simulations
  • Performed the Gap Analysis of 11i and R12 OM processes
  • Worked with business to implement Oracle out of box functionalities.
  • Interacting with various Cross-functional team to gather business requirements
  • In one R12 instance, simulate additional Cisco scenarios identified for Phase 1.2 without integrations in an Oracle Hosted Environment
  • Re-execute BPS 1.1 scenarios with Cisco configuration & data
  • Simulate limited cross-functional Order to Cash scenarios within R12
  • Gathered requirements and implemented changes for Ordering, Invoice management, Vendor dispute processing for Indirect procurement and AP/ AR netting/ offset.
  • Analyzed, designed and developed an interface to load GL period rates.
  • Implemented Subledger Accounting (SLA)
  • Implemented Milestone based Revenue Recognition, Revenue Transfers, Revenue Reconciliation, Amortize Revenue, and Allocation Posting.
  • Designed and developed enhancements for Purchasing, Payables and Requisitions.
  • Experience writing detailed documentation including RD050, BR100, MD050
  • Developed conversion strategy and mapping (CV40) for various data objects like open AR invoices, customers, sales orders, etc.
  • Wrote Test Scripts and trained users on the process steps.

Environment: UNIX, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle *Forms 6i, Reports 6i, Oracle Apps Modules, R12

Confidential, San Jose

Oracle Apps 11i Techno functional Consultant/Analyst

Responsibilities:

  • Worked with various stakeholders like internal and external partners for setting up new Operating Units for different countries.
  • Performed Gap Analysis and designed custom solutions for different requirements.
  • Review standard Oracle capabilities in the Order-to-Cash space for new buy sell entities
  • Developed the BP080 (To-be future process) documents for AP, AR, PO, FA and GL.
  • Designed and developed solutions for end-to-end processes for Evaluation Orders, Donations Orders, Service Fulfillment, RMA, etc.
  • Interacting with various Cross-functional team to gather business requirements
  • Designed the party and customer data model using TCA from business requirements.
  • Working in integrating Oracle ERP OM and AR with RevPro system for revenue recognition process.
  • Worked on Internal order transformation - Ordering & Booking entity are different
  • Working with Revenue Team in defining the revenue recognition process for software and subscription parts.
  • Developed RD50 documents for various interfaces and customizations.
  • Wrote the BP080 (To-be future process) for doing OM, AR and RMA processes which include entitlement checking, closed-loop RMA process, Install Base checking, etc.
  • Analyzed the Oracle 11i AS-IS and the Oracle R12 TO-BE processes and documented the business impacts for Revenue Recognition process.
  • Worked in mapping the TO-BE process for standard revenue recognition for both products and services.
  • Configured Revenue Recognition rules, AGIS, Intercompany Relations and Trade compliance business processes in Oracle.
  • Developed the process training manuals for various business groups.
  • Developed BR100 (setup) document for finance modules.
  • Gathered requirements and implemented changes for Ordering, Invoice management, Vendor dispute processing for Indirect procurement and AP/ AR netting/ offset.
  • Implemented E-business Tax for Germany, Singapore and Australia
  • BIE rule setups by country
  • Experience writing detailed documentation including RD050, BR100, MD050
  • Wrote Test Scripts and trained users on the process steps.
  • Provided support to QA teams on UAT/BAT testing
  • Developed the process training manuals for various business groups.
  • Provided KT sessions to Production support team on new enhancements

Environment: UNIX, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle *Forms 6i, Reports 6i, Oracle Apps Modules, R12

Confidential, Richmond, VA

Oracle Apps 11i Techno functional Consultant.

Responsibilities:

  • Prepared technical specifications for the given functional specifications, Completed database design for custom ERP tables.
  • Develop custom forms, reports and registered the same with AOL
  • Review Technical Specifications written by other developers and suggest improvements where necessary
  • Implemented a series of scripts to check data integrity between the Comergent front-end database (Oracle) and the ERP database.
  • Worked on GL Copy Program is to transfer data from the Journal tables into the Custom tables and then transfers into the GL INTERFACE
  • Involved in studying historical data and feeder system data and developed interface programs to import journal entries into Oracle General Ledger using 'Journal Import' feature
  • Worked on GL Posting from AR, Receivable Interfaces from Order Entry Modules.
  • Involved customization of enter purchase order form, creation of PA transaction Interface correction screen, creation of interface program from GL to PA transaction interface table
  • Analyzed Credit Memos, Debit Memos, Manual Invoices, Auto Invoices, Lock Box, Adjustments etc.
  • Designed and Implemented Custom Invoice Processing, Upgrade Allowance program in Credit Memos
  • Created Data Base level procedures deducting Tax Calculations.
  • Customized Trial Balance Report
  • Generated Funds availability reports
  • Involved in unit testing, integration testing and system testing of the application

Environment: Unix, Windows NT, Oracle 8i, SQL, PL/SQL, Oracle *Forms 4.5, Reports 2.5, Oracle Order Entry 10.7, Oracle Purchasing, Project Accounting, GL, AOL and Sysadmin.

Confidential

Oracle Developer

Responsibilities:

  • Worked with end users to gather the business requirements
  • Designed and Developed the screens pertaining to Clients, Plans, Policies and Premium payments,
  • Developed the reports on premium dues, policy status and commission details by agency
  • Developed and executed the test cases
  • Trained the end users on using the system
  • Involved in System Analysis, Design, Coding, Data Conversion, Development and Implementation.
  • Installed Oracle 8i, Developer 2000 on Windows NT/2000/95.
  • Coded SQL Scripts to create the Development Database, Testing Database, Production Database, including Table spaces, added data files to table spaces, managed redo log files and control files, Rollback segments creation, Users, Synonyms, Roles, profiles, Privileges and changed init.ora file parameters.
  • Wrote PL/SQL and Pro*C Programs, unix shell scripts required for data transformation/Loading.
  • Extensively involved in writing SQL queries (Sub queries and Join conditions), PL/SQL programming.
  • Involved in creation of Sequences for automatic generation of voucher numbers and views for hiding.
  • Completed performance tuning in the existing SQL, PL/SQL and initiated changes in the Views, improved performance.
  • Completed performance tuning in the existing SQL and PL/SQL to obtain the better performance and made changes in the Views. Written many stored procedures, stored functions, packages and used in many forms and reports.
  • Used TOAD, PL/SQL developer toolsfor faster application design and developments
  • Have written many database triggers for automatically updating the tables and views.
  • Write Shell scripts for batch jobs in Unix.

Environment: SQL, PL/SQL, Oracle 8.06, 8i, 9i,TOAD, PL/SQL,Developer 6i, SQL*Loader, UNIX and Windows 2000.

We'd love your feedback!