System/network Engineer Resume
Sfo, CA
SUMMARY
- Around 13+ years of professional IT experience with around 3+ years of experience in developing strategic methods for deploying big data technologies like HDFS, Map Reduce, HBASE, HIVE, PIG, FLUME, OOZIE, SQOOP and ZOOKEEPER using Apache HADOOP frame work to efficiently solve Big Data processing requirement.
- Extensive experience in development of Big Data projects using Hadoop, Mapreduce, Pig, Hive, Sqoop, Flume, and Oozie.
- Good experience working with Hortonworks Distribution and Cloudera Distribution.
- Experience in installation, configuration, supporting and managing Hadoop clusters.
- Ability to import and export data between HDFS and Relational Data Management Systems using Sqoop.
- Worked with multiple file Input Formats such as TextFile, KeyValue, SequenceFile and NLineinput format.
- In Depth understanding in installing and configuring Pig, Hive, HBase, Flume, Sqoop on the Hadoop Clusters.
- Experience in analyzing data using HiveQL, Pig, Hbase, and custom MapReduce programs in Java.
- Experience in extending Hive and Pig core functionality by writing custom UDFs using Java.
- Experience writing MapReduce programs with custom logics based on the requirement.
- Experience in optimization of Map reduces algorithm using Combiners and Partitioners to deliver the best results.
- Hands on experience in storing, processing unstructured data using NoSQL databases like HBase.
- Logical Implementation and interaction with HBase.
- Implemented many Impala scripts and Shell scripts for data validation and data analytics
- Knowledge of job workflow scheduling and monitoring tools like Zookeeper.
- Worked on different OS like UNIX /Linux and developed various shell scripts.
- Worked in Multiple Environment in installation and configuration.
- Experience in using Sqoop, Oozie and Cloudera Manager.
- Working knowledge on real time batch processing using Spark.
- Good Knowledge on Hadoop Cluster architecture and monitoring the cluster
- Experience in managing and reviewing Hadoop Log files.
- Knowledge of data warehousing and ETL tools like Talend OpenStudio.
- Background with traditional databases such as Oracle, MySQL.
- Knowledgeable in Database concepts and writing finely tuned queries.
- Knowledge of manipulating/analyzing large datasets/stored data and finding patterns and insights out of it based on the requirement.
- Document and explain implemented processes and configurations in upgrades.
- Support development, testing, and operations teams during new system deployments.
- Evaluate and propose new tools and technologies to meet the needs of the organization.
- Implemented stand - alone installation, file system management, backups, process control, user administration and device management in a networked environment.
- An excellent team player and self-starter with good communication skills and proven abilities to finish tasks before target deadlines.
- Sound knowledge of Business Intelligence and Reporting. Preparation of Dashboards using Tableau.
- Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper.
- Knowledge of java virtual machines (JVM) and multithreaded processing.
- Strong programming skills in designing and implementation of applications using Core Java, J2EE, JDBC, HTML,JavaScript.
TECHNICAL SKILLS
Big Data Ecosystems: Hadoop-HDFS, Map Reduce, Pig, Hive, SQOOP, Hbase, Flume, Oozie
Web Technologies: Core Java, J2EE, CSS3, HTML, XML, JavaScript, Servlets, JSP
Programming Languages: C, JAVA, J2EE, SQL, PHP
Methodologies: Agile, UML, Design Patterns (Core Java and J2EE)
Operating Systems: Microsoft Windows, Linux
Environment: s & Databases: Oracle, My SQL, Net beans, Visual Studio, GIT, Eclipse, JIRA
Monitoring & Reporting tools: Tableau, Custom Shell Scripts
PROFESSIONAL EXPERIENCE
Confidential - Richfield, Minnesota
Senior Hadoop Developer
Responsibilities:
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
- Configured Sqoop Jobs to import data from RDBMS into HDFS using Oozie workflows.
- Involved in creating Hive Internal and External tables, loading data and writing hive queries, which will run internally in map, reduce way.
- Involved in Migrating the Hive queries to Impala.
- Created batch analysis job prototypes using Hadoop, Pig, Oozie, Hue and Hive.
- Deployed and configured Flume agents to stream log events into HDFS for analysis.
- Assisted with data capacity planning and node forecasting.
- Integrated Oozie with the rest of the Hadoop stack supporting several types of Hadoop jobs out of the box (such as Map-Reduce, Pig, Hive, and Sqoop) as well as system specific jobs (such as Java programs and shell scripts).
- Proposed aHadoop-based BI infrastructure including an HDFS data layer storing the data in Parquet files with an AVRO schema on top being fed by aKafkadata broker.
- Parsed JSON and XML files in PIG using Pig Loader functions and extracted meaningful information from Pig Relations by providing a regex using the built-in functions in Pig.
- Proactively monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures.
- Involved in Analyzing system failures, identifying root causes, and recommended course of actions.
- Documented the systems processes and procedures for future references.
- Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
- Monitored workload, job performance and capacity planning using Cloud era Manager.
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
- Monitoring, Performance tuning of Hadoop clusters, Screening Hadoop cluster job performances and capacity planning Monitor Hadoop cluster connectivity and security Manage and review Hadoop log files.
- Load and transform large sets of structured, semi structured and unstructured data.
Environment: HDFS, JDK1.6, CentOS, Flume, Map Reduce, Hbase, MongoDB, Pig, Hive, Sqoop, Oozie, Zookeeper, Java,Eclipse, MySQL, Tableau, Unix, WebLogic, Kafka, J2EE.
Confidential - Dallas, Texas
Senior Hadoop Developer
Responsibilities:
- Responsible for building scalable distributed data solutions using Hadoop.
- Installed and configured Hive, Pig, Oozie, and Sqoop on Hadoop cluster.
- Developed simple to complex Map-Reduce jobs using Java programming language that was implemented using Hive and Pig.
- Supported Map Reduce Programs that are running on the cluster.
- Cluster monitoring, maintenance and troubleshooting.
- Handled the importing of data from various data sources, performed transformations using hive, Map-Reduce, loaded data into HDFS and extracted data from Mysql into HDFS using Sqoop.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Analyzed the data by performing Hive queries (HiveQL) and running Pig Scripts (Pig Latin).
- Installed Oozie workflow engine to run multiple Hive and Pig jobs.
- Worked on NoSQL database including MongoDB and HBase.
- Created HBase tables of large data sets of structured and unstructured data.
- Developed NoSQL database by using CRUD, Indexing, Replication and Sharding in MongoDB. Sorted the data by using indexing.
- Generated the reports and dashboards using the tool Tableau.
- Performance tuning ofKafka, Storm Clusters. Benchmarking Real time streams
- Generated Tableau reports with trend lines and used filters, sets and calculated fields on the reports.
- Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Written multiple MapReduce programs in Java for data extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV and other compressed file formats
Environment: HDFS, JDK1.6, CentOS, Flume, Map Reduce, Hbase, MongoDB, Pig, Hive, Sqoop, Oozie, Zookeeper, Java,Eclipse, MySQL, Tableau, Unix, WebLogic, JUnit, kafka.
Confidential, San Francisco, CA
Java Developer
Responsibilities:
- Involved in the complete SDLC software development life cycle of the application from requirement analysis to testing.
- Developed the modules based on MVC Architecture.
- Developed The UI using JavaScript, JSP, HTML, and CSS for interactive cross browser functionality and complex user interface.
- Created Business Logic using Servlets, Session beans and deployed them on Weblogic server.
- Used MVC framework for application design.
- Created complex SQL Queries, PL/SQL Stored procedures, Functions for back end.
- Prepared the Functional, Design and Test case specifications.
- Involved in writing Stored Procedures in Oracle to do some database side validations.
- Performed unit testing, system testing and integration testing
- Developed Unit Test Cases. Used JUNIT for unit testing of the application.
- Provided Technical support for production environments resolving the issues, analyzing the defects, providing and implementing the solution defects.
- Resolved more priority defects as per the schedule.
Environment: Java, JSP, Servlets, JDBC, Unix/Linux, JavaScript, CSS,, HTML, SQL, Junit, Eclipse, Apache Tomcat.
Confidential, MN
Java Developer
Responsibilities:
- Involved in various stages of application development through requirement analysis, development, testing and deployment.
- Developed the application using Agile Methodology.
- Developed web application using Spring MVC framework.
- Developed back-end logic with CoreJavausing technologies including Collection Framework, Multi-Threading, Exception Handling, Generics and Annotations.
- Functionalities include writing code in HTML, CSS, JavaScript, JQuery, and Bootstrap.
- Designed REST APIs that allow sophisticated, effective and low cost application integration.
- Used Spring Dao Support to access the database.
- Used Spring IOC for dynamic bean injection and Spring AOP to modularize cross-cutting concerns in aspects.
- Used UI such as JavaScript for enhancing the user interaction with the application and for client side validation.
- Used JIRA for bug tracking, issue tracking and project management.
- Created Class and sequence diagrams by using Enterprise Architect.
- Used Maven to build, run and create JARs and WAR files among other uses.
- Used Spring IOC, AutoWiredPojo and DAO classes with Spring Controller.
- CSS was used to keep uniformity in look of the application.
- Deployed the application to TOMCAT server andUsed ALM tool for defect tracking.
- Created numerous test cases using JUnit framework and JUnit for front end UI testing.
- Used SVN for code repository and Eclipse for IDE.
- Used Log4j for application logging and debugging.
- Developed Junit test cases for all use cases and executed them.
Environment: Java, MyBatis, Maven, Spring MVC, HTML, CSS, JavaScript, JUnit, SVN, Web Sphere Application Server, Jenkins, Eclipse, JSON, JSP, Servlets, Log4j.
Confidential, SFO, CA
System/Network Engineer
Responsibilities:
- Involved in the configuration & troubleshooting of routing protocols: MP-BGP, OSPF, LDP, EIGRP, RIP, BGP v4. Configured IP access filter policies.
- Key contributions include troubleshooting of complex LAN/WAN infrastructure that include
- Configured Firewall logging, DMZs & related security policies & monitoring
- Modified internal infrastructure by adding switches to support server farms and added servers to existing DMZ environments to support new and existing application platforms.
- Creating Private VLANs & preventing VLAN hopping attacks & mitigating spoofing with snooping & IP source guard
- Installed and configured Confidential PIX 535 series firewall and configured remote access IPSEC VPN on Confidential PIX Firewall
- Enabled STP Enhancements to speed up the network convergence that include Port-fast, Uplink-fast and Backbone-fast
- Other responsibilities included documentation and change control
- \Installation & configuration of Confidential VPN concentrator 3060 for VPN tunnel with Confidential VPN hardware & software client and PIX firewall
- Documented new VPN enrollments in a database and create standard procedures for further improvement.
- Involved in troubleshooting of DNS, DHCP and other IP conflict problems
- Used various scanning and sniffing tools like Wire-shark
- Hands on Experience working with security issue like applying ACL’s, configuring NAT and VPN
- Documenting and Log analysing the Confidential ASA 5500 series firewall
- Configured BGP for CE to PE route advertisement inside the lab environment
- Spearheaded meetings & discussions with team members regarding network optimization and regarding BGP issues.
Environment: Confidential 3750/3550/3500/2960 switches and Confidential 3640/ 5/3600/2800 routers, Confidential ASA5500, Confidential PIX Firewall, Checkpoint, LAN, OSPF, BGP, RIP, EIGRP,WiresharkWindows server NT /2000 Windows XP.
Confidential
Fiber Optic Engineer
Responsibilities:
- Supervising installation of optical ground wire conductor (OPGW).
- Splicing and testing fiber optic networks.
- Installation, operation and maintenance of teleprotection equipment at electrical substations.
- Performed LAN administration and troubleshooting for Windows 2003 servers and Windows XP workstations.
- DNS Management (IP addressing)
- Executes Campus Network Upgrades, replacements & expansions
- Onsite Support of desktops, Servers, LAN equipment and WAN links.
- Assists in LAN Design & support.
- Involve in Datacentre build and support, Implementation, migrations network support, Interconnectivity between an old Datacentre and new Datacentre.