We provide IT Staff Augmentation Services!

Bigdata Lead Resume

0/5 (Submit Your Rating)

Des Moines, IowA

SUMMARY

  • Over 9 Years of professional experience in Software Development & Deployment and Three Years of experience in BigData ecosystems and Data engineering technologies.
  • Good Expertise on Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map reduce paradigm.
  • Experience in the design, develop and monitoring of large clusters involving with both relational and NoSQL databases.
  • Expertise in NoSQL database implementation like Cassandra, MongoDB, Hbase, MarkLogic, Elastisearch and Redis.
  • Good Understanding on Distributed Architecture and Horizontal scalability methodologies.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Zookeeper, Oozie, Hive, Sqoop, Pig, and Flume.
  • Experience in handling Data ingestion to Hadoop Ecosystems.
  • Good Experience in designing Java integration API for Hbase, cassendra, Elasticsearch, Flume and Redis.
  • Expertise in Message Brokers ActiveMQ, Apache Kafka, IBM MQ.
  • Good Experience in Data Serialization methodologies like AVRO and Thrift.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Experience in Developing Mobile Application developments using Knoy Mobile Development Tool.
  • Experience in Agile, SDLC, and Waterfall implementations.
  • Strong Experience in designing SOA patterns in ESB (Enterprise Service Bus) model.
  • Experience in Banking and Financial services, Cards and Payments & Communication Domains.
  • Expertise in working with J2EE Frameworks like Spring, Struts Framework, Webservices, Hibernate, iBatis.
  • Involved in all phases of software development life cycle.
  • Good Team player and ability to deliver tasks on time.

TECHNICAL SKILLS

Languages: C, JAVA, Scala, Ruby

J2EE Frameworks: Struts, Spring, Hibernate, JSF, iBatis,Webservices, EJB, JMS, Camel

Big Data Technologies: Hadoop Ecosystems, Hive, HBase, Zookeper, Kafka, STORM, Apache Flume, Sqoop, Pig, Apache Spark, Splunk, Logstash, Kibana

NoSql Databases: Hbase, Cassandra, Elasticsearch, MongoDB, Redis.

Web Technologies: HTML, HTML5, JavaScript, DOJO, Extjs, JQuery, Ajax, Weblogic Portal, Content Management Tool.

Message Protocols: JMS, AMQP, STOMP, TCP

Message Brokers: Websphere MQ, ActiveMQ, Rabit MQ, Kafka

Deployment servers: Web logic, Websphere, Tomcat

Mobile Technologies: Android, Kony Studio.

Build Scripts: Ant, Maven, Gradle

RDBMS: Oracle, MySQL

PROFESSIONAL EXPERIENCE

Confidential - Des Moines, Iowa

BigData Lead

Responsibilities:

  • Implemented Java API for Apache Flume to ingest Auditing data to Hbase using ASYNC Hbase streaming.
  • Writing Hbase Scan operations for custom search implementation.
  • Writing Analysis and Design documents for the changes made.
  • Analyzed the Scope of the Project and approach to implement the Functional Requirements.
  • Developed Flume Agent configuration files for AVRO Source and Hbase SINK.
  • Implemented API to write Transaction metadata into Oracle database.
  • Implemented Data loading and transforming from structured data to Semi-Structured data using Sqoop.
  • Implement and integrate Kafka producer with STOM Spout implementations for Audit Implementations.
  • Data management using Hadoop clusters.
  • Configuring and installing Hbase, Zookeper, and Hive.
  • Participate and review status of deliverables.
  • Review and integrate the code in SVN as per coding standards.
  • Worked with infrastructure team to design Hadoop cluster.
  • Implemented Retrieve API and search queries for payload from HBase.
  • Monitoring Health Checks for HDFS clusters.
  • Analysis and implementation of Elasticsearch integration with HDFS.

Confidential - Des Moines, Iowa.

BigData Lead

Responsibilities:

  • Evaluation of High Volume Auditing Data Processing with Open source tools Logstash, Elasticsearch, Redis and Kibana.
  • Implemented Data Ingestion API to insert in Cassandra.
  • Implement Elasticsearch custom Query using Elasticsearch java API.
  • Install Elasticsearch cluster.
  • Tune and perform Elasticsearch sharding and Replication for logs data.
  • Implemented Elasticsearch River API to read data from RDMS data.
  • Analyze and integrate STORM processing with Redis cluster.
  • Analysis and implementation of Elasticsearch integration with HDFS.
  • Implemented Apache flume agents to process and ingest data into Elasticsearch.
  • Integrated Kibana to retrieve Elasticsearch stored data.
  • Monitoring Health Checks for HDFS clusters.

Confidential

Module Lead

Responsibilities:

  • Implemented the Java API to insert data into Cassandra using Java Rest API.
  • Handled Data ingestion model to use Recovery and Retry insertion logic.
  • Implemented Data Retrieval API from Cassandra.
  • Implemented Transaction handling logic for Data insertion.
  • Fixed the Unit Test and SIT Issues.
  • Reviewed the code from Code Review Panel team for SQP Projects.

Confidential

Module Lead

Responsibilities:

  • Analyzed the Scope of the Project and approach to implement the Functional Requirements.
  • Implemented the changes in Batch Processing for Japan downstream systems.
  • Designed the XSL templates for Instant Requests and Batch templates.
  • Designed the Layout templates for JSPs.
  • Implemented the Performance improvements changes in XSL transformations.
  • Executed the project with SDLC.
  • Fixed the SIT and UAT Issues.
  • Reviewed the code from Code Review Panel team for SQP Projects.

Confidential

Team Lead

Responsibilities:

  • As a Module lead was responsible for providing the metrics of the project, assisted the business analysis team in prototyping.
  • Contributed to Sales Management and Admin Management.
  • Implemented the Webservice client for Data passing to downstream systems using JAXWS.
  • Implemented UI using Ext-Js.
  • Participate in design discussions and preparing the Design documents.
  • Guide the Team to implement requirements and monitored the reviews.
  • Implemented the Struts Actions, Business Components.
  • Involved in System Testing, User Acceptance testing and implemented the change requests.
  • Implemented the Spring DAO for DB

Confidential

Environment: Java 1.6, JSP, Spring, XML, Oracle, AjaxExtjs, Hibernate, Web Services, Spring STS

Responsibilities:

  • Involved in Writing actions using Spring MVC component.
  • Developed Pages Using Extjs Framework.
  • Build the Application using Agile methodologies.

Confidential

Environment: Java 1.6, Struts2.0, JSP, Spring, XML, MySQL, AjaxDojo, iBatis, Hibernate, JProfile, Web Services, Tomcat 6.0

Responsibilities:

  • Involved in Gathering Requirements From The Client.
  • Developed Prototype for the Project in Wireframe Software.
  • Involved in Database Design.
  • Developed Action classes, Service Layers, DAOs .
  • Implemented Ajax Requests for 7 Modules.
  • Designed Project Front End using Dojo Framework.
  • Developed Struts2 Actions to service requests.
  • Configured Quartz framework to schedule Reports.
  • Client side validations through JavaScript.
  • Implemented Springs Mailing Functionality.
  • Involved in development, integration and bug fixing .
  • Involved in giving KT sessions to Onsite and offshore Team.
  • Defect Tracking and bug fixing
  • Extensively worked on performance issues.

We'd love your feedback!