We provide IT Staff Augmentation Services!

Sr. Big Data Engineer Resume

0/5 (Submit Your Rating)

SUMMARY:

  • Skilled and goal - oriented developer with the ability to produce analytic solutions across many departmental boundaries and large organizations on small and large-scale, multi-million-dollar Big Data & Java projects. Proven experience in requirement gathering, impact analysis, design, estimation, development, unit & system testing, deployment for major organizations.
  • Strong knowledge and work experience in analyzing data using Hadoop Ecosystem including Spark, Spark Streaming, HDFS, Hive, NIFI, Kafka, HBase, Zookeeper, Sqoop etc.
  • Extensive Knowledge of Hadoop and Spark Architecture and its core components.
  • Experience in working with Amazon Web Services (AWS) suite and CDH clusters.
  • Expertise in Writing, Configuring, Deploying Spark Applications on a Hadoop Cluster.
  • Experience in developing Spark Streaming applications, getting the data using Nifi, writing the stream data into Kafka and analyzing the data through Spark (conducted ETL processes and connected to different SQL and NoSQL databases).
  • Experience in writing queries for moving data from HDFS to Hive and analyzing data using Hive-QL.
  • Very good Knowledge of Partitions, bucketing, Join optimization, Query optimization concepts in Hive.
  • Experience in importing and exporting data using Sqoop from Relational Database System (RDBMS) into HDFS.
  • Imported data from different relational data sources like Oracle, Teradata to HDFS using Sqoop.
  • Extensive hands-on experience in building high-performance Java distributed systems, performance tuning, stress testing, and load testing. Proficient in Java, J2EE, MVC patterns, ORM technologies, RESTful web services, Message Broker Services, Web development (HTML, JavaScript, CSS, jQuery, client-side MVC frameworks) and application servers (WebLogic, WebSphere, Jboss and Tomcat).
  • Very strong understanding of object oriented design / development and Understanding of real-time, high throughput systems and architecture.
  • Strong exposure to and understanding of database tables, data navigation, warehousing, and extraction using Oracle, MySQL and MS Access.
  • Good experience in using CI tools like Bamboo to create automatic build plans and deployment plans.
  • Worked with XML parsers like JAXP (SAX and DOM) and JAXB
  • Proficient in doing the code review and identify functional & technical gaps in end-to-end application development life cycle.
  • Experience in Managing teams of onsite and offshore functional analysts and developers. Performed overall planning, budgeting, resource allocation and project management for various phases in several projects.
  • Experienced in Software Development Life Cycle (SDLC) methodologies including traditional Waterfall methods, and iterative approaches such as Agile.
  • Exposure to handle the meetings with end Clients and business teams on requirement discussions, critical issues in UAT & Production regions.
  • Strong team player and be extremely flexible and able to work in deadline driven, fast paced environment adopting to constantly changing business priorities.

TECHNICAL SKILLS:

Big Data: Spark Kafka HDFS HBASE HIVE NIFI Zeppelin Elastic Search

Methodologies: Waterfall Agile

Languages: Scala Java, J2ee XML parsing Unix Shell Scripting JavaScript jQuery CSS HTML

Databases: Oracle 11g SQL Server 2000 MS - Access My SQL Teradata

Framework: Spring Framework (IOC, MVC, JMS) Struts Framework XSLT Asynchronous Message processing from MQ JDBC Junit EJB iBatis DAO

CI Tools: Bamboo

Tools: IntelliJ Eclipse Control - M jobs scheduler Altova XML Spy Microsoft Visual Studio Rapid SQL (Oracle database client) Oracle SQL developer Toad RAD 7.0

Applications Servers: Oracle WebLogic Server 11 IBM WebSphere (WAS) 6.0 Jboss 5.1 Tomcat

Source Control: Bitbucket Team Foundation Server (TFS) PVCS SVN VSS

Knowledgeable: IBM MQ Server HP ALM JCAPS Mysis Summit FT (Trade Management System) Lombard Colline (Collateral Management System)

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Big Data Engineer

Responsibilities:

  • receive the trade messages from different clients in different time zones across three regions (NA, Europe, APAC) through messaging broker service apply validation, business rules to detect Fraud and perform transformations on the consumed message
  • Transform the message to a format required by different downstream systems (like asset, fund accounting systems etc.…) and route to corresponding system through messaging service. Every downstream system has its own defined format requirements. load every transaction of trade message to the data repository so as to enable the trade status dashboards route the transformed message to central counterparty (CCP) for trade matching, settlement and obtaining the clearance from CCP house archive the trade messages extract data from existing repositories and send the reports to Clients
  • Designed Streaming architecture, and implemented strategies for governance, management, ingestion, storage, consumption and delivery processes of diverse, complex and large-volume data sets.
  • Designed and developed methodologies to assure validity, completeness, accuracy and consistency of the data enable and configure Apache Kafka as message broker system between Client and its customers for input messages and outgoing messages. developing the real-time trade message processing by using Spark Streaming
  • to consume the messages from message broker (Kafka) save the raw input messages to HBase for backup purpose and for further usage by analytics team apply validation and transformation save the processed success messages to Hive using dynamic partition also, convert the processed messages to json format and send it to downstream system through Kafka broker for further trade processing developing the EOD batch reports using Spark SQL writing the sqoop batch scripts to migrate the existing derivative trade history data from legacy oracle database to HDFS and Hive. writing the HQL batch scripts to extract the data from hive, apply the business rules, transform the data and generate the client reports (in csv format). writing the wrapper shell scripts for spark submit jobs and hive scripts coordinating the development, unit testing & System testing with the finalized design and ensure the SDLC process are followed. lead Quality Assurance initiatives including managing Hadoop application and infrastructure testing teams and contributing to the testing effort in a hands-on role. working closely with cross-functional organizations, project teams and clients to develop project schedules, execute test plans and cases and deliver high quality products.
  • Developed HBase data ingestion code using Sprak-Hbase API and involved in developing HBase data model for storing horizontal and vertical data.
  • Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory parameters.
  • Involved in POC design and implementation to analyze the Social Media data to assist with the derivatives trade market trend analysis. Connected to twitter using NIFI process and extracted the searched keyword public tweets, routed to Spark streaming process through the Kafka broker, process and ingest to HBASE for further trend analysis by Analytics team.
  • Worked on Enterprise Logger POC to analyze the server logs generated by all internal application systems for improving the customer intelligence, monitor capability and alert system. Using NIFI ftp process, retrieve all systems log files and push it to Spark processor to process and analyze the logs data.

Confidential

Sr java developer

Responsibilities:

  • Spring MVC & IOC framework to track the Trade process.
  • Spring JMS & EJB MDB’s to consume and send the messages using IBM MQ Manager with asynchronous message processing
  • Concurrent processing of trade messages using java multi-threading and route to different internal systems using MQ
  • JCAPS technology as the HUB system for all the trade messages to route.
  • Unix Shell scripting, core java, JDBC, JMS to create and send the reports to the various Clients.
  • XSLT to convert the messages from one format to another format
  • XML parsing techniques DOM parsing & SAX parsing
  • Oracle WebLogic 11g Server, JBOSS 5.1 as application servers
  • IBM MQ 7.5 MQ server for message processing
  • Oracle Database 11g as database
  • Team Foundation Server(TFS) for source control system.
  • Working closely with business, client partners and systems group to understand the business and functional requirements and responsible for creating the design document, architecture diagrams, estimations and related artifacts for the business requirements.
  • Co-ordinate the development, unit testing & System testing with the finalized design and ensure the SDLC process are followed.
  • Performing FAT and UAT with the Change management group and provide clarifications / resolutions to technical / business related issues.
  • Working closely with cross-functional organizations, project teams and clients to develop project schedules, execute test plans and cases and deliver high quality products.
  • Responsible for the installation in Production environment and providing warranty support for any issue during post installation.
  • Also, Responsible for RUNBOOK preparation for installation which describes the step by step procedure for installation in Production/Live environment.
  • Responsible for efficient offshore coordination for task allocation, provide status updates to the project management team, monitor and provide the support for installation of ongoing projects.
  • Led the rest of the team members in the team, and coaching them around the areas of their function.
  • Mentored team members on best practices. Identified and suggested opportunities for immediate workgroup to help promote bench strength.
  • Responsible for mentoring the other team members on technical & function aspect of the project.
  • Responsible for code reviews and provide approval for the code to migrate in UAT environments from development environment.
  • Performing SME role
  • Leading the team and also doing management activities like HCM supervisor

Confidential

SENIOR JAVA DEVELOPER

Responsibilities:

  • Performed the analysis of every Confidential Client (different industries) requirements and configure the Confidential product using the XML configuration. Different industries like Automotive and Industrial, B2B Services, Cargo, Freight & Logistics, Chemicals and Energy, Consumer Goods, Food and Beverage, Healthcare, High Tech, Travel & Hospitality.
  • Responsible for on-time delivery of quality code.
  • Responsible for Unit Testing and Integration Testing of the deliverables.
  • Led the rest of the team members in the team in coaching them around the areas of their function.Mentored team members on best practices. Identified and suggested opportunities for immediate workgroup to help promote bench strength.
  • Served as a subject matter expert.
  • Worked with business teams and technical analysts to understand business requirements and determined how to leverage technology to create solutions that satisfy the business requirements.

Confidential

SENIOR JAVA DEVELOPER

Responsibilities:

  • Involved in service side development, deployment and unit testing.
  • Involved in writing the class level Junits and integration level Junits using JUNIT framework.
  • Involved in Maven Configuration and Maven build
  • Responsible for unit development & unit deploy the code to local environment in IBM WebSphere Application server. Performed the IBM WAS server configuration in unit environment and configured the connection pools for databases, shared libraries for application server.
  • Ensuring timely delivery with adherence to quality standards
  • Involved in Requirement Analysis/Impact Analysis, Analysis review, Coding, Code review and Testing of Maintenance Requests.
  • Performed Configuration Manager (CM) role
  • Providing solutions to the production fixes in Service to all the application.
  • Worked closely with cross-functional organizations, project teams and clients to develop project schedules, execute test plans and cases and deliver high quality products.
  • Responsible for mentoring the other team member/new joiners on technical & function aspect of the project.
  • Responsible for code reviews and provide approval for the code to migrate in UAT environments from development environment.

We'd love your feedback!