We provide IT Staff Augmentation Services!

Sr.implementation Analyst Resume

0/5 (Submit Your Rating)

Fremont, CA

SUMMARY

  • Almost 10 years of industry and research experience in Java, J2ee, Java based Framework, Hadoop, Big Data, MPI and Hadoop Cluster, Distributed and Parallel Programming, Numerical methods library, Data Warehouse design, Cube Modeling and design, ETL process, Hyperion Planning, Excel Reporting and BI.
  • Working on the Big Data and Hadoop eco System like Map Reduce, HDFS, Hive, Pig, Impala. Avro etc. since 2013. Have Experience in developing web base application using Java, Spring, MVC, J2ee.
  • Worked on Core Java, J2ee, JavaScript, AngularJs, Shell, jacl, AJAX, JSON and Thread pools with the Executor Framework to call multiple services and aggregate results.
  • Have Research and Teaching experience on Hadoop and MPI Cluster, Distributed and parallel Architecture and development, Cloud Computing, Hadoop Architecture, Map Reduce, Pig, Hive, HBase, Parallel Algorithm and Multi core Architecture, Linux and Numerical Methods
  • Preferred to work on Java based Framework and Distributed environment, as have Analysis, development, teaching and lab experience for Distributed and cluster computing. This experience will help me to drive any scalable project. Also worked on LAPACK and numerical method library
  • Have published 3 international and journal Paper on Numerical & Parallel Algorithm and Cloudera Certified Hadoop Developer (CCD - 410 in Jan 2015)

TECHNICAL SKILLS

Tools: /Ecosystem: parallel programming (MPI & Open MP), Hadoop, Spark, HDFS, Map Reduce, Hive, Pig, BLAST, Avro, Kafka, Cloud, HiBench, OCR tools like Kofax/ABBYY, J2EE, Spring, XML and Eclipse

Cloud and Data Visualization: OpenStack Nova, Heat, Glance, Gephi, Sci2 and GUESS

Product: Hyperion OIOP (OLAP tool), Hyperion Planning, Essbase, Forecast Pro

Scripting: Shell, VBA Excel for Reporting, Jacl script to handle ETL process, Pig

Programming Languages: Core JAVA, Ruby, JSP, Servlet, JDBC

Databases: Oracle 11g, SQL, Teradata, HBase, PLSQL and MDX Query

Operating Systems: Linux, Windows, Mac OS

Document Server: Chef, Git, Serena VM, Radar, SVN versionApplication Server Tomcat 7.x, Weblogic 10.3, Web logic 11

Tools: Intel tools (Thread Profiler, thread Checker etc.), SoapUI 3.0

Current Work: Statistical Analysis of QR algorithm and Binary Matrix Decomposition, Big Data Analysis using hadoop and other BI tools Analysis

PROFESSIONAL EXPERIENCE

Confidential, Fremont, CA

Sr.Implementation analyst

Responsibilities:

  • Objective of this project is to make a Flood data mart in Teradata and other big data technology like Hadoop. It also needs to extract text information from the scanned picture documents using OCR, distributed environment, java regex and Text Extraction tools. Store extracted text in to Teradata.
  • Worked on Java/J2ee and Have developed the java user interface on top of sqoop to transfer data from oracle/Teradata/MSsql to HDFS/Hive
  • Worked on Spark to move the all statics related computation on Spark as per business requirement
  • Have developed the user interface for the Hive data masking and governance
  • Have implemented Image text extraction objectives using tesseract + Hadoop + sqoop + Teradata and Kofax/Abbyy.
  • Have used Avro to read schema definition in Json format.
  • Applies business data validation rules wherever applicable
  • Accessing Teradata database, and loading the data captured from images into a Teradata database
  • Compare the Performance in processing the images/documents
  • POC helps in establishing a process of capturing business relevant data from image copies and storing them in a database system
  • Prepare Evaluation sheet for selected OCR tools (ABBYY and Kofax) and Implementation feasibility for Image Text extraction
  • Look on alternative solutions for given problem like OCR tools plus Hadoop, GATE and other Framework

Environment: /Software: Cloudera Hadoop, Spark, sqoop, Hive, Pig, Avro, pdfbox, splice machine, Java Regex, JSP, Servlet, Kofax, ABBYY, OCR tools tesseract (Image to Text), VB Script, Teradata, MS SQL, Oracle, Java, Spring Framework, Map Reduce, Linux, Eclipse, Tomcat etc.

Confidential, Santa Clara, CA

Sr. Implementation Analyst

Responsibilities:

  • Objective was to Developing system to transfer logs to the log server and security interface for cluster on Confidential cloud environment
  • Developing BI application to use it top on OpenStack application as a service and also worked on nova image and flavor to get application in virtual instances
  • Implemented Hadoop Hortonworks cluster to use in private cloud
  • Working on deploying Hadoop Cluster on OpenStack
  • Working on writing chef recipe for the Hadoop Cluster
  • Using Hibench for the load stress testing on Hadoop Cluster
  • Worked on to implement Hadoop cluster on escale file system

Environment: /Software: Hadoop, Spark, Yarn, Intel HiBench, OpenStack, Nova, Keystone, Jira, Stash, Chef Recipe, Shell Script, Ruby, Java, Git etc.

Confidential - Sunnyvale, CA

Sr. Implementation Analyst

Responsibilities:

  • Objective was to Implement SCM BI solutions using BI tools and Big Data tools where input sources are Teradata, sql and other web based log files from the different vendor.
  • Requirement gathering from User, Discussion with users and other teams (like ETL using Map Reduce, discussion with Hyperion IOP/SAP, EAI, Essbase team) for New Architecture report and make Design documents
  • Development of Custom Java classes for the customization in Hive functionality. Able to write Pig script and Map reduce code to solve Big Data problem. Supported code/design analysis for Hadoop ecosystem
  • Have done lots of Data level preprocess, Validation, Debugging and Optimization in Map Reduce.
  • Formulated analytical queries using map reduce for early Fraud detection into iTunes App purchases.
  • Worked with Teradata team to prepare history and daily data on hadoop cluster. Responsible for Designing JSP page, writing Java Script for the User Interface and take care of the cluster environment
  • Responsible for running job and Migrating the code on UAT and Production. Take care of the Production environment on Linux and responsible for bug/Radar solution.
  • Importing and exporting of data into HDFS and Hive using Sqoop and Flume. Have used Avro and Kafka to convert and store unprocessed messages.
  • Scheduled and Automated Oozie workflow xml generation by writing custom stored procedure in PL/SQL.

Environment: /Software: Hadoop, Pig, Hive, Map Reduce using Core Java, Avro, Kafka, Hyperion Shared Services, Core Java, JSP/Servlet, Hyperion OIOP, Perl, jacl, Oracle 11g, SQL and MDX query, Weblogic 11g server etc.

Confidential - Chicago, IL

Responsibilities:

  • Objective was to Developed Framework related components and infrastructure to handle the very large size sales and forecast data using BI and custom java development.
  • Write the core java code for the new enhancement for generate calendar and forecast file system, worked on jsp and java script for the web interface. Have worked with different types of file for extracting data like mlt, log, dat etc.
  • Transform Business critical information in ETL workflow using Java and jacl script.
  • Developed Custom Java serialization to handle specific data inputs and to generate the sales forecast and do the overriding for the sales forecast.
  • Worked with the Eton team to build the Linux cluster and virtual machine to implement the model.
  • Worked on POC and make the design documents for the Model. Have responsible for total flow and guide the team for the implementation.
  • Primary task was Requirement gathering, Discussion with users and other teams (like ETL, Cognos) for New Architecture and make Design documents
  • Guide ETL team for the required tables and design model like Sales Cube with Multi Currency, dimension, data source, row source etc., do unit testing for the same.
  • Migrate model on different QA and Production server (have 3 servers NA, EMEA and APAC). Worked with ETL team for data Migration on Teradata
  • Worked on Core Java, development changes in workflow configuration, Memory issues for the Data Block, Cluster environment issue on Linux and discussion with Oracle team for product issue etc.

Environment: /Software: Core Java, JSP, Servlet, Java Script, Hyperion Shared Services, Hyperion OIOP, Forecast Pro 4.X, Cognos 8, jacl, Oracle 11g, Teradata, MDX query, Tomcat and Weblogic server etc.

Confidential - Atlanta, GA

Responsibilities:

  • Write the core java code for the displaying and handling customer data, Worked on database connection JDBC, Data Analysis and Query performance issue
  • Have written java classes for the Order and payment module
  • Responsible for writing Junit test cases for the same and involved in unit testing for the online payment module
  • Responsible for writing jsp, servlet and java script for the Confidential
  • Have worked on payment module and order module for the Confidential
  • Responsible for migrating code on Unix environment for testing team
  • Coding, Unit testing, Bug fixing and Documentation
  • Have worked on WebLogic server on Linux environment and also did maintenance and admin task for the WebLogic server
  • Have used SVN tools for update and commit the code.

Environment: /Software: Oracle 11g, Excel. JDBC, java, JSP, Servlet, Java Script, WebLogic server, struts etc.

Confidential

Environment: /Software: MPI Cluster 12 Node, OpenMP, Multi-threading, Parallel Algorithm, Intel LAPACK library, and Intel Thread Checker Tools

Responsibilities:

  • Analyze the performance of the algorithm and Analyze the output, Coding and Unit testing
  • Extensively involved in debugging, optimization and validation for the code
  • Implemented that Matrix multiplication and QR algorithm using Open MP, which contain serial as well as parallel. Also submitted International paper on the same.
  • Installed various multithreading tools and configure MPI cluster Daemons.
  • Worked on Core java and do bench marking for performance checking on MPI cluster, collect the analyzing boiler data from Power company
  • Handle the file size more than 1 GB and divide it in to the chunk of file for the further processing. Have done many analyses on streaming the file data for the efficient processing
  • Used Intel tools like Vtune, Thread Checker and Thread Profiling for multithreading and multitasking

Confidential

Project Engineer

Environment: /Software: Java, Java Script, html, assembly language, MySQL, VBA Excel, Microcontroller kit etc.

Responsibilities:

  • Meet customer personally and understand the requirement and handled more than 50 client
  • Have involved in writing html and java script for the measurement User interface
  • Have written Java classes to measure the dimension and size of the virtual image
  • Have also used parallel port interface to send the signal to measurement kit
  • Coding, Unit testing, Data measurement, Data Analysis and bug fixing
  • Manage the maintenance team and in touch with client to track the issue

We'd love your feedback!