We provide IT Staff Augmentation Services!

Bigdata Cosultant Resume

3.00/5 (Submit Your Rating)

Pennsylvania, PA

SUMMARY:

  • Around 9 years of experience in SDLC Design, Development, quality assurance, build and Release, Integration, Support and maintenance of large scale enterprise applications.
  • 3+ years experience on Big Data technologies - Hadoop HDFS, Map-Reduce, Pig, Hive, Sqoop, Kafika, Flume, HBase, Zookeeper, YARN, Scala, R, Cassandra, Spark, RDD, Streaming, Datawarehouse, Imapala, Oozie, Weblech, Talend.
  • Excellent experience on DevOps using Automation tools AWS, Puppet, Chef, Docker and Ansible
  • 6 years’ of experience in J AVA, J2EE development, manage and support deploy the applications
  • Experience on HDFS Architecture & Cluster concepts, Map-Reduce, Pig, Hive, Sqoop, Impala, Kafika, Flume, Hbase,Redshift.
  • Extensive experience on writing scripts ising Scala, Shell, R, Python, SQL, MapR, Pig Latin for relevant applications.
  • Experience on working Map Reduce, Spark applications, ETL operations using Talend studio, Pig Scripting and Hive and HQL.
  • Experience in working on version control systems like CVS, Subversion and GIT, used source code management client tools like Visual SVN, Tortoise SVN, putty, Git Bash, GitHub, Cloudera, Horton networks and other command line applications.
  • Extensively worked on Jenkins for continuous integration (CI) and for End-to-End automation for all build and deployments.
  • Good experience on ETL tools like Talend DI, ESB - job design, webservices, MDM, Eclipse tool, Talend Studio.
  • Experience on execution of XML, ANT scripts, Shell Scripts, Ruby, Go, YAML, Python, Perl Scripts, Power Shell Scripts and Java Scripts.
  • Experience on AWS - EC2, S3, RDS, Cloud-watch, Lambda, ECS, EBS, Glacier, EC2 Container, IAM, BitBucket.
  • Good exposure on Monitor, troubleshoot and manage deployed applications and environment
  • Experience on Software Development Life Cycle, Test Driven Development, Continuous Integration and Continuous Delivery.
  • Having work experience in support of multi platforms like Ubuntu, Fedora, IOS and Windows for production, test and development servers.
  • Extensive experience in architectural Object-Oriented, distributed, Web based and Enterprise applications.
  • Extensive experi ence on J2EE, Java, Web services, SOAP, REST, EJB, JMS, J2EE and Spring design-patterns.
  • Good exposure on Web/a pplication servers like Tomcat, Apache, Nginx, Weblogic, IBM Websphere, JBoss, Jetty, IIS.
  • Extensively worked on interaction of Java/J2EE systems with databases like O racle, SQL Server, DB2, MYSQL - SQL Queries, Stored Procedures.
  • Extensively worked on develop the gui using HTML, DHTML, Java script, JSP and XML, Perl.
  • Experience on Deploying Couch base, Tomcat and Elastic search clusters using Docker
  • Experience on frameworks like Struts, Spring, JSF, Hibernate, iBatis, Swing, AWT, Angular JS, Node JS
  • Experience on writing PL/SQL, packages, Confidential SQL, Stored procedures, modeling, functions and complex SQL queries using Oracle, MySQL, Sybase, DB2, PostGre.
  • Application integration with ides like Eclipse, RAD, WSAD, WTX, J D eveloper a nd deployed to application servers
  • Experience on Finance Domains, Banking, Health-care Domains, Accounting Domains
  • Experience on methodologies like SOA, OOPs and Agile.
  • Ability to communicate with multiple teams and vendors effectively
  • Excellent Problem solving, communication and interpersonal skills

TECHNICAL SKILLS:

Operating systems: Linux(Ubuntu,RedHat, Fedora, Cent OS), Unix,Windows, Ms DOS

Java: Jdk, Jdbc2.X, Java5-8

AWS: EC2, ECS, RDS, Lambda, DynamoDB, MongoDB, CloudWatch

BIGDATA /ETL: MongoDB, Hadoop, MapReduce, Pig, Sqoop, Flume, HDFS, Hive, HBase, Storm, Spark-MLIB, STREAMING, Webleich, Kafka, Flink, Cassandra, Talend Studio

Automation Tools for Continuous: (Integration, Delivery, Deployment)Puppet, Chef, Ansible, Docker, Jenkins, GitHub

J2EE technologies: JSP, Servlets, EJB, JMS, JAAS

Version control/build/PM: GIT, Maven, CVS, VSS, SVN, Ant, PerforceApplication servers: Weblogic,Tomcat, Websphere, OC4J, Jboss, WESB

Web services: Soap, Apache Axis, WSDL, JAAX, Restful

Web servers: IIS, Apache,Tomcat, Jetty, NGinx

IDE and Debugging tools: Eclipse, RAD, Toad, Visual Studio, Bugzilla, Firefox

Frame works: Struts, Spring, Hibernate, JSF

Databases: Oracle, DB2/UDB, Sybase, SQL Server, MariaDB

Design tools: OOAD, Design Patterns, UML, Poseidon, Dreamweaver, Talend Studio,MYSQL- Workspace

Methodologies: Agile, OOAD, SOA

SCRIPTS: Java Script, XML, HTML5, CSS3, Perl, Ajax, PHP, Python

CMS: Interwoven Teamsite

XML: XSL, XSLT, DOM, SAX

PROFESSIONAL EXPERIENCE:

Confidential,Pennsylvania,PA

BigData Cosultant

Responsibilities:

  • Developed MapReduce programs to parse the raw data and store the refined data in tables
  • Designed and Modified Database tables and used HBASE Queries to insert and fetch data from tables.
  • Involved in loading and transforming large sets of Structured, Semi structured and Unstructured data from relational databases into HDFS using Sqoop imports.
  • Worked on implementation of Avro, ORC, and Parquet data formats for Apache Hive computations to handle custom business requirements.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers and network devices and pushed to HDFS
  • Worked on Spark transformations, actions, RDDs, SparkContext, sessions, datasets for different jobs
  • Worked in retrieving transaction data from RDBMS to HDFS, save the output in Hive tables as per user using MapReduce jobs.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Created Hive tables, loaded data and wrote Hive queries that run within the map, Cassandra database
  • Used OOZIE Operational Services for batch processing and scheduling workflows dynamically
  • Worked on Kafka configurations for streaming, Consumers and Producers by extending Kafka high-level API in java and ingesting data to HDFS - Hbase depending on the context.
  • Worked on creating the workflow to run multiple Hive and Pig jobs, which run independently with time and data availability
  • Worked on Map Reduce, Spark transformations, Spark RDD's, Spark streaming, Spark SQL using Python and Scala.
  • Worked Talend ETL, DI, Studio to create jobs, transformations from different databases for data analysis.
  • Worked on code reviews, review the database models, generating reports and data analysis
  • Developed Spark scripts by using Python, Scala,Java as per the requirement
  • Worked on implementing machine learning techniques in Spark by using Spark Mlib.
  • Participated onsite calls and status meetings, review meetings about backlogs
  • Involved in moving data from Hive tables into Hbase, Casandra for real time analytics on hive tables.
  • Involved in cluster maintenance which includes adding, removing cluster nodes, cluster monitoring and troubleshooting, reviewing and managing data backups and Hadoop log files.

Environment: J2EE, AWS, EC2,EMR, S3, Route53, RDS,R, version control, JAVA8, Maven,Hadoop2, MapReduce, PIG, Sqoop, Webleach, Hive, Spark Streaming,Cassandra, AMbari, Oozie, Hbase, Python, SQL, Pl/SQL, Html5, Linux, tika, RedHat, MySQL, YARN, NOSQL, JavaScript, CSS3, GIT, Shell, Tomcat, Talend DI, Studio, ETL, Oracle,SQL Server, DB2, Jenkins.

Confidential,Texas

BigData Consultant

Responsibilities:

  • Worked on create software scripts to automate test, staging and production service deployments.
  • Developed Map Reduce jobs in Java for data cleansing, preprocessing and implemented complex data analytical algorithms.
  • Created Hive Generic UDF's to process business logic with Hive QL.
  • Involved in implementing Maven build scripts, to work on maven projects and integrated with Jenkins.
  • Worked on Hadoop MapReduce, HDFS and MongoDB, Spark-MLIB, Streaming, RDD, DataFrame and DataSets
  • Responsible to managing data coming from different sources and involved in HDFS maintenance and loading of structured and unstructured data.
  • Developed MapReduce programs to parse the raw data and store the refined data in tables.
  • Designed and Modified Database tables and used HBASE Queries to insert and fetch data from tables.
  • Involved in moving all log files generated from various sources to HDFS for further processing through Flume.
  • Developed algorithms for identifying Influencers with in specified social network channels.
  • Involved in loading and transforming large sets of Structured, Semi structured and Unstructured data from relational databases into HDFS using Sqoop imports.
  • Analyzing data with Hive, Pig and Hadoop Streaming, Talend ETL tool for analyzing and cleansing raw data by performing Hive queries and running Pig scripts on data.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Created Hive tables, loaded data and wrote Hive queries that run within the map.
  • Used OOZIE Operational Services for batch processing and scheduling workflows dynamically.
  • Populated HDFS and Cassandra with huge amounts of data using Apache Kafka.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting. Performed Data mining investigations to find new insights related to customers.
  • Involved in forecast based on the present results and insights derived from Data analysis.
  • Developed sentiment analysis system per particular domain using machine learning concepts by using supervised learning methodology.
  • Worked on Created Continuous build process using Jenkins as Continuous integration tool.
  • Leveraged AWS cloud services such as EC2, auto-scaling and VPC (Virtual Private Cloud) to build secure, highly scalable and flexible systems that handled expected and unexpected load bursts, and are able to quickly evolve during development iterations.

Environment: Hadoop, AWS, MapReduce, BigData, Hive, Impala, Cassandra, Spark, Avro, Kafka, Storm, Pig, Flume, Java1.8, MongoDB, NOSQL, S3, RDS, GIT, Linux, Redshift,Talend ETL, Ubuntu, EC2, Lamda, Cloud,Python, Scala, Oracle10g, LDAP, Shell Script, YARN, Eclipse, RAD, Maven, Web-sphere AS,Tomcat WSConfidential,Greensboro, NC

Hadoop System Analyst

Responsibilities:

  • Worked on Processing large data sets in parallel across the Hadoop cluster for pre-processing.
  • Developed the code for Importing and exporting data into HDFS using Sqoop
  • Developed Map Reduce programs to join data from different data sources using optimized joins by implementing bucketed joins or map joins depending on the requirement.
  • Worked on imported data from structured data source into HDFS using Sqoop incremental imports.
  • Worked on Implementing Kafka Custom partitioners to send data to different categorized topics.
  • Implemented Storm topology with Streaming group to perform real time analytical operations.
  • Worked on Kafka Spouts for streaming data and different bolts to consume data.
  • Worked on Creating Hive tables, partitioners and implemented incremental imports to perform ad-hoc queries on structured data.
  • Written Shell scripts that run multiple Hive jobs which helps to automate different hive tables incrementally which are used to generate different reports using Tableau for the Business use.
  • Written Java scripts that execute different MongoDB queries. Involved in editing the existing ANT/MAVEN files in case of errors or changes in the project requirements.
  • Worked on deploying systems using Amazon Web Services Infrastructure services EC2, S3, RDS, SQS
  • Deploy and monitor scalable infrastructure on Amazon web services (AWS)& configuration management using Chef .
  • Collaborate in the automation of AWS infrastructure via Jenkins - software and services configuration via chef cookbooks.
  • Worked Hadoop MapReduce, Sqoop, HDFS for big file system components
  • Worked on Docker for managing containers, snapshots, images, managing directory structures.
  • Handled the tasks like build configurations, LDAP code changes, build applications, deployments to Application containers.
  • Monitored software, hardware, and/or middleware updates and utilizing technologies like Jenkins/Hudson, Ant, Maven Build, TFS Team Explorer and Subversion.
  • Worked on communicating with customers and resolved the high severity issues.

Environment: JAVA, AWS, Ant, Maven, EC2, S3, RDS, Python, Spring, ShellScript, SQL, Pl/SQL, Hadoop, MapReduce,MongoDB, Sqoop, kafka, Storm, Hive, Flume, Bash, Web Sphere Application Server8, Tomcat6, Windows, DB2, SQL Server, Subversion.

Confidential,Memphis,TN

DevOps Engineer

Responsibilities:

  • Deploy and monitor scalable infrastructure on Amazon web services (AWS)& configuration management using Chef .
  • Used Bitbucket Server for secure, fast, enterprise-grade controls, like fine-grained permissions and powerful management features
  • End-user training regarding branching strategies for all Subversion (SVN) users to effectively use the tool.
  • Used Jenkins for enterprise scale infrastructure configuration and application deployments.
  • Using ANT, Puppet/ Chef Scripts to build the application and deploy.
  • Worked on Created Continuous build process using Jenkins as Continuous integration tool.
  • Leveraged AWS cloud services such as EC2, auto-scaling and VPC (Virtual Private Cloud) to build secure, highly scalable and flexible systems that handled expected and unexpected load bursts, and are able to quickly evolve during development iterations.
  • Implemented multiple high-performance MongoDB replica sets on EC2 with robust reliability
  • Experience and ability to setup automated monitoring and alerting systems.
  • Worked on deploying systems using Amazon Web Services Infrastructure services EC2, S3, RDS, SQS
  • Collaborate in the automation of AWS infrastructure via Jenkins - software and services configuration via chef cookbooks.
  • Handled tasks development, execution, Bug tracking and periodical technical review of various components.
  • Extensively worked on Spring, Servlets, JSP, JSF, JSR168, XML configurations for framework, servers
  • Worked on Talend ETL tool installation and configurations and plugins installations,
  • Worked on supporting EJB Session Beans, Design patterns to communicate DAO components
  • Extensively worked on Web services- SOAP, WSDL, JAX-RPC, JAX2-WS, XML, XML Schema were used to distribute the RPCs and messages in different formats.
  • Extensively worked on XML schema, XSLT, XML DOM, SAX, parsing, transformations, different formats.
  • Worked on Docker, Jenkins, Chef cookbooks for managing containers, snapshots, images, managing directory structures, deployments.
  • Handled the tasks like build configurations, LDAP code changes, build applications, deployments to Application containers.
  • Created Calendar entries for the scheduling of promotions to different environments like Integration, UAT, Staging, and Production.
  • Worked on integrating application through Web services, Spring, Hibernate components
  • Worked on JMS using MDB to create pub/sub, Point-to-Point connections-Topics, Queues, connections, messages.
  • Developed SQL queries, Stored procedures, Triggers, Functions to manipulate the data from frond end to database.

Environment: EJB 3.0, JSP1.1, XML, XSLT, Jdbc3.0, Docker, Chef, Python, Talend, AWS, RDS, EC2, MongoDB, YAML, GO, Java1.6, Spring2.5, MVC, AJAX, Shell script, Oracle10g, LDAP, JSP, WS, Hibernate3.0,Ant, SQL, Pl/SQL, Html5, CSS3, JavaScript, Weblogic Application Server9.0, Eclipse, Unix, Subversion, AIX, DB2, SQL Server.

Confidential,Springfield, OR

Sr Application Analyst

Responsibilities:

  • Involved in the meetings related to business and technical requirements gathering and requirements analysis discussions.
  • Enhancements on Designed application Components like use cases, Class diagrams, Component diagrams, sequence diagrams using UML based on OOPs and service oriented architecture standards.
  • Extensively worked on Struts MVC, Servlets, JSP, Java beans, JSTL, Spring, EJB and Hibernate, Core Java.
  • Worked on Struts, request-response life cycle, process validations, Configure navigation rules, validation rules.
  • Worked on web pages development using HTML, Perl, DHTML, XSLT, XSD, XHTML, JSR, JSF, Ajax, CSS, Swing, Java script, XML - SAX, DOM parser.
  • Worked on Core java concepts like I/O, Multi threading, Generics, annotations, Collections.
  • Extensively worked on Web services, SOAP, JAXB, JAX-RPC/JAX-WS, WSDL, EJB, MDB, JAXB, XML, Services/Proxy and Java Beans.
  • Extensively worked on production support and resolving production issues.
  • Implemented Spring dependency/IOC concept, Spring AOP, Spring JDBC, Spring Security, integration, Factory, Application Context modules.
  • Integrated the application Servlets, JSP, JSF, JSR 168, JSTL, JMS, Spring, JDBC, Hibernate components through Web services with SOA standards.
  • Actively handled concurrency issues, Hibernate Caching strategies, HQL, Performance issues.
  • Worked on UNIX environment to deploy the application Components on WebLogic Application server.
  • Worked on TOAD to develop PL/SQL procedures, Joins, Indexes, Triggers, Functions, SQL queries to handle database operations.
  • Maven was used to create components/ dependencies/artifacts for build directory structure.
  • Worked on Eclipse IDE was used to write code for all application components, Unit test, integration tests.

Environment: J2EE, Maven2.0, Xml1.0, Java5.0, Struts2, Multi-threading, Oracle10g, TDD, Web Services-Axis2, SOAP, REST, WSDL, EDI, JSR168,JSF, JAXB, JAX-RPC/JAX-WS, Messages, JDBC, Spring - IOC, Hibernate, MVC, UNIX, JMS, Eclipse3.4, Servlets, JSP, CSS, SVS, Perforce, ANT, JUnit, HTML3.0, Shell Script, JavaScript, Design patterns, Apache tomcat WS, WebsphereAS.

Confidential, OH

Sr Technical Consultant

Responsibilities:

  • Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used Rational Unified Process (RUP)/TFD.
  • Worked on Struts framework - JSP, XML, HTML, Servlets, MVC design pattern, validation rules, JSP in the presentation layer.
  • Adapted various design patterns like MVC, Data Transfer Object (DTOs), Business Delegate, Service Locator, Session Facade, Data Access Objects (DAO), etc.
  • Designed EJB's like Stateful Session Beans for the Session Facade design pattern and used Message Driven Beans (MDB) as a listener for JMS messages.
  • Consumed Web Service using WSDL and SOAP to get the credit history from the service provider.
  • Extensively used Hibernate in data access layer to access and update information in the database.
  • Used Spring Framework for Dependency injection and integrated with the Hibernate.
  • Worked on JMS MQ series to create channels, Queue managers, Monitor load balancing and integrating with other components.
  • Implemented the database connectivity using JDBC with SQL Server database as backend.
  • Worked on Parsing the XML based responses using JAXB Parser and developed XML Schema for Loan application module.
  • Worked on PL/SQL for querying the Database and Stored procedures and triggers for SQL Server using TOAD.
  • Extensively used Eclipse while writing code and CVS for version control.
  • Used JUnit to implement test cases for Unit testing of modules.
  • Monitored the error logs using Log4J and Involved in the documentation of the application

Environment: JBOSS A p Server, Eclipse IDE, EJB 2.0, JSP 2.0, Struts 1.1, Hibernate 3.0, Spring 1.0.2, MQ Series, JDBC, JIBX, XML, XSL, XSLT, JAXB, WSDL, MQ series, SOAP, JMS, CVS, JNDI, JUnit 3.8.1, Microsoft SQL Server, Log4J 1.2.7, RUP, UNIX.

Confidential, CA

Sr Technical Associate

Responsibilities:

  • Interacted with the client, gathering requirements and identify the use cases.
  • Was handled small team members, assigned, supported for issues, other development activities.
  • Used J2EE design patterns such as Service Locator for lookup, DAO for database operations and Singleton for unique copy of object.
  • Web user interface and Retail customer’s user interface was designed and developed using JSP, Struts, Servlets, HTML and JavaScript.
  • Worked EJB Session Beans, JNDI, Design patterns to communicate DAO components
  • Used Stateless Session Bean for maintaining the transaction to minimize resource overhead.
  • Was responsible for object oriented design using rational rose and UML to generate object model, sequence diagrams, static diagrams for interfaces and classes in the system.
  • Enhanced and designed application as per the J2EE architecture .
  • Designing the business objects and user interface aspects with UML /Rational rose
  • Web Services - SOAP, WSDL, XML were used to distribute the RPCs and messages in different formats.
  • Hibernate used For O-R Mapping, Persistence, Automatic Primary Key Generation
  • XML, XSLT, XPath was used to distribute formats, to view by users
  • Deployed session enterprise java beans using IBM Web Sphere application server, web sphere application developer ( RAD ) tool.
  • Worked on JMS to communicate messages asynchronously
  • Was involved in testing and maintenance of components.
  • Written the code to Communicate different LDAP servers.
  • Worked on JMS - MDB to create pub/sub, point-to-point connections-topics, Queues, connections, messages.
  • Developed several complex SQL queries, PL/SQL stored procedures, Triggers, Functions to manipulate the data from frond end to database.
  • SOAP has been implemented for retail customer interface

Environment: EJB 1.2, JSP1.1, XML, Servlets2.5, Jdbc3.0, JSF, Java1.5, Struts2, Ajax, DB2, LDAP, Single sign on, Web logic Application Server9, Hibernate3.0, Ant, SQL, Pl/SQL, Html 3, Javascript1.3,Clear case, IBM Websphere Application Server 5.1, WSAD/RAD , Unix, Window NT.

Confidential

Sr Technical Consultant

Responsibilities:

  • Leading design and development team of Integration Framework, to integrate Open System Applications (part of e-Banking solutions) with the Mainframes.
  • Web user interface and Retail customer’s user interface screens, webpages was designed and developed using JSP, Struts, Servlets, HTML and JavaScript.
  • Used JDBC Oracle thin driver to access databases from Java.
  • Fine-tuning integration framework to make solution compliant with SLAs.
  • Measurement of Change Impact for the SRs raised subsequently.
  • Writing utilities to automate application deployment processes.

Environment: TIBCO / Business Works 5.1.3, J2EE, Weblogic workshop, Struts, Weblogic Application server6.1, Windows 2000 / XP, Solaris 2.8, ClearCase

Project: Ignite OEX BT Global Trading and Finance Services

Client: Confidential

Sr Technical Associate

Responsibilities:

  • Designed and developed the screens for invoice, customer modules using front page.
  • Developed java code for the user interface screens
  • Worked on JSF, JSP, Manage beans, form beans, validation rules, navigation rules.
  • Updated screens, test screens as per client modifications.
  • Worked on Class diagrams and component diagrams, styles of fonts, automatic generation of code, required screens as per requirements.
  • Extensively worked JIBX to improve the performance through bind the XML data to Java objects and Java objects to XML data
  • Worked on develop the code in java for value objects, DAO classes, database connections, database functions for customer, invoice modules
  • Weblogic8.1 application server was used to deploy Java components

Environment: EJB 2.0, JSP1.2, XML1.0, Servlets2.3, JDBC2.0, Java2.0, X12 EDI, Rosetta EDI, Oracle 9i, SQL, Pl/SQL, Html3.0 Javascript1.4, JMS, Weblogic8.1, Oracle9i Application server, Web logic Workshop, Eclipse, Toad, Equity Dérivatives, Web services-SOAP, XML- SAX, DOM, Window2000.

We'd love your feedback!