We provide IT Staff Augmentation Services!

Bigdata Aml Analyst Resume

0/5 (Submit Your Rating)

Tampa, FL

SUMMARY:

  • Accomplished BigData AML Analyst, Data Scientist, and Hadoop Administrator with expertise in the areas of Project Management, Hadoop Cluster Setup, QA, System Integration and Data Migration.
  • 10.2 years of working experience in BigData AML Monitoring, Project Management, Hadoop Administration, SDET, Testing. Support. Currently working as a BigData Hadoop Consultant in Confidential Inc, Tampa (FL).
  • Strong communication skills, executive presence, as well as extensive leading.
  • Key Competencies:
  • Hands on experience in installation, configuration, upgrades,patches, supporting and managing Hadoop Clusters using Apache and Cloudera .
  • Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
  • Adding/removing new nodes to an existing hadoop cluster.
  • Backup configuration and Recovery from a NameNode failure.
  • Decommissioning and commissioning the Node on running hadoop cluster.
  • Installation of various Hadoop Ecosystems and Hadoop Daemons.
  • Installation and configuration of Sqoop.
  • Excellent command in creating Backups & Recovery and Disaster recovery procedures and Implementing BACKUP and RECOVERY strategies for off - line and on-line Backups.
  • Involved in bench marking Hadoop/HBase cluster file systems various batch jobs and workloads.
  • Making Hadoop cluster ready for development team working on POCs.
  • Experience in minor and major upgrades of hadoop and hadoop eco system.
  • Testing: Unit Testing, Integration System and User acceptance testing.
  • Test Management Tool: ALM Quality Center 11.0
  • Expertise in Linux (Ubuntu) and UNIX command

TECHNICAL SKILLS:

Languages / APIs: Java, PL/SQL, UNIX Shell Scripting

Web / Enterprise: Struts, Servlets, AJAX, JSP, Web Services, JDBC, JavaScript, CSS, XML, XSLT, HTML

IDE: HP Quality Center, VGen, Eclipse, JDeveloper

Application Servers: JBoss, Tomcat

RDBMS: Oracle 9i/10g, MS-SQL Server 2000, DB2, MySQL

Operating Systems: CentOS, Windows NT/2000/XP, Ubuntu.

Configuration Management: SVN, CVS, Visual SourceSafe, Team Forge, Remedy

PROFESSIONAL EXPERIENCE:

Confidential, Tampa, FL

BigData AML Analyst

Responsibilities:

  • Responsible for Architecting, Installation and Administration of CDH 4.5, Cloudera Manager 4.8
  • Setup Hadoop(YARN/MapReduce) cluster on various platforms like RHEL 5.6, CentOS 6.4
  • Configured High-Availability (HA) architecture for Namenode/Resource Manager
  • Experience in Decommissioning and commissioning node on running cluster
  • Monitoring Hadoop cluster using tools like Ganglia and Cloudera Manager
  • Installed, configured and integrated Hadoop Ecosystem components (Hive, Sqoop, Flume, PIG, Zookeeper, Oozie)
  • Plan and manage HDFS storage capacity. Advise team on best practices and optimal processes
  • Implemented Backup and Recovery procedures from a Namenode failure using NFS
  • Worked on importing and exporting data from RDBMS into HDFS and Hive using sqoop
  • Collected log data from webservers and integrated into HDFS using Flume
  • Worked on standards and proof of concept in support of CDH4 implementation using AWS cloud infrastructure
  • Knowledge on Kerberos, Ranger, Knox, Puppet
  • Data Ingestion Oracle to Hive from different Data Center (NAM, LATAM, APAC, EMEA).
  • Working with Legal/Compliance Clearance Process team and getting approval from DPO, BISO and DPAT.
  • Preparing RunBook for data Ingestion.
  • Requirement gathering from different team.
  • Hadoop Cluster Monitoring in Hue.
  • Requirement analysis for new enhancements.
  • Onsite-Offshore Coordination.

Environment: CDH 5.4, Cloudera Manager, Hive, Oracle, Sqoop, Platfora, Datameer, Hue, Service Now, SVN.

Confidential, Boston, MA

Functional & Technical Onsite Lead for QA and Hadoop Administration

Responsibilities:

  • Responsible for Cluster maintenance, Monitoring, commissioning and decommissioning Data nodes, Troubleshooting, Manage and review data backups, Manage &review log files.
  • Day to day responsibilities includes solving developer issues, deployments moving code from one environment to other environment, providing access to new users and providing instant solutions to reduce the impact and documenting the same and preventing future issues.
  • Experienced on adding/installation of new components.
  • Architecture design and implementation of deployment, configuration management, backup, and disaster recovery systems and procedures.
  • Hadoop Cluster locally with multiple nodes (NameNode, DataNode, JobTracker, HIVE, Cloudera Manager Etc.)
  • Installation, Configuration, Performance Tuning, Upgrade Hadoop environment.
  • Configuration Management & Project Management.
  • Requirement analysis for new enhancements.
  • Test strategy and Test Estimation.
  • Onsite-Offshore Coordination.
  • Performance Testing.

Environment: HP Quality Center 11.00, CentOS, Ubuntu, hadoop-0.20.205, hadoop-2.0.4-alpha, jdk7, Oracle VM VirtualBox, VMware, Putty/SSH client, bash script, Cloudera Manager, XML, Oracle, SharePoint, Unix, VGen.

Confidential, Memphis, TN

Functional - Technical Onsite Lead

Responsibilities:

  • Installation of various Hadoop Ecosystems and Hadoop Daemons.
  • Installation and configuration of Sqoop and Flume.
  • Managed and reviewed Hadoop Log files as a part of administration for troubleshooting purposes. Communicate and escalate issues appropriately.
  • As an admin followed standard Back up policies to make sure the high availability of cluster.
  • Decommissioning and commissioning the Node on running Hadoop cluster.
  • Experience in benchmarking, performing backup and disaster recovery of Name Node metadata and important sensitive data residing on cluster.
  • Working on Pseudo Distributed hadoop cluster.
  • Setup Hadoop cluster in Scratch, Installed Hive, and Configure Ganglia.
  • Understanding of new requirements and feasibility study of new requirement.
  • Requirement analysis for new enhancements.
  • Design & Development of the product using Java Technology.
  • Application Development and Unit Testing.
  • Integration System and User acceptance Testing.
  • Access product serviceability and diagnostic functionality and make change recommendations to help improve overall product quality.
  • Support of bug fixes and issues reported in production environment in real time.
  • Managed a large offshore development and testing team.
  • Performed requirements analysis.

Environment: Java, XML, Oracle, Eclipse Indigo IDE, Oracle SQL Developer, Oracle 10g, Teradata, Team Forge, Unix, Batch jobs, Mainframe, Hive, HBASE, Sqoop, Flume, Talend.

Confidential

Technical Lead

Responsibilities:

  • Standardize of Design and layout to the extent possible.Build a Library for the components required in report and reused across the project.
  • Modify our product as per client requirement.
  • Provides HTS tanning to our team members.

Environment: Java, XML, Oracle, Eclipse Indigo IDE.

Confidential, Canton, MA

Onsite Coordinator

Responsibilities:

  • Incident Diagnosis, Root Cause Analysis, Preventive Maintenance for Confidential and PIM Application.
  • Upgrades, Bug Fixes, Minor Enhancements, Application Performance Tuning for Confidential and PIM application.
  • Working closely with business users and coordinating for technical issues.
  • Prepare Test cases.

Environment: Java/J2EE, JSP, SQL Server, XML, Oracle.

Confidential, Westborough, MA

Technical Lead

Responsibilities:

  • Responsible for taking KT from SME.
  • Analysis the code, database.
  • Prepare SMTD.
  • Application support.

Environment: Java/J2EE, JSP, XML, SQL Server 2000, DB2, MKS, Harvest.

Confidential, Nashville, TN

Java/J2EE Developer & Testing

Responsibilities:

  • Responsible for design and development of core architecture for the vendor applications like GE CV PACS, OB Link to integrate with the portal environment.
  • Played major role in understanding the healthcare application called Confidential and Confidential t Keeper portal.
  • Developing and integrating Authentication and Study level contexts of clinical applications within the Confidential t Keeper portal.
  • Guiding various divisions on the Clinical application setup that should be done for the portal integration work.
  • Implementation of the SSO for different divisions using Sentillion.

Environment: ; AJAX, J2EE, JSP, XML, CSS, Confidential 6.0, Sentillion, Oracle.

Confidential

Java/J2EE Developer

Responsibilities:

  • Customization and enhancements to standard UCES application as per client requirement. Changes UCES screens to have same look and feel as Confidential, Cambridge and Sudbury portal.
  • Develop new screens and integrate into existing UCES application.
  • Used single instance, multiple clients methodology to achieve desired application in UCES portal for Confidential, Sudbury and Codac on a single server.
  • As per database connectivity here I am using JCO connection which is intermediated between SAP function module and Java.
  • Through RFC here we are showing the pdf doc file in UCES portal which is stored in different SAP and Oracle server.
  • Modified and Developed screen in UCES (Current Balance, Paid Bills, Address Data, Payment option, Meter Reading Entry, Budget Billing Plans etc.

Environment: AJAX, J2EE, JSP, Java Advanced Imaging (JAI) API, Java Beans, Java Servlets, SAP Netweaver Developer Studio, HTML, Java, XHTML, XML

Confidential

Java/J2EE Developer

Responsibilities:

  • Development, Support and Maintenance entire application.

Environment: Corba Services, Teradata, Oracle Lite (device), Oracle, SQL Server, Struts and Core Java.

Confidential

Java Developer

Responsibilities:

  • Developed the TeaTime Module.

Environment: J2EE framework follows with Oracle as back end, Using Servlet, JSP and CSS.

Confidential

Java Developer

Responsibilities:

  • Coding the Pc based application followed by MVC (Struts) architecture.Creating Web Service, XML Parsing, Report generation (iReport) etc.

Environment: J2EE (Struts) framework follows with Oracle as back end. This application uses a MVC framework (Struts).This application followed XML parsing, Report Genaration (iReport).

Confidential

Java Developer

Responsibilities:

  • Developed the ERP Management followed MVC (Struts) architecture.
  • Maintained the database DTO, DAO layers for our transparency.
  • Using the Connection Pooling concept for Database connectivity.
  • Configure JBOSS Application Server.

Environment: Struts, Java-J2EE, JSP-Servlets, XML-XHTML-CSS.

We'd love your feedback!