Sr. Software Engineer Resume
Atlanta, GA
SUMMARY
- Possess over 7+ of experience in Java (J2EE) Software Development, Relational Databases, Data warehouses, Client - Server applications and Oracle and SQL Server Database Administration.
- Over 3+ years’ experience with the tools in the Hadoop Ecosystem including. Pig, Hive, Impala, Hadoop HDFS, Flume, HBase, Hadoop MapReduce, Sqoop, Flume, Oozie, ZooKeeper and Apache Hue.
- Experience with distributed systems, large-scale non-relational data stores, RDBMS, Hadoop MapReduce systems, data modeling, database performance and multi-terabyte warehouses and datamarts.
- Imported the Apache Mahout machine learning libraries to write advanced statistical procedures like filtering, clustering and classification to extend the capabilities of the MapReduce framework.
- Implemented Ad-hoc Hive queries to satisfy immediate or urgent requirements for decision making.
- Good understanding of NoSQL Data bases and hands on work experience in writing applications on No SQL databases like Cassandra and Mongo DB.
- Have hands on experience in the entireLife Cycle of Data warehouse designand in designing, Writing and Executing Test Plans.
- Good hands on experience with building the application interface using Java scripts and worked on building control to dashboard, standard and Ad-Hoc Reports.
- Experience in architecting complete data models combining multiple transaction systems ETL tools and relational and multidimensional data stores for star schema, snow flake schema designs and reporting tools.
- Handled performance tuning, weekly back-ups and monthly consolidations of data and automated monthly/quarterly/annual report generation.
- Effective team player, self-motivated, capable of adapting to new technology and possessing strong problem solving and analytical skills, excellent verbal, written and inter-personal communication skills.
TECHNICAL SKILLS
RDBMS: Oracle 10g/9i/8i, 7.0, DB2, Teradata, MS SQL Server, MS Access.
Operating system: Windows 98/2000/XP, MS DOS 6.22, UNIX (Solaris), Linux and AIX version 5L.
Languages: C, C++, Java, Visual Basic, C #, JavaScript, VB script, SAP - ABAP, SQL, PL/SQL, SQL*Plus, JSP, Struts, Hibernate, AJAX, ASP, XML, HTML, JDBC, RMI, Multithreading 3.3/8.0, UNIX Shell Scripting. Hadoop/Big Data Hadoop Ecosystem including. Pig, Hive, Impala, Hadoop HDFS, Flume, HBase, Hadoop MapReduce, Sqoop, Flume, Oozie, ZooKeeper and Apache Hue.
PROFESSIONAL EXPERIENCE
Confidential, Johnston, RI
Hadoop Engineer/Administrator
Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
- Experienced in installing, configuring and using Hadoop Ecosystem components.
- Experienced in Importing and exporting data into HDFS and Hive using Sqoop.
- Knowledge in performance troubleshooting and tuning Hadoop clusters.
- Participated in development/implementation of Cloudera Hadoop environment.
- Implemented Partitioning, Dynamic Partitions and Buckets in HIVE for efficient data access.
- Experienced in running query using Impala and used BI tools to run ad-hoc queries directly on Hadoop.
- Installed and configured Hive and also written Hive UDFs and Used Map Reduce and Junit for unit testing.
- Experienced in working with various kinds of data sources such as Teradata and Oracle.
- Successfully loaded files to HDFS from Teradata, and load loaded from hdfs to hive and impala.
- Experienced in using Zookeeper and Oozie Operational Services for coordinating the cluster and scheduling workflows.
- Installed and configured Hive and also written Hive UDFs and used piggy bank a repository of UDF’s for PigLatin.
- Experienced in managing and reviewing Hadoop log files.
- Worked in installing cluster, commissioning & decommissioning of Datanodes, Namenode recovery, capacity planning, and slots configuration.
- Supported Map Reduce Programs those are running on the cluster. Involved in loading data from UNIX file system to HDFS.
- Load and transform large sets of structured, semi structured and unstructured data.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
Environment: Hadoop, MapReduce, HDFS, Hive, pig, Impala, Java, SQL, Tableau, PIG, Zookeeper, Sqoop, Teradata, CentOS, SOLR.
Confidential - Atlanta, GA
Hadoop Administrator
Responsibilities:
- Installed and configured Apache Hadoop to test the maintenance of log files in Hadoop cluster.
- Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster.
- Installed Oozie workflow engine to run multiple Hive and Pig Jobs.
- Setup and benchmarked Hadoop/HBase clusters for internal use.
- Developed Java MapReduce programs for the analysis of sample log file stored in cluster.
- Developed Simple to complex Map/reduce Jobs using Hive and Pig
- Developed Map Reduce Programs for data analysis and data cleaning.
- Developed PIG Latin scripts for the analysis of semi structured data.
- Developed and involved in the industry specific UDF (user defined functions)
- Used Hive and created Hive tables and involved in data loading and writing Hive UDFs.
- Used Sqoop to import data into HDFS and Hive from other data systems.
- Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
- Migration of ETL processes from RDBMS to Hive to test the easy data manipulation.
- Developed Hive queries to process the data for visualizing.
Environment: Apache Hadoop, HDFS, Cloudera Manager, CentOS, Java, MapReduce, Eclipse, Hive, PIG, Sqoop, Oozie and SQL.
Confidential
Sr. Software Engineer
Responsibilities:
- Responsible for coding, testing and documenting various packages, procedures, and functions for libraries.
- Designed use cases for the Application as per the business requirements.
- Participated in requirement gathering and framework implementation sessions through Agile TDD methodology
- Worked closely with the Business/Users team in translating technical requirements into application code
- Developed web layer using Struts framework to manage the project in MVC pattern.
- Implemented integration of Struts Action classes in Presentation Tier
- Used StrutsTilesFramework for designing the layout of the application.
- Client pages are built using HTML, CSS, JSP, javascript/JQuery
- Extensively used Core Java such as Exceptions, and Collections
- Used JSTL and developed required tiles and tile definitions for templating and defined configuration in the struts configuration.xml.
- Developed helper classes, delegate classes, value objects to access business tier and transfer the information from business layer to the presentation layer.
- Developed business layer using EJB stateless session beans and deployed on the Application server (Web logic).
- Have proficiency in using DAO and DTO patterns to persist data into database
- Used Hibernate as persistence framework for DAO layer to access the Oracle10g database
- Produced SOAPweb services using Metro JAX-WS RI for building the business application layer.
- Defined XML documents for input and output sources of Web Services. Created schema documents for XML validation and developed XML schemes
- Used Maven for build framework and Jenkins for continuous build system
- Worked on Eclipse for development and deployment of application in Web Logic Application Server.
- Proficient in using SVN for Version Control.
- Actively involved in code reviews and also in bug fixing.
Environment: Java, JSP, JavaScript, Ajax, Jquery, Ant, Struts, Spring, and Hibernate, RSA, WebSphere Application Server, DB2, XML, LDAP, Accurev, JUnit, and Windows2000.
Confidential
Software Analyst
Responsibilities:
- Developed Profitability Reports, which detail Unit Cost, Price and Volume Trends, and Expense Variances over prior periods.
- Developed newdashboardsandad hoc reportsfor different business teams using Interactive Reporting (Brio) and Java based dashboard reports.
- Assisted the integration of Brio reporting process along with streamlining of method improvement for data collection and loading processes.
- Responsible forgathering business requirementsfrom numerous business units.
- Created custom calendars using Job Utilities menu to support the events and jobs scheduled to run the IR reports.
- Applied various Filters for efficient and accurate retrieval of required data using Interactive Reporting Studio.
- Upgradedandmigratedall the applications and databases from older versions.
Environment: (Environment:Java, JSP, JavaScript, Ajax, Jquery, Ant, Struts, Spring, and Hibernate, RSA, WebSphere Application Server, DB2, XML, LDAP, Accurev, JUnit, and Windows 2000.