Application Developer Resume
New York, NY
SUMMARY
- Over 8+ years of total IT experience, with over 2+ years of experience in all phases of Hadoop and HDFS development along with 4+ years of experience in analysis, design, development, testing and deploying various software applications with emphasis on Object Oriented Programming.
- Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
- Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, and Flume.
- Experience in managing and reviewing Hadoop log files.
- Experience in analyzing data using HiveQL, Pig Latin, HBase and custom MapReduce programs in Java.
- Extending Hive and Pig core functionality by writing custom UDFs.
- Experience in data management and implementation of Big Data applications using Hadoop frameworks.
- Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the Hadoop ecosystem.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
- Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
- Strong experience as Applications Developer in 3-tier architecture, client/server technologies using VB, C#, Java Apps and SQL.
- Experience in areas of Health Care Data Services, Direct Marketing, Stock Market systems, Financial Data Services, Hospital Information systems and Mailing Data Services using RDBMS and Application software.
- Experience in UNIX Shell scripting.
- Experienced in Database development, ETL and Reporting tools using SQL Server DTS, SQL 2005/2008 SSIS, SSRS, Crystal XI & SAP BO.
- Ability to contribute to all stages of SDLC on Client Requirement Analysis, Prototyping, Coding, Testing and Documentation.
- Experience in Agile Engineering practices.
- Techno-functional responsibilities include interfacing with users, identifying functional and technical gaps, estimates, designing custom solutions, development, leading developers, producing documentation, and production support.
- Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.
TECHNICAL SKILLS
Big Data Technologies: Hadoop, HDFS, Hive, Map Reduce, Pig, Sqoop, Flume, Zookeeper, Oozie, Avro, HBaseLanguages VB, Java, C/C#, FoxPro, Linux Script, SQL
Web Technologies: ASP, HTML, XML, JavaScript, JSON
IDE/Tools: Eclipse, Visual Studio 2012,Vmware, Apache, VSS, TFS 2008, Visio
GUI: Visual Basic 6.0, Oracle, MS Office (Word, Excel, Outlook, PowerPointAccess).
Browsers: Google Chrome, Mozilla Fire Fox, IE8
Reporting Tools: Crystal XI, SAP BO 4.1, Dashboard, InfoView
DB Languages: MSSQL 2005/2008/2012 , MSAccess, MySQL, Pervasive SQL
Operating Systems: Windows XP, 7, 8, LINUX/Unix
PROFESSIONAL EXPERIENCE
Confidential, Minneapolis, MN
Hadoop Developer
Responsibilities:
- Importing and exporting data into HDFS using Hadoop shell commands
- Moving data efficiently between clusters using Distributed Copy .
- Experience in Using Sqoop to connect to the DB2 and move the pivoted data to Hive tables or Avro files.
- Managed the Hive database, which involves ingest and index of data.
- Expertise in exporting the data from avro files and indexing the documents in sequence or serde file format.
- Hands on experience in writing custom UDF’s and also custom input and output formats.
- Migration of ETL processes from Oracle to Hive to test the easy data manipulation.
- Developed Hive queries to process the data for visualizing.
- Configured and Maintained different topologies in storm cluster and deployed them on regular basis.
- Developed Unit test cases and automated the scripts.
- Hands on experience on Oozie workflow.
- Worked in Agile environment, which maintain the story points in Scrum model.
Environment: Hadoop, Bigdata, Hive, Hbase, Sqoop, Oozie, HDFS, MapReduce, Java, Unix, Sql.
Confidential, Farmington Hills, MI
Hadoop Developer
Responsibilities:
- Importing data from MySQL into HDFS using Sqoop, Exporting data from HDFS into MySQL using Sqoop
- Configuring Sqoop for Microsoft SQL Server
- Exporting data from HDFS into SQL DB, Importing data from SQL DB into HDFS
- Exporting data from HDFS into Mongo DB using Pig
- Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
- Developed the Pig UDF's to preprocess the data for analysis.
- Involved in loading data from LINUX and UNIX file system to HDFS.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig.
- Optimized Pig and Hive scripts that were already in use
- Implemented external tables, dynamic partitions, and buckets using Hive
- Involved in managing and reviewing Hadoop log files.
- Load and transform large sets of structured, semi structured and unstructured data.
- Responsible to manage data coming from different sources.
- Supported Map Reduce Programs those are running on the cluster.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers
Environment: Hadoop 0.20, HDFS, Hive, Sqoop, Linux, PIG, Java (JDK 1.6), Eclipse, MS SQL 2005
Confidential, New York, NY
Application Developer
Responsibilities:
- Application development in three tier environment. Code development in VB6 for the Enterprise Application. Bug fixing, enhancements, front end forms design in VB6 and CS .Net.
- Database development of Creating Schemas, Tables, Views, Stored Procedures, and triggers in T-SQL.
- Design Enterprise Reports using Crystal Reports XI. VB Macros development in Excel.
- SDLC - Job Requirement Analysis and creating Requirements Specifications document, Technical Solution, risk assessment and test cases. Module level & Unit testing and Demo on the completed work after each sprint.
- C# progg and XML scripting in VS2008 in the Code migration from VB6 to .Net QA testing - End of release Regression testing on the applications on existing and latest version of Windows and SQL.
Environment: SQL Server 2005/2008, Visual Basic 6.0., C# & XML scripting, Crystal Reports XI, TFS 2013, Agile Methodology. Got trained in SAP BO 4.1 to design reports using various BO tools.
Confidential, Lisle, IL
Software Developer
Responsibilities:
- Involved in Requirement analysis and design phase of Software Development Life cycle (SDLC).
- Analysis of Business Requirements and Technical Requirements
- Involved in designing the front end screens & Designed Low-Level design documents for my Module
- Writing complex SQL and PL/SQL queries for stored procedures
- Generating Unit Test cases with the help of internal tools
- Used Core Java for development
- Used HTML, CSS and for the enriched front end
- Involved in defect Tracking, Analysis and Resolution of bugs in system testing
- Involved in Designing and creating database tables
- Prepared project Metrics for Time, cost, Schedule and customer satisfaction (Health of the project
- Develop Stored Procedures, Triggers and views in SSMS, Design DTS, ETL operations in SSIS. Report Design using SSRS.
- As a member of Parallel Database team, Perform daily data loads using SQL scripts and FoxPro application, running bulk insert data maps and resolve key operations.
- Generate daily reconciliation summaries for clients. Fixing problems with the History data using T-SQL complex queries.
Environment: Java, XML, LINUX, SQL, Crystal Reports.
Confidential, Burr Ridge, IL
Database Developer
Responsibilities:
- Develop stored procedures, UDFs in SSMS, develop DTS packages using SQL2000 and SSIS, generate reports using different reporting tools like SSRS, Crystal XI for Data Warehouse.
- Handle day-to-day database operations like backup, restore and detach the databases and design database maintenance plans for user & system databases. DTS Migration from SQL server.
- Create and monitor ETL jobs in Data Integrator Designer and Administrator of Business Objects for Data Warehouse.
- Testing different data parsing, cleansing and ETL tools for Data Warehouse. Developed the user defined functions to parse the customer names.
- Provide support for a Core Media application built on a FoxPro database with Access 2000 backend reporting.
- Develop FoxPro programs to purge the historical data from Core Media application to save the processing time.
- Create Database Relationship Diagrams using Microsoft Visio with multiple ODBC connections.
- Automation of several daily processes replacing the manual operations with batch programs using Access, Excel macros w/ VB Scripts.
- Create Reports in Crystal Reports10 on Pervasive SQL database as backend.
- Developed an Auto Tap tool in VB6 using custom controls. Adding web browsing utilities, media controls and help text browser using Help Scribble tool to the application.
- Testing the tool to communicate with the automotives and pulling the trouble codes that helps to detect the problem in the vehicle and diagnose the errors.
- Member of the team developed an Access Control device using C / C++ on LINUX system. Develop the modules to create classes and instantiate them using timers for each session.
- Tracking and granting or denying a badge for Access Control for specific locations and sessions. Capturing the attendee information and matching with configuration file.
Environment: Windows XP, SQL Server, Business Objects Data Integrator XI R1/R2, MS Access, VB6 and Crystal Reports10.