Software Engineer/ Architect Resume
Atlanta, GA
SUMMARY:
- More than 12 years of professional experience in analysis, design and development of high volume, highly scalable and distributed software systems.
- Cloudera certified Hadoop developer(CCD - 410, passed in 2013) with 4+ years of professional experience in Big Data related technologies and expertise in HDFS/Hadoop architecture, Map Reduce algorithms, HBase and other hadoop ecosystem softwares like Hive, Pig, Scoop etc..
- Experience using Apache Spark and Scala.
- Worked on Proof Of Concept models using NoSQL databases like Redis, Cassandra, MongoDB etc.
- Worked with MYSQL,MSSQL, Oracle PL/SQL, Triggers and Stored Procedures for the databases.
- Sun Certified Java developer(passed in 2008) with more than 10 years of professional experience in developing java backend software using traditional databases like MySql, Oracle and MSSql.
- Extensive experience in the development of reporting tools and various charting systems.
- Hands on experience using Unix shell and scripting.
- Experience in developing highly distributed applications using Java, J2EE, Servlets, JSP, Spring, ActiveMQ JMS, JDBC, Apache Tomcat Server, JavaScript, HTML, XML, XSL, SQL under Unix Operating system.
- End to end experience in the development of real time software systems handling high volume.
- Experience working in a fast-paced, flexible environment, and taking the initiative to learn new tools quickly.
- Have good experience in MVC architecture and proficient in OOPS concepts.
- Expertise in preparing the test cases, documenting and performing unit testing and Integration testing.
- Expertise in cross-platform (PC/Mac, desktop, laptop, tablet) and cross-browser (IE, Chrome, Firefox, Safari) development.
- Skilled in problem solving and troubleshooting, strong organizational and interpersonal skills.
- Possesses professional and cooperative attitude, adaptable approach to problem analysis and solution definition.
- Good team player with strong analytical and communication skills.
TECHNICAL SKILLS:
Programming Languages: Java, C, C++, Scala, Perl, Bash
BigData Technologies: Hadoop, HDFS, MapReduce, Hive, Pig, HBase, Impala, Sqoop, Cascading, ZooKeeper, Spark, Spark SQL
NoSQL: Hbase, Redis, Cassandra
Development Tools: Eclipse, Idea IntelliJ, Streambase studio
Web Development: JSP, Struts, Spring, JSON REST Jersey, jQuery, AngularJS, Apache Thrift
Database Tools: MySql, Oracle, MSSQL
Design & Analysis: Design patterns, UML modelling, MVC
Software Tools: Jenkins, Maven, Nexus
Frameworks: Spring MVC, Struts
PROFESSIONAL EXPERIENCE:
Software Engineer/ Architect
Confidential, Atlanta, GA
Environment: Apache Hadoop, HDFS, HBase, Java, ActiveMQ JMS, FIX, Apache Thrift, Spring, MySql, Maven, Jenkins, Nexus
Responsibilities:- Develop and support different types of fraud analytic models for creating alerts and support the configuration of various alert parameters/thresholds in the system.
- Architect of the feed module which handles huge amount of market data from Interactive Data Corp.(IDC) and other market data vendors.
- Designed and developed market data and news data APIs for use by analytics system to alert suspected fraud transactions.
- Bulk load of the FX data from IDC defined data format file into HBase using MapReduce jobs.
- Designed and developed various ETL jobs to import data into Hbase.
- Developed fully automated system with cron jobs running shell scripts generating various regulatory and client specific reports by ingesting the data stored inside the Big data platform.
- Worked in Agile development environment with continuous deployment and releases made using Jenkins and nexus.
- Wrote BASH scripts to support easy deployment of data and platform on QA machines.
- Implemented Proof of Concept models in Redis, Cassandra and other major NoSQL databases to analyze the latency of the Big Data platform.
- Developed and deployed Apache Spark code written in scala to detect Wash Trade fraud and presented it in the Confidential Unlimited R&D summit.
- Developed tools for one click automation of QA regression tests.
Technical Architect
Confidential
Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Cassandra, Java, JavaScript, Apache Tomcat
Responsibilities:- Design and develop data aggregator which handles market data from different exchanges. Import all the instrument market data into HDFS and Hive database using Sqoop.
- Provide Map Reduce jobs to aggregate all instrument data quotes(Big Data) into hourly and daily data at the end of day.
- Provide support for various stock technical analysis indicators and rule based alerting on technical indicators like Standard deviation, Bollinger Bands, RSI, Moving Average crossings etc.
- Organize the data of different types of instruments from various data providers. Provide administration features to provision new instruments, update instrument information and delete bad data.
- Responsible for analyzing and cleaning bad data by running Hive queries and Pig scripts on data.
- Use Agile methodology practices in a test driven development.
- Ensure best performance of real time queries and streaming of data on the website.
- Provide REST interfaces for alert management(alert creation/deletion etc.). The alert management REST interfaces are used by PHP code to generate HTML.
- Design and develop daily reports on usage analysis using daily cron jobs which run Pig scripts. e.g.: Number of market data quotes per exchange, Top user alerting patterns etc.
- Maintain System integrity of all components related to Hadoop.
- Write unit tests, integration tests and regression tests.
Senior Software Engineer
Confidential
Environment: Java, JSP/Struts, Apache Tomcat, MSSQL
Responsibilities:- Provide web UI support for group management and modified EPS application to provision groups.
- Provide web UI support for provisioning non-AT&T numbers and developed separate routing module in EPS application to support non-AT&T numbers.
- Design system to manage groups in core EPS when the groups are added/deleted/modified from web UI.
- Modify various telecom related protocol servers(WCTP, SNPP etc.) to support groups which may not be standard mobile numbers.
- Add support for resolving and querying a group and also sending message to all active members of the group.
- Design and develop usage reports and billing reports using stored procedures and queries which are run on MSSQL server.
- Develop cross-carrier gateway server in Core java and route messages of all non-AT&T numbers through this gateway.
Senior Software Engineer
Confidential
Environment: C/C++
Responsibilities:- Read/Decode Unicode SMS from GSM Modem and encode it in base64 format and send back to the SITA ground network.
- Write testing framework to test throughput and accuracy of the system.
Software Engineer
Confidential
Environment: JAVA, JSP/Struts, JGraph, Tomcat, JBoss, MS-Sql
Responsibilities:- Analyze various types of questions used for collecting feedback
- Develop a Question maker dynamically based on the requirements
- Graphs for rated questions and summary of the survey
- Design system to manage roles (HR, Employee, Administrator etc...)
Associate Software Engineer
Confidential
Environment: Microstrategy (BI Reporting tool), Erwin Data Modeler
Responsibilities:- Analyze whether existing data model fits with Microstrategy’s CAM data model
- Review new report requirements and suggest changes to the data model