Software engineer Resume
San Francisco, CA
SUMMARY:
- 8 plus years of professional experience in the areas of design, development, implementation, deployment and production support for various Intranet/Internet applications.
- 5+ years experience in advanced SQL, PLSQL, ETL, data warehouse (DW), Apache Kafka, Sqoop, and developing databases.
- 4+ years experience in designing and developing RESTful Web services using JAVA/J2EE, Spring, Hibernate,JSON.
- Expertise in developing application with financial domain, using Enterprise Technologies pertaining to Core Java 1.8, JEE, Servlets 2.2/2.3, JSP 2.0, Struts 2.0, Hibernate 3.0, Spring IOC, Spring MVC, Spring Boot Hibernate, JMS,XML, JDBC 2.0, JNDI, JAXP, JAXB, Web Logic, Web Sphere and Tomcat.
- Experience in Web Services using XML, HTML, SOAP and REST API.
- Solid background in Object Oriented Analysis & Design, Development and Implementation of Client Server/Web/Enterprise development using n - tier architecture.
- Hands on expericne in Data analysis using R, MapReduce and deploying applications on Spark.
- Passionate about data and data analysis with expertise in Apache Kafka, Amazon RedShift, MySQL, SQL Server database design and development.
- Extensive experience in J2EE architecture and developed server-side applications using technologies like Java, JSP, Servlets, Spring, Webservice, Hibernate.
- Designed and developed data pipeline to perform batch ETL process from Oracle ERP to HDFS.
- Setting up scalable data pipelines using Kafka Messaging, Pub/Sub methodologies and data models that support ingestion of real time data streams.
- Designed and Developed Oracle custom APIs to sync data from SAP to different modules in Oracle Apps including SCM and Finance.
- Optimized the code as part of performance tuning to achieve up to 8 times higher speed. Knowledge on MongoDB.
- Development Methodologies: Waterfall, Test Driven Development, Agile and Agile Scrum .
- Testing: Supporting and involving application development testing using Junit, system testing, boundary testing, regression testing and UAT testing phase.
- Experience in relational database systems MSSQLServer, Oracle and MySql. Knowledge on Pig and Hive frameworks.
- Good Communication, analytical, interpersonal and presentation Skills and capable of picking up any new technology with a minimum learning curve.
TECHNICAL SKILLS:
Hadoop Core Services: HDFS, Map Reduce, Hadoop YARN.
Hadoop Distribution: Cloudera.
Hadoop Data Services: Apache Hive, Sqoop
No SQL Databases: Apache HBase, Cassandra
Virtualization Software: VMware, Oracle Virtual Box.
Java& JEE Technologies: Core Java 1.7 & 8, Servlets3.2, JSP 2.0, JDBC
IDE Tools: Eclipse, Net Beans
Programming Languages: Java, Unix Shell scripting,Python
Data Bases: Oracle 11g/10g/9i, DB2, MS-SQL Server, MySQL, MS-Access.
Web Servers: Web Logic 11, Web Sphere 6.1, Apache Tomcat 5.5/6.0.
Environment: Tools: SQL Developer, Putty.
Frameworks: UML, Hibernate 3.0, Spring2.5
Version Control Systems: CVS, PVCS, Git
Operating Systems: Windows, Linux.
PROFESSIONAL EXPERIENCE:
Confidential, San Francisco,CA
Software Engineer
Responsibilities:
- Analysis and designing the software requirements.
- Server side implementation using Java/J2EE and Javascript.
- Implementation of the Restful services using NodeJS and Express.
- SOA based designing and architecting the services utilizing the component-based architecture too.
- Experience with distributed caching mechanisms using Redis.
- Working in the Restful service layer of Autodesk which implements all the services that are consumed by various other systems.
- Continuous integration with Git and Jenkins.
- Implemented SOAP services on the consuming side.
- Developing some of the UI components using Angular framework.
- Web development component architecture using Angular directives for developing reusable UI components that can be used in both IPP and eStore.
- Design and implementation of the common service layer using Service Oriented Architecutre and Restful services.
- Deployment of applications on the Amazon AWS cloud.
- Implementation of the client for Open Id authentication protocol for the BIC Store website.
- Worked on some legacy systems that are working on Java J2EE platforms.
- Writing user stories and tasks for the sprints and used Rally as tracking tool.
- Maintaining another legacy application built on Spring, Hibernate and MySQL database.
- New services implementation at service layer using RestEasy framework.
- Agile methodology with two week sprints.
Environment: NodeJS, Express, Angular, Mocha, Chai, Sinon, Agile, Rally, Java, J2EE, IOC, WebStorm, AWS EC2, OpenId2.0, RestFul, SOA, Web Component Architecutre, RestEasy, Jboss, RabbitMQ.
ConfidentialJava Developer
Responsibilities:
- Design and development of Customer Experience projects.
- Used Agile and Agile Scrum Methodology for minimizing creep by developing in short intervals resulting in small software products.
- Preparing Technical Design Documents for modules that includes class, object and sequence diagrams.
- Experience in JSP, jQuery, XML, JSON, JSTL, HTTP, RestFul API.
- Used Jquery components (Ajax, Selectors, Attributes, Manipulation, CSS, Utilities) to meet the business requirements.
- Struts MVC and Spring MVC architecture for the core legacy and new modules respectively.
- Worked on the catalog administration tool. Developed the content management at item level such as image upload process, item descriptions etc using Swing.
- Worked few screen in the user role and permission admin tool .
- Used JAXB for sending and processing the command requests from the Swing application and parsing the responses.
- Webservice: Used Jersey JAX RS for building the Restful web services.
- Testing: Worked as an interim tester and worked with QTP Automation tool.
- Designed and developed test scripts for various web modules and web pages.
Environment: Java, J2EE API, HTML, JSP, JSP Standard Tag Library(JSTL), Swing, Struts, Spring MVC, Hibernate, Tomcat 6, STS, SVN, CSS, SCRUM, Toad, jQuery, jQuery Plugins, AJAX, JSON, GSON, JAXB, JAXP, XPATH, MVC, RESTFul, Jersey, GlassFish, JUnit, JIRA, Confluence, Hudson, ANT, Oracle 10g, MS SQL Server, Quick Test Professional (QTP).
ConfidentialSoftware Engineer
Responsibilities:
- Involved in database development and creating SQL scripts.
- Involved in Requirement Study, UI Design, Development, Implementation, Code Review, Validation, Testing.
- Managed database related activities. Designed tables and indexes.
- Writing SQL queries to fetch the business data.Developed Views, Sequence and indexes.
- Created Joins and Sub queries involving multiple tables.
- Analyzing SQL data, identifying issues and modifying the SQL scripts to fix the issues.
- Involved in trouble shooting and fine tuning of databases for its performance and concurrency.
- Involved in fixing bugs and different forms of testing including black and white box testing.
- Handling issues regarding database, its connectivity and maintenance.
- Manage the priorities, deadlines and deliverables of individual project and issues related to it.
- Effectively prioritize work while considering business need and urgency.
- Involved in performance improvement of web application for user friendly experience and solving a critical issue that happens in the production environment.
- Designed and developed enhancements for webinar recommendation engine to reach out to right audience and increase conversion rate.
- Developed end-to-end data pipeline for dynamic cost calculation of campaigns to save manual efforts of Sales team.
- Sharing data processing across multiple servers to reduce run time of 4 critical jobs by 60%.
- Implement end-to-end Amazon Redshift data warehouse solution. Created dimension, fact tables and loaded data to Redshift.
- Worked on Pentaho Data Integration (PDI) tool with advanced SQL skills for ETL jobs.
- Extract data from sales force ERM system and load the data for ETL processing.
- Maintenance and design modifications of existing ETL jobs to send emails to users for targeted webinars.
- Created automated solution using PDI to generate performance report used by customers, Sales and Marketing team that saved approximately 20 hours of manual task per week.
- Gather business requirements, develop data models, entity relationship (ER) diagrams for High Level Design (HLD) document and perform impact analysis.
- Designed and developed data pipeline to perform batch ETL process from a Relational database to HDFS.
- Setting up scalable data pipelines using Kafka Messaging, Pub/Sub methodologies and data models that support ingestion of real time data streams.
- Working in a fast paced agile team following SDLC development process. Using JIRA for workflow tracking process and Bitbucket for code version control.
- Extensively handled importing of data from Relational DBMS into HDFS using Sqoop for analysis and data processing.
- Responsible for creating Hive tables on top of HDFS and developed Hive Queries to analyze the data.
- Optimized the data sets by creating Dynamic Partition and Bucketing in Hive.
- Collected the information from web server and integrated it to HDFS using Flume.
- Used Pig Latin to analyze datasets and perform transformation according to business requirements.
- Stored the compressed data in row column oriented binary file format for efficient processing and analysis.
- Implemented Hive custom UDF's for comprehensive data analysis.
- Involved in loading data from local file systems to Hadoop Distributed File System.
- Developed Spark jobs using PySpark and used Spark SQL for faster processing of data.
- Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Sqoop script, Pig script, Hive queries.
- Exported data from HDFS environment into RDBMS using Sqoop for report generation and visualization purpose.
- Experienced in writing Shell scripts for automating the tasks. Continuous monitoring of the jobs.
Environment: Toad, Oracle 9i, PVCS,Hadoop, MapReduce, HDFS, HBase,Hive,Pig,Scoop