We provide IT Staff Augmentation Services!

Software Engineer Resume

0/5 (Submit Your Rating)

San Ramon, CA

SUMMARY:

  • 9 and half years of experience in Software Analysis, Design and Implementation. Has worked on various projects relating to Application Development and experience in Database Systems
  • Over 3+ years of hands on experience working with HDFS, Map Reduce and Hadoop ecosystem like Hive, HBase, Sqoop, Flume, Zookeeper.
  • Worked on various distributions like Pivotal, Cloudera and Apache.
  • Developed an UI application using AngularJS, consuming REST data from Apache Solr (search engine).
  • Over 5 years of developing applications using J2EE Design Patterns like creational, structural and behavioral design patterns (MVC Architecture, Singleton, Factory, Struts, etc)
  • Strong Experience in Internet Technologies with experience in J2EE, JSP, Struts, Servlets, JDBC, Apache Tomcat, JBoss, WebLogic, WebSphere, SOAP Protocol, XML, XSL, XSLT, Java Beans, HTML
  • Involved in migrating reporting application from java into Hadoop using HDFS, MapReduce, Sqoop.
  • Experience writing web services using Axis framework.
  • In depth understanding and exposure to all phases of OOA/OOD and OOP
  • Strong SQL, PL/SQL experience and very good exposure to MS SQL Server, MYSQL, Oracle & Greenplum databases.
  • Involved in Confidential Central (java project) to generatepolicy summary, system usage & forensic summaryreports, by parsing the huge logs uploaded by multiple customers.
  • Good Experience in all the Phases of Software Development Life Cycle
  • Knowledge of hibernate persistence technology and web services
  • Experienced working in the Linux, UNIX and Windows environment.
  • Experience in methodologies such as Agile, Scrum and Test driven development
  • Strong program analyzing skills, with ability to follow project standards
  • Strong ability to understand new concepts and applications
  • Excellent Verbal and Written Communication Skills have proven to be highly effective in interfacing across business and technical groups

TECHNICAL SKILLS:

Big Data: Hadoop, Map Reduce, HBase, Hive, Sqoop, Flume, Yarn, Talend, Apache Solr

Programming Languages: Java, J2EE, Python, PL/SQL, Map Reduce, Unix Shell Scripting

Java Technologies: Java 2, JDBC, Spring, Hibernate, Web Services, Struts, JSP, Servlets, ANT, Maven, log4j, XML, AngularJS

Database: Oracle 10/11g, MS SQL Server, MS - Access, MYSQL, HBase, GemFire, Greenplum, HAWQ, Postgresql

Web/Application Servers: IBM Web Sphere, Web Logic, Apache Tomcat, J-Boss, Jetty, TCServer

Tools: Eclipse, Maven, Gradle, SBT, Ant, Soap UI, Soap Sonar, JDeveloper, Sql developer, JIRA, Eclipse MAT, IBM thread dump Analyzer, Tableau

Operating Systems: Linux, Unix, Windows, Mac

Version control: Git, Perforce, Subversion.

PROFESSIONAL EXPERIENCE:

Confidential, San Ramon, CA

Responsibilities:

  • Responsible for data mapping after scrubbing for given set of requirements, used Complex SQL Queries as part of Data Mining and perform analysis, design and Data Modeling.
  • Designing overall ETL load process to move data incrementally and transform into new data model.
  • Created Talend Jobs to extract data from Hadoop and ingest in green plum (mpp database) incrementally.
  • Creating procedures/functions in HAWQ & green plum (mpp database) to retrieve desired result set for reporting.
  • Support all data needs for application development, including data source identification, analysis, and performance tuning and data migration.
  • Work with businesses to understand and define the business problem, develop a working prototype, prepare data for analysis.
  • Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
  • Built REST API using Spring boot.
  • Wrote Map Reduce job (java) to implement data science algorithm in Hadoop, and also for data preprocessing.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from Oracle into HDFS using Sqoop. Created external tables in Greenplum, to load the processed data back into GPDB.
  • Wrote Python scripts for text parsing/mining
  • Developed UI application using AngularJS, integrated with Apache Solr to consume REST.
  • Worked along with Tableau developers to help performance tune the visualizations graphs/analytics.

Technologies: HDFS, MapReduce, Hive, HBase, Talend, Java, Spring Boot, JPA, Cloud Foundry, HAWQ, JSON, Xml, Python, Pivotal HD, Tableau, Greenplum, Gemfire XD, Angular JS, UNIX Shell Scripts.

Confidential, Santa Clara Ca

Software Engineer

Responsibilities:

  • Responsible for support and maintenance, by troubleshooting any issues the customer is facing with the Adaptive Authentication (On Premise) product and help resolving them.
  • Worked closely with customers, professional services team to troubleshoot product/installation related issues and providing resolutions.
  • Helped customers in customizing/configuring Adaptive Authentication product for their specific needs.
  • Assisted customers in capturing thread/heap dumps on different application servers and analyzed them to identify application hangs and performance issues/memory related issues.
  • Analyzed AWR reports to troubleshoot/identify performance bottlenecks.
  • Provide Root Cause Analysis of incidents to stakeholders and clients.
  • Managed communications to customers at all levels to maintain positive relationships.
  • Reported software bugs (after reproducing in local environment) and customer suggestions (enhancement requests) to development and product management teams.
  • Worked closely with Professional Services team to verify new installs/upgrades and transitioning them over to support.
  • Created internal environments to reproduce customer reported issues and test/verify hot fixes and upgrades.
  • Created technical notes to contribute to knowledgebase.
  • Performed 24/7 on call duties once a month.

Environment: Java, Web Services (WSDL, SOAP), Spring, Hibernate, XML, ANT, Maven, drools, Oracle, MSSQL, DB2, JDBC, JSON

Confidential

Java Developer

Responsibilities:

  • Involved in Full Software Development Life Cycle (SDLC).
  • Worked with Object Oriented design patterns such as Factory classes. Developed few Factory Classes which act as controllers and diverts the HTTP Request to a particular Request Handler Class based on the Request Identification Key.
  • Developed Interfaces using JSP, based on the Users, Roles and Permissions. Screen options were displayed on User permissions. This was coded using Custom Tags in JSP using Tag Libraries.
  • Developed code to handle web requests involving Request Handlers, Business Objects, and Data Access Objects
  • Build process with ANT framework to build and deploy the application. Various ANT Tasks were defined for Compile, Build, Deploy, Check-in, and Checkout from CVS.
  • Worked with DB (DBO) classes and used JDBC drivers from different vendors. Used Various JDBC drivers such as MSSQLSERVER, WEBLOGIC for SQL Server, MYSQL connector for MYSQL Database
  • Designed and developed a user usage logging facility using Apache Log4J 1.2.8. Used different Levels of Loggers Such as INFO, DEBUG, WARN, ERROR and FATAL.
  • Installed and administered SQL Server 2000. Implemented maintenance plans including backups, security, check Integrity, Optimization, all with documentation.
  • Designed and developed database in Oracle.
  • Involved in Export/Import of data using Data Transformation Services (DTS). Imported data from flat files to various databases and vice-versa.
  • Worked and Modified the Database Schema according to the Client requirement.
  • Adopted three-tier approach consisting of Client Tier, Business Logic Tier, and Data Tier.
  • Tested the entire System according to the Use Cases using JMeter.
  • Involved in tracing and troubleshooting large volumes of source code using logging tools like log4j, and classes such as Print Writer.
  • Used XML, XSL and XSLT for developing a dynamic and flexible system for handling data.
  • Packaged and deployed the entire application code to integration testing environment for all the releases.
  • Implemented using Extreme Programming in Coding. Programmers followed all the standards in the coding.

Environment: MS SQL Server 2000, Functions, Views, Oracle 8i/9i, MYSQL, Linux8, Windows2000, Apache Tomcat, Web Sphere 5.0, WSAD, J2EE (Java 1.4, Servlets, JSP, JDBC-SQL), HTML, XML, UML, Eclipse 3, JMeter 2.0, JavaScript, CVS, ANT 1.5.1, JUnit, Log4J 1.2 8.

We'd love your feedback!