Lead Developer Resume
Summary
- 8+ years of Java/J2EE experience in the area of requirement analysis, design, development, unit testing, integration and deployment.
- Excellent working in the design and development for a number of web based enterprise projects in the Retail/banking/web portal domain for varies client’s accurse the world.
- Good Exposure on the design and development of distributed enterprise applications using Java, J2EE (Servlet, JSP, JDBC), Database (Oracle, MySQL, DB2). And has experience on various frameworks/technologies like Struts, Spring Frame Work and Distributed Databases Hbase, Redis, Hadoop, Hive and Pentaho.
- Good Exposure on various web and application server like Tomcat, sun one, web sphere portal server 6.1, web sphere commerce server 6.0, web sphere commerce staging servers6.0 and web
- I have been involved into different projects and played different roles like Delivery Lead, Team Lead, Module Lead, and Team Member.
Education
B.Tech in Electronics and Communication from Confidential University .
Technical Skills
Java Skills : Java, JDBC, JSP, Servlet, JNDI, JMS, Struts, Spring, Hibernate, J2EE, EJB, XML, XSD, XSLT, XPATH, JAXB, SOA, Web Services, SOAP, WSDL, UDDI, Swing, Java Beans.
Development Tools : RAD, Eclipse/My Eclipse, JIdea.
Middleware : EJB3.0, Open JMS, Mule ESB.
RDBMS : Oracle (SQL, PL\\SQL), Mysql, and SQL Server.
Scripts : HTML, DHTML, JavaScript, CSS, Shell Script, ant.
Web Server : Apache Tomcat, Jetty.
Application Server : BEA Weblogic9.1, web sphere portal server 6.1, web sphere commerce server 6.0 web sphere commerce staging servers6.0.
Source Controlling Tools : GIT, CVS and SVN.
Other Tools : Pentaho ETL.
Distributed Systems : Hadoop, Hbase, Hive, Redis. HUE, Sqoop.
Operating System : Unix, Sun Solaris, Windows NT/2000/XP.
Client: Confidential, US March 12 –Till Date
Organization: Cognizant Technology Solutions
Role: Lead Developer, Delivery Lead
VDAT:
VDAT (VDAT Data Access Toolkit) is a set of products to streamline access to clean, integrated, reliable data. It consists of the following components:
- ETL: An ETL (Extract, Transform, Load) process, which pulls data from various sources, integrates and cleans, and stores the resulting data in a compressed binary format within HDFS (Hadoop Distributed File System).
- Hive: A command line tool for running SQL-like queries against the cleaned data. Our Hive library is customized to encapsulate logic from our beacons, logical definitions, and custom functions.
- Scheduler: A web service for programmatically running Hive queries, scheduling queries to run in the future, and retrieving the resulting data in various formats.
- Scheduler Web UI: A web application, which provides a front-end to the Scheduler along with basic tools for retrieving the resulting data.
Responsibilities:
Involved in designing pentaho workflow for Daily Sync from Sales Force Web services.
Implementing pentaho components to download Sales Force Data, hadoop storage, executing hive summarization queries and load to MySQL tables.
Involved in implementing serda for Serialize/Deserializer raw logs.
Involved in designing Adtech workflow for Data Sync from Adtech web service Web services.
Introduced HUE for all business users to use and it replaced hive CLI .
Environment: Jdk 1.6, Spring IOC 2.0, apache tool packages, JUnit, Pentaho ETL, Hadoop, Hive, Shell, hue, Ubuntu, MySQL, Git, Maven.
Experience
Client: Confidential, US May 10 – March 12
Organization: Cognizant Technology Solutions
Role : Lead Developer, Delivery Lead
Kingfisher:
Kingfisher is an infrastructure project initiated to migrate existing brittle pixel infrastructure to efficient, error tolerant and performance oriented system using Scribe, Hadoop, Map Reduce aggregation and Hive. Kingfisher hosts a bunch of http loggers, which pushes the logs into scribe and streaming to the HDFS.
For Efficiency and validation of these logs, Custom MRs were written which serialize and de-serialize the logs, validate them and feed the logs through to the ETL processing.
The ETL System consisted of Pentaho, Hive and Web UI. The ETL System would receive the Valid feed execute the Summarization every 1 hour, 4 hour and 24 hour intervals into multivariate dimensions and provide it to the Web UI.
Responsibilities:
Involved in designing Loggers / Scribe and HDFS upload.
Involved in the Hive Query Designs and Pentaho Integration with Hive using Hive Webserver.
Implementing the ETL System that would summarize the valid logs at regular intervals.
Involved in setting up data validation tools to compare beta release data with legacy system data
Involved in the Capacity assessment of the Hadoop Cluster.
Cost:
ETL is an infrastructure project initiated to migrate existing shell script based SEMTools reports to performance oriented distributed systems of Hadoop. Querying capabilities of this Distributed reports was developed using hive. The complete workflow was controlled through ETL processing tool, Pentaho.
The migration from the shell script
Responsibilities:
Involved in designing pentaho workflow for report processing.
Implementing pentaho components to download cost, hadoop storage, executing hive summarization queries and load to MySQL tables.
Involved in setting up data validation tools to compare beta release data with legacy system data
Implementing search query report analysis, positizing and negatizing keywords using Hadoop storage and Hive custom MRs.
Environment: Jdk 1.6, Spring IOC 2.0, apache tool packages, JUnit, Pentaho ETL, Hadoop, Hive, Shell, Ubuntu, MySQL, Git VCS, Maven, Jetty web server.
Adsleuth:
Historically, Ask.com displays all ads returned by Google without considering their performance on our site. AdSleuth project intends to measure the performance (CTR) of an advertiser’s domain on our site in real time and when a low-CTR domain is returned by Google for a query, request fewer ads for subsequent occurrences of that query in the immediate future (a few minutes). Following this strategy, AdSleuth expects to show fewer ineffectual ads in response to the users’ queries.
Our initial experiments indicate that since many queries and ads repeat within short intervals of time, AdSleuth can remove a large portion of ineffective impressions while foregoing a relatively small number of picks. Our analysis also indicates that the ratio of reduced impressions to lost picks for AdSleuth is much larger than the same ratio achieved by the recent ad-reduction on direct US and UK traffic by a raw optimization method that does not consider the domain CTR.
Responsibilities:
Delivery Lead for this Project.
Involved in understanding Requirements and Coordinating with offshore Team.
Involved in Design Architecture This System
Environment: Hbase, Redis, Struts, Jsp, Java, Apache Jetty Server.
SEM Tools:
SEM Tools is a tool that aids Ask.com-SEM business unit to manage the SEM business. The capability list includes keyword bidding/buying traffic on Google, Yahoo and MSN search engines, uploading keywords bundle to Google AdWords, MSN adcenter and Yahoo search marketing, updating keyword bids after initial upload, managing adids and campaigns, retrieving revenue values from AdSense, campaign level, search engine level report views to get quick summary of cost, estimated revenue and actual revenues.
A distributed infrastructure using Hadoop was setup and custom schedulers, which would launch appropriate custom MR, to download cost, adbundle generation and upload to adwords was developed.
Responsibilities:
Interacting with Business team to understand requirements and coming up with PRD.
Implementing requirements using Java, Hibernate 3.2 (HQL, NativeSQL and Relational mapping), Quartz scheduler, Castor XML binding and Spring 2.0.
Implementing adbundle keyword de-duping using HBase.
Migrating adbundle generation and adbundle upload to Map Reduce task based process.
Migrating ctr, cpc based keyword status change, bid processing to Map Reduce task based process.
Implementing search engine web service clients to interact with AdWords/AdSense.
Implementing keyword portfolio report tool using Map Reduce Task.
Designing DB tables to support business requirements.
Environment: Jdk 1.6, Spring IOC 2.0, Hibernate 3.2, Struts2, lucene solr, Hadoop, Hive, Custom Map Reduce, HBase, Castor XML bindings, SAX parser, apache tool packages, JUnit, HTML/CSS, Oracle 10g, Shell scripting, Ubuntu. JIdea, SVN, Maven, tomcat 5.5.
Client: Confidential, US Oct 09 – April 10
Organization: Cognizant Technology Solutions
Role: Team Lead
Mentor Graphics Corporation:
Mentor Graphics has the broadest industry portfolio of best-in-class products and is the only EDA Company with an embedded software solution. To mange price for all the components
Vendavo is the leading provider of price management and optimization software for business-to-business companies.
Responsibilities:
Involved in understanding Vendavo Tool frame work and customized this to meet client requirement.
Involved in designing and customizing of price list generation and work book creation in deferent product
Involved in outbound of Vendavo Data to SAP in for generating price list and price points.
Involved in vendavo reporting tool and jasper framework while exporting reports in vendavo tool.
Understanding the requirements of the project and involved in development of the project and developed a Use case document, Design document.
Involved in creating Stored Procedures, Database design, normalization, and optimization
Environment: Web Logic 9.1, Eclipse, Oracle 9i. Java 1.4, J2ee, Jasper, iReport 1.2.4, Toad Vendavo tool (Price Generation Tool).
Client: Confidential. Jan 09 – Oct 09
Organization: Cognizant Technology Solutions
Role: Team Lead
Straight line && Auto Junction:
Mjunction services limited operating at the cutting edge of Information Technology and the Internet, is a 50:50 venture promoted by SAIL and TATA Steel. Founded in February 2001, it is today not only India\'s largest eCommerce company (having eTransacted worth over Rs.45,193 Cores till date) but also runs the world\'s largest eMarketplace for steel. The steel and coal supply chain in India has been transformed by mjunction, which has ushered in Efficiency, Transparency and Convenience to the way steel and coal, is bought and sold. Similar transformational change is being sought to be made in the automobile industry and in the sale of fixed priced branded products with the launch of autojunction and straightline respectively..
Mjunction offers a wide range of selling, sourcing and knowledge services across diverse industry verticals that empower businesses with greater process efficiencies. The selling and sourcing services that mjunction offers do not just stop here, they go all the way to provide fulfillment services like inspection, logistics and finance. Mjunction has service offerings spanning the entire eCommerce spectrum and operates through - metaljunction.in, buyjunction.in, coaljunction.in, autojunction.in, straightline.in, financejunction.in and mjunctionedge.
Responsibilities
I have involved in understanding Web Sphere Commerce framework and customized this to meet client requirement.
Enabling content management system in portal server and data migration in portal side from derby to Db2
Involved in designing and customizing of content management in portal server to meet client requirements.
Deployed code in staging and commerce servers and customized staging side to integrated portal server content management with staging server. And also customized staging server to pull content from portal side and push the content to commerce server to display it in front end.
Involved in outbound of commerce to SAP in Auto junction and B2B Service.
Worked with MUAL and ESB sever to communicate with sap web services.
Understanding the requirements of the project and involved in development of the project and developed a Use case document, Design document.
Web Sphere Portal security, WAS administration, build tools such as ANT/Maven.
Involved in creating Stored Procedures, Database design, normalization, and optimization
Environment: Java 1.4,1.5, J2EE, Struts frames work, DB2, RAD Tool, Web Sphere Commerce Server 6.0, Web Sphere Commerce Staging Server 6.0, Web Sphere Portal Server 6.0,Mule ESB Server, Open JMS.
Client: Confidential, US. Jun`08 – Jan`09
Organization: Cognizant Technology Solutions
Role: Team Lead
Web Professional:
Web Professional Destination product will be a destination that allows the existing Network Solutions customers access points to a design library, business document library, forums, tutorials, and whitepapers
Network Solutions, LLC is a technology company, which was founded in 1979. The domain name registration business has become the most important division of the company. As of 2006, Network Solutions manages more than 7.6 million domain names. Their size, founding status, and longevity have made them one of the most important corporations affecting domain name price and policy. NSI®products provide a range of resources usually reserved for larger organizations
Responsibilities
Leading team more then 6 members.
Involved in developing Server side (JSP, HTML) components for the product.
Involved in architecture and database Design.
Involved in preparing unit test cases and testing.
Designed Sequential and class diagrams.
Involved code review and code clean up actives.
Developed the authorization of user based on roles in the application
Involved in integration all modules and bug fixing.
Involved in creating Stored Procedures, Database design, normalization, and optimization
Environment: Java, J2EE, Jsp, Struts frames work, Spring framework, MySQL, Validation framework, Eclipse 3.2, Sun One App Server, Unix.
Store Front Application:
The Network Solutions Storefront is the retail website and major sales channel. This is the main retail customer facing website for Network
Responsibilities
Involved in developing Server side (JSP, HTML) components for the product.
Involved in preparing unit test cases and testing.
Involved in creating Stored Procedures, Database design, normalization, and optimization
Involved code review and code clean up actives.
Environment: Java, J2EE, Jsp, Struts frames work, Spring framework, MySQL, Validation framework, Eclipse 3.2, Sun One App Server, Unix.