Lead Solr, Java, Big Data Developer Resume
Warren, NJ
SUMMARY:
- Over all 10.5 years of experience as designing and developing applications in java j2ee world
- 4+ years of team leading and coaching experience
- 2 years of experience in APACHE SOLR/CLOUDERA SEARCH technologies
- Experience in dealing with batch, near real time (NRT) ingestions, architecting and tuning search performance of application
- Around 2 years of experience in Big data technologies.
- Experience in writing MapReduce, apache spark, YARN, PIG Scripts, Hive Queries for analyzing Data and to derive analytics.
- Exposure on Hadoop application level Security
- Extensive work experience in applications development, maintenance & production application support involving Java Dynamic Web and Enterprise Level Development.
- Experience in developing Enterprise, System Integration and Distributed applications design and development
- Exposure in all areas of SDLC from requirements gathering, analysis, design, estimations, software development, delivery and Training.
- ITIL Expert, have influence over the key areas of the ITIL Service Lifecycle: Incident, Problem, Change and IT Operations management.
- Proficiency in developing secure web applications and server side development
- Using Core Java, JDBC, Servlets, JSP, Hibernate, Spring Framework (core, AOP, MVC) and Web Services (apache CXF JAX - WS, JAX-RS).
- Experience in writing extensive JUnit tests using EasyMock and Mockito libraries.
- Experience in SOAP web services, RESTful web services and SOA architecture
- Strong analytical skills with ability to quickly understand client’s business needs.
- Having experience of requirements gathering, triaging and coordinating with business people and application design and architecture.
- Research-oriented, motivated, proactive, self-starter with strong technical, analytical and interpersonal skills.
- Experienced work in a complex, multi-vendor, geographically diverse environment.
- Experience working in a global team and with offsite onshore model. Worked with team located geographically at different locations and managed the teams.
- Effective team player with strong analytical and problem solving skills.
- Communicate effectively (both written and verbal) with people at different levels including stake-holders, internal teams and the senior management.
TECHNICAL SKILLS:
Programming Skills: Map reduce, PIG Scripts, Java, Servlets, JSP, Struts, Spring Core, spring MVC, Spring Security, AJAX, XML, HTML, Angular JS, JQuery, JavaScript and web services (SOAP/Rest), Disney s Tea framework.
RDBMS: Oracle, MySQL, DB2
J2EE Server: Web logic Application server 8.1, Web Sphere Application Server 8.5
Web Server: Apache Tomcat 7.0, Disney s Barista web server.
IDES: Eclipse 3.4, RSA 8.0
Build Tools: Maven
Continous integration Tools: Hudson,Jenkins.
Version control/Configuration Management Tools: perforce, SVN, CVS, IBM Rational Clear case
Operating Systems: Windows XP/Windows 9x, UNIX, Linux
Open source Distributed File system: Hadoop Distributed File System
Data Analysis Tools: apache PIG, HIVE.
Data Importing Tools: Sqoop, flume.
Serialization Framework: Apache Avro
Search Engine tools: Apache SOLR
Distributed File system: Hadoop
Distributed Message Brooker: Apache Kafka
Cluster coordinating services: Apache Zookeeper
Application Workflow management: Apache Oozie
PROFESSIONAL EXPERIENCE:
Confidential, Warren, NJ
Lead SOLR, java, Big Data developer
Responsibilities:
- Lead on building Big Data Hadoop platform using Cloudera Distribution for Hadoop (CDH5) on 5 master nodes on rack expandable cluster to achieve redundancy and configured Name Nodes, 45 Data Nodes, 1 Secondary NameNode, and 1 Job History Server, 1 Resource Mgr. YARN(MR2), and configured hadoop security and monitoring on the virtualized cluster servers.
- Architected and build dataflow pipeline using Apache Nifi using inbuit connectors and moved data in to HDFS, Hive, Created Data Pipe using Flume and Kafka.
- Implemented and worked on real-time streaming data processing using Apache Flume, the aggregating data from a few application servers to HDFS using multiple Flume agents.
- Worked on Cloudera Cluster Manager monitoring and trouble shooting configuring custom dashboards using ‘tsquery’ and SNMP traps alerts were configured to capture the hadoop cluster health.
- Deploying and configuring application in dev boxes and resolving integration challenges while assembling and integrating components like site minder, Kerberos etc.
- Tuning the search application to provide best performance.
- Coded and Managed data ingestion into SOLR using spark jobs to achieve NRT (spark streaming) as well as batch processing (Spark SQL).
- Worked with business stakeholders, application developers, production teams and across functional units to identify business needs and discuss solution options.
- Develop code that matches the prototype and specification as necessary, portable to other environments.
- Real time data processing with CDC replication using Attunity and Streamsets tools.
- Designed frameworks to connect AWS instances to store and access data.
- Created native Spark frameworks to solve business use cases which were taking longer time in map reduce process.
- Created Data Profiling framework to monitor the data quality and create a report with the detailed analysis with custom functions.
- Implemented Hadoop Security using Apache Sentry services plugins, the configuration was created for Hive and Impala using sentry bindings, configured and tested policy engine and policy providers for Hive and Impala components that was connecting to backend policy provider database.
- Working on Dockers and containers on the Cloud
- Active member in the solution engineering team to provide solutions to production issues of the Talend Frameworks.
- Created Business-Crucial packages and functions to support efficient data storage and manipulation.
- Designed and Developed Real time Data Ingestion frameworks to fetch data from Kafka to Hadoop.
- Optimized existing long running high data load jobs to perform better with more production loads.
- Created Spark based Talend Bigdata Integration jobs to do lighting speed analytics over the spark cluster.
- Coded the recommendation engine to captured the patterns for analytics.
Technologies used: Java 1.7, spring boot 1.5.1, Spring MVC 4.2, cloudera search 1.3.0 CDH 5.8.2, spark 1.6.2, JPA 2.0
Confidential, Dallas, TX
Lead Java/Hadoop Developer
Responsibilities:
- Provided application demo to the client by designing and developing search engine, report analysis trends, application administration prototype screens using AngularJS, and BootstrapJS.
- Took the ownership of Complete application Design of Java part, Hadoop integration.
- Apart from the normal requirement gathering, participated in Business meeting with the client to gather security requirements.
- Assisted with the architect to analyze the existing system and future system
- Prepared design blue pints and application flow documentation
- Developed the JAX- RS web services code using apache CXF framework to fetch data from SOLR when user performed the search for documents
- Participated in SOLR schema, and ingested data into SOLR for data indexing. written MapReduce programs to organize the data, and ingest the data to suitable for analytics in client specified format written PIG Scripts to query and process the Data sets to figure out the patterns of trends by applying client specific criteria, and configured oozie workflows to run the jobs along with the MR jobs
- Stored the derived the results in HBASE from analysis and make it available to data ingestion for SOLR for indexing the data
- Involved in integration of java search UI, SOLR and HDFS
- Involved in code deployments using continuous integration tool using Jenkins
- Documented all the challenges, issues involved to deal with the security system and implemented best practices
- Created Project structures and configurations according to the project architecture and made it available to the junior developer to continue their work
- Handled onsite coordinator role to deliver work to offshore
- Involved in core reviews and application lead supported activities
- Trained the offshore Resources to adopting the client standards
Technologies Used: Java, J2EE, Spring 3.2, MVC, HTML5, CSS, AngularJS, BootstrapJS, Restful services using CXF web services framework, WAS 8.5, spring data, SOLR 5.2.1, PIG, HIVE, apache AVRO, Map Reduce, Sqoop Zookeeper, SVN, Jenkins, windows AD, windows KDC, Hortonworks distribution of Hadoop 2.3, YARN, Ambari
Confidential, St. Louis, MO
Lead Java Developer
Responsibilities:
- Participated in business requirement meetings.
- Involved in transforming functional requirements into Technical Detail Design by adhering HIPPA Rules and regulations.
- Estimating the time required for each use case.
- Developed REST based web services for the wellness profile score, Content search and recommendations for user.
- Generated prototype classes from copybook using RAD.
- Implemented new enhancement like sending multiple requests to backend system (Stratus) at a time and response need to send back to application.
- Analyzed existing code base for back end system calls based on client discussions and enhanced those calls with new requirements (sending data differentiator).
- Involved in key areas of portal application like portal Login and Registration.
- Retrieve differentiator from service/core id’s and distributed to entire application.
- Worked with ILog Jrule engine, to trigger the business rules in the rules execution engine configured in WebLogic application server.
- Contributed in Defect Analysis and Fixing.
- Co-ordination with offshore and onshore teams.
- Ticket Tracking and QA Support.
- Escalating the issues to the top management.
- Identifying the risks and come up with a risk mitigation plan.
- Created JUnit test cases.
- Design reviews and code reviews with Find Bugs and Crucible.
- Represented technical team in Defect Triage meetings and coordinated the assignment of the bugs.
- Represented as Technical Check out (TCO) for this project.
- Supported and lead release support activities for corporate releases.
Technologies Used: Java, J2EE, Servlets, JSP, JSTL, Spring, Hibernate, mule Framework, AngularJS, BootstrapJS, IBM Web Sphere 7.x, Restful web Services, HTML5, CSS, Rational Clear Quest, Harvest CSM, Ultra Edit, IBM RAD, My Eclipse Blue, Oracle 10g/11g, Advanced Rest using WEB API. Client, Requite Pro, Putty, log4j, Oracle SQL Developer, Microsoft Visio. CA wily application monitoring tools.
Confidential, St. Louis, MO
Senior Java Developer
Responsibilities:
- Participated in business requirement meetings.
- Involved in transforming functional requirements into Technical Detail Design.
- Involved in Client/BSA meetings to understand the issue and provide the analysis and solution.
- Defect Estimations, Analysis and fixing.
- Working on enhancements as per client requests.
- Hudson Build and unit test the application supporting the HDS/CCS processes for ESI.
- Work closely with client resources in support of business applications.
- Tracking all the production defects and working on this on priority basis. Coordinate with offshore production support team. Delegate work to offshore team.
- Code review with crucible tool.
- Escalating the issues to the top management.
- Identifying the risks and come up with a risk mitigation plan
- Attending the triage meetings and coordination of defects, new enhancements.
- Prepare knowledge transition presentations and share knowledge to team members.
- Supported in application release activities.
Technologies/Languages: Java1.5, Spring Core, Mule 2.x, IBM Web Sphere 7, Tomcat, SOAP Services, HP Quality Center, CVS, Ultra Edit, Eclipse, Putty, log4j, Remedy JUNIT, Oracle 10g/11g, Oracle SQL Developer.
Confidential, St Louis, MO
Senior Java Developer
Responsibilities:
- Participated in business requirement meetings.
- Involved in transforming functional requirements into Technical Detail Design.
- Involved in Client/BSA meetings to understand the issue and provide the analysis and solution.
- Defect Estimations, Analysis and fixing.
- Working on enhancements as per client requests.
- Hudson Build and unit test the application supporting the HDS/CCS processes for ESI.
- Work closely with client resources in support of business applications.
- Tracking all the production defects and working on this on priority basis. Coordinate with offshore production support team. Delegate work to offshore team.
- Code review with crucible tool.
- Escalating issues to the top management.
- Identifying the risks and come up with a risk mitigation plan
- Involved in writing Test Conditions and Test Scripts using JUNIT Framework/tools
- Attending the triage meetings and coordination of defects.
- Prepare knowledge transition presentations and share knowledge to team members.
- Supported in application release activities.
Technologies Used: Java1.7, J2EE, Servlets, JSP, JSTL, Spring Core, MVC and Web flow, Hibernate, IBM Web Sphere 7, Tomcat, SOAP Services, HP Quality Center, CVS, Ultra Edit, Eclipse, JQuery, Putty, log4j, Remedy, AJAX, CSS, HTML, Oracle 10g/11g Java Script, JUNIT, SQL developer.
Confidential
Java Developer
Responsibilities:
- Responsible for deliver the MD5 module
- Done project estimations using DBT estimator
- Coding of the UI using Java, Servlets, JSP, and prototype JavaScript
- Developed a POC to client for maintains Xml repository of records as jobs, by accessing that repository generating UI elements
- Functional Designing and Developing for new implemented Change Requests
- Run the Code coverage and branch coverage for Junit test scripts for unit testing
- Tracked the entire process in identifying any noticeable risks.
- Done the effort estimations for the entire project by considering the all SDLC phases into the account with the help
- Prepared the Low level Requirement documents, low level design documents.
- Designed the use cases, sequence diagrams, class diagrams, activity diagrams, component diagrams, deployment diagrams
- Support the Integration Testing phase with fixes for the defects raised
- Timely validating the estimated effort hours with the actual effort with the help of guide-lines and metrics
- Managed all the change requests. Tracked and reviewed effectively. Support the system testing phase till the go-live of the application.
- Maintain the centralized build system Cruise Control for build and deployment of the application to integration, system test and UAT test environments
- Java1.5, Spring Core, Mule 2.x, IBM Web Sphere 7,Tomcat, SOAP Services, HP Quality Center, CVS, Ultra Edit, Eclipse, Putty, log4j, Remedy JUNIT, Oracle SQL Developer.
Technologies Used: Java1.5, J2EE, Spring Core, IBM Web Sphere 7,Tomcat, SOAP Services, HP Quality Center, CVS, Ultra Edit, Eclipse, Putty, log4j, Remedy JUNIT, Oracle SQL Developer.
Confidential
Java Developer
Responsibilities:
- Involved in the Understanding of the Requirements and Design of the Application.
- In Build phase I had taken control CICS connectivity with Java using CTG, integrating java environment to COBOL runtime environment in mainframe.
- Written test case documents for java, COBOL integration to connect mainframes.
- Involved in the creation Mock test cases to test the correctness of data transfer be- tween the intended recipients ( mock objects acting as mainframe data source )
- Communicated effectively with the client related to issues in data transfer between mock objects and resolved it independently
- Written Junit test cases for developed code
Confidential
Java Developer
Responsibilities:
- I developed user interfaces using tea language & framework, using data(objects) stored in content management server(go-publish),those objects are self-contained portions of GUI screen, GUI to middle ware integration I am responsible to handle flight's module for flight booking for different type of users basing on the category.
- Designed the screens page flows and individual page elements, sub elements using Disney proprietary CMS Designer, and converting the screen elements into java specific method call.
- Done the GUI is integration with under laid with DLS, DAE services which in turn contact with Databases Goredge (flights database), LMS.
- Involved in the JUnit testing.
- Used Test Director 8.0 for Bug registration and tracking