Lead Engineer Resume
Middletown, NJ
SUMMARY:
- Over 12 years of experience in developing software applications and web development in the IT industry as Architect/Consultant in Banking, E - Commerce and Telecom domains.
- Total 3+ years of leading, architecture and design and development experience of Big Data applications, which involves IBM BigInsights, CDH, Map-Reduce, HDFS, Hive, Pig, Oozie, Sqoop, Kafka, Spark, Flume and HBase.
- Twelve years of hands-on experience in Enterprise Application Development, Web Applications, Client-Server Technologies, Web Programming with various languages and frameworks like Java, J2EE, PL/SQL, Struts, Spring, Hibernate, SOAP and REST webservices.
- Expertise in all phases of Software Development Life Cycle and Testing Life Cycle.
- Help DevOps with Hadoop cluster setup for various environments like Dev, QA, Prod etc. Provide cluster and data sizing for hosted and on-prem installations.
- Architectural framework and patterns for positioning Big Data technologies for handling structured data, un-structured data and machine learning algorithms.
- XP methodology, Agile Development, and Design patterns.
- Working knowledge of data streaming technologies like Storm, Spark streaming and Kafka.
- Executed various Projects Solutions & PoC’s on Big Data (CDH) & (IBM Biginsight), Data Mining, Statistical Methods, Predictive Analytics, Near Real Time Analytics, Search, Visualization, Alerts, Reports & Dashboards.
TECHNICAL SKILLS:
Hadoop/Big Data: CDH 5.0, IBM BigInsights 2.1, HDFS,YARN, Map Reduce, Hive, Pig, HBase, Cassandra, Sqoop, Flume and Kafka and Spark Streaming
Java & J2ee: Java, Servlet, EJB, JSP, WebServices, PL/SQL, Struts, Spring, Hibernate & SOA.
IDE: Eclipse, JBuilder, Workshop & Jdeveloer.
CM Tools: SVN, SCME, Borland StarTeam 6.0 and CVS
UI: HTML5, JavaScript, JSP, JPF, DHTML, AJAX
APP SERVERS: Weblogic 11g, IBM Websphere 6.0 and Tomcat 7.x
DATABASES: ORACLE 10g/11g, MySQL
Tools: TOAD/SqlDeveloper, soapUI, Borland Together Architect 1.1, CaliberRM 8.0, PMD/Sonar, Ant/Maven/JenKin, Log4J,JUnit 4.x
OS: Sun Solaris, AIX, Windows XP/7, UNIX, Linux, DOS & Cloud.
PROFESSIONAL EXPERIENCE:
Confidential, Middletown, NJ
Lead Engineer
Development Environment: CDH, MapReduce, Flume, Kafka, Spark Streaming, Pig, Java, J2ee, SOAP/REST Web service, Maven, Jenkins, PL/SQL, Unix/Linux, HTML, XML,XSL, JSP, Struts1.2, Eclipse, SOAPUI, Tomcat 7.3, Weblogic Application Server 10.3.6, SCME, Confidential QC, Cassandra, Toad and Oracle 11g
Responsibilities:
- Built distributed, scalable, and reliable data pipelines for large data sets with high throughput(~10GB/Hour) and low latency(~80ms)
- Architect, design and develop SOAP based web service.
- Executed various Projects Solutions & PoC’s on Big Data (CDH) & Kafka, Data Mining, Statistical Methods, Predictive Analytics, Near Real Time Analytics, Search, Alerts, Reports & Dashboards.
- Prepared Big Data & Analytics Service Catalog (use case scenarios) and architectural frame work.
- Developed MapReduce programs to cleanse the data in HDFS obtained from heterogeneous data sources to make it suitable for ingestion into Hive schema for analysis.
- Migrated CLOB web service transaction’s to stored procedures.
- Involved in detailed design and writing core classes and API for the application using MVC
- Conducting Code reviews with the team for better Quality.
- Migrated 1500 API’s to CSI( Confidential &t CDM) framework
- Used Maven & Jenkin tools to automate build and deployment process.
- Developed modules that integrate with web services that provide Node updates for another system.
- Used Hive to analyze data in HDFS to identify issues and patterns.
- Used Sqoop to efficiently transfer data between databases and HDFS and used Flume to stream the log data.
- Importing and exporting Data from MySQL/Oracle to HDFS.
- Customize parser loader application of Data migration to HBase.
- Developed custom UDFS and implemented Pig scripts.
- Used Kafka and Spark Streaming API for real time data process.
Confidential
Lead Engineer
Development Environment: Java,J2ee, Webservice, CDH 4, Hive, Pig, MapReduce,Flume, Unix, RHEL, Shell Scripting InformaticaBig Data Governance Edition and Datameer.
Responsibilities:
- Participated and designed functional specification reviews to determine technical design
- Developed MapReduce programs to cleanse the data in HDFS obtained from heterogeneous data sources to make it suitable for ingestion into Hive schema for analysis.
- Involved in application design, data architecture, coding, unit-testing, Deploy and maintenanceand debugging. Used Hive to analyze data in HDFS to identify issues and patterns.
- Monitored system activities and fine-tuned system parameters and configurations to optimize performance and ensure security of systems.
- Experience in Installation and configuration of CDH4, NameNode, Secondary NameNode, JobTracker, TaskTrackers and DataNodes.
- Experience in Installation and configuration of Hadoop ecosystem like HBase, Flume, Pig, Sqoop and Hadoop cluster task like adding/removing nodes without any effect to running jobs & data.
- Used Sqoop to efficiently transfer data between databases and HDFS and used Flume to stream the log data from servers/sensors.
- Mentored and lead junior team members in Java Map-Reduce and Pig-Latin, HBase, and Hive
- Performed the activities on data mining (Text analytics), data quality, Hadoop hive queries, and map reduce code development.
- Developing Pig scripts and Java UDFs for extraction and normalization of content.
- Designing & scheduling the jobs using Oozie scheduler.
- Explored Spark, Kafka, Storm along with other open source projects to create a realtime analytics framework
Confidential
Development Environment:Java, J2ee, PL/SQL, Unix, HTML, XML, JSP1.1, Struts1.2, Eclipse 3.2,WLS 9.2, Resin-2.1.16 Server, StarTeam6.0, CaliberRM 8.0, Borland Together Architect 1.1, QC, Toad and Oracle 10g.
Responsibilities:
- Involved in SDLC, includes Requirement gathering using Caliber, writing technical spec, drawing sequence and class diagrams using Borland together architecture, coding and testing.
- Migrated the application from resin webserver to Weblogic application server9.2 including struts.
- Designed and developed various Business components for business logic. Used Stateless Session beans and MDB for business logic
- Involved in writing core classes and API for the application using MVC and Singleton pattern.
- Developed modules that integrate with web services that provide price updates for catalog.
- Preparation of Unit Test Plans/Test cases using Confidential Quality Controller.
- Participated in the design and code reviews.
- Build and deploy the application in Weblogic clustered environment.
Confidential
JAVA Developer
Development Environment:Java, HTML, JSP1.1, Hibernate3.0, Eclipse 3.0, Weblogic Application Server 8.1, Weblogic Portal Server 8.1 and Tomcat 5.x and Oracle 9.0
Responsibilities:
- Involved in creating design documents like Requirement analysis document and Code Development. Involved in detailed design and writing core classes and API for the application using MVC and Singleton patterns.
- Used MVC design patterns to Architect the Application. Used JSPs to create the front-end screens for the module and JPF as controller.
- Developed Business components to retrieve data and implement the business logic.
Confidential
Sr Software Engineer
Development Environment: J2EE (JSP, Servlets, EJB2.0, XML), Jbuilger7.0, Weblogic7.1 and Oracle 9i.
Responsibilities:
- Developed and Deployed EJB's, Struts and JSP's and fine-tuned them for better performance.
- Developed Business Objects, Data Access Components in Java.
- Wrote SQL queries to insert, update and manipulate result set using JDBC.
Confidential
Software Engineer
Development Environment: Java, JSP, Servlets, Tibco Portal Builder and Oracle 9i.
Responsibilities:
- Analysis, design and development for the Password Recovery Modules