Hadoop Big Data Architect Resume
Columbus, OH
OBJECTIVE
- Seeking a challenging position in an organization, where I can provide solution for Big Data related problems, articulate and provide optimum solution for the problems dealing with huge data.
SUMMARY
- 8+ years of professional experience using Data Analytics, Object Oriented Analysis, design & development of multi - tiered applications
- Around 2 years of experience as Big Data, Hadoop Architect.
- Architected the project for handling Huge volumes of data.
- Provided Architectural solution in the projects using Hadoop, Big Data, Java Map Reduce, Hive, Sqoop, Flume, Apache Spark, Strom, Zookeeper, Kafka and J2EE technologies.
- Planned and delivered Big Data Solution, using Hadoop eco system.
- Understanding problem in hand and giving solution for the given problem.
- Analyze the requirement and come up with qualitative questions which are essential for Architecting the project
- Followed TOGAF standards for building the project.
- Sun Java Certified Programmer (SCJP 1.5) Professional.
- Hands on development experience in working with huge volumes of data using Big Data technologies like Hadoop, Map Reduce, Hive, Pig, Flume, Sqoop
- Experience in using Data visualization and Analytic tools like R, D3.js
- Written Machine Learning Algorithms using R for predictive analytics.
- Provided Data warehousing solutions using Hive for Huge volume of data.
- Handled variety of data using Streaming and velocity of data using Flume
- Used No SQL technologies like HBASE, Mongo DB and Cassandra for storing huge volumes of data.
- Good understanding of Zookeeper and Kafka for monitoring and managing Hadoop Jobs and resources.
- Experience in writing programs using Apache Spark and Strom for handling real time analytics
- Hands on development experience using J2EE, Spring, Hibernate, XML, XSLT,GWT, REST, Web Services
- Experience in developing custom search engine application using Lucene, Tika and Solar.
- Expertise in refactoring code using design patterns and object oriented programming.
- Good understanding of Hadoop eco system and application of eco system technologies
- Good understanding of the project life cycle process and ability to work in a team as well as an individual contributor.
- Expertise in developing large and highly scalable applications.
- Expertise in Performance analysis and automation with jMeter and Dynatrace.
- Expertise in Maven, Ant for build, application tools.
- Excelent understanding of Supervised and Un-Supervised Learning algorithm using R
- Good understanding of Linear Regression and Polynomial Regression using R
- Good understanding of Logistic Regression, Polynomial Logistic Regression and Classification using R
- Worked on Data Analytics, Web applications, client server technologies, and design and quality methodologies, Business Rule Management.
- Ability to analyze, design and code programs to generate optimum output.
- Good RDBMS knowledge and extensively worked with Oracle, My SQL and SQL Server.
- Extensive experience in collaborating with Business analyst/Business users/clients to understand the requirements
- Create high level and low level design documents and provide solutions for the requirements.
- Adept in handling customers, technically driving projects, leading teams and maintaining strict quality focus.
- Well versed with Agile Scrum methodologies.
- Outstanding ability to initiate action, make decisions and issue resolutions.
- Excellent communication and documentation skills. Adapt quickly and competently to new technologies.
- Highly motivated, dependable and work under minimal supervision to see projects through shipping.
- Excellent exposure on Functional Specification and Technical Application Design documents preparations.
TECHNICAL SKILLS
Languages: Core Java 6, XML, Multi-threading, JDBC, UML
Big Data Technologies: MapReduce, HDFS, Hive, PIG, Sqoop, Lucene, Tika, R, Flume, OOZI, YARN, MR2, Solar, Elastic Search, Streaming, Spark, Strom, Kafka, Zookeeper, AWS
Web Technologies/ APIs: J2EE, JNDI, Servlets, JSP, Spring MVC, Java Script, CSS, XSD, XPATH, XSLT, JAXB, Webservices, REST, GWT, D3.js
Frameworks: Spring 4.0.1 (Core, Context, AOP, JDBC, ORM), Hibernate 4.1.9, Acord
RDBMS: Oracle 10g, MySQL, Hive
IDEs: Eclipse Galileo/Indigo, STS
Servers: IBM Websphere application server 6, Apache Tomcat 6
Caching Framework: Eh-cache 2.7.2
Version Control: SVN,CVS, Tortoise
Continuous Integration: Hudson, Jenkins
Build Tools: Ant, Maven 2.x, 3.x
Issue Tracker System: HP ALM, Service now, Bugzilla
Testing Framework: JUNIT, MR Unit
Performance Tools: JMeter, Dynatrace
Operating System: Windows 2000/2003/XP/7, UNIX, Linux
No SQL Technologies: HBase, Hive, Mongo DB, Cassandra, CouchBase
Process Mining Tools: RapidMiner, Disco, ProM 6
PROFESSIONAL EXPERIENCE
Confidential, Columbus, OH
Hadoop Big Data Architect
Responsibilities:
- Architecture the project from the legacy system to the Big Data, as our volume of data was growing huge.
- Provide Solutions and Architecture for developed new modules iteratively which are used for predictive analysis.
- Analyze the requirement and come up with the valid questions which are helpful in Architecting the project which will address maximum business problems
- Effectively utilized Hadoop Eco system in Architecting the project
- Written programs in the distributed environment. Used Java, Map Reduce, Hive, Pig Sqoop, Flume, My SQL R, Apache Strom, Zookeeper, Kafka.
- Enhanced existing Components to work on the Data intensive System from traditional to High availability Scalable system.
- Designed and written Map Reduce programs/Algorithms for different Data Related Business problems.
- Provided solution for Ad Hoc and on the go ETL business request using Hive and Pig technologies of Hdoop ecosystem.
- Prepare Low Level Design Document for all the development with minor and major changes.
- Prepare High Level Design Document to give the overall picture of system integration
- Prepare Unit test document for each release and clearly indicate the steps followed while unit testing with different scenarios and captured.
- Debug the log files whenever a problem come in the system try to do the root cause analyses.
- Lead the team and take initiative for conducting the scrum calls in the absence of Scrum Master.
- Reviewed code and suggested improvements.
- Use of Agile Methodology with Stories, Sprint and Scrum
- After every release prepare the release document which have information the deployment and integration at the time of production.
- Written and used some of the Machine Learning Algorithms for predicting the future customer retention and giving discount.
- Written Data Visualization programs using D3.js for presenting the predictive data.
- Written code for infer into the drivers driving habits and come to logical analysis to provide balanced premium.
- Analyze the requirements and ask questioners for better understanding of the requirement and create the path for development.
Environment: Core Java 6, MapReduce, YARN, MR2, Hive, HDFS, PIG, Sqoop, Streaming, Flume, Strom, Kafka, XML, XSL, UML, Multi-threading, Servlets, Junit 4.8, MR Unit Linux, HP ALM, Service now, MRUnit, Zookeeper, HBASE, Gagelia Monitoring, D3.js, JSP, Java Script, Apache Log 4j, Acord, MVP, My SQL, Artius rating engine, R..
Senior Hadoop Big Data Developer
Confidential
Responsibilities:
- Designed and developed a new module which will be used for doing predictive analysis and inferring the data in distributed environment. Used Java, AWS, Elastic Map Reduce, Hive, Pig Squoop, Spark, HBase, Oracle, Lucene, Tika.
- Enhanced existing Components to work on the Data intensive System from traditional to High availability Scalable system.
- Extensively written Map Reduce programs/Algorithms for different Data Related problems.
- Effective utilization of Hive and Pig technologies for Ad Hoc and instant results.
- Prepare Low Level Design Document for all the development with minor and major changes.
- Prepare High Level Design Document to give the overall picture of system integration
- Prepare Unit test document for each release and clearly indicate the steps followed while unit testing with different scenarios and captured.
- Debug the log files whenever a problem come in the system try to do the root cause analyses.
- Reviewed code and suggested improvements.
Environment: Core Java 6, Elastic MapReduce, AWS, YARN, MR2, Hive, HDFS, Spark, PIG, Sqoop, Streaming, Flume, Cassandra, XML, XSL, UML, Multi-threading, Servlets, Junit 4.8, MR Unit, Linux, Lucene, Tika, Zookeeper, Gagelia Monitoring, JSP, Java Script, Apache Log 4j, R, .
Confidential, Harleysville, PA
Senior Java Developer
Responsibilities:
- Discussions with Business users for Business and Architectural requirements.
- Designed the architecture flow of data in application to various back end systems
- Created HLD and LLD for projects.
- Designed and developed middle ware application to integrate with Agent and insurance company system.
- Developed and executed Java, Lucene, Tika, Spring code to search index and integrate the code.
- Designed and developed XSLTs for transformation of one Request into another comparator system.
- Create daily builds and deploy in the remote system.
- Participated in design and code reviews.
- Provided support to Integration testing teams.
- Provided production support on go-live.
Environment: Jdk1.5, Lucene, Tika, Spring Core, Acord, My Sql, Weblogic, Tomcat, Tortise, JAX-WS 2.0, Apache Log4j, Hibernate, Oracle, JUnit, SOAP, XML, XSLT, XPath, SQL Developer, JNDI, Mercury Quality Center
Confidential
Senior Java/J2ME/J2EE Developer
Responsibilities:
- Giving my input for the design of the project.
- Customizing the design of the project by providing the canvas based components.
- Analysis of the requirements and designing as per the modification
- Tracking the porting issues.
- Interacting with the team sharing my knowledge taking the lead for handling the team.
- Java programming
- Preparing the functional specification.
- Meeting the requirements as per the road map of the project
Environment: Jdk1.5, Eclipse, J2ME, Oracle SQL Server, KXML,, JUnit, Client-Server, RMI, Hibernate, XML, JAI (Java Advance Imaging), Dom Parser, SVN for version control, Bug zilla for tracking defects, Spring IOC, Spring Core, Spring MVC.
Confidential, Boston
Senior Java/ J2EE Developer
Responsibilities:
- Interaction with Client and Understanding Requirement.
- Analysis of the requirements and designing.
- Coding Java programming.
- Writing test cases and documenting the work done.
Environment: Jdk1.5, Eclipse, Oracle SQL Server, JUnit, Client-Server, Hibernate, Dom Parser, SVN for version control, Bug zilla for tracking defects, Voiger framework, Spring Core, Spring MVC.
Confidential
Java/ J2EE Developer
Responsibilities:
- Interaction with Team Lead
- Analysis of the requirements and designing
- Java, Springs and Hibernate programming
- Writing test cases and documenting the work done.
Environment: Jdk1.5, Eclipse, Oracle SQL Server, JUnit, Client-Server, Hibernate, Dom Parser, SVN for version control, Bug zilla for tracking defects, Spring IOC, Spring Core, Spring MVC.
Confidential
Java/J2ME / J2EE Developer
Responsibilities:
- Analysis of the requirements and design
- Coding and writing the unit test cases
- Preparing the document of the work done
- Support, Maintenance and Enhancement.
Environment: Jdk1.5, Eclipse, J2ME, Oracle SQL Server, KXML, JUnit, Client-Server, RMI, JDBC, XML, JAI (Java Advance Imaging), Dom Parser, SVN for version control, Bug zilla for tracking defects, Spring IOC, Spring Core, Spring MVC.
Confidential
Responsibilities:
- Analyzing requirements, Prototyping for various forms
- Coding Java, J2ME, Servlets, XML, JAXB and Database programming.
- Preparing the document of the work done
- Support, Maintenance and Enhancement
Environment: Jdk1.5, Eclipse, J2ME, Oracle SQL Server, KXML, Dom Parser, SVN for version control, Bug zilla for tracking defects, Spring IOC, Spring Core, Spring MVC.