Solution Architect Resume
Hoffman Estate, IL
SUMMARY
- 14+ Years of experience with emphasis on Big Data and Java Technologies, Solution Architect, Development, Administration and Experience in Analysis, Design, Development, Implementation, Testing, Deployment and support of Java based enterprise applications.
- 4+ years of experience in Hadoop with Spark and Scala as Solution Architect, Development, Administration and setting up standards and processes for Hadoop based application design and implementation of data analytics on datasets ranging from 75TB to 100TB.
- Hands on experience in installing, configuring and using ecosystem components like Hadoop MapReduce, HDFS, Hbase, Oozie, Hive, Pig, Flume.
- Performed data analysis using Hive and Pig. Loading log data into HDFS using Flume. Experience in using Sqoop, Zookeeper and Cloudera Manager.
- Solid background in Core Java concepts like Threads, Collections Framework and also have handsome experience in building Class Diagrams, Activity Diagrams, Sequence Diagrams, and Flow Charts using Rational Rose and Visio.
- Application development, maintenance (Client/Server Arc) & Infrastructural support of web based applications developed on Java/J2EE using Servlets, JSP, JSTL, RMI, EJB(Session Beans, Message Driven Beans), Struts, Spring, JSF, Java Beans, JDBC, JMS, Hibernate 3.0 and MVC architecture, deployed on Application servers (Web Sphere & JBoss (Tomcat) & Web logic), running on Linux, UNIX & Solaris servers.
- Strong database connectivity skills which includes Oracle, MySql and DB2. In - Programming with Sql, Pl/Sql, Stored Procedures, Triggers, Functions and Packages besides writing DDL, DML and Transaction queries with development tools like SQL developer.
- Experienced in using Design Pattern: Singleton Pattern, Session Facade Pattern, MVC Pattern, Business delegate Pattern, Factory pattern, Value Object pattern, DAO pattern and Data source pattern.
- Proven expertise in implementing IOC (Inversion Controller), Dependency Injection features in various aspects of Spring Framework (Core, Web, JDBC, MVC, DAO) and also experienced in integrating Spring Web Flow with other frameworks like JSF, has exposure on SOA using WebServices.
- Developed scripts for automating tasks using Ant, Perl, Python, Ruby on Rails and UNIX shell programming.
- Building and Deployment of EAR, WAR, JAR files on Test, Stage and Production servers.
- Experienced in deploying applications on Application servers such as BEA WebLogic, IBM WebSphere, Jboss and on the Tomcat .
- Extensive knowledge and hands on experience in Web development using HTML, DHTML, AJAX, CSS, JavaScript, ExtJS, XML, XSL, XSLT, validations with DTD & XML Schema and processing files with SAX, DOM Parsers.
- Good Understanding and working experience on various software methodologies like Agile/Scrum and Waterfall.
TECHNICAL SKILLS
Languages: Java (JDK 1.4/1.5/1.6/1.7/1.8 ), SQL, PL/SQL, C, C++
Operating systems: UNIX, Linux, Sun Solaris, Windows Server 2003, XP, 2007.
Databases: Teradata, Oracle 10g/11g, MS SQL Server 2008, DB2, MySQL, Sybase.
Big Data ecosystem: Hadoop 1.x, 2.x, Hive, Drill, Pig, Sqoop, Oziee, Zookeper, Spark, Scala, Kafka, Flume
Big Data Integration Systems: Talend 5.6
NoSql: Hbase, Cassandra, MangoDB, MarkLogic
Distribution Systems: MapR, Cloudera, Hortonworks
Scripting Language: Perl, Python, JavaScript, VB Script, Shell Scripting
Web/J2EE Technologies: Servlets, JSP, JDBC, JNDI, Tag Libraries, MVC, Struts, Spring, DOM, Hibernate, RMI, EJB, CSS, HTML, XML, DHTML, Ajax, JQuery, ExtJS.
Messaging: IBM MQ-Series, JMS, RMI
Web/App. Servers: WebSphere Application Server 6.0/7.0, Apache Tomcat 7.0, Web Logic 8.1, Apache HTTP Server, IBM HTTP Server
WebServices: SOAP, REST, WSDL, UDDI, SOA Architectures
IDEs: RAD 7.0, WSAD 5.1, RSA, Eclipse, NetBeans.
Config. Management: Puppet, Chef, Jenkins
Tools: TOAD, DB Visualizer, Aqua Data Studio 15.0, Xmlspy, Ant, Putty, Edit plus, PL/SQL Developer, Bugzilla, Enterprise Architect
ETL Tool: Informatica 6.0
Design Patterns: MVC, Singleton, DTO, DAO, Factory Pattern etc.
Frameworks: Struts, JSF, Spring (Dependency Injection), Hibernate, AJAX, Log4J.
Unit Testing: JUnit.
Version Controllers: Rational Clear Case, CVS, VSS, SVN
Rules Engine: JBoss Rules (Drools)
PROFESSIONAL EXPERIENCE
Confidential, Hoffman Estate, IL
Solution Architect
Responsibilities:
- Analyzed large data sets by running Hive queries and Pig scripts
- Complex pig udf for business transformations
- Worked with the Data Science team, Teradata team and business to gather requirements for various data sources like webscrapes, APIs
- Involved in creating Hive/Impala tables, and loading and analyzing data using hive queries
- Developed Simple to complex MapReduce Jobs using Hive and Pig
- Involved in running Hadoop jobs for processing millions of records and compression techniques
- Developed multiple MapReduce jobs in java for data cleaning and preprocessing
- Involved in loading data from LINUX file system to HDFS, and wrote shell scripts for productionizing the MAP (Member Analytics Platform) project
- Load and transform large sets of structured, and semi structured data
- Loaded Golden collection to Apache Solr using morphline code for Business team
- Assisted in exporting analyzed data to relational databases using Sqoop
- Data Modelled for Hbase for large transaction sales data
- Proof of Concept on Strom for streaming the data from one of the soources
- Proof of Concept in Pentaho for Big Data
- Implementation of one of the data source transformations in spark
- Worked in Agile methodology and used Ice Scrum for Development and tracking the project
Environment: CDH5.0, Hadoop, HDFS, Pig, Hive, IMPALA, Solr, morphline, MapReduce, Sqoop, HBase, shell, Pentaho, spark, Teradata, storm, spark and Big Data
Confidential, MN
Solution Architect
Responsibilities:
- Data Integration workflows, including Talend and various Big Data components to ingest, enrich and distribute data in a MapR Hadoop ecosystem.
- File, source and record level data quality checks, Logging and Error Handling, and Job & workflow scheduling.
- Role-Based security across all data storage locations, including file, table, row and column level security.
- Compare data across data storage locations: from files, to Hive, to MS SQL Server.
- Evaluated business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
- Responsible for building scalable distributed data solutions using Talend
- Analysed large amounts of data sets to determine optimal way to aggregate and report on it.
- Handled importing of data from various data sources, performed transformations using Hive, Talend MapReduce, and loaded data into HDFS.
- Importing and exporting data into HDFS using Sqoop.
- Wrote Talend MapReduce code to make un-structured data into semi- structured data and loaded into Hive tables.
- Worked extensively in creating Talend MapReduce jobs to power data for search and aggregation
- Worked extensively with Sqoop for importing metadata from Oracle.
- Created partitioned tables in Hive.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in Talend MapReduce way.
- Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
- Created Hbase tables to store various data formats of data coming from different portfolios.
Environment: Hadoop, MapReduce, HDFS, Talend, Hive, Ooozie, Java(jdk1.6), NoSQL(HBase), PL SQL, Toad 9.6, UNIX Shell Scripting.
Confidential, Richmond, VA
Hadoop Technical Analyst/Lead
Responsibilities:
- Installed and configured HDFS and developed multiple MapReduce jobs in Java for data cleansing and preprocessing. Involved in loading data from UNIX file system to HDFS.
- Installed and configured Hive and also written Hive UDFs.
- Involved in complete SDLC activities - Requirement Analysis (BRA), Design, development and in various testing phases (Unit Testing, Integration Testing, System Testing and UAT).
- Implement architectural designs while building solid relationships with stakeholders Confidential all levels
- Work with functional analysts, developers and development managers to ensure that all solutions are deployed within agreed timelines and supported after delivery
- Identify where exceptions to the enterprise architecture standards are required
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Provided quick response to ad-hoc internal and external client requests for data and experienced in creating adhoc reports.
- Responsible for building scalable distributed data solutions using Hadoop.
- Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
- Extracted the data from Teradata into HDFS using Sqoop.
- Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like shopping enthusiasts, travelers, music lovers etc.
- Exported the patterns analyzed back into Teradata using Sqoop.
- Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
- Installed Oozie workflow engine to run multiple Hive.
- Developed Hive queries to process the data and generate the data cubes for visualizing.
Environment: Hadoop, MapReduce, HDFS, Hive, Ooozie, Java (jdk1.6), Cloudera, NoSQL, Oracle 11g, 10g, PL SQL, SQL*PLUS, Toad 9.6, UNIX Shell Scripting, Python Scripting.
Confidential, Indianapolis, Indiana
Lead developer
Responsibilities:
- Writing entire business logic using Core Java 1.6, J2EE, and Spring.
- Developed Struts Action Forms, Action classes and templates and performed action mapping in struts-config.xml.
- Involved in complete SDLC activities - Requirement Analysis (BRA), Design, development and in various Testing phases (Unit Testing, Integration Testing, System Testing and UAT).
- Implemented MVC architecture using WebSphere Studio Application Developer (WSAD) 5.1.2 IDE based on Eclipse technology was used as deployment tool.
- Worked with service Management Team and Business Teams to get the requirements, developed Class Diagrams, Use case diagrams and sequence diagrams.
- Created Service Oriented Architecture (SOA) and Web Services with XML, XSD, WSDL and SOAP. Used SOAPUI for testing the web service response.
- Experienced in Application Migration and Data Migration including user migration from existing application to the new application. Generate Jasper reports in the project.
- Implemented new Projects and providing fixes for Tickets and VIDS for critical and high priority Change requests on time and with minimal defects in all the applications.
- Involved in tuning SQL queries and stored procedures for better performance.
- Develop build script to deploy the applications on to Web sphere application server and Production support.
Environment: Java 1.6, WAS 6.1, RAD 7.0.8, Clear case 7.0, Clear Quest 7.0, JSF 1.1, Richfaces 3.1, JSP, Servlets, Struts 1.2, Maven, HTML, CSS, JavaScript, JQuery, Log4j, SVN, Spring 2.0.6, JPA-Hibernate 3.x, Oracle 9.2, Quartz Scheduler 1.6, JMS, Web Service,, SOAP, WSDL, Apache Lucene Search Engine 2.4, Ant Script, WSAdmin Script Jython, PL/SQL, Work Manager, Distributed Cache.
Confidential, Atlanta, Georgia
Sr. Software Engineer
Responsibilities:
- Participated in the requirements gathering sessions and the detailed design of the entire architecture.
- Applied MVC Architecture using Struts Framework
- Developed the presentation tier using HTML, JSP, Servlets, JavaScript and Tag Libs.
- Responsible for coding SQL Statements and Stored procedures for back end communication using JDBC
- Involved in developing shell scripts for data load into Oracle.
- Involved in Analysis and Design of the object-oriented J2EE application/Project.
- Developed the user interfaces with Struts Tiles framework
- Involved in tuning SQL queries and stored procedures for better performance.
- Creating logs for error tracking and security using Log4J.
- Involved in writing Product Documents.
Environment: Java 1.5, JSP, Servlets, JDBC, Struts 1.2, Hibernate, EJB, IBM Clear Case, CVS, Ant, PL/SQL, Oracle 9i, HTML, CSS, JavaScript, Eclipse, Log4j, SVN, Putty, Weblogic 8.1, RAD, UNIX.
Confidential
WAS Admin
Responsibilities:
- Hands-on experience working directly with customers to triage their technical issues, and provide resolution on support issues, facilitate testing and validate deliverables within the scope of support per Service Level Agreement.
- Provide solutions to a variety of technical problems of varied scope and complexity.
- Managed J2EE Web Application Infrastructure (IIS, iPlanet Web Server) configuration, upgrade, and support.
- Worked with vendor support to resolve application issues (including 3rd party applications) with running out-of-memory, performance degradation.
- Administered active applications and monitored their performance, tuned JVM parameters for optimum performance.
- Experience in Setting up of Nodes, Data Sources, Virtual Hosts, including planning installation and configuration of WebSphere Application Server.
- Involved in creating and managing WebSphere environment variables for data source and JDBC driver path.
- Monthly Side A/B switching of Production and staging servers and Web servers.
- Code Build and Deploy for Development, QA, Staging and Production environments.
- Experienced in applying patches for WebSphere Application Server, IBM HTTP Server.
- Hands on experience in deployment of J2EE applications on WebSphere Application Servers 6.0/5.x.
- Plan and implement the installation of service packs, hot fixes, and upgrades.
- Strong knowledge of WebSphere internals and the interpretation of log files.
- Installing fixes, fix packs and APAR’s for WAS.
- Monitoring the applications and servers using Compuware Vantage view and Wily Introscope.
- Writing Shell scripts and creating Cronjobs.
- Troubleshoot problems related to WebSphere architecture and applications in conjunction with helpdesk, development team, networking team, and vendor.
- ITSM Change manager for IT Infrastructure & Dealer based applications, including mission critical web applications, high availability & standard applications.
- Remote administration of Solaris Servers.
- 24X7 customer support.
- Troubleshooting for WebSphere Application server and Application Deployment.
- Implemented ITSM methodologies.
- Undergone ITSM foundation training classes.
Environment: Java, J2EE, WebSphere 6.0/5.x, Oracle 9i, My Eclipse.
Confidential
Software Engineer
Responsibilities:
- Understanding the architecture of the application.
- Created mappings using transformations such as Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank, and Sequence Generator.
- Performed status checks everyday and Troubleshooting in Case of issues.
- Performed data manipulation using basic functions and informatica transformations.
- Involved in the development of complex mappings.
- Used Informatica Workflow Manager to create, schedule and monitor sessions.
- Used send pre and post session emails to communicate success or failure of session execution.
- Involved in Peer reviews.
- Used debugger for checking data flow in transformations.
- Creating Unit test cases and performing Unit and Integration Testing.
- Creating TDD (Technical Design Document) and UTP (Unit Test Plan)
Environment: Informatica 6.0/7.0, Oracle9i, and UNIX