We provide IT Staff Augmentation Services!

Solutions Architect / Lead Resume

4.00/5 (Submit Your Rating)

SchaumburG

EXPERTISE SUMMARY:

  • About 20 years of experience in analysis, architecture design, detailed design and development of application and administration experience having skills of data analysis, design and testing of software system from development stage to production stage in HADOOP and Java technologies.
  • 3+ years of exclusive experience in Big Data and tools in HADOOP Ecosystem including Hive, Flume, SQOOP, HBASE, ZOOKEEPER, OOZIE, KAFKA.
  • Excellent understanding of HADOOP architecture (HDFS, YARN, Map - reduce) and underlying framework.
  • Excellent understanding of NOSQL databases like Cassandra, HBASE.
  • Have good experience in installing, configuring and administrating HADOOP cluster for Horton Woks, CDH 4.
  • Expertise in analysing log files and fixing the issues of the HADOOP environment.
  • Have good experience in importing data from HDFS to Relational databases and vice-versa using SQOOP.
  • Experience in using Flume for Integration of log files into HDFS by collecting log data from different sources.
  • Expertise in OOZIE framework and automated the importing jobs.
  • Experience in developing custom UDF for Hive and Pig core functionalities.
  • Experience in managing HADOOP clusters using CLOUDERA manager.
  • Good knowledge in Amazon AWS concepts like EMR and EC2 web services.
  • Strong skills in SDLC models like waterfall model, Agile Scrum.
  • Excellent team driven skills and communication.
  • Focus on the delivery of the project effectively.
  • Expertise in modelling Linear/ Logistic regression model, Time series analysis, Cluster Analysis using R language.
  • Areas of expertise distributed systems using C/C++, Java, Object Oriented Design and Real Time systems.
  • Proficient in the following technologies:

TECHNICAL SKILLS:

Languages: C, C++, Java, J2EE, XML,XSL,XSLT, HTML, Shell Script, R-Language

Database Management: Oracle 7.x/8.0/8i/9i /11i, DB 2 7.1/7.3/8.1, Teradata, MySQL, TOAD and Maria DB.

Hadoop/Big Data platform: HDFS, MapReduce, Yarn, Hbase, Hive, Pig, ozie,Zookeeper, Sqoop, Flume, Kafka, Spark and Scala

Distributed Systems: TUXEDO, CORBA (Iona Orbix), SonicMQ, TIBCO, RMI, IIOP, RPC, POSIX Threads, MQSeries.

Development Tools: Eclipse, WSAD, IOC containers (Core, J2EE) and R studio.

Methodologies: UML, OOAD, AGIL Software Development Life Cycle, Design patterns

Application Servers and Web Servers: Web Sphere, Web Logic, JBOSS, Tomcat and Java Web Server

Enterprise Technologies: EJB, Spring, Servlets, Java Server Pages, JDBC, JMS, JNI, XML,SOAP, LDAP, JNDI

Operating Systems: Sun Solaris 2.6/2.8, HP-UNIX, AIX-UNIX, Linux, CentOS7, and Windows

Source Code Control System: VSSF, SCCS, CVS

Protocols: TCP/IP, SNMP,LDAP,SOAP,HTTP/S

PROFESSIONAL EXPERIENCE:

Confidential, Schaumburg

Solutions Architect / Lead

Responsibilities:

  • Installed and configured Horton works HADOOP from scratch for development and HADOOP tools like HBASE, SQOOP, ZOOKEEPER and FLUME.
  • Involved in the team to increase the nodes of a cluster, where used the commissioning process for configuring additional data nodes.
  • Involved in HDFS maintenance and administration, defining job flows with yarn.
  • Implemented map reduce jobs in java for processing the data.
  • Implemented KAFKA message broker to receive messages on TOPIC and consume message to process it.
  • Responsible for cluster maintenance, adding and removing nodes, monitoring the cluster, troubleshooting, manage and review backups and log files.
  • Knowledge in performance trouble shooting and tuning HADOOP clusters.
  • Collected the xml data generated from vendors and stored in HDFS by using Apache FLUME.
  • Involved in converting Map reduce applications to spark.
  • Involved in scheduling and managing the jobs in HADOOP cluster.
  • Implemented scripts to transmit data from RDBMS to HDFS using SQOOP.
  • Used zookeeper and OOZIE operational services for coordinating the cluster and scheduling workflows.
  • Exported the analysed data from HDFS for visualization and to generate reports.
  • Involved in migration of traditional environment to Cloud environment.
  • Design to infrastruture for Oracle Data Integrater.
  • Strong Experience in trouble shooting the prodution issues and working with problem in ODI.
  • Designed XML to ORACLE Table data transformation.
  • Extensively use shell scripts to run the various ODI JOBS.
Confidential

Professional -App/Prod Support

Responsibilities:

  • Involved in migration of traditional environment to Cloud environment.
  • Involved in migration of WAS 7.0 environment to WAS8.0 environment to support IBM connetcion 4.0
  • Involved in migration of IBM connections 3.0 to IBM connections 4.0
  • Strong Experience in trouble shooting the prodution issues and working with problem tickets in tSpace.
  • Extensively used Introscope to test the application performance and memory usage information.
  • Increases file space for tSpace users and communities, whenever user request.
  • Creating new user accounts under tSpace and synch with data base.
  • Strong Experience in trouble shooting the prodution issues and working with problem tickets.
Confidential

Professional -App/Prod Support

Responsibilities:

  • Involved in migration of traditional environment to Cloud environment.
  • Involved in migration of WAS 7.0 environment to WAS8.0 environment
  • Strong Experience in trouble shooting the prodution issues and working with problem tickets in BIS and CSI
  • Extensively used Introscope to test the application performance and memory usage information.
  • Deployed BIS applications in Websphere environment.
  • Expertise in Installation, configuration and Administration of IBM WAS 7.0/6.1/6.0/on Unix and Windows Operating Systems with DB2 UDB 7.2, Oracle 10i and JMS.
  • Involved in migration of WebSphere Application Server 6.0/6.1/7.0.
  • Strong Experience in trouble shooting the prodution issues and working with problem tickets.
  • Developed JAVA program to generate Topology Monitoring Service report for MVT Tool
Confidential, Illinois

Technical Architect/ Developer

Responsibilities:

  • Analyzed the requirements, worked with requirement people in creating business requirements (BR), and High Level Design (HLD) documents.
  • Designed use case, sequence and class diagrams for monitor catalog server web application using Rational Rose.
  • Involved in Development of C++ API s to the clients for all platforms.
  • Maintained promotion tool and approval tool java applications, which are used to approve and promote a business model between different environments.
  • Maintained persistence and security framework application which has role based access control (RBAC) techniques.
  • Extensively used XML technologies in the application development.
  • Developed XSLT based rendering techniques for OLDBWA application.
  • Extensively used Ant scripting to automate the build process to retrieve the source from VSS repository and building ASBAM infrastructure components.

We'd love your feedback!