We provide IT Staff Augmentation Services!

Application Architect Resume

2.00/5 (Submit Your Rating)

Syracuse, NY

SUMMARY:

  • Dynamic, Collaborative and Self driven individual with 13 years of Information Technology experience in design, development of software applications using Hadoop, Unix and Mainframe technologies.
  • 3 years of comprehensive hands on experience on Big Data Technologies like HDFS, Hive, Pig, HBase, Sqoop, Flume, Map Reduce, Spark .
  • Hortonworks certified Hadoop developer
  • Strong experience on Hadoop distributions like Cloudera and Hortonworks.
  • Experience in importing and exporting data using SQOOP from Relational Database Systems to HDFS.
  • Experience in designing and developing applications in Spark using Scala.
  • Performed ETL for structured and semi - structured data using Hadoop ecosystem tools.
  • Performed analysis of data using PIG .
  • Hands on experience in creating and maintaining Hive databases, tables, views and partitions.
  • Worked with different data sources like flat files, XML files and databases.
  • Good understanding of NOSQL databases like HBASE .
  • Experience in IBM Bluemix cloud service.
  • Built cognitive ChatBot using IBM Watson Conversation service for utility client.
  • Technically leading a team on Onsite & Offshore Delivery Model.
  • Strong in Mainframe Application development and UNIX Shell scripting.
  • Demonstrated success in achieving customer satisfaction.
  • Strong communication skills both verbal and written.

TECHNICAL SKILLS:

Big Data Technologies: Hadoop, SPARK, HDFS, Map Reduce, HIVE, PIG, HBASE, KAFKA, Flume, Sqoop

Languages: Scala, Shell Scripting, PERL, COBOL, REXX, JCL

Databases: MySQL, HBase, DB2, Oracle, VSAM, IMS DB.

SDLC: Waterfall and Good understanding of Agile.

Tools: VMware server, GitHub, Service now, Xpeditor, QMF and Platinum.

Operating Systems: Linux CentOS, SOLARIS, AIX, zOS, Windows 2000 Server

Middleware tools: IBM WebSphere MQ on zOS

PROFESSIONAL EXPERIENCE:

Application Architect

Confidential,Syracuse, NY

Responsibilities:
  • Managed data flow for two reports namely ESCO Enrollment/Switch/Drop and Low Income Reporting.
  • Setup a tool to feed latest incoming supplier data into Hadoop Cluster.
  • Analyzed ESCO enrollments, Drops and Switches using Spark and stored the results back to HDFS in a specific format.
  • Analyzed the entire Low Income data for both the customer information systems using Spark.
  • Setup Spark Streaming process to read Smart meter data and to analyze in real time.
  • Handled processing of historical meter usage data which helped to offer TOU pricing options to customers.
  • Analyzed customer energy usage to detect energy theft or meter tampering.
  • Comparing the performance of Hive and Spark jobs.

Application Architect

Confidential,Syracuse, NY

Responsibilities:
  • There are multiple ways in which Confidential 's customers can contact a Customer service representative for everyday tasks like Starting a new service, Stopping an existing utility service, Switch ESCO, Make payments etc. The existing mode of communication between the customers and Confidential are only Phone and E-mail.
  • I proposed and built a Watson Confidential as another mode of communication between customer and Confidential on IBM Bluemix platform and presented it to the client as part of an RFC.

Confidential

Technology: Watson Conversation, Cloudant DB

Environment: IBM Bluemix

Responsibilities:
  • Proposed an Idea of creating a ChatBot on IBM Bluemix platform using Watson Conversation service. This automated ChatBot is up 24/7 and will reduce calls to the call center.
  • Architected the ChatBot application and responsible for identifying the technologies to be used to build a ChatBot.
  • Identified the use cases that can be implemented on a ChatBot.
  • Built the ChatBot from scratch using IBM Watson conversation service, Cloudant DB on IBM Bluemix platform.

Hadoop Technical Lead/Application Architect

Confidential,Syracuse, NY

Responsibilities:

  • Responsible for bringing customer data from CSS and CRIS CIS systems to Big Data Platform.
  • Load data from relational tables to HDFS using Sqoop
  • Perform cleansing of data using PIG and store them back into HDFS
  • Load log data into HDFS using Flume
  • Write and execute Hive query/scripts to analyze the customer data on Hadoop
  • Used Kafka to stream real time data to Hadoop for analysis
  • Used HBase to store the processed/analyzed data, which was further used by UI tools.

Hadoop Technical Lead

Confidential,Syracuse, NY

Responsibilities:
  • Responsible for identifying the data sources and bringing that data to Hadoop platform.
  • Responsible for processing structured and un-structured data using Pig and store it in Hive managed and External tables.
  • Involved in Analysis, Design, Coding and Testing phases of the POCs.
  • Mentored new team members on the Hadoop Eco System.

Project Lead/Lead IT Specialist

Confidential,New York

Responsibilities:
  • Lead the effort on developing a new TIER approach for Energy Affordability project for Customer1 application.
  • Directly worked with business leads in understanding the requirements and providing valuable suggestions.
  • Created Logical technical model for Energy Affordability project covering all the interfaces with CSS application.
  • Created five new riders to give different TIER discounts to Energy Affordability Participants.
  • Designed a new DB2 table to store the TIER information and used as centralized repository for Energy Affordability program.
  • Architected the Energy Affordability enrollment process and conversion of around 120,000 low income customers into EAP.
  • Automated creation of many reports which were used for automating regression testing.

Confidential

Responsibilities:
  • Worked on requirements to design for CSS ESCO analytical reporting and delivered error free reports on time.
  • Very good understanding of the Customer1 (CSS) application as a whole with good understanding of Supplier Services Area and proficiency in Finance area.
  • Lead the effort on converting the General Ledger accounting from PeopleSoft to SAP code values on Customer1 system.
  • Responsible for the architecture of interfaces between Customer1 (CSS) and SAP for sending the General Ledger code values to SAP ERP system.
  • Automated the ESCO POR (Purchase of receivable) and MuniTax General Ledger Interfaces from CSS to SAP, thus reducing manual effort and minimizing errors.

Confidential

Responsibilities:
  • Worked as Business Analyst and created Functional Requirements documents for CSS application during CRIS Migration project.
  • Assisted the business users in formulating the test scenarios for eliminating the PeopleSoft GL on CSS.
  • Responsible for translating requirements into design for some of the major functions of Confidential project.
  • Closely worked with change management team to push the changes to production.
  • Responsible for developing the core components of the OBR project which involved developing a VSAM database for storing loans and the infrastructure to process and manage loans.
  • Responsible for integrating the Nyserda loan installment into the existing billing platform.
  • Proactively resolved some of the key issues in the project. For e.g. identified and worked on the code changes required for GUI application, where there was limited knowledge and helped clients in setting up the test environment which did not exist.
  • Developed critical task of MASS cancel of Gas service jobs from MDSI and Advantex for CAS to CSS conversion project.
  • Developed FTP solutions from third party Nyserda system to CAS system.

Confidential

Responsibilities:
  • Implemented an automated way to extract data from multiple external FTP sources using Shell Script. This reduced manual effort and increased the efficiency of ETL process by 50%. This also helped in revenue generation to the client since the data was available quickly to the customers.
  • Implemented a ‘STARTDATE’ report which gave the statistics of a particular feature of search engine on all the databases. This report helped us in proactively fixing the issues before customer reported. This was greatly appreciated by the client.
  • Proposed an alternative solution to the File Sync process running on UNIX box, which reduces the CPU utilization by more than 50%.
  • Proposed a solution to utilize JAVA on z/OS for some of the activities which are currently done on individual’s computer. This reduces dependency and centralizes information.
  • Increased team morale and their understanding of internal processes involved in the application.
  • Received direct appreciation from the client for being proactive in solving issues and communicating the same to client on a timely basis.
  • Helped in regaining the customer confidence with IBM.
  • Used ESP tool for monitoring and scheduling events.
  • Used Remedy tool for defect recording and tracking.

Senior Application Developer

Confidential, Florida

Responsibilities:
  • Proposed and implemented an automated way of receiving data from Deutsche Bank and transfer it to MVS system. Utilized knowledge of Shell scripting, SSH and Connect: Direct. This saved the customer from implementing SSH on MVS system.
  • Designed and implemented EDI Invoice transfer for new clients.
  • Helped managed operations in setting up SSH with new clients on UNIX and Windows.
  • Used Connect: Direct on MVS and UNIX platforms.
  • Developed good rapport with the US colleagues which helped in gaining more knowledge about the applications during transition phase.
  • Worked effectively with IBM Brazil and IBM US colleagues, which helped in developing interpersonal skills.
  • Used Control-M scheduling tool.

Application Developer

Confidential

Responsibilities:
  • Ported mainframe application to Windows platform. Used object oriented COBOL in the process of porting.
  • Received IBM Bravo award in 2006 Q2 for quickly taking up the SME role.
  • Reduced the design and implementation phase for a project by 2 weeks, which generated revenue of $700k for the client.
  • Involved in account level initiatives of Intellectual Capital and Rational Portfolio Manager.
  • Developed design document as per the system requirement specifications.
  • Implemented modules as per the design specification.
  • Unit tested and System tested the components.
  • Leveraged knowledge of Mainframe and UNIX platform to do end to end development and testing.
  • Used RPM for tracking projects, recording and tracking defects and risks.
  • Configured Connect: Direct on Windows platform.

We'd love your feedback!