Integration Architect Resume
5.00/5 (Submit Your Rating)
SUMMARY:
- Responsible for providing architecture leadership and strategy for the integration, middleware, monitoring, and application domains.
- Create project implementation designs against corporate standards and verified implementation meets the corporate architecture standards.
- Achieving Client Satisfaction by ensuring service quality norms and build the brand image by exceeding customer expectations
TECHNICAL SKILLS:
- Strong knowledge of Enterprise Architecture Guiding Principles
- Excellent verbal and written communication and presentation skills
- Familiarity with the architecture standards and best practices used in integration projects
- Familiarity with the regular web service applications such as DOM, XSLT, XPATH, XML, WSDL, UDDI, XQUERY, XSD, XML Schema, SAX, SOA, and SOAP
- Hands on Experience developing in J2EE framework: EJB, JSP, Servlet, JMS, JNDI, JDBC, JMX, RMI
- Hands on experience in building GUIs using Java Script, AJAX, HTML, DHTML, CSS2, JSP, Taglibs, JSON, XML, DTD, XSD, DOM, SAX, JAXP, JAXB and XSLT.
- Experience in developing web applications using SOAP based Web services - SOAP, WSDL, CXF, AXIS, and JAX-WS and Restful Web Services - JAX-RS, CXF, and JERSEY.
- Hands on experience in implementing web applications using frameworks like Struts 1.x/2.x, spring 3.2(IOC, DI, AOP, Spring MVC, Spring Test module), JSF 2.1 and integration with ORM tools like Hibernate, iBatis.
- Hands on Experience in using Mule Connectors like FTP, FILE, SFTP, IMAP, SalesForce, NetSuite etc as part of Integration Usage.
- Expertise in using Data Modeling, Data Flow Diagrams, and Entity Relationship Diagrams along with SOA based Data Integration
- Highly skilled in exercising enterprise class modular integration and service oriented design. Besides, good knowledge about distributed, high performance, and high available architectures
- Collaborate with business, solution architecture, Business Solution leads and application teams to understand business strategy, requirements and end to end business processed requiring integrations
- Estimate integration activities, resource and timelines
- Communicate integration architectures, designs and implementation approach to business and Functional Teams
- Ensure Functional and System design specifications meet established architectural Standards
- Understand and analyze industry trends and produce standards and guidelines on enterprise integration
- Acquaintance with Federated Database Systems, Stored Procedures, Database Triggers, Data Governance, Master Data Management, Enterprise Application
- Integration, Multi-threaded programming, Enterprise Information Integration, Extract Transform & Load, UML & OOAD, Data Warehousing, BPM Workflow and Middle-ware Solutions
- Experience in installing and configuring WMQ distributed and clustered networks on the AIX, Windows, and z/OS platforms in addition to designing MQ application solutions.
- Experience in developing middle tier applications using Enterprise Service Bus (ESB) - MULE.
- Strong experience on various Mule connectors / adapters, developing API, API management and developing services on Cloud Hub.
- Created Mule ESB artifact and configured the Mule config files and deployed.
- Experience in MuleSoft Anypoint API platform on designing and implementing Mule APIs
- Created cluster environment using Mule ESB with Mule AnyPoint Studio and Creating ActiveMQ, RabbitMQ with different topologies in Enterprise Integration.
- Experience in implementing Java/J2EE design patterns such as Singleton, Factory Pattern, Adapter, Front Controller, Business Delegate, Service Locator, Intercepting Filter, Data Transfer Object and Value Object.
- Developed the Identity Policy according to business requirement.
- Developed Provisioning Policy for all the managed applications.
- Creation and configuration of Identity Manager Services
- Monitoring all Identity and Access Management services.
- Creation of Identity group
- Creation of a new Organization role and configuring provisioning policy for the same.
- Creating an Assembly line in Tivoli Directory Integrator to manipulate or transfer the date between the date resources.
- Involved in Migration from WebSphere Message Broker 6.1 to IIB 9
- Extensive experience in design and development of MQSeries applications using MQSeries for Java and MQSeries JMS APIs & MOM (Message Oriented Middleware).
- Experience of implementing Advanced Integration Services (AIS) for integrating disparate systems using IID and implementation of Human Services, Integration Services etc. on IBM Process Designer and developing mediation flows and implementing JNDI in clustering environment.
- Involved in POC build using IBM BPM 8.0 as part of Proposal
- Good experience in Concentration on building JAVA/J2EE, Business Process Management (BPM) and Enterprise Content Management (ECM) business solutions from the business need and solutions aspect, in contrast to the product line and well experienced in managing end-to-end project life cycle
- Strong Business Process Management skills and experience including Process Mapping & Modeling, Process Visualization, Business Process Analysis, Business Process Frameworks & Methods, Business Rules, and Business Process Standards such as BPMN, BPEL, BPDM, XPDL
- Proven ability in gathering requirements from the client and translating the business details into technical docs. Design and Development of end-to-end s automated Business Process/workflow solutions using IBM BPM and Lombardi
- As a Business Process Management (BPM) Consultant, Lombardi Developer and Solution consultant working on Lombardi 7.5 and IBM BPM 8.5.5 platforms.
- Recommend and promote new technologies and best practices to enable us to better support our customers and tools and utilities for design, development, testing and management of web-based applications
- Provided customer with Middleware and Micro service solution build on using open source technologies such as Kafka, Hazlecast, Cassandra, MongoDB and SpringBoot
- Responsible for designing and implementing cloud-based solutions, including private, community and public cloud deployment models
- Played a key role in evaluating, establishing and conducting proof of concepts of IBM API Manager product
- Reduce Time-To-Market by up to 2 years and lowered cost per data point by 60% by enabling increase in data throughput
- Reduced Labor costs 5% by restructuring teams, converting contractor positions to employee status, and renegotiating contractor rates.
- Design new innovative systems with advanced technology utilizing internal resources, external software, independent contractors and third party consulting firms.
- Recruit, train and mentor all IT staff members, developing teams and leaders to ensure that business value is maximized and IT goals are achieved.
- Met all target goals by identifying and tracking high-risk areas, by implementing risk mitigation early and contingency plan as needed, through proof-of-concept technical and architectural prototyping, and by maintaining constant communication with program/project stakeholders
- Introduced and implement Agile/Feature Drive Development- Scrum and Kanban
PROFESSIONAL EXPERIENCE:
Confidential
Integration ARCHITECT
- Working as EAI solution architect in a team hired by Wal-Mart to create next generation Integration solution that is configurable, loosely coupled and highly scalable to support multiple integration patterns
- The next gen architecture delivered to Wal-Mart is based on an Orchestration framework that is entirely driven by Metadata and allows execution of common services to be executed on business interfaces in a controlled and configurable manner.
- The solution now allows Integration resources to rapidly plug-in new business interfaces seamlessly through configuration allowing them to focus on specific mapping requirements of the interface.
- Wrote the complete solution architecture document, presented it to stakeholders and got it approved by the steering committee.
- Worked with onsite designers and offshore delivery team to ensure that the implementation is done as per the specifications.
- Next Generation Integration Solution provides a configurable, loosely coupled and highly scalable integration model based on the industry standards based on IIB 9, MQ Series 7.5, XI152 Data Power and MQMFT.
- It is completely meta-data driven and that will allow resources to plug-in new business interfaces seamlessly and allow them to keep focus mainly on specific message transformation requirements of the interface.
- It enforces contract on interfacing applications/systems to pass Interface identification with the message to allow common routing Framework to process all interfaces consistently.
- New business interface can be quickly configured and deployed once specific WTX map has been developed.
- Common processing enforced across all interfaces, based on configuration through Stronghold framework.
- Staging feature is available by holding the message processing within Stronghold layer ideally on account of Target system downtime. The source system is not impacted and continues to send messages to Stronghold as a normal course.
- Use the API Manager user interface to define APIs and to securely manage your API environment
- Analyze API usage by using the analytics that are provided and socialize your APIs in a developer portal
- XI52 Data Power Integration Appliance in Trusted zone to securely expose the enterprise applications to internal consumers and partners
- Used IBM MQ Managed File Transfer to build a customized, scalable, and automated solution that enables to manage, trust, and secure file transfers. It will also eliminate costly redundancies, lowers maintenance costs, and maximizes your existing IT investments.
- Implemented complete End to End automation of new Business Process. Automation, integration of “Term Life Policy” Business Process.
- Done the Simulation rule analysis for each process “Ex: underwriter Approval and rejection” Process. Analysis report shows the data about the particular process instance
- User interface is designed for Policy creation, Search Quote/Policy, Product, Plan Details, beneficiary information, Client Details, Quote Level Premium Screen’s using the Coaches. Done the customization and change the look and feel of the application.
- Created the BPD nested processes and dynamic sub processes for Agent Policy level process. Integrated with Email services and reused the components across Enterprise.
- I Used message events, Timer events, UCA’s in the application. Develop generalized email services for sending reminders about SLAs to the policy holders.
- Created highly complex UI’s using Brazos toolkit, custom HTML and it involves localization languages
- Developed reusable components in Lombardi which is used across BPM Applications.
- Imaging and document management and BPM implementation. Using Portal and coaches provide native support for document attachments, and can include links to external ECM applications
- Scanned Policy/Quote documents process execution and linkage can be captured in event-based. Using Teamwork, we can monitor and correlate those events to instantiate new processes.
- Implemented Real-Time Performance Metrics for Product and Plan details, Customer details, policy wise details, Quote level details, family history. At any time, managers can view real-time process performance reports in Scoreboard portal.
- Build an auditable work task environment where absolute and precise KPI measurements are possible
- Provided SOA based solution where different systems can interact through well-defined interfaces.
- Used modeling task routing rules, in Process creating activity flows using a palette of standard BPMN, services, activities and gateways, timer, message, and exception as well.
- Enhance and troubleshoot multiple BPD’s for Term Life Policy existing components and new modules in WebSphere clustered environment.
- Designed an Event Processing Framework, which collects and distributes events.
- Events distributed in the system can then be consumed by Applications which have interest in utilizing the data held within those events.
- It is robust extensible framework which allows the capabilities to plug in many features like Producer and Consumer Adapters, API, Auditing, exception handling and notification.
- Framework was developed and deployed on open source stack like Tomcat on WalMart cloud environment, uses hazel-cast for cache, uses Kafka and IBM MQ for messaging, uses XSLT for transformation and Cassandra/zFam to store transaction audits/exceptions.
- Framework was build and tested using Maven, Junit, Jenkins, Sonar and Jacoco
- Producer and Consumer Adapters give ability for Event Producers and Consumers to produce/consume events by implementing a simple Java Interface.
- Re-usability of Adapters for connecting different Producers and Consumers.
- Common services like Auditing, exception handling, notification etc. need to be developed only once and then consistently enforced across all interfaces.
- Producer can simplify/optimize its design based on the fact that it needs to implement Adapter to produce events for all outbound interfaces.
- Consumer can also simplify/optimize design of inbound interfaces by implementing Adapter to receive multiple event types.
- Framework on OneOps cloud allows scaling out horizontally to accommodate new applications/systems and also new interfaces from existing applications and vertical scaling can be achieved in case of increased load/volumes of an existing interface.
- Designed Micro Service Based Product that offers a Platform which enables developers to quickly build and deploy production ready, highly available micro service that perform great and are extensible for future changes.
- Platform allows interested parties to construct simple scalable services which are backed by an in memory data grid.
- Platform will provide capabilities for service discovery and composition to provide new ways of thinking of enterprise data and there will be operational components to ensure consistency and availability is maintained.
- The data is stored in-memory by providing the capability to quickly process, store and access the data with the speed of RAM.
- Platform is Highly available no single point of failure of cached data, Multiple copies are stored in multiple machines for automatic data recovery in case of single or multiple server failures.
- No DSL involved for creating Micro services as the data model in is object oriented and non-relational.
- Distributed querying - Queries in multiple nodes with relatively smaller subset of data and unifies and produces results.
- Dynamic Horizontal scaling
- Orchestrates Micro services
- Used a microservice architecture, with Spring Boot-based services interacting through a combination of REST and RabbitMQ or Apache Kafka message brokers. Deployed services to Google Cloud Platform and Microsoft Azure in Docker containers, managed by Kubernetes
- Service Aggregation - Service combines multiple services into one or more new services and ensures that data is modeled across all component services and integrated.
- Allow greater reuse of existing Micro services providing an easier path for onboarding.
- Service Discovery- Self Registers Micro services deployed in Micro services platform to the Broker Service platform.
- Inner sourced- Teams are free to fork and add new features, Takes out team dependency.
- Security, Metrics, Centralized Logging.
- Installation will be composed through Containers.
- Features of MicroServices Framework: HealthCheck,Metrics,Centralized Logging,Basic Authentication,Swagger Configuration,
- Used Spring framework to inject services, entity services, transaction management, and concerns by factory class corresponding to the use case operation executing.
- Implemented the application using spring concepts - DI/IOC, AOP, Batch implementation and Spring MVC.
- Developed & consumed the web services using Apache CXF, JAX - WS, AXIS, WSDL, and SOAP.
- Deployed Mule ESB applications into MMC (Mule Management Console).
- Strong experience on various Mule connectors / adapters, developing API, API management and developing services on Cloud Hub.
- Created Mule ESB artifact and configured the Mule config files and deployed.
- Integration of Mule ESB system while utilizing MQ Series, Http, File system and SFTP transports.
- Developed application in Anypoint studio 5.4.3 IDE and used RAML 0.8 for generating Raml files.
- Extensively used Mule components that include File, SMTP, FTP, SFTP, JDBC Connector, and Transaction Manager.
- Migrated Mule ESB 3.5.1 apps to Mule ESB 3.7.3 and updated all the dependencies.
- Migrated deprecated Data Mapper mapping to Data Weave in MULE ESB.
- Developed RESTful/SOAP web services in Mule ESB based on SOA architecture.
- Working knowledge of API management using Anypoint API management tools.
- Strong expertise in SOA and ESB, and involved in integrations with Salesforce and SAP.
- Utilized custom logging framework for Mule ESB application.
- Used MULE ESB as a integration platform for developing the application which is based on SOA architecture.
- Involved in converting data formats such as XML, CSV, EDI and JSON.
- Developed Restful web services using JAX-RS and CXF tool.
- Integrated Spring & Hibernate frameworks to develop end to end application.
- Used Hibernate to create data layer to the services to do CRUD operations in to DB.
- Set up Object & relationship mappings with Associations, inheritance, and named queries etc. using Hibernate.
- Used core java concepts like Collections, Generics, Exception handling, IO, Concurrency to develop business logic. Used JMS for Asynchronous messaging.
- Involved in writing SQL & PL SQL stuff to be called by CTRL-M batch jobs for BOD and EOD jobs.
- Introduced and implement Agile/Feature Drive Development - Kanban and Scrum
- Personally coached team on software development Agile best practices.
- Team designed and trained to follow Kanban or Scrum based on project or user story.
- Executive briefing and reporting on scrum metrics, burn-up and process improvements periodically.