Sr Jbpm/microservice Technical Architect Resume
SUMMARY
- Seventeen years of extensive experience in Java J2EE Technologies
- 7 Years US experience in Healthcare, Telecom & Insurance Domain
- 5 Years’ experience in Technical and Solutions Architectural Designs
- Responsible for Micro Services, BPM & AWS Architectures solutions.
- Excellent work experience in BPMN 2.0 with BPMSuite (JBPM) and IBM WebSphere stacks
- Specialized in Business Rule implementation using Blaze Advisor, BRMS and Drools.
- Expert in providing cloud solutions with AWS, Azure, Google Cloud and Soft Layer
- Prepared Integration Architecture and Standards for use by projects using ESB infrastructure (JBoss Fuse ESB, Mule ESB & IBM Stack)
- Spring Netflix OSS is used and configured
- Experience in Real time analytics over Data ingestion pipeline using Kafka and Spark
- Good work experience in Service Oriented Architecture and Micro services architecture
- Experienced with Security & SSO Solution, Identity Management with OAuth, OpenID& SAML
- Hands on experience with Automated Environment using Jenkins, Docker in OpenShift/Cloud Foundry PaaS
- Comprehensive work experience in AWS (EC2, Elasticsearch, Cloud Cache, Lambda, S3, MongoDB, RedShift, RDS, Glacier, EWR, IAM, Beanstalk, CloudFront, Snowball, VPC & CloudWatch)
- Hands on in writing POC, Code optimization and performance tuning across presentation, process and service layers.
- Good Experience in Open source framework
- API Management setup in Apigee Edge & MuleSoft API Gateway
- Good experience in Log Management Tool using ELK/Splunk & Sumo Logic
- Content Management solutions using Apache Stack (Jack Rabbit, Sling & Felix)
- Expert in Modelling UML Diagrams (Class, Sequence, Activity, Component, Deployment,
PROFESSIONAL EXPERIENCE
Confidential, NJ
Sr JBPM/Microservice Technical Architect
Responsibilities:
- Modernizing legacy system in to Responsive Web Design, workflow and micro service oriented in AWS Cloud
- Migrating IBM Mainframe and Multi Data source system on premises Data Load in to Cloud Data Pipeline using Kafka Data Streaming Auto Ingestion.
- Owned End to End Microservice Architecture design and implementation, API policy management with Mulesoft, Service Registry/Discovery with Consul, Logging Framework with ELK/SumoLogic, Externalized properties with Consul KV/Git2Consul,Secured Key Value in Vault, Latency, tracing in Zipkin, Security using Keytab for system, Tokenized data using Protegrity, JWT exchange across all layers, Autoscaling and Auto recovery solutions with Lambda and Cloud Aware, RDS managed service with Elasticsearch and PostgreSQL, Data Streaming with Kafka, HIVEk, Logstash and Mirror Maker across VPC and On Prem, Centralized Repo setup using Nexus and JFrog
- Built API/Interface specification, security & policies are created using RAML in Mule Gateway
- Proof Of Concept: Demonstrated with four use cases like IVR, Search, Calculator and TimeLine
- ELK(Elastic/Logstash/Kabana Stack infrastructure setup is done and tested
- Datahub, Auto Ingestion to Microservice DB instance
- JBPM Infrastructure setup with Open Shift
- Platform Architecture setup
- Automated CI - CD pipelines in AWS using PCF and Red hat OpenStack
- Complex JBPM Workflow using work item handler, human task & compensation
- Did set up Enterprise Consul for Service Registry/Discovery and Dynamic Configurations
- Configured Urologic/ELK Logging Centralized streaming in JSON from various Server logs
- Guided Security Management with Active Directory, Client ID, Token and Secrets Management using Vault and Cyber Ark/Conjur
- Identity Security Management integration with JWT in Spring Boot Microservices
- Kafka, Elastic Search, Logstash Data Pipe lines are used
- Implemented Data Tokenization using Protegrity across all layers
Technologies: Java 1.8, Spring Boot & Cloud, Mule API Gateway, AnyPoint & Kafka, Logstash & Spring Boot & Spring Cloud & BPM Suite 6.5
Confidential, Irving
Sr Technical Solutions Architect
Responsibilities:
- Migrated IBM Mainframe legacy system in to JBPM/BRMS and Java MicroService architecture-oriented system with NetFlix OSS
- Automated end to end process workflow engine is designed and executed
- Designed architecture and technical governance guidelines
- Built API/Interface specification
- Provided Solution Approaches for Infrastructure, and Technologies
- Involved in System integration
- Performed load test /stress test on jBPM process/Rules Engine with 100K concurrent users
- Proof of Concept: Demonstrated Generic Workflow of Confidential M2M order activation workflow
- Installed and setup environment for Staging and Production server in Linux OS, Docker Setup
- Service discovery is configured in Consul
- Developed Micro services architecture using Spring Boot
- ELK(Elastic/Logstash/Kibana) Stack is applied
- Netflix hystrix circuit breaker is configured and implemented pattern across micro services
- Apache Kafka 2.0 and API is used
- Spring Batch is used to handle billion transactions on job scheduled basis
Technologies Java 1.8, Spring Boot & Cloud, JBoss EAP 6.4, BPMSuite, BRMS & Fuse Engine, Casandra, AMQ & Spring Boot & Spring Cloud
Confidential
Technical Solutions Architect
Responsibilities:
- Prepared Conceptual Process and Future Process flow,
- Automated end to end process workflow engine is designed and executed
- Designed architecture and technical governance guidelines
- Built API/Interface specification
- Provided Solution Approaches for Infrastructure, and Technologies
- Documented all technical requirements and design technical solutions utilizing enterprise architecture standards including
- Documented all solution components and configurations
- Migrated legacy Ruby Rails application in to JBPM
- Involved in System integration
- Mongo DB Data Model Architecture is used
- Performed load test on jBPM process/Rules Engine with 1000 concurrent users
- Proof of Concept: Demonstrated Generic Workflow of Lab ingestion automation process
- Installed and setup environment for QA and Staging and Production server in Linux OS
- Integrated RabbitMQ Messaging for posting lab result to various vendor hubs.
- Clustering and Failover are configured in RabbitMQ
- AMQP 0-9-1 Model is used in RabbitMQ
- Enterprise Splunk v 6.0 is installed in AWS, Splunk components like forwarder, indexer & search head are used
- Splunk Data Model, Pivot and Apps are configured
- Splunk report and alerts are automated
- App is hosted on Pivotal Cloud Foundry PaaS
- Redhat Openshift 3.4 is used to deploy container based micro service application
Technologies: Java 1.8, Ruby Rails, Enterprise Splunk, JBoss EAP 6.4, BPMSuite, BRMS, RabbitMQ 3.6.3 & Fuse Engine
Confidential
Platform Architect
Responsibilities:
- Prepared product road map,
- Designed architecture and technical governance guidelines
- Built API/Interface specification
- Provided Solution Approaches for Infrastructure, and Technology
- Involved in System and Technical Architecture design
- Involved in System integration
- Middle ware solution to integrate multiple system
- Created End to End POC
- Oracle ATG Web Commerce 10.2
- Designed External and Internal Applications
- Worked on Customer Facing Cluster
- Implemented using Endeca Search, Agent Facing cluster, Asset Management, ATG DW Load Server
- CIM is used for ATG E Commerce installation.
- Spark 2.0 is configured, and APIs are used.
- App is hosted on Redhat OpenShift PaaS
- Lightning anyDB Lightning Connect been used
Technologies: JBoss EAP 6.4, BPMSuite, BRMS & Fuse Engine, APIGee API, Oracle ATG Web Commerce 10.2 and AWS Data Architecture