Senior Consultant Resume
SUMMARY:
- Overall 17+ years of extensive hands - on experience in Application Development, Designing, Performance Tuning, Data Modeling, Testing and now managing small focused team of engineers.
- I have garnered extensive experience in Application Development with huge interest in development using Node JS, Express JS, Java/J2EE, AWS, MongoDB and PostgreSQL
- Drawing upon experience with clients in Healthcare, finance, media,, Food, Retail and manufacturing domains, I have built a strong foundation for analysis, design, and problem resolution. With most clients having teams spread globally, I have good understanding in resource utilization and good team communication.
TECHNICAL SKILLS:
Software Development: JavaScript, ES6, Node.JS, Express JS, ReactJS, Java/J2EE, Spring 4.x, TDD, Terraform, AWS CloudFormation, C#, VB.NET, JSON, YAML, HTML, XHTML, XML, XSLT, XPATH
Database: Mongo DB 4.2/4.0/3.2.9 (NoSQL), Mongo DB Atlas (cloud), ElastiCache (Redis), AWS Dynamo DB, AWS RDS (PostgreSQL), PostgreSQL 9.4/9.5/9.6 (PL/pgSQL), Enterprise DB Postgres, EDB failover Manager, pgpool, pgBench, Redis 4.x, SQL Server 2014/2012/2008 R 2/2005/2000/7.0, SSIS (SQL Server integration services), SSRS (SQL Server reporting Services), SSAS (SQL Server Analysis Services), Oracle PL/SQL 8i/9i/10g/11g, Expertise in Shell scripting (UNIX/AIX/Solaris environment) as well as PowerShell.v7.x
Server Products: Kubernetes (AWS EKS), Container Services (AWS ECS), Docker, AWS SNS, SES, AWS Lambda, EC2, AWS Athena, AWS Pinpoint, Apache Kafka,, Mongo DB Ops Manager, PEM (PostgreSQL Enterprise Manager), Idera DM RHEL 6.x, 7.x, 8.x, Windows 2012 Data Center/2008, VMware ESXi 5.5.
Other Tools: Circle CI for CI/CD AWS X-Ray, Jmeter 5.x, Redline, AWS Code Deploy, CloudWatch, Red Gate, Database design / ER Modeling using ERWIN 9.5, dB Schema, Embarcadero Data Architect 10.0, Toad for Data Modeling (including PostgreSQL), SAP Power designer 15.0, Enterprise Architect, 4+1 views and design using UML with Visio, Configuration management using Visual SourceSafe, GIT, TFS (Team foundation System) and CVS/SVN, Application defect tracking with Mercury-Test Director, Bugzilla, JIRA
PROFESSIONAL EXPERIENCE:
Confidential
Senior Consultant
Responsibilities:
- Captured and documented current state of the CRM Solution used by Marketing team. This included series of meetings, identifying the MVP that we need for the success.
- Capacity planning for the CRM Solution based on all customers (approx. 368 million customers) vs 52 million (29 million loyalty and 23 million non-loyalty customers). This included about (42 profile attributes, and 40 CAF attributes). The Capacity planning was done across multiple areas. Including MongoDB, Snowflake, AWS Services that’ll be used, and credits that’ll be consumed in Braze.
- Designed and developed REST APIs for providing coupons/deals/filtered deals for Braze. This was designed to handle a capacity of 250k calls/min. This solution was tested using redline-JMeter scripts. The test data was created using ad-hoc code in Node.JS to generate records based on a schema. The Business driver was to target 20 million + customers within 3-hours lunch time window, and dinner time window including all 4 time zones.
- Created end to end solution for curbside pickup at Pizza hut, this included capturing the order status from SUS/NPC and sending SMS to customer and when customer responds with one of the approved messages inform the store. Also dealt with appropriate negative scenarios and audit trail for legal purposes. The integration was done using - AWS Pinpoint (using SDK), Twilio (using SDK) and Braze (using REST API).The 3 SMS providers were included as this SMS-handler service was further enhanced to support any corporate needs to engage customers using SMS.
- Designed and developed the REST APIs for the customer profile/CRM-IDs, for capturing customer communication p, customer profile. This also included the design of MongoDB Collections. This API was developed to handle up to 5k calls/min. Over the period time it was adjusted to handle 11k calls/min during Friday peek loads in April-2020. This also published the updates/changes to Kafka for any downstream consumers.
- Designed and developed Process to Sync Computed attributes from Snowflake to Mongo 4.0 for approximately ½ million customers each day. This was also planned to sync 50 million customer sync in under 12 hours. This posed a significant challenge and a extensive solution to make sure the real time APIs are not impacted.
- Worked on new project initiatives, including streamlining the purchase events, to CRM platform to enable marketing team to tweak their campaigns and benefit from orders made my customers.
- Helped implement the text-to-join feature with the team, so that we identify and correlate customers and target them. This was implemented using lambda layers.
- Redesigned few services to improve performance, this was done using an ElastiCache layer (Redis), we cached essential data about Products, coupons, stores, strategies to better capture essentials and send data to CRM platform.
- Designed and Developed Kafka Message Consumer using Node.JS, to capture updates to profiles and update braze accordingly.
- Designed and enhanced Kafka Consumer, to capture Phone Numbers, and key fields to set Alias(es) to Braze profiles, and set them to SMS Subscription group
- Created Lambda functions, to capture Un-Subscribe email/SMS events coming from Currents, and update profiles in MongoDB.
- Created AWS Lambda functions in Java 8 to handle requests for Loyalty Customers, wherein SUS/NPC or call center made calls to find customers based on phone number, or email address.
- Created AWS Lambda functions in Java 8 to handle requests from external systems, to provide CRMIDs based on punch sys id (from Punchh System), qo cust guid (from Quick Order System), This was quick and dirty solution to aid in quick assessment of integration with these systems. Plan was to enhance them into full-fledged microservices.
- Created Batch Jobs to fetch/load provided files from AWS S3 for “reject-keywords”, and “approved-names” and tag each profile suitably if the first name is suitable or not.
- Created Shell Script based process to monitor and alert if the Messages from Kaka topic lag by more than designated threshold
- Created Shell Script based process to monitor and Alert if the Jobs are delayed by more than 10 minutes.
- Created Process to capture snapshot from Braze and Sync the Mongo 4.0, and Snowflake. This was done to aid in automated testing.
- Created Process to upload Coupons (from QO) to MongoDB. This involved loading 20 million coupon codes in less than 1 minute
- Designed and developed process to offload the used coupons from MongoDB to S3 Location
- Designed and developed process to Capture NFL Schedule from the official website and send relevant campaigns to fans.
- Implemented end - to- end solution for CI/CD using Circle CI (Mono-repositories), and Terraform 0.11 to deploy artifacts (ECS Services, S3 Buckets and IAM profiles)
- Done estimation and path to migrate the MongoDB from 4.0.20 to 4.2.x version
Confidential
Lead Application Developer
Responsibilities:
- Worked on New features/stories allocated during the sprint cycles this included developing backend APIs using NodeJS, and Sequelize and Unit tests cases (using jest).
- Worked with team to refine the data model and do stress testing across multiple functional areas. With results from Stress testing defined the indexing strategy for the tables.
- As a single point of contact, implemented High availability/Failover PostgreSQL DBs from scratch (v9.5). This was initially done using pgpool-II in QA and Stress servers (with streaming replication). Later this was changed to EDB Failover Manager (streaming replication) as the client procured licenses. This is now running in production as well from past over 4 months.
- Implemented routine jobs (for Backup and recovery, managing logs and WAL files) in cron, this is keeping PITR in mind.
- Implemented Monitoring tools including PEM (PostgreSQL enterprise manager) for 2 instances (production and non-production)
- Installed and configured postGIS for the spruce application.
- Installed and configured the redis DB for the client. In Stress environment and production, it was done with 2-node replication and built from source.
- Supported an implementation of Mongo DB (3.4.3) as a DBA/Developer for 1 application in POC Stage. This is currently used in 2 pilot stores only.
Confidential
Lead Application Developer
Responsibilities:
- At AIM Specialty Health, they have couple of applications to analyze and automatically approve/disapprove requests for tests based on certain algorithms. I worked as a Lead developer and DBA for their 302 project. Herein we defined the collections (in MongoDB 3.4) and 7 tables (in PostgreSQL 9.5) that were central to RAD - Oncology, completed the capacity planning and cost estimation This included setting up the infrastructure(on-premise)
- At VCE they build virtually converged platforms and deliver to clients ready to use vBlocks, vxRails etc To capture requirements, configure, quote and price these ready to use platforms, they have couple of web applications utilizing, SQL Server 20012, PostgreSQL 9.4, MongoDB at backend, for front end they are using Java 8, Angular JS, Node JS and Express JS.
- I was also instrumental in supporting REST APIs developed using J2EE 1.7, and Spring framework to provide the current customers/vendors for the VCE, this was also being developed to support them so that customers can do a similar configurations of their VBlocks like we can configure our laptops, but this not only configured the hardware but also configure the networking options they had selected.
Confidential
Database Architect
Responsibilities:
- They intended to implement a complete corporate Data warehouse based on Ralph Kimball methodology.
- At BCBSM they have implemented CPDM data mart, that houses data ranging from basic subject areas like members, Providers, to data about Claims, Healthcare indicators etc totaling to 6 TB+. This data is used for many initiatives taken at BCBSM to reduce cost of operation, evaluate and reward the healthcare providers who help them keep costs down.
- At JLL they have close to 200 end-client’s contracts, and each Client specific data/Transformation is managed in their specific DB instance. They are acquiring more clients, and standardizing the way Jobs/reports, Created Transformation packages for each Client. The collective size of all DBs was at the time more than 10 TBs, and many tables having billion+ records. Worked on all aspects of data in this along with Tracker application that provided an insight into the underlying data. This Solution was built using SSIS Packages, C# ASP.NET, jQuery, Telerik controls, RadScript Manager, Rad Controls, HTML, CSSMindlance (Client-Motorola)
Confidential
Senior Developer
Responsibilities:
- Client is Extracting data from GOLD (retail management system) and pushing the data into Point of Sales (POS) and Lawson (accounting systems). One part was to write jobs in Java to extract data in XML format from multiple end points including warehouse, stores, Vendor portal and place them in a shared location, there was an analytical part, wherein data was pushed into a data warehouse, to develop reports/trend lines and make suitable business decisions. Worked extensively on both ends to make sure of data consistency and avoid any staleness. This was done using AIX 4.2, J2EE1.5, Spring, Hibernate, oracle PL/SQL, SSIS 2008 Packages, SQL Server 2008
- Worked on the integration and application built in C# and ASP.NET. and the reports built using SSRS