Agile Transformation Manager, Resume
3.00/5 (Submit Your Rating)
Sacramento, Ca
SUMMARY
- 14 years of Experience across confidenial to provide high quality consultancy services in business areas like Banking, Insurance, Healthcare and in Product development.
- Provided technical consultancy in roles such as: Data Analytics lead, Data Migration & integration consultant, Agile Transformation Manager, Technical Project manager & Scrum master, Product Owner, SAFE RTE.
- Advanced technical and problem - solving skills to provide creative approach to answering key business questions resulting in uncovering core issues and discovering opportunities for significant financial impact.
- Experience to generate predictive mathematical models using historical data to estimate/predict future resource & product demands.
- Experience in machine learning algorithms: supervised vs unsupervised, linear regressions with one & multiple variables, Polynomial regression, Hypothesis generation, data requirements.
- Extensive experience to build interactive visualization dashboards using SAS Visual Analytics (SAS VA) & Tableau.
- Experienced to Synthesize diverse, complex information from business and technical teams to develop a compelling story with data and insights. Develop Knowledge material for early adopters, end customers.
- Excellent experience & knowledge of data warehouse concepts and dimensional data modeling using Ralph Kimball methodology.
- Experienced with most of software deployments methodologies like Water Fall, Iteration, Agile (Scrum/Kanban/TDD) and Rapid Application Development (RAD).
- Extensive experience working with OLTP & OLAP data bases and in designing dimensions and facts models as star schema, Snow flake Schema, Galaxy Schema’s.
- Proficient in Normalization/De-normalization techniques in relational/dimensional database environments. Hands on experience in 1NF/2NF/3NF techniques for optimum performance in relational & dimensional database environments.
- Technical expertise to engage and lead teams working on latest technologies such as Advanced Data Analytics, Predictive analytics using Big Data, Hadoop, Map reduce, HIVE, SCOOP, Python, Perl, Unix, Linux, Informatica, Teradata, SAS Visual Analytics, Base SAS, SAS EG, TIBCO Spotfire, Python, Perl, Java, R, Tableau.
- Experience to administer SAS high performance analytics servers in distributed architecture using Management console, SAS Enterprise guide, SAS Schedule manager.
PROFESSIONAL EXPERIENCE
Agile Transformation Manager,Confidential, Sacramento, CA
Responsibilities:
- Lead agile transformation and adoption in R&D product development division.
- Lead workshop sessions with product manager & product owners to define minimal viable product (MVP) by defining associated key feature.
- Lead sessions to define product feature scope, estimation and prioritization for features aligning it to product MVP and rollout plan.
- Conduct Agile workshops/transformation session with Scrum Master, product owners & other stake holders to explain Scaled Agile Framework (SAFE) adoption at enterprise & at team levels.
- Manage Agile release train dashboards in IBM RTC (IBM Jazz) showcasing teams progress, burn down charts, Release burn-ups, feature burn ups, velocity charts.
- Lead System demo’s session to present continuously evolving and integrated product to all stake holders.
- Address Program and project level risk in regular agile ceremonies and at regular cadence interval.
- Lead Program increment (PI) planning session and PI events with 7 cross functional teams spread across multiple locations.
- Lead scrum of scrums, Systems demo’s, Catch up sessions with Product & Program owners, Sprint reviews & retrospective sessions,
- Conduct weekly Process improvements sessions with Management & key stake holders,
- Resolve anti-Agile patterns by regular feedback and knowledge sharing sessions,
- Conduct inspect and adapt workshop sessions using Fish bone and 5 why’s techniques,
Confidential, Houston, TX
Responsibilities:
- As data Analytics lead responsible for capturing and assessing business needs to utilize vast amount of heterogeneous data for visual analytics and predictive analysis.
- Lead team to design and build visualization reports using SAS Visual Analytics on Data Builder/Data Preparation, Report Designer and Data Exploration.
- Prepared data presentation strategy for business units to provide insights for Fire Sale, Up Sell, Cross Sell opportunities.
- Accessed business needs to consolidate data for predictive modelling to provide insights into market basket analysis, Affinity analysis, association, classification and clustering.
- Acted as agile scrum master to deliver continuous business values in sprints by conducting Sprint planning, grooming sessions, sprints review, retrospectives and dairy scrums.
- Developed visual dashboard and analytical platform prototype to provide BI and analytics capabilities to users to perform ad-hoc analysis and gain insights of enterprise wide resources & asset allocation.
- Access existing dashboards and underlying business logic to translate them to SAS VA accessing data directly from Hadoop HDFS instead from standalone applications/DB’s or excel files.
- Ingest Data from ad-hoc SLB applications to common Hadoop HDFS data lake and then using enterprise data for sort, match, merge, cleaning to load into dimensions & facts table in a snow flake schema.
- Work with BPM’s to design predictive module to estimate resource allocation for jobs deployed at wellsite for drilling & measurement, well services, well integration services, assets allocation, product deployment.
- Prepared logical, Physical data model using Erwin, common data model for different segments in the organization using galaxy schema having more than 4 fact table and more than 10 dimension tables.
- Setup connectivity to Oracle/Hadoop HDFS and other databases using SAS/ACCESS, PROC SQL, SQL PASS to access data directly for implicit and explicit processing.
- Work extensively in sorting, merging, concatenating and modifying SAS data sets using BASE SAS to generate appropriate SAS datasets to be loaded to SAS LASR Analytics server.
- Refine & prune machine learning algorithms to generate resource demands using market driven indicators.
- Read data from Hadoop HDFS Data lake & external Oracle tables by writing new SAS programs using SQL Pass through facility and Libname methods to create study specific SAS datasets, which are used as source datasets for report generating programs in SAS VA.
- Administer SAS Products and servers (physical, virtual memory) using Management console, SAS Enterprise guide, Unix Shell Scripts, Unix Cron, SAS Schedule Manager.
- Administer user access to SAS VA reports. Manage access levels to secure vs public reports.
- Administer SAS Analytics LASR server for physical and virtual memory utilizations spread across root and worker nodes in SAS high-performance deployment on Hadoop.
- Start, stop SAS LASR Analytics servers. Load, unload register and de-register tables loaded to SAS LASR servers.
- Administer SAS VA LASR servers and various development and prod environments. Administer report and SAS dataset migration to Dev/QA and Prod environments.
Confidential, San Jose, CA
Responsibilities:
- Coached teams to understand Agile values and principles as expressed in Agile manifesto.
- Develop roles pertinent to programs/portfolio that accommodate the SAFe Agile framework.
- Facilitated Release planning. Conducted Scrum of Scrums (SoS), PO sync meetings, Program increment workshops, System demo’s & inspect and adapt sessions during a Program increments (PI’s).
- Coached the Scrum masters of each team about Program level SAFe events including: Program Increment Planning, Systems Demo’s, Innovation Planning Sprints, Inspect & Adapt sessions, Scrum of Scrum, RT synch up with Product manager/product owner.
- Assessed Scrum Maturity of multiple teams and coached them to higher levels at a pace that was sustainable and comfortable to each team.
- Resolved anti-Scrum patterns seeped into the process by conducting workshop sessions with different stake holders.
- Reorganized portfolio project selection and funding to support agile release trains and program/team alignment.
- Worked with the Program Portfolio Management teams on Release Planning and the Cadence of different SAFe ceremonies across all the levels (Portfolio, Value Stream, Program and Team).
- Implemented Continuous integration (CI) within Sprints in a PI cycle reducing defect by over 40%.
- Guided team benefits about continuous integration (CI) to commit code regularly to a common repository within the sprint, thereby reducing post integration efforts in following sprints increasing overall velocity.
- Resolved issues affecting team’s velocity by extensively working to reduce “Technical debt” from each sprint, thereby increasing team’s Velocity by 30% in a span on 4 Sprint duration.
- Enacted change to remove impediments and helped team to increase quality of deliverables by over 30%.
- Met goal of addressing and resolving issues within 24 hours by removing impediments faced by the team or by managing stake holder’s expectation correctly.
- Coached Product owner for backlog refinement & to align with project vision & business expectation.
- Conducted Spike sessions during the sprint to elicit information for unknown requirements to help product owner write future requirements as User stories.
- Conducted Sprint retrospectives using innovative techniques (DOT Voting, Acceleration techniques) to help overcome recurring issues impacting team’s performance.
- Hands on experience in using CA Agile Central (Rally) to track, manage & facilitate Scrum ceremonies.
- Worked with Team utilizing various technology stack like: Puppet Labs, Jenkins, GIT, TIBCO, OSB, RabbitMQ, Openstack, and Mulesoft & Layer 7, Jenkins for continuous integration
Confidential, intelligence, London,
Responsibilities:
- Assessed migration readiness of existing SAS 9.2 client platform by doing deep dive and impact analysis.
- Executed SAS Migration Utility to check existing SAS products installed and migration readiness of each to version to 9.4.
- Planned migration plan with resource mapping, cost estimates, dependency chart, test schedules.
- Lead end to end system testing / validation testing / regression testing of SAS programs, SAS Reports & jobs.
- Automated new and existing jobs and perform upgrades, patches hotfix, backup, and maintenance release installations of the SAS environments.
- Created new SAS programs and fine-tuned existing SAS programs for better performance in SAS 9.4 by using SAS PROC, SQL MACROS, SAS SQL, SAS Connect, SAS ACCESS.
- Imported data from SQL Server and Excel to SAS datasets. Performed data manipulation by merging several datasets and extracted relevant information.
- Developed new or modified SAS programs to load data from the source and create study specific datasets, which are used as source datasets for report generating programs.
- Imported data from SQL Server and Excel to SAS datasets. Performed data manipulation by merging several datasets and extracted relevant information
- Build SAS datasets from clinical database and developed programs using the LIBNAME statement & SAS Views to extract & report error statistics.
- Develop SAS macros, templates and utilities for data cleaning and reporting.
- Used various components of an INPUT and INFILE statements to process raw data files
- Used procedures like PROC FREQ, PROC MEANS, PROC SORT, PROC PRINT etc.
- Prepared Unix shell scripts for starting and stopping SAS services like Metadata server etc. and also mid-tier services like SAS application service, Web logic SAS Managed server.
- Successfully deployed two tier SAS Development and UAT environment to plan future upgrades and code release smoothly
Confidential, London,
Responsibilities:
- Responsible for Initial design for data extraction, data transfer and data load (ETL) from source systems to load into existing Lloyds Internet Banking application and database.
- Designed end to end ETL flow to extract tables from source and load to staging zone, moving refined and consolidated data to transfer zone, and then loading finalized consolidated data into OLTP & OLAP database.
- Prepared conceptual & physical data entity relationship diagram to define enterprise wide data architecture.
- Held meetings with data architects and database administrators to help define “as-is” data and “to-be” system processes and proposed Databases.
- Planned implementation strategy with DBA’s on complex data migration releases.
- Planned Data Migration, Reverse Migration, Bulk Migration strategies with DBA’s and System Analysts
- Lead team of onshore and offshore developers to design, develop and support ETL and data migration & data visualization schedules.
- Lead team to create and generate business intelligence reports to generate business values.
- Design scheduled migration process merging 6 million active HBoS customers with existing Lloyds Internet Banking application and database.
- Played active part in preparing database design, data modeling, and data validation during ETL process using SAS DI Studio and mainframes batch jobs.
- Extensive experience working with OLTP & OLAP data bases and in designing dimensions and facts models in star schema, Snow flake Schema, Galaxy Schema’s.
- Prepared detailed to design to migrate customer’s data from source to target using ETL process in various migration cycles such as Bulk Migration, Reverse migration, On demand migration, ad-hoc migration.
- Planned bulk data migration, reverse Migration, ad-hoc migration & integration strategies with DBA’s on complex data migration releases spread across months.
- Prepared Tech Specs documents and liaised with Test teams to verify QA test cases & test scripts and plan System testing, Integration testing and UAT plans.
- Prepared activity diagram, screen flow diagrams and flowchart for various modules of internet banking
- Prepared data modeling to create Use Cases, work flow, system designs.
Confidential,
Responsibilities:
- Coordinated with ETL team, DB administrators & BI teams to elevate the data model changes in the system
- Prepared scripts for model and data migration from DB2 to the new Appliance environments.
- Analyzed database performance with SQL Profiler and Optimized indexes to significantly improve performance.
- Automated load run on Informatica sessions through UNIXCorn, PL/SQL scripts and implemented pre and post-session scripts, also automated load failures with successful notification through email.
- Develop Logical and Physical data models that capture current state/future state data elements and dataflows using Erwin / Star Schema.
- Involved in mapping spreadsheets that will provide the Data Warehouse Development (ETL) team with source to target data mapping, inclusive of logical names, physical names, data types, domain definitions, and corporate meta-data definitions.
- Converted physical database models from logical models, to build/generate DDL scripts.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Extensively used ETL to load data from DB2, Oracle databases.
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
Confidential,
Responsibilities:
- Analysis, design and coding for incidents being raised.
- Engaged in Maintenance and Support of the existing systems
- Taking up enhancements/changes on periodic basis for existing systems.
- Preparing Unit Test Plans, test cases and executing them.
- Understanding technical specification document and transferring it to code.
TECHNICAL SKILLS
- Database Specialties: Database/ETL Architecture & Design, Data Analysis, Enterprise Data Warehouse, DB Design and Modeling, Data Integration and Migration, Data Warehouse, OLTP, OLAP.
- Data Modeling Tools: Erwin Data modeler, MS Visio ER Studio, MS Visio, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables.
- Language/Tools : SAS, Python, C++, Java, SAS Visual Analytics, SAS Enterprise Guide, SAS DI Studio, SAS Management Console, SAS/Access, SAS Connect, SAS/ETL, SAS/HDAT, SAS HADOOP, SAS 9.x, SAS Scripting, Hue, Scoop, Map reduce, Hadoop.
- Reporting tools : SAS BI Platform, MS-VISIO, Erwin Data modeler, Power Designer,
- Database: MS SQL servers, Oracle, Teradata, DB2, Hadoop HDFS, SAS/Access to RDBMS, SAS/Connect, SAS/Share, SAS OLAP, IBM DB2
- Operating Systems: Windows 95/98/XP/Vista/Win 7/8/10, UNIX, LINUX, Mainframes, Unix Shell scripting
- Batch Server: UNIX, Putty, SSH, and SAS scripting, Unix Shell Scripting, Perl, Linux.