Consulting, Analytics Resume
SUMMARY:
- Over 12 years of industry experience spanning Consulting, Analytics and IT across the USA, Europe & Asia pacific region. Professional experience comprises of Strategy to Implementation with Banking /Financial Industry as key industries. Core competencies are in Data Management, Data Quality, MIS & Analytics (Credit Risk & AML). Specialization includessetting up the big data Analytics Centre of excellence, statistical analysis (techniques: pattern classification, regression,univariate statistics) and synthesis/dissemination of data analyses to general and technical.
TECHNICAL SKILLS:
Platform: Mainframe - TSO, Change man package, CA7 scheduler & NDMSun Solaris- Control-M, Linux, Windows Server - NT, Windows - 2000 & MS - DOS
Business Intelligence: SAS/Base (Macro), SAS-EG, SAS-ETL, SAS/Graph & SAS DI(Enterprise GuideInformation Map Studio, Web Report Studio)
Big Data: SAS WITH HADOOP (SAS LASR Analytic server), HBase, Pig, HiveMap reduce & HDFS & Green plum ()
RDBMS: Oracle, Ingress, Teradata & DB2 - UDB V6.1/7.1
Language/Package: C, JCL, COBOL, FoxPro, Visual FoxPro, Developer 2000 &VB
Internet Technologies: ASP, HTML, DHTML, Java Script, Dream Weaver and MS Front Page
Office Automation: MS - Office (Word, PowerPoint & Excel)
Statistical knowledge: Multivariate Analysis - Regression analysis, Logistic regression & Cluster analysis
Interest Areas: Risk Analytics, Fraud Analytics, Forecasting, Retail Analytics, Predictive modeling & Statistical research
PROFESSIONAL EXPERIENCE:
Confidential
Responsibilities:
- Running mapreduce in a Hadoop cluster from SAS
- Running pig in a Hadoop cluster from SAS
- Using HDFS to store SAS data sets
- Accessing Hive data using SAS/Access interface to Hadoop
- Worked in NOSQL (Not Only SQL) key / Hive SQL value and the BIG hash table and schema-less with column-based (HBase) data transformation and validation
- Leading the team on migrating SAS to R across the business, especially on predictive analytics platform & developing framework solution that can be shared across product lines and business
- Managing
- Big Data analytics platform Datameer for data exploration
- Big Data preparation & curation tool - Talend
- Big data (Hive) with Reporting tools like SAS, Datameer, Platfora business intelligence and data warehousing development team
- Data Management & MIS
- Responsible for Data extraction and MIS to support Credit risk / AML analytics unit using SAS Programming under Windows, UNIX & Mainframes environment.
- Data Analytics, Data Management & MIS to support Credit Risk Analytics unit using SAS Programming under Windows, UNIX & Mainframes environment.
- Implementing Stress Testing
- Setting the stage, Identifying the Risk Factors, Constructing the Stress, Setting the Stage for Stress Measurement, Measuring the Stress, Reporting the Stress and defining the stress testing workflow.
- Risk - Retail Analytics
- Responsible for Existing Customer Management - Developing and managing credit risk strategies / policies and related for $7 Bn retail card portfolio. Polices include line management, authorization (for over limit and delinquent accounts), inactive closures and re-issue strategies for both Private Label and Cobrand portfolios
- Responsible for creation of new challenger strategic processes including tracking of test strategies and eventual implementation
- Responsible for creating reports, decks and dashboards cutting across various portfolio metrics and important risk drivers (e.g. FICO distribution, vintage loss measurement, strategy performance tracking etc). In addition, involved in various initiatives like driving efficiency through automation and maintaining proper documentation
- Hands on experience on model process like Setup the definition, Selecting data, Understanding data, Preparing the Data Sets for Modeling, Cleaning the Data, Selecting the most predictor power variables, Process the Logistic model by different methods, Evaluating the final model, Model validation and Deriving the Business Function
- Responsible for revalidating the models on a periodical basis to ensure that the quality and performance of these models are at least above the minimum set thresholds
- Responsible for Analysis of early warnings of model effectiveness using the population stability index. This is done to check whether the scoring of population is different from the benchmark population.
- Population Stability Index
- Involved in analysis of how the applicant characteristics changes over time by using:
- Characteristic Stability Index
- Variable Deviation Index
- Project was performed to verify whether the scorecard can differentiate between the “Good” and the “Bad” distributions. We use the K-S to measure the score’s effectiveness to separate the “good” accounts from the “bad” accounts
- KS
- Responsible for leading the onshore & offshore CRRI team, which performs independent reviews of North America’s retail models for the all business (Credit Cards, Personal Financial services and Retail services)
- Manage and work on both pre and post implementation reviews of risk \ non-risk retail BAU models
- Led the model monitoring and validation team which was responsible for all CML(Consumer and Mortgage Lending) risk models
- Developed centralized model validation framework for all the portfolio processes
- Designed and automated the overall validation process for CML models through SAS
Confidential
Responsibilities:
- Transaction Matches with known money-laundering schemes
- Matches with internal or external name lists
- Occurrence of high-risk transactional behaviors
- Suspicious flows of money among accounts
- Redundant personal data
- Statistical Analysis of Thresholds
- Sensitivity Analysis
- Random Client Analysis
- Productive Alert Analysis
- Graphical Representation of alerts
- Project Management
- Responsible for delivery of strategic, cross-functional, high visible projects within time, cost, scope and quality defined boundaries, in accordance with the company's life cycle methodologies and disciplines.
- Manage complex interaction of multiple technical and business stakeholders that span multiple businesses, functions and geographies within the company.
- Responsible for facilitate, coordinate, and arbitrate cross-functional macro level topics, often including key business stakeholders and senior management. Communicate to all stakeholders on a timely basis, often including Senior or Executive Level Management.
Environment: Hadoop CDH 5,Hive,Hbase, SAS 9.4, SAS Enterprise Guide( EG), SAS/SQL, SAS/GRAPH, SAS/STAT., SAS/Impala, MS Excel, UNIX, Teradata, Oracle database, Datameer, Rev R & Microsoft Project Management (MPP)
Confidential
Responsibilities:
- Data Management & MIS
- Responsible for creating reports based on the Confidential customers and providing trend analysis on the various products/sub-products based on various classifications.
- Implement UNIX shell script. Extract and manipulate data from Database.
- Write SQL and build data input inSASformat or other formats for Forecasting Model and conduct data analysis.
- Analyzed the Business and User requirements and Participated in the creation, preparation, and conduct of quality assurance reviews
- Support Cisco's Global Market View Application, an internally derived opportunity sizing and forecasting tool
- Extract, transform and load data from Teradata, excel sheets, flat files, etc. using SQL, fastexport utility and Proc import into SAS datasets.
- Extensively used various SAS Data manipulation techniques to validate, cleanse and conduct for monthly trend analysis
Environment: SAS 9.1.3, SAS Enterprise Guide, SAS/SQL, SAS/GRAPH, SAS/STAT.,MS Excel, UNIX, Oracle database
Confidential
Responsibilities:
- Model Development /Revalidation (Application, Response & Behavioural Scoring - CREDIT RISK)
- Responsible for model validation final report containing all the three type of measures i.e. stability, strength KS and alignment should be generated for each scorecard per design.
- Vintage Delinquency
- To develop vintage report on the delinquency rates that would help top management to frame strategies for future
- Approval rate and rejection rate
- To generate product based Approval and rejection rate of customers who are solicited and worked on a monthly basis
- High and low side override rate
- To create override reports to understand the potential hit rate due to overrides in the policy
- Collection Reporting
- Interacting with database marketing team and collections team on new direct mail campaign results and providing MIS for the same
- Tracking charge-off accounts on vintage basis to identify accounts that charge-off early
- Auto Dialer data management for various collection centers
- Campaign Execution
- Cross Sell business unit (part of Consumer and Mortgage Lending, Analytics) where the markets of Credit Card, Mortgage services, Auto Finance, Retail and others are analyzed for new markets. New products are introduced to those customers within these markets
- The models are created according to the end-users requirements. The campaigns are run every month to determine prospective customers. The campaigns generate files and reports which have to be validated and sent to the end business users
- Campaign Response Tracking (CRT)
- Basically this system generates reports on different promotions every week. The weekly reports has information about the number of mails delivered, responders for each promo, count of WTD, count of TUD and made balance for each PROMO
- Project Management
- Managed the team for stabilization of Analytical projects like model development, model revalidation / monitoring, data management and MIS.
Environment: SAS 8.2, SAS/ACCESS, SAS/BASE, SAS/MACRO, SAS/STAT, SAS/GRAPH, SAS/SQLDB2, MS OFFICE. UNIX, Mainframe and Windows
Confidential
Responsibilities:
- Data Extraction (Oracle, other sources - Pass-through Facilities) - Initial data analysis (Data Audit, Exploratory Analysis & Data Profiling) - DATA TREATMENT (Outlier, Missing value treatment, Extreme value treatment & Capping issues).
- Developed SAS Macro programs to create Ad hoc reports
- Created macro program to create macro variables to pull data from DB2 for monthly data extracting
- Extracted data from different sources like Oracle and Teradata and text files using SAS/Access, SQL procedures and created SAS datasets. worked on a Unix and Windows platform, in a server application environment
- Passing Macro variables to Unix server from windows remotely created from windows to develop good User interface for the end user
- Done Performance tuning for the programs created
- Used SAS PROC SQL pass through facility to connect to DB2 tables and created SAS datasets using various SQL joins such as left join, right join, inner join and full join.
- Used SAS Macros and SAS SQL to query data and to obtain results which were generally stored as delimited text.
- Created Dynamic Macro arrays by SQL and with data step programs
- Developing repetitive loops for retrieving array variables for joining of datasets for Ad hoc reporting
- Created reports using PROC REPORT, PROC TABULATE, ODS statements and PROC TEMPLATE to generate output formats in HTML.
- Documented the code and algorithms logically
- Restructured SAS datasets using PROC TRANSPOSE procedure.
- •Revised and automated daily, weekly, monthly and quarterly reports to run more efficiently and provide more meaningful information vital to the overall strategies.
Environment: SAS 8.2, SAS/ACCESS, SAS/BASE, SAS/MACRO, SAS/STAT, SAS/GRAPH, SAS/SQLDB2, MS OFFICE. UNIX, Mainframe and Windows