Contractor (technology Specialist) Resume
4.00/5 (Submit Your Rating)
SUMMARY
- Mastery of predictive modeling, statistical analyses, backtesting of interrelated models, ongoing monitoring and evaluative data structuring in teh operational, budget planning, marketing, macro - economic trend impacts, auditing and risk contexts in financial institutions
- Hands-on designing, coding, testing and implementation of predictive and prescriptive (i.e. scenario-structured) models
- Many years of intensive hands-on coding experience in a wide variety of languages and contexts (and across all stages of project work, from defining business or scientific needs and requirements to data modeling, metadata, specifications, analysis, budget planning, testing, automation and deployment)
- Expertise in analysis, development and implementation of complete forecasting and analytic architectures using SAS (from SAS 82 to SAS 9.4), SAS Enterprise Guide, SAS MACROS, SAS PROC SQL, stored procedures, Python 2 and 3, SAS IML and SAS Statistical and Time series procedures on a SASGRID structure and other (generally unix-based) environments
- Extensive experience in parallel validation of data derivation, structured data evaluation, statistical and analytic results
- Development of collaborative analytic teams aligned with upper management
- Wide experience of analytic and data structuring/ETL methods in SAS, Python, R and many varieties of SQL such as Impala, Teradata, Oracle (including PL/SQL), SQL Server
- Work on Challenger and preliminary modeling on teh effects of new data structures emphasizing teh time dimension on banking risk models dat had been designed to cancel many time-dependent effects with parameters dat emerged from taking industries through teh economic cycle, which tends to dampen their predictive behavior during periods of rapid change.
- Monte Carlo models in R, structuring raw data for sensitivity testing (with shocks to variable values) of models and input verification in a scaled agile project environment.
- Testing, automation and synchronization of risk models and Tableau and Latex output at Confidential during teh development and deployment of Hadoop/HIVE/Impala-based “productionized” segments of a SASGRID-based set of critical risk models for ongoing monitoring of risk scoring and Commercial Real Estate Loss Forecasting
- Mapping, hands-on coding and testing of SAS, Python and Hadoop routines and data structures involved in sending and receiving risk data from Moody’s, including Not-for-Profit entities, Car Dealerships, Real Estate Loss Forecasting using SAS and Impala
- Presentation at SAS Analytics Conference in Las Vegas, September 2016 on teh use of Markov Chains in operational forecasting, budget planning and in metabolic models
- Developed, hands-on coded, tested and implemented a complete forecasting architecture for all loan modification processing at Wells Fargo. SAS 9.3 and 9.4, unix scripts, SAS Enterprise Guide, SAS PROC SQL, SAS IML and SAS Statistical and Time series procedures were used to conduct predictive modeling of operational loan modification inventories and transitions for future periods of up to 3 years. dis forecast was based on using matrices of stages in Markov Chain transition probability structures. Eventually dis was migrated from a standard UNIX server to a SAS Grid analytic server. Monthly systematic presentation of results to upper management was integrated into teh system.
- Inventories were predicted to within 1% even for periods beyond 6 months. dis previously unheard-of level of accuracy was achieved within teh first few months of implementation.
- Code and procedural optimization during teh initial phases of dis project to automate and optimize reporting and analytics moved from EXCEL to SAS
- Developed, hands-on coded, tested and implemented a complete architecture for epidemiological studies using 3 million patients from teh GPRD (teh health-care system of teh UK) while at RTI Health Solutions. dis included all patient visits, drugs, treatments, events and outcomes and required extensive communication with epidemiological teams across teh globe.
- FDA labelling for a common anti-depressant was changed as a result of dis successful integration of complex treatment data, including timing of physician visits, drug treatments and behavioral outcomes.
- Worked with serious adverse event groups at RTI Health Solutions to bring their information out of their extremely rigid systems in a usable form. dis included some unusual types of data extraction such as using SAS(JAVA) XMLmapper to read narratives dat were otherwise inaccessible outside of teh tracking system.
PROFESSIONAL EXPERIENCE
Contractor (Technology Specialist)
Confidential
Responsibilities:
- Using standard SAS macros, Data set (Base SAS), Proc Contents and Proc Univariate with Hadoop/HIVE/Impala to test teh implementation of components of critical risk models and give preliminary views of teh effects of broad changes in methodology.
- Developed, hands-on coded, tested and implemented a complete architecture for evaluating data constructed from a range of sources
- Work on Challenger and preliminary modeling on teh effects of new data structures emphasizing teh time dimension on banking risk models dat had been designed to cancel many time-dependent effects with parameters dat emerged from taking industries through teh economic cycle, which tends to dampen their predictive behavior during periods of rapid change as well as Monte Carlo models in R, structuring raw data for sensitivity testing and input verification in a scaled agile project environment.
- Revised Python (Python 2.6.6, even) code dat was used to write SAS code in macro-style methods set up in lengthy scripts (original Bourne-style shell scripts).
- Developed Python 3.5.2 routines to write SAS code read in from Excel specifications
- Building and testing Latex and Tableau implementations to combine and assess Loss Forecasting models.
- Assessed Python (3.6.1) Anaconda implementation of panda and numpy arrays
- Generally using SAS 9.4, UNIX, SASGRID, various forms of SQL (Impala, Teradata, Oracle, Proc SQL) and SAS Enterprise Guide in a Scaled Agile Framework as teh basic environment for most of teh above tasks
- Integrating Moody’s risk calc facility to do external benchmarking of industrial sector risk using JSON and XML with Python 3.6.1
- Set up Epics in a scaled agile project environment to budget and guide development of very large Tableau arrays describing a population stability indexing model as well as testing everything about population stability in a industrial sector
Contractor Analyst (Meridian)
Confidential
Responsibilities:
- Developing and validating analytic processes to cut waste in Medicare programs.
- Cross-checking procedures in R and SAS for accuracy and optimization in a DB2 SQL environment.
- Controlling input and output using SAS intranet, ODS and ODS tagsets, Excel and R (with LATEX and Shiny or Tableau using JSON or XML in some cases).
- Running simulations in R to ensure samples from large data are adequate to determine teh 95% confidence interval for use in potential litigation.
- Analyses in R (using Rstudio and statistical, graphics and input/output packages) and SAS 9.4 using large macros and statistical procs.
- Writing unix scripts (bash and Bourne-shell) to automate testing
Analyst Consultant Level 4
Confidential
Responsibilities:
- Developed and implemented a complete forecasting architecture for all loan modification processing and operational budgeting at Wells Fargo. SAS 9.3 and 9.4, SAS Enterprise Guide, SAS PROC SQL, SAS IML and SAS Statistical and Time series procedures were used to conduct predictive modeling of operational loan modification inventories and transitions for future periods of up to 3 years. dis forecast was based on using matrices of stages in Markov Chain transition probability structures. Eventually dis was migrated from a standard UNIX server to a SAS Grid analytic server. Monthly systematic presentation of results to upper management was integrated into teh system.
- Inventories were predicted to within 1% even for periods beyond 6 months. dis previously unheard-of level of accuracy was achieved within teh first few months of implementation.
- Code and procedural optimization during teh initial phases of dis project to automate and optimize reporting moved from EXCEL to SAS
- Presented analysis and recommendations to senior management, (in Excel, Powerpoint and Word).
- Lead and coached programming teams to increase analytic accuracy using SAS.
- Ensured adherence to data management regulations and policies. Worked with Data Governance teams to identify teh sources, paths, modifications, summarization and definition of data elements used in forecasting and capacity planning in teh loan modification space.
- Applied teh SAS value-added methodology to determine teh cost-effectiveness and operational value of teh forecast, and explaining and characterizing factors dat drive teh forecast such as population changes and other types of variability (not well-covered in teh SAS value-added methodology).
- Used multiple databases and server structures such as SASGRID1 via SAS connect with UNIX-type methods, T-SQL on SSMS, and SAS Enterprise Guide.
- Assessed needs and team skills for addressing new types of analyses and capabilities development.
Contract Analyst
Confidential, Charlotte, NC
Responsibilities:
- Created model development and data analysis for all loan investors (FNMA, FHLMC, FHA, VA, Wells Financial, Legacy Wachovia, Wells Owned, etc.)
- Worked with SAS, SQL, SAS IML and SAS Statistical and Time Series Procs.
- Oversaw teh automation of Excel VB analyses into SAS analyses.
- Built FDA/Clinical Data Consortium Compliant Study Databases for review by independent scientific agencies. dis included all forms of analysis and data manipulation, from setting flags to building structures based on teh interrelation of dates and times.
- Worked with SAS Base, Macros and Oracle technologies to complete Study Databases with full audit trails and complete analytic variables.
- Advised Statisticians on teh consistency and implementation of study and database specifications based on my experience working with Data Monitoring Boards.
Senior Statistical Analyst/Lead Programmer
Confidential, NC
Responsibilities:
- Developed and implemented a complete architecture for epidemiological studies using 3 million patients from teh GPRD (teh health-care system of teh UK) while at RTI Health Solutions. dis included all patient visits, drugs, treatments, events and outcomes and required extensive communication with epidemiological teams across teh globe.
- FDA labelling for a common anti-depressant was changed as a result of dis successful integration of complex treatment data, including timing of physician visits, drug treatments and behavioral outcomes.
- Wrote Standard Operating Procedures (SOPs) for teh range of types of studies. dis required gathering and codifying study requirements and data specifications from clients, clinicians and statisticians. Developed full specs for all stages of analysis based on procedural outlines.
- Worked with serious adverse event groups at RTI Health Solutions to bring their information out of their extremely rigid systems in a usable form. dis included some unusual types of data extraction such as using SAS(JAVA) XMLmapper to read narratives.
- Directed problem-solving with clients for teh complex data problems dat arise during programmatic transitions.
- Conducted parallel validation of data, statistical and analytic results for clinical studies.
- Installed SAS 9.1.3 on a UNIX server at RTI Health Solutions and validated teh installation to FDA standards.
- Worked with scientific advisory panels to reconstruct and programmatically replicate published studies from final data in order to demonstrate teh completeness and accuracy of data in a research repository. dis included an extraordinary range of data extraction and manipulation.
- Prepared combined study data reports for Data Monitoring Boards. dis included data extraction and modification of analyses from diverse studies.
- Prepared integrated drug safety data reports pulled from multiple studies for submission to teh FDA.
- Created exploratory analyses of large databases (such as UNOS organ transplant data, cancer registries and insurance data) in collaboration with research statisticians and epidemiologists.
Senior Statistical Programmer/Lead Programmer
Confidential
Responsibilities:
- Oversaw adherence to FDA regulatory requirements at all stages of clinical trials projects
- Programmed clinical studies in SAS 6.12 and 8.2 in teh VAX, UNIX and Windows environments using SAS SQL, Macros, ODS and Proc Report among other SAS tools
- Acted as analysis group and project leader for managing teh programming side of a number of protocols including meetings with clients to determine regulatory requirements, actions and specifications for all parts of clinical studies
- Conducted parallel validation of data, statistical and analytic results for clinical studies
- Coordinated testing and introduction of internally developed automated validation software
- Coordinated SAS Database management and extraction from specialized data sources to teh database, including clinical database management, including export and import of specialized data such as ECG analyses, memory testing and pharmacokinetic data
- Analysed all aspects of clinical trials data analysis for Phase I clients of teh CRO while coordinating teh efforts of other analysts, statisticians and data managers
- Created a range of procedures including standard tables, listings, graphs and building data structures for pharmacokinetic and pharmacodynamic analyses
- Analysed statistical tables and listings, created Macros and figures for individual and integrated reports
- Conducted ad-hoc analyses as required, and developed standard tools for teh reporting and analysis of clinical trial data
- Gatheird and codified requirements for analyses, summaries and reports from clients, clinicians and statisticians
- Developed full specs for all stages of analysis from outlines provided by clinicians and statisticians
- Implemented arrays to extrapolate and interpolate data points according to various algorithms
- Developed and maintained format libraries for a number of studies
- Incorporated human genetic data into analytic structures
- Served as a member of Biostatistics Committee which oversaw teh introduction of Windows-based procedures, to replace VAX eventually throughout entire department, requiring excellent communication and organizational skills