Sr. Sas Programmer / Data Analyst Resume
Warren, NJ
SUMMARY:
- 6 years of Extensive IT experience in Data Analysis, Design & Development in the Finance, Banking & Insurance domains.
- Hands on experience using SAS as data management and statistical analysis tool in Retail Bank, Mortgage, Credit Card, Insurance and other Financial Services.
- Core technical expertise in design, development and debugging SAS programs and Macros to query, extract and manipulate large datasets, perform statistical model analysis, produce files, tables, listings, graphs, reports using SAS/BASE, SAS/STAT, SAS/SQL, SAS/ACCESS, SAS/MACRO, SAS/ODS, SAS/ETS, SAS/GRAPH, SAS/AF, SAS/FSP, SAS/CONNECT, SAS/SQL SAS/DI studio, in Windows and UNIX environment
- Advanced skills on data cleaning, manipulation, Data migration and creation of Analytical Data mart wif SAS/DI Studio
- Strong exposure to all steps of project life cycle such as original data collection, secondary data query, data manipulation and analysis, optimizing, summary of findings, develop analysis indications; communicate strategic insights and presentation of results.
- Experience on SAS Visual Analytics (VA) and Tableau.
- Skilled in Analytical tools like R, Python (numpy/scipy/pandas/matplotlib)
- Hands on experience wif data validation, modelling, forecasting in R and Python
- Learning data science wif R and python that includes predictive modelling, clustering, decision trees and random forest
- Developer of consistent, standardized reporting procedures as well as creator of ad - hoc reports as needed
- Thorough Knowledge and experience of Microsoft Office tools like MS Access, MS word, MS PowerPoint, and MS Excel.
- Good experience in CCAR, DFAST, PPNR Stress Test Data model.
- Strong analytical skills that include insight on successfully applying technological solutions and best practices to resolve business issues, while improving overall efficiency.
TECHNICAL SKILLS:
Statistical Software: Base SAS, SAS/SQL, SAS/MACRO, SAS/ODS, SAS/ACCESS, SAS/CONNECT, SAS/ETL, SAS/GRAPH, SAS/ENTERPRISE GUIDE & SAS BI Tools, SAS/DI Studio.
Operating System: Windows, Unix.
Languages: SQL, R, Python, Shell Scripting PL/SQL, Visual Basic, XML, HTML.
Databases: Oracle, Teradata, DB2, SQL-Server, MS Access.
Data Visualisation Tools: SAS VA, Tableau, Spotfire.
PROFESSIONAL EXPERIENCE:
Confidential, Warren, NJ
Sr. SAS Programmer / Data Analyst
Responsibilities:
- dis project is to validate BASEL data before models use them and responsible for downloading data from sources like Economy.com, BLS.gov etc., read those external data and validate them against internal data.
- Generate comparison reports among the FED CCAR scenario mnemonics (Base, Adverse, and Severely Adverse) on periodic basis.
- Generate comparison reports on Moody’s actual mnemonic values and Internal forecast mnemonic
- Involved in CCAR, DFAST stress test by loading the economic data for the test.
- Used Python for scenario analysis and sensitivity analysis on models
- Applied standard statistical methods such as Measures of Dispersion, Central Tendency, Reliability and Correlations in determining significant factors present in data for modelling
- Performed data cleaning & explored data visualization techniques on a variety of data stored in spreadsheets and text files using R and plotting the same using ggplot2
- Identified process needs and improvement opportunities, develop solutions, get them accepted and coordinate their execution in a continuous improvement environment.
- Helped generating tool based software to validate the economic data and generate the report how the data values are trending over the month/years.
Environment: Base SAS 9.3, SAS/Macros, SAS/Reports, SAS/STAT SAS/Graph, SAS/SQL, XML, SAS/Enterprise Guide 5.1, SAS/ODS, SAS/ETL, Oracle 10g, DB2, TERADATA, Windows NT, UNIX, Python 3.4, R 3.2.
Confidential, White Plains, NY
Senior SAS DI / Data Analyst
Responsibilities:
- Responsible for maintaining the quarterly reports, its validation and delivering the data of GL Pricing, Workers Comp, D&B and Specialty Auto to business team.
- Worked on updating the requirement modification as well as new business logic implementation into the existing Workers Compensation project.
- Involved in designing job flows using SAS Data Integration (DI) to extract data from source systems, perform required transformations and load back to target.
- Gathered information and created requirement documents for Specialty Auto.
- Used various transformations like Access, SQL, lookup, sort, splitter, and transpose in SAS Data Integration.
- Worked on comparison between logs and program using various UNIX commands.
- Worked on end to end validation on verifying the policies in the Mainframe Source system and Document Warehouse till the SAS output.
- Performed statistical analysis and data management on study data by utilizing appropriate statistical methods using SAS and SAS tools.
- Used Python libraries numpy/scipy/pandas and R libraries for exploratory data analysis
- Created functions and lambda functions in Python for recurring processes
- Created scripts and made changes in scripts depending on business and scheduled the jobs to run for production migration
- Used SAS Visual Analytics to visualize results and performances.
Environment: SAS Enterprise Guide 9.3, SAS Data Integration (DI) studio 4.2, SAS Visual Analytics 6.4 (VA), UNIX K - Shell, Windows 7, Putty, FileZilla, Mainframe, AIX, Python 3.4, R
Confidential, McLean, VA
SAS Developer
Responsibilities:
- dis project is mainly into validating data between DB2 database to Teradata database.
- Involved in reengineering multiple DB2 SQL to Teradata SQL and optimize them to reduce the runtime
- Performance tuning, including collecting statistics, analyzing explaining & determining which tables needed statistics. Increased performance by 35-40% in some situations
- Responsible for converting SAS scripts, SQL Queries, testing in Dev, keep track of UAT testing and moving into production.
- Expertise in primary index selection for uniform data distribution across the AMPs and compress columns for efficient utilization of available disk space
- Responsible for creating and testing Store procedures in SAS
- Creating daily reports activity onsite through the creation of a Teradata.
- Extensively used SAS macros and executing Unix command thru SAS Codes
- Worked on SAS Add-in for Microsoft office to bring the data to MS Excel from SAS datasets and by executing STP.
Environment: Base SAS 9.3, SAS/Macros, SAS/Reports, SAS/STAT, SAS/graph, SAS/SQL, XML, SAS/Enterprise Guide 5.1, SAS/ODS, SAS/ETL, Oracle 10g, DB2, TERADATA, Windows NT, UNIX
Confidential
SAS/Data Analyst
Responsibilities:
- Created SAS datasets from Oracle database wif random sampling technique and created Oracle tables from SAS datasets by using SAS Macros.
- Created large datasets by combining individual datasets using various inner and outer joins in SAS/SQL and dataset merging techniques of SAS/BASE.
- Analyzed data using various statistical procedures
- Used SAS/Macro facility to create macros for statistical analysis, reporting results and data extraction.
- Worked on SAS data sets in both Windows as well as in UNIX environments.
- Proposed a set of action plans to mitigate the risks identified wif the sourcing decision.
- Generated HTML, Listings, EXCEL and RTF reports for presenting findings of various statistical procedures
Environment: SAS 8.0 (BASE, STAT, GRAPH, MACRO, ETS and ODS), Windows NT, Oracle 8i, PL/SQL, UDB, UNIX
Confidential
SAS CONSULTANT
Responsibilities:
- Provided structured programming analysis, design, development, code, testing, documentation, implementation, and quality assurance during enhancement, maintenance, and/or production support activities for various client business application solutions.
- Created Ad hoc, Macros and Stored Procedures for ETL Interfaces Programs.
- Used SQL join, Splitter, Extract, loader, Sort, Summary, validation and SCD Type2 loader Transformation in SAS DI Studio.
- Used SAS to reports employing various SAS procedures,
- Prepared new Datasets from raw data files using Import Techniques and modified existing datasets
- Coordinating the production of monthly, quarterly, and annual performance reports for senior management
References: Available Upon request