Sr. Sas Programmer Resume
San Francisco, CA
PROFESSIONAL SUMMARY:
- A certified SAS programmer with an experience of 8 plus years in both clinical and financial industries.
- Worked in banking sector with a very good involvement in various phases of Software Development Life Cycle.
- Mostly worked with business team in preparing Business requirement document along with gathering the requirements and involved in coding, debugging, application design, integration, documentation, maintenance and deploying.
- Adept in data analysis, requirement analysis, data mining, processing, manipulation, integration and transformation.
- Extensive experience on Data Warehousing Database Teradata Database12, 13 & 14.
- Strongly knowledgeable in Teradata SQL programming using Joins, Union, Rank functions, Aggregate functions, time functions, Sub select, Grouping, Logical expressions, Explain and Statistics.
- Extensively worked on Monitoring system capacities, system growth, and using Teradata SQL implanted data compression techniques to improve application performance.
- Involved in Understanding and translating the business requirements and to High Level and Low level design for ETL processes in Teradata.
- Got good experience with NoSQL database Worked extensively on DBC tables along with Aggregates and building efficient views.
- Strong working experience in planning and carrying out of Teradata system Loading Processes, Data Warehousing, Large - Scale Database Management.
- Extensive experience in SAS programing for extracting data and reporting procedures.
- Worked with various different software development life cycle models: Agile, Waterfall and Scrum models.
- Strong experience in Advanced SAS programming procedures like PROC SQL, PROC FORMAT, PROC TRANSPOSE, PROC ACCESS, PROC TABULATE,PROC IML, PROC FREQUENCY, PROC APPEND, PROC GPLOT, PROC GCHART, MACROS.
- Worked with generating reports using SAS Add In for MS Office, Browser interfaces like SAS web report studio, windows clients SAS Enterprise Guide,
- Have in depth knowledge of Macro Programming with SAS/BASE, SAS/MACRO, SAS/CONNECT, SAS/ODS, SAS/SQL, SAS/STAT, SAS/GRAPH in both Windows, Oracle and Tera database.
- Good experience in table analysis using PROC FREQ with LIST, OUT, MISSING options.
- Hands on experience in generating Ad hock reports using BTEQ and UNIX.
- Worked with Mainframe SAS and Mainframe JCL.
- Extensive experience in bug tracking and verifying the defects in the existing code.
- Performed data migrations and visual analytics using tableau.
- Very good understanding of Teradata RDBMS architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc. Extensively used different features of Teradata such as BTEQ, Fast load, Multi load, SQL Assistant, DDL and DML commands.
- Good knowledge about ETL techniques used to the extraction, transformation and loading of data among different databases.
- Strong command over UNIX shell scripting.
- Experience in working with extraction transformation and loading(ETL) process using SQL and PL/SQL
- Strong knowledge in Tuning PL/SQL code for handing large volume of data and improvising the performance of data.
- Hands on experience in using the SAS BI web application to create the dashboards.
- Good experience in Perl scripting procedure.
- Experience in working with R, Hive, Cassandra and Pig.
- Good knowledge about SAS Migration Utility and SAS Migration version analysis.
- Hands on experience in SAS Grid manager with in the SAS management console.
- A solid grounding in Preparing and analyzing the Business Requirements Documents (BRD), Technical documents and Functional Requirements Documents.
- Good experience in working with both business and development teams.
- Versatile team player with good communication and problem solving skills with all management levels.
TECHNICAL SKILLS:
Programing Languages: C/C++, SQL, PL/SQL, ORACLE, PHP, XML
Operating systems: Windows (95, 98, 2000, XP), Linux, UNIX, Solaris
Databases: MS SQL Server, MS Access, Oracle 9i/10g/11g, DB2
SAS Expertise: SAS/BASE, SAS/SQL, SAS/MACROS, SAS/CONNECT, SAS/STAT, SAS/SERVER, SAS/REPORTS, SAS/REPORTS, SAS/ACCESS, SAS/GRAPH, SAS/ODS
SAS Procedures: Print, SQL, Report, Sort, Format, Import, Export, Tabulate, Summary, Compare, Frequency, Transpose, Gplot, Gchart, Means, Regression, Correlation, Univariate
SAS BI Tools: SAS Enterprise Guide, SAS Data Integration Studio, SAS Management Console, SAS Information Map, SAS Customer Intelligence Studio, SAS BI Dashboard, SAS Web Report Studio.
Web Technologies: HTML, Macromedia Flash, Confidential Photoshop, Dream Weaver.
Scripting: Perl scripting, Shell Scripting.
BI Tools: OLAP, Qlikview, Tableau
PROFESSIONAL EXPERIENCE:
Confidential, San Francisco, CA
Sr. SAS Programmer
Responsibilities:
- Worked in collaboration with the business team to perform data integration with SAS coding.
- Used Dynamic LIBNAME, PROC APPEND, PROC FORMAT, PROCSQL, PROC IMPORT, PROC SORT, PROC EXPORT, PROC COMPARE
- Worked on building VBA Macro for processes to pull data for specific requirement in case of recurring jobs.
- Worked with marketing data for creating marketing campaigns and managing the marketing campaigns using SAS Enterprise guide.
- Analytical function applied across data in multiple sources (Teradata Excel).
- Provided Support in SAS Programming and technical assistance in the development steps of SDLC phases.
- Developed SAS/Macros extensively to decrease the redundancy and to automate the SAS code.
- Worked with team of developers on Python applications for RISK management.
- Developed Python/Django application for Google Analytics aggregation and reporting.
- Used Django configuration to manage URLs and application parameters.
- Worked on Python Open stack API's.
- Used Python scripts to update content in the database and manipulate files.
- Generated Python Django Forms to record data of online users
- Used Python and Django creating graphics, XML processing, data exchange and business logic implementation
- Developed functional and technical documentation to assist in the design, development and/or maintenance of deliverables.
- Worked on optimizing and tuning the Teradata SQLs to improve the performance of batch.
- Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions
- Tuned various queries by Collecting Statistic on columns in the Where and Join expressions
- Worked extensively with Teradata Utilities like BTEQ, Fast export and all load utilities
- Extensively used complex SQL, Indexes, Joins and Triggers.
- Proficient in administration of BI tools from SAS Management Console,
- Used PROC/SQL for retrieving the data from various data sources which include data warehouse and SQL server and UNIX.
- Worked with Teradata in creating tables, views according to the requirements.
- Developed analytical process, statistical models and quantitative approaches, leveraging SAS Ad hoc reports.
- Debugging, documentation and maintenance of the Ad hoc reports according to the demand.
- Worked with SAS Pass through for converting Oracle Data tables in SAS Data files and used SAS Db load Procedure for uploading SAS data files into ORACLE Tables.
- Used UNIX Scripting for data encryption and file transferring using FTP, SFTP and MFT Processes.
- To obtain desired format for metrics used SQL Queries and created SAS datasets.
- Prepared various metric formats for reporting different business functions like loss prevention and loss mitigation.
- Using SAS Enterprise Intelligence platform worked on delivering various intelligence strategies which provide good support in the day to day changing needs of the enterprise at different levels of departments.
- Also used Enterprise intelligence platform in data extraction form variety of operational data sources on multiple platforms in building data warehouses and data marts which help to integrate the extracted data.
- Worked on Qlikview platform for creating the batch files for automatic running of Qlikview scripts. Was also involved in design and development of integrated and expandable database using Qlikview platform.
- Involved in analyzing the threat reports produced by the SAS Cybersecurity teams.
- Worked in close relation with the Fraud detection team in understanding the new claim threats, fraud claims and other fraudulent activities.
Confidential, Herndon, VA
Data Analyst
Responsibilities:
- Worked with the statistical reporting procedures used for regular data collection and data analysis.
- Extensively used the SAS/MACROS for creating SAS reusable programs which can be modified according to the timely requirements.
- Utilized SSRS reports from different source formats satisfying clients by representing reports and presentations.
- Used various options like DROP, ADD, KEEP, and RENAME to make changes to the existing datasets.
- Worked with tableau admin in creating the users, group, projects and data connection settings.
- Involved in creating the dashboards and report schedules on tableau server.
- Used Perl scripting for handling the data feed, implementing business logic, communicating with web services through SOAP lite module and WSDL.
- Worked on migrating the existing campaign execution from SAS to Unica.
- Prepared various customer lists using SAS for marketing procedures.
- Few of the daily activities include creating metadata users, groups, roles, server definitions ( SAS BI servers and other Database server definition like Oracle, Teradata), ACT’s, Server libraries, importing the tables, code migration of the BI content from lower environment to higher environment
- Involved in development of Ab Initio graphs for loading the tables in Oracle and also helped in redesigning of the existing graphs in Ab Initio along with the documentation of changes made according to the requirements.
- Used macros for integrating the complex business rules.
- Performed various metadata functions like information map, OLAP, Enterprise Guide.
- Worked with the Credit risk Analysis reporting system based on the fico sores. Prepared repots which can help to prepare credit risk management policies.
- Involved in moving large datasets across Mainframes, Teradata, UNIX and SAS.
- Used Modeling SAS data stores like SAS/OLAP, SAS/MOLAP Cubes to support the reporting teams and the business intelligence group.
- Manage and Maintain the Application Reporting Database.
- Responsible for reporting Viewership Metrics of U-verse and direct TV usage.
- Created complex stored procedures and developed dynamic SQL Queries to adopt dynamic environment.
- Worked with Huge Databases in Teradata and Vertica and SQL Environment.
- Successfully created Snapshots Report, Report Subscription, Data Driven Subscription and Report caching using Report Manager.
- Involved in dimensional modeling to design data warehouse.
- Tuning of SQL Queries for Performance improvements in Teradata and SQL.
- Hands on Experience with Teradata and working through huge data sets and overcoming the Spool Space Errors
- Involved in preparing the testing data for the testing jobs in User Acceptance Testing environment.
- Prepared various summary reports of different models with the help of SAS Factory miner which enables the detailed data drill down.
- To detect the data issues prepared interactive variable distribution graphs using SAS Factory Miner.
- Responsible for migration from SAS 9.2 TO SAS 9.4 version.
- Supported the change manager during the migration by reporting the progress up to date and analyzing the cost and benefit tracking.
- Identified the threats and provided SWOT Analysis reports to the change manager to identify and mitigate the possible risks.
- Designed and developed MI reports to meet the specified business requirements by using Teradata SQL.
Confidential, DE
SAS Programmer
Responsibilities:
- Creating Database libraries in both UNIX and Linux environment.
- Worked with SAS EBI tools for application development and customizing dashboards and portals.
- Hands on experience in preparing the test case and test scripts for Integration Testing and also worked on preparing the use cases for Clint systems.
- Worked on finding out the missing values and frequencies of the missing values in the mortgage data.
- Involved in requirements gathering, analysis, design, deployment and user training.
- Involved in Physical / Logical design of database, creating and managing the database and all the database objects in Sql server 2008.
- Developed E-R diagrams (logical and physical using Erwin) mapping the data into database objects.
- Designed and developed various DB Objects like Stored Procedures, User defined functions, triggers and used Indexes for accomplishing various tasks.
- Created Views to facilitate easy viewing implementation and for security
- Wrote Python routines to log into the websites and fetch data for selected options.
- Created Logins and Job roles.
- Responsible for Backup of databases.
- Fulfilled random data requests for large quantities of data housed in mainframe files, legacy files, and in the data warehouse using Mainframe SQL, SAS, Access, and BRIO.
- Worked with team of developers on Python applications for RISK management.
- Developed Reports, Sub Reports, Drilldown reports, Data driven subscriptions, and used various features like Charts, graphs, filters etc.
- Created stored procedures and used as the datasets for the Report Design.
- Providing the data quality reports to the credit bureau using the updated QA metrics in a timely manner.
- Leveraged Machine Learning to create diagnostic classifiers and clustering analysis.
- Analyzed and designed feature set to maximize model fit with R.
- Implemented the machine learning algorithm into production software utilizing Python.
- Applied SVM machine learning algorithm to non-linear data to fit and predict.
- Wrote Algorithm programming with R and python. Sometimes Matlab.
- Developed BTEQ scripts and worked on the performance tuning on the same.
- Data extraction and validation, applying rules transformation in the flow of data from source to target.
- Securing the data by providing appropriate permissions on both at Metadata server as well as at the Unix level, creating SAS metadata backup using the Backup Wizard etc.
- As per business requirement, validating the source and target data using Teradata.
- Created the low level design document.
- Creating test case scripts in Teradata to validate different stage
- Involved in post implementation support and resolved product related bugs.
- Eliminated the duplicate observations in the datasets by using the NODUPKEY and NODUPRECS.
- Validating the LINUX servers by the doing pre-check like all user ID are created or not, correct python version, correct java version, correct file directories and their spaces requirements.
- Provided documentation of methodologies, data reports and the model results to the project manager.
- Processed data using Hive for developing the web reports regarding the insurance claim analysis.
- Using Data Null Techniques generated the SAS customized reports.
- Worked with dynamic data exchange protocol to pull out and transfer the data from excel spread sheets to other applications.
- Prepared various BTEQ scripts to run the complex queries on the Teradata database. Delivered queries for breaking the complex one into simpler queries using volatile tables.
- Participated in home equity loan loss forecasting SAS model implementation and forecast analytics for regulatory stress testing (CCAR/DFAST) as well as BAU forecasts and annual planning
- Managing and streaming the analytical workflow by tracking the customer workflow using SAS Model manager.
- Developing and designing Oracle PL/SQL scripts for import and export of data.
- Implemented HDFS and NO SQL techniques in big data for extracting and analyzing the data.
Confidential
SAS Programmer
Responsibilities:
- Developed SAS Programs for validation, analysis, data cleaning and generating reports.
- Prepared Case report tabulations.
- Created reports by identifying the irregular data entries by reviewing the SOPs.
- Generated reports using different SAS Output delivery systems like HTML, PDF and RTF.
- Involved in retrospective validation procedures.
- Preparing the design document.
- Application of statistical programming language (R, SAS, Python) and programming language python and java for API’s visualization with API and Tableau as well as application of relational database such as sql and oracle and non-relational database such as hbase, mongoDB and redis
- Responsible for building a new Services BI database with the respective tables and views required as given by the business team
- Administered SQL Server with Client/Server tools including SQL Server Enterprise Manager, Profiler and Query Analyzer
- Created AD-Hoc reports using Report Builder and maintained Report Manager for SSRS.
- Involved in development and modification of existing Financial Reports and Marketing Reports, which include Corporate Consolidation Reports, Financial Reports and Planning Reports for management.
- Created number of standard reports and complex reports to analyze data using Slice & Dice and Drill Down, Drill through using SSRS.
- Was actively in the Server migration and the Fresh QlikView Installation process.
- Developing programs in SAS for the conversion of Oracle Data, for a phase II study into SAS datasets using SQL Pass through facility and Libname facility for importing external files in to SAS environment
- Worked on SAS code to import the CME (Common Medicare Environment) data to SAS from EBCDIC format (extracted from Mainframes) and develop the measures for monitoring the Administrative functions.
- Mapping of raw Data to SDTM data to generate SAS programs for creating SDTM domains such as Adverse Events (AE), Demo Graphics (DM), Concomitant Medication (CM), Medical History (MH), Vital signs (VS) reports.
- Generate tables, listings and graphs, including Safety and Efficacy profiles for FDA submissions.
- Developed SAS programs using SAS/BASE, SAS/SQL, SAS/STAT and SAS/MACROS for Statistical
- Responsible for providing SAS programming and analysis support for Phase II and Phase III clinical studies.
- Perform validation programming using SAS/Macros by define variables, merging datasets, creating derived datasets and Code review.
- Created Analysis datasets from raw datasets based on the necessity of new variables.
- Extensive use of Data Null, Summary, Means, Freq and Chart procedures ascertaining quality and standards for the code Performed statistical works to resolve and develop SAS applications using proc reg, proc univariate, proc corr, proc anova, proc means etc.
- Formatted HTML, RTF and PDF reports, using SAS output delivery system (ODS).
- Good understanding in creating CRT data sets, MedDRA, WHO drug dictionaries, IND, ICH guidelines
- Generated tables, listings and graphs, including Patient Demography and Characteristics, Adverse Events, Laboratory etc.,
- Developed SAS programs using SAS/BASE, SAS/SQL, SAS/STAT and SAS/MACROS for statistical analysis data display.
- Produced RTF, MS WORD and HTML formatted files using SAS/ODS to produce ADHOC reports for presentation and further analysis.
- Developed summary reports and graphs using SAS Procedures like proc freq, proc gplot, proc summary, and proc compare etc.
- Proactive monitoring of Database performance.
- Responsible for having meeting with the clients and getting the business requirements
- Created user defined data types and added them to model database for future purpose.
- Involved in Database design and worked with Production DBA.
- Was single point of contact between the Business Users and the Technical team to make sure they get the desired result. Also responsible of giving Technical presentation to demo the applications to users.
- Responsible for Creating Data Design Document (DDD) which documented everything in the project.
- Providing the reports according to the requirements of the business team with minimal risk.
- Generated case report tabulation according to the standards for regulatory submissions.
- Participated in project meetings for updating the research findings.