We provide IT Staff Augmentation Services!

Sas Programmer Resume

0/5 (Submit Your Rating)

Buffalo, NY

SUMMARY:

  • Over 8 Plus Years of experience as SAS Data Analyst - primarily concerned with the collection, analysis, interpretation and presentation of quantitative information
  • Proficient in technology such as: SAS, SAS Enterprise Guide, SQLServer 2008, Teradata SQL, Relational and Dimensional Data Modeling, TOAD, UNIX, VB, MS Excel, MS Access, Business Objects, DB2, Mainframe, Microstrategy, TSO and JCL.
  • Excellent knowledge of marketing data analysis, particularly around brand to understand ad effectiveness on business outcomes.
  • Evaluation of Mobile Analytics and Online Brand Effect metrics to identify trends and product insights.
  • Experience in advanced analysis of macroeconomic and financial data utilizing statistical, and data mining techniques (SQL, SAS, MS Excel).
  • Excellent programming skills in joining, merging and extracting data from various databases using SAS/SQLand PROC SQL.
  • Excellent Database and BI skills (SQL SERVER, SSAS, SSIS, ORACLE, SYBASE, DB/2, TERADATA,Tableau,Qlikview), especially with large databases
  • Strong knowledgein dynamic programming and list processing usingSAS/MACRO.
  • Proficient in creating UML diagrams such as Use Case Diagrams, Class Diagrams, Sequence Diagrams, Collaboration Diagrams, Activity Diagrams, State Diagrams, Deployment Diagrams and developed Business Requirement Documents (BRD) and Functional Requirement Documents (FRD).
  • Experienced in writing korn shell scripts for automating the jobs.
  • Good quantitative and analytical skills. Experienced in extracting data, analyzing it and transforming into Business intelligence. Deft in writing Structured Query Language (SQL) queries, and SAS queries.
  • Extensively used SAS/SQLpass through facilityto connect with relational databases likeOracle, DB2, Sybase and Teradata.
  • Creating analytical models and presenting information that enhances the forecast processes. Contributed in identifying process improvements related to quality of data.
  • Proficient on using SASEnterprise Guide for accessing and manipulating different databases like Oracle,DB2, Teradata, Sybase etc.
  • Proficient with Excel, working with Excel functions and expertise in generating graphs using MS Excel Pivot tables.
  • Interest areas include Excel modeling, applied statistical techniques including Exploratory Analysis, Cluster Analysis, Regression Analysis and, Factor Analysis (Principal Component Analysis).
  • Good knowledge of STLC (Software Testing Life Cycle), Test Planning, Test Execution and Test Results Analyses.

TECHNICAL SKILLS:

Data Warehousing/ETL Tools: Teradata utilities (BTEQ, TPUMP, Multiload, Fast Load,and Fast Export),Data mover,, Business Objects,Teradata Index Wizard, Teradata Statistics Wizard and Teradata SQL Assistant, Tableau,QlikView, Informatica Powercenter

Languages: UNIX, SQL, COBOL, JCL

RDBMS: Teradata, Microsoft SQL server, Oracle,DB2

Programming Languages: UNIX Shell scripting, PL/SQL, SQL

Schedulers: WLM, Informatica scheduler, Autosys.

Design Tools: ERWIN modeler

Operating systems: Windows VISTA/XP/2000, UNIX AIX, UNIX Solaris, MVS

Version control tools: Clear Case, VSS and Subversion.

PROFESSIONAL EXPERIENCE:

SAS Programmer

Confidential - Buffalo, NY

Responsibilities:

  • Involved in interacting with Business Analysts for requirement gathering, analysis, documentation and also to discuss primary model requirements for the project and for making business decisions Documented methodology, data reports and model results and communicated with the Project Team / Manager to share the knowledge Developed complexSASMacrosto simplifySAScode and effectively reduce coding time Imported Data from relational database intoSASfiles per detailed specifications Imported data using LIBNAME and PROCSQLPass - Thru facility to reduce processing time Created automated DI jobs inSASDI studio to generate data from different sources.
  • Used almost all data transformations in DI Studio Performed complex statistical analysis using PROC MEANS, PROC FREQ Extensively performed Data Cleansing during the ETL's Extraction and Loading Phase by analysing the raw data and writingSASProgram and creating complex reusableMacros Carried out data extraction and data manipulation using PROCSQL, PROC SORT, PROC REPORT to create preferred customer list as per business requirements Extensively usedSASprocedures such as PRINT, REPORT, TABULATE, FREQ, MEANS, SUMMARY, TRANSPOSE and Data Null for producing ad-hoc and customized reports and external files WroteSASprograms in Windows platform Generated Dashboard using EXCEL Created reports in the style format (RTF, PDF and HTML) usingODSstatements and PROC TEMPLATE Performed data analysis, statistical analysis, generated reports, listings andgraphsusingSAStools e.g.SAS/Graph,SAS/SQL,SAS/Connect andSAS/Access

Environment: SAS/BASE,SAS/MACRO,SAS/ODS,SAS/ETL,SAS/SQL,SAS/ACCESS,SAS/CONNECT,SASEnterpriseGuide,SAS/DI Studio, MS-SQLServer, UNIX, MS- Excel, MS-Accessand Windows

Confidential, Durham, NC

SAS Data Analyst

Responsibilities:

  • Performed complex programming/analysis aimed at improving portfolio risk and operational performance within Mortgage Services Division
  • Confirm data is accurate, available, and consistent with the physical data model in the Teradata database via SQL queries.
  • Built strong relationships with senior business partners both within and external to department
  • Used SAS ODS to create HTML, RTF and PDF outputs files in the process of producing reports.
  • Developed weekly sales leads generation report on global sales performance in selected 40 territories used to make strategic decisions and target more promising sales leads. (SQL, Excel, Shell scripting, Tableau)
  • Identification of Basel II Business Objectives
  • Creation of Basel II As - Is & To - Be Functional & Technical Architecture for Pillar I & Pillar II.
  • Defining the Basel II Process & Capabilities for various portfolios including Stress Testing
  • Analyzing requirements & helping the customers BA’s in preparation of the BRS & FRS documentation for various Basel rules such as Counterparty Credit Risk for ETD & centrally cleared OTC Derivatives
  • Oversee, manage, and verify testing of new data going into the Bank Data Warehouse (BDW) as part of User Acceptance Testing (UAT) and Post-Production Validation (PPV).
  • Create Data Dictionary and migrate the data when shifting to new version of SAS.
  • Results are performed using SAS programming and using techniques such as SAS Macro language, advanced data manipulation, and statistical procedures (e.g., PROC FREQ, PROC REPORT).
  • Develop UAT and PPV test cases from source-to-target mapping documentation for the project so that all testing scenarios are covered to ensure a smooth transition to Production.
  • Conduct for end users to ensure they are knowledgeable about new data available in the Bank Data Warehouse (BDW).
  • Create, track, update, and communicate status of tasks daily in the Version Control tool and verbally in the Stand-Up for every 2 week sprint per agile methodology.
  • Extensive experience collecting and reporting life cycle mortgage information that includes Boarding
  • Written many of the MACROS in Teradata to generate less channel traffic and easy execution of frequently used SQL operations and improve the performance.
  • Extract the data from a data warehouse and create ad-hoc reports, tables, and graphs according to Business needs.
  • Having extensive skills in the data mining.
  • Used various procedures such as Proc Tabulate, Proc Means and Proc Report to generate tables, listings, graphs and reports.
  • Extracted data from different data sources like Oracle, DB2 loaded in data ware house depends on the business logics.
  • Present statistical results using the appropriate graphs, charts, tables and written reports.
  • Used Enterprise Guide (EG) query builder to run queries against the table and sub setting the data.
  • Creating ad-hoc reports.
  • Support mortgage servicing cash operations with ad hoc SAS reporting

Environment: IBM OS, SAS, SAS/ODS, JD Edwards World, MVS,Hyperion Brio,JDE, E1, IBM DB2, IBM iSeries, Tableau,QlikView,IBM iNavigator, Teradata, BusinessObjects, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), PL/SQL

Confidential, Rockville, MD

SAS Data Analyst

Responsibilities:

  • Responsible for gathering requirements from Business Analysts and identifying the data sources required for the requests.
  • Responsible for creating and delivering recurring as well as ad hoc marketing campaign programming within strict timelines while under constantly changing requirements using SAS, SAS SQL, and Teradata SQL in a UNIX environment with Oracle, Informix, and Teradata relational databases.
  • Dashboarding, visualization, Predictive Modeling, Decision Trees (CART, CHAID), Logistic Regression, and Cluster Analysis using SAS and Tableau
  • Performed data analysis and statistical analysis using SAS/Base,SAS Macroand other procedures like Proc freq, Proc Univariate and Proc Means
  • Use SAS and VBA to perform data mining on large credit portfolio risk data sets and run simulation trials for stress tests.
  • Performed numerous data pulling requests using SQL for analysis.
  • Maintained campaign analysis reports for end users using SAS and Microsoft Excel.
  • CreatedSAS PROCSQL code to transform the original datasets to target datasets for complex requirements using ETL process.
  • Manage, create and maintain Access Databases and Excel Macros written in VBA
  • Met with management to determine ad hoc campaign programming needs.
  • Performed data analysis on various campaign related issues
  • Reviewed and verified other programmers’ code (Standard Quality Control and Assurance checks).
  • UsedSAS /SQLto extract data from different databases and joining tables.
  • Used SAS Enterprise Guide to access and manipulate Teradata and oracle Databases.
  • Testing and debugging of all ETL and Teradata objects in order to evaluate the performance and to check whether the code is meeting the business requirement.
  • Extracted datafrom existing data source, Developing and executing departmental reports for performance and response purposes by using oracle SQL, MS Excel

Environment: Teradata, VBA, UNIX Shell Scripting, MVS, SAS/ODS,Teradata, ViewPoint 13, Oracle 10g, SQL, Tableau,QlikView, UNIX, SAS, SAS/ODS, Erwin, Business Objects XIR3, Teradata Teradata QueryMan, Oracle 8i, Utility: Teradata Loading Utilities (BTEQ, MultiLoad, FastLoad and Fast Export)Rochade, HP Service Manager, Clear

Confidential, Wilmington, Delaware

SAS Data Analyst

Responsibilities:

  • Analyzed user requirements and developed, tested, and tuned various scripts, procedures, and database triggers to validate data.
  • Reviewing, analyzing and evaluating business system and user needs.
  • Designing and data mapping of business process from an IT perspective.
  • Design target data structures (data models) in compliance with the corporate data architecture for both OLTP and OLAP environments.
  • Designed scripts in SASto be compatible withOracleto load and access data from theOracle tables.
  • Connected to Mainframe andOraclefrom SASusing Macros.
  • Wrote queries on the Teradata database to provide ad-hoc reports using SAS/SQL.
  • Designing Unix Framework for executing File system to handle ETL processing under Teradata database Environment
  • Developed a small scale GUI for users to measure and identify key risk indicators arising from various firm wide
  • Operations. The tool was built using VBA user-forms and had the appearance of miniature software wizard for users to log
  • Transferring and converting data from one platform to another to be used for furtheranalysis. (From Oracle and Excel to SASandvice versa)
  • Profiling data; developing meaningful extracts; recognizing inconsistencies and initiating resolution of problems in data and processes.
  • Analyzing and examining existing SAS procedures and SQL queries/scripts to improve upon.
  • Creating SAS scripts and writing SQL queries based on input from analytics/development team.
  • Executing ad hoc data analyses in response to business requests; devising and maintaining various data mining studies.
  • Extracted datafrom existing data source, Developing and executing departmental reports for performance and response purposes by using oracle SQL, MS Excel.
  • Implementing and contributing to statistical analysis plans; provides additional expertise in the analysis as it relates to protocol development, and data validation.
  • Identifying and implementing improvements in QA processes as needed.
  • Worked on Data Partitioning that optimizes Parallel Processing with Multi Thread Processing
  • Extensively worked on Teradata SQL Assistant and BTEQ Scripts
  • Extracteddata from existing data source and performed ad-hoc queries.

Environment: Teradata, SAS, SAS/ODS, UNIX, Oracle, MVS, Business Objects XIR2/XIR3, Rochade, Tableau,QlikView, HP Service Manager, Teradata V2R13, Teradata Queryman, Oracle 8i, Utility: Teradata Loading Utilities (Bteq, Multiload, Fastload, Fast Export),Powercenter9,Clear Quest

Confidential, Brandon, FL

SAS Data Analyst

Responsibilities:

  • Develop Complex SQL Queries to fetch data from the Teradata Database and schedule the Jobs in UNIX.
  • Created one time Adhoc Reports and performed testing on Teradata Sql Assistant, provided data to Users in Excel for quick decisions.
  • Extracted datafrom existing datasource and performed ad-hoc queries using SQL.
  • Responsible for creating and delivering recurring as well as ad hoc marketing campaign programming within strict timelines while under constantly changing requirements using SAS, SAS SQL, and Teradata SQL in a UNIX environment with Oracle, Informix, and Teradata relational databases.
  • Write Excel VBA macros to manipulate and manage data
  • Created required financial reports using analysis output and export them to other environments or theweb using various SAS ODS methods for creating delimited text files,HTML, RTF, or XML files.
  • Maintained campaign analysis reports for end users using SAS and Microsoft Excel.
  • Met with management to determine ad hoc campaign programming needs.
  • Performed data analysis on various campaign related issues
  • Reviewed and verified other programmers’ code (Standard Quality Control and Assurance checks)
  • Created risk reports which entailed interactive graphs and charts allowing users to click buttons and drop down features to
  • Manipulate the illustrations. The graphs were made interactive through the use of VBA macros and plug-ins/add-ins.
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning.
  • Using advance Excel modeling and statistical techniques to build flexible models to forecast, analyze performance variables. Performing data validation checks and procedures.
  • Provideddata to Business analyst in graphical format using MS Office and Power Point Presentations.
  • Manage, manipulate and join datausing SAS,Excel tables, Oracle and SQL query builder.
  • Extract/transfer dataand perform ad-hoc queries using SASv9.1 from/to different data sources, like MS SQL server, Excel and reporting items.
  • Extracted data from Oracle and loaded data into Teradata for Reporting purpose.
  • Created Batch processes using Teradata FastLoad, MultiLoad, B-TEQ scripts, Unix Shell and Teradata SQL to transfer cleanup and summarize data.
  • Gather data for Customer Support from Sql server ODS application into the centralized system Teradata.

Environment: Teradata, SAS v8/v9, SQL Assistant 12.0, Erwin, UNIX Sun Solaris, Clearcase, Hyperion Brio, Oracle, Tableau,QlikView, Excel, Rational Rose, Requisite-Pro, AGILE, Documentum, MS Project 2002, MS Visio, MS Word, MS Excel, Test Director, Java, COGNOS, MYSQL, Windows NT/2000. Proc SQL, PC SAS, OLAP, PL/SQL, DB.

Confidential

Data Analyst

Responsibilities:

  • Extracted data from Oracle using SQL Pass through facility, Libname Method and generated ad-hoc reports.
  • Transferring and migrating data from Oracle to SAS datasets to be used for further statistical analysis.
  • Responsible for creating new code, utilize existing code and maintain data inSAS.
  • CreatedSASdatasets from raw data files with different field structures using trailing and in the data step.
  • Built summary reports after identifying the customers, their occupancy period and the revenue generated using PROC SUMMARY, PROC MEANS and PROC FREQ.
  • Scheduling the sessions to load data in to warehouse database on Business. Requirements using InformaticaScheduler.
  • DevelopedInformaticamappings, mapplets, workflows, sessions to process data from multiple sources and load the data into Data Warehouse based on reuirements.
  • Developed complex mappings and mapplets using partitions inInformatica.
  • Performed SQL tunning andInformaticamappings to improve the performance of the jobs to provide maximum efficiency and performance.
  • Developed severalInformaticamappings and mapplets to extract data from various sources such as flat files, database etc., and load the data into Data Warehouse using various transformations & reusable transformations like source qualifier, joiner, rank, update strategy, expression, union, sort, lookup, sequence generator, and router transformation etc.
  • UsedSASsystem macros for error handling, code validation, date stamping of log files, collected files to a given directory and scheduling.
  • Performed data analysis, data migration, data preparation, graphical presentation, statistical analysis, reporting, validation and documentation.
Environment: Windows,SAS,SAS/Macro,SAS/ODS,PROC SQL, Excel, ORACLE.

We'd love your feedback!