We provide IT Staff Augmentation Services!

Sas Bi Developer Resume

5.00/5 (Submit Your Rating)

Raleigh, NC

SUMMARY:

  • Data enthusiast with 8 years of experience working with Fortune 500 companies in Data Analytics, Product Management, Project Management, Business Process Development, and Initiatives & Strategies. Highly experienced in market research, statistical modeling, and prototyping, devising innovative products, customer engagement, and collaboration.
  • Exceptional problem solving, presentation, leadership and communication skills. Passionate about data mining and machine learning. Skilled in providing unified solutions with central focus on business value, customer satisfaction, and future vision.

TECHNICAL SKILLS:

  • SAS Enterprise Guide
  • SAS Programming
  • SAS DI Studio
  • SAS VA
  • SQL
  • Unix
  • Tableau
  • QlikView
  • QlikSense
  • Python
  • MS Project
  • Oracle &Teradata
  • MS Access
  • Data Analysis
  • Statistical models for decision making . Segmentation . Time Series Analysis (ARIMA)
  • Decision Trees
  • Neural Networks
  • Regression and ANOVA
  • ETL

PROFESSIONAL EXPERIENCE:

SAS BI Developer

Confidential, Raleigh, NC

Responsibilities:
  • Responsible for understanding the Business Requirements and translate the same into technical design document(TDD).
  • Created many Technical detailed design documents and walk through with the business team and get it base lined before the development begins.
  • Attend the weekly project status calls and get the clarification on the Business requirements and share the project delivery timelines to the IT project manager.
  • Worked extensively on Operational enhancements, maintenance and support on SAS GRID Platform.
  • Supported existing applications by implementing few functionality changes as per the current business requirements.
  • Involved in application migration from SAS 9.2 to SAS 9.4 on grid environment.
  • Working with users to migrate the processes currently running on 9.2 to 9.4 Grid Platform which include Migration of users, data and corresponding Metadata.
  • Created Unix shell scripts (Power shell) to execute the SAS programs on the Unix server.
  • Generated ad hoc reports to the internal business users as well as for the client customers as desired using SAS EG.
  • Designed several ETL jobs using SAS DI Studio to port the ODS data into data marts with tasks involving configuring metadata libraries in SMC, developing transformation logic and staging the final data.
  • Promoted the unit tested SAS codes and Scripts into the production region.
  • Interacted with the client manager on a daily basis to deliver project updates.
  • Worked closely with Application Operations team to support the system by resolving any productions issues/Fault or Failures within the SLA’s via Service Requests using Service Now ticketing tool.
  • Involved in many new development projects as Application Operations Developer to support the application post deployment.
  • Created artifact documents and made them visible (SharePoint site) to other team members to ensure continues support for the launched application or the new process.
  • Worked in Salesforce.com Integration update Project where the primary goal is to move away from having JAVA XML’s to INFORMATICA.
  • Updated Informix & Lotus Notes database & performed data transfer
  • Expertise in extracting and loading data to and from various RDBMS including ORACLE, MS SQL SERVER, TERADATA, FLAT FILES, XML FILES, IBM DB2 and Informix.
  • Responsible for creating and delivering recurring as well as ad hoc marketing campaign programming within strict timelines while under constantly changing requirements using SAS, SQL, and Teradata SQL in a UNIX environment with Oracle, Informix, and Teradata relational databases.
  • Developed many Informatica mappings and workflows using Informatica PowerCenter Designer and Workflow manager.
  • Used Informatica Workflow Monitor to monitor the execution of the workflows and its efficiency.
  • Got a very good opportunity to work on different Database Schemas and dealt with Packages, Stored Procedures, Functions, Views, Database Triggers using SQL and PL/SQL in Oracle.
  • Create the intermediate SAS datasets for the unit testing and extensively used the procedures PROC SQL, PROC SUMMARY, PROC FREQ, PROC PRINT, PROC REPORT, PROC IMPORT, SAS ODS.
  • Extensive experience in various Tableau Desktop reporting features like Measures, dimensions, Folder, Hierarchies, Extract, Filters, Table Calculations, calculated fields, Sets, Groups, Parameters, Forecasting, blending and Trend Lines.
  • Created Context filters on the existing filters and Tableau data extract to publish it in server by optimizing it to reduce the report rendering time.
  • Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
  • Automated many SAS Jobs as per the business needs using ActiveBatch Job scheduler application.
  • Developed Power shell scripts to automate SAS Jobs and also to load reports on monthly, weekly, quarterly, yearly basis to VistaPlus - A client based tool to hold all the reports customer Visible.
  • Worked with Business Partners, Statisticians, and key stakeholders to provide SAS programming for analyzing data as well as generating reports, tables and listings.
  • Responsible for the data accuracy, defect free deliverable, timely response to the business users.
  • Created one-off programs to provide the adhoc reports.
  • Coordinated with the QA & BA teams on the UAT (User Acceptance Testing) testing.
  • Responsible for the code deployment into the production servers and provide the post production support.
  • Extracted the data from the Oracle tables and DB2 tables to create the business specific reports.

Technologies and Tools: SAS 9.2 & 9.4 | SAS EG 5.1 & 7.1 |Tableau Desktop 9.2 | SQL | UNIX | Oracle | Teradata | Informix | ActiveBatch v8 | Informatica | Salesforce | Service Now | SAS Information Map Studio 4.4 | SAS OLAP Cube Studio 4.4 M3 | SAS DI Studio 4.2 & 4.7 | SAS Management Console9.4 M3.

SAS Programmer Analyst

Confidential, Charlotte, NC

Responsibilities:
  • Analyzed advanced analytics platform (AAP) portfolio & Ally Cash Back Credit Card portfolio data and generated analytical reports using SAS, MS SQL .
  • Worked on Code Migration from Pc SAS to Grid (SERVER).
  • Prepared model data and built various Predictive Models using R Programming Time Series forecasting(ARIMA), Linear & Logistic Regressions and K-Means Clustering in Business Analytics.
  • Implemented Time Series Analysis, Hypothesis Test, ANOVA, RFM analysis, and ARIMA models.
  • Developed SAS Macros for Data cleaning, Data mining and Reporting and to support routing processing.
  • Extracting data from different sources like "HUB" system where information is stored in Oracle tables, Excel, Access, and text/CSV files using SAS/ACCESS, SAS/SQL in SAS environment and creating SAS files.
  • Used SQL to download data and SAS PROC SQL pass through facility to connect to Oracle tables.
  • Wrote new SAS programs from scratch, automated and fine tuned existing SAS code using SAS Macro and Shell scripts and reduced manual dependencies and generated HTML/PDF/RTF Reports and Graphs/Charts.
  • Merging SAS datasets using various SQL joins such as LEFT JOIN, RIGHT JOIN, INNER JOIN and FULL JOIN as well as using SAS procedures such as SET, MERGE, PROC APPEND etc.
  • Extensively used Informatics-Power Center for extracting, transforming and loading into different databases.
  • Used SAS DI studio for data transformation and has ability to schedule and deploy code from SAS DI STUDIO.
  • Wrote PL/SQL stored procedures and triggers for implementing business rules and transformation.
  • Created Source and Target Definitions in the repository using Informatica Design Source Analyzer and Warehouse Design.
  • Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like Oracle, flat files, XML files and loading into Staging and Enterprise Data Vault.
  • Created technical specification for the development of Informatica extraction, transformation and loading (ETL) mappings in order to load data into classified tables.
  • Worked in highly structured division across Informatica developers/Data Architects (data modeling team)/DBAs/Change Management etc.
  • Wrote Python scripts to parse XML documents and load the data in database.
  • Creating analytical reports (periodic and ad-hoc) for different credit card portfolios reports using PROC REPORT, PROC TABULATE, PROC SUMMARY, ODS statements and generating outputs in HTML, Excel and RTF formats.
  • Developing SAS Macro programs and using macro functions (%LET, CALL SYMPUT, %NRSTR, SYMGET etc.) to automate business as usual (BAUs) reporting process to improve process efficiency.
  • Interacting with business stakeholders to understand their decision making parameters, analyzing the available data to build Tableau Dashboards and managing the delivery.
  • Prepare the summary reports across the products (Credit Cards, Loans like home loans, vehicle loan etc.)
  • Involved with SAS Visual Analytics Implementation in a Distributed mode with Cloudera.
  • Used SAS Visual Analytics which renders reports and visualizations that can be easily shared with others using iPad and Android mobile devices.
  • Designing and developing Tableau visualizations which include preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies.

Technologies & Tools: SAS 9.2 | SQL | Oracle | QlikView | QlikSense | SAS Visual Analytics | SAS DI Studio 4.2 | R Programming | Perl Scripting | Python | Informatica

SAS Data Analyst

Confidential

Responsibilities:
  • Wrote advanced SAS programs involving macros, SQL joins, merges, prompts and building Enterprise Guide projects.
  • Worked extensively in Data Extraction from Data Marts/external sources and in Data Analysis, Cleaning, Sorting, Merging and Reporting using R Programming.
  • Hands on experience with SAS Predictive Maintenance, SAS / Visual Analytics, SAS Enterprise Miner for predictive Analytics for clustering & classification.
  • Build standard and custom reports for distribution to internal and external customers/users.
  • Active Record using MySQL (mysql2) and Oracle (active record oracle enhanced adapter) databases.
  • Implemented EBI to provide reports, dashboard reports, data set and data mart services in response to the client requirements.
  • Performed data analysis, statistical analysis, generated reports, listings and graphs using SAS tools such as SAS Data Integration Studio, SAS /Graph, SAS /SQL, SAS /Connect and SAS /Access.
  • Responsible for credit risk model validation, Completed CCAR model validation reports and assisted senior managers with model presenting.
  • Worked with relational databases such as Oracle, Sql server, MS-Access and SAS storage server SPDS.
  • Performed statistical analysis of data and provided conclusions and predictions based on data summaries.
  • Implemented SAS/IML for programming statistical procedures, matrix operations and to construct graphics.
  • Using Query data user interface in ECM, defining process and creating multiple automated workflows for varies type of cases- fraud, security Etc.
  • Designed, developed, validated and deployed application through QlikView .
  • Used various data sources like QVD files, Excel, flat files in gathering data to develop the QlikView
  • Have been involved on modeling data as per business requirements.
  • Have knowledge of many QlikView Functions (Date and Time, Keep, Join, Mapping, String & input Fields etc ) used in projects to create Data Models.
  • Used QlikView functions to load data viz Load, resident load, concatenate and so on.
  • Prepared Unit Test Cases, Perl scripts, Load Test, Test Data, Execute test, validate results, Manage defects and report results.
  • Conducted in depth data analysis/mining and interpretation of results of various data sets using SAS and R programming.
  • Performed unit testing on Stored Processes and newly developed SAS programs.
  • Experienced with extensive debugging of SAS codes to identify the issue.
  • Used Python's XML parser architectures (SAX and DOM) API for tracking small amounts of data without requiring the DB.
  • Replicated operational tables into staging tables, to transform and load data into the enterprise data warehouse using Informatica .
  • Performing data validation, data cleansing, and transposing data using different procedures (PROC SORT, PROC MEAN, PROC COMPILE, PROC FREQ, PROC RANK, PROC FCMP, etc.) and statements (KEEP, DROP, RENAME, LABEL, OBS, FIRST, FORMAT, PUT, INPUT etc.)
  • Retrieved millions of records using various Procedures and used PROC FCMP to define multiple subroutines and functions.

Technologies & Tools: SAS | SQL | UNIX | Oracle | R Programming & Packages | Perl Scripting | Python | SAS DI Studio | SAS VA

Data Analyst

Confidential

Responsibilities:
  • Worked closely with data modeling team and management like Marketing managers, Campaign managers, financial managers etc., to analyze customer related data, generate custom reports, tables and listings.
  • Developed SAS code using SAS/BASE, SAS/Macro to clean the invalid data from the database while reading the data from Oracle, TERADATA, Excel, XML and flat files into SAS software.
  • Extract and Transport data from DB2 and Oracle tables in UNIX. Extract huge volumes of data from legacy systems and uploaded into Oracle using SQL*Loader.
  • Automated SAS jobs running on daily/weekly/monthly basis using SAS/BI, SAS Macro and Unix Shell Scripting.
  • Worked closely with business for Ab Initio GDE, ACE/BRE (Business Rules Engine) setup, design, coding and training. Used Metadata Hub for files, dml's, xfr's etc.
  • Co-Ordinated with Ab Initio support to resolve the peculiar issues encountered.
  • Extensively used SAS procedures such as Print, Report, Tabulate, Freq, Means, Summary, Transpose and Data Null for producing ad-hoc and customized reports and external files.
  • Validate Autosys BOX's and CMD's deployed to schedule Ab Initio graphs and scripts to automate jobs and co-ordinate with Autosys Admin team to ensure Job dependencies are set as expected.
  • Extensive use of SQL, SAS/Access, SAS/Connect to connect to various databases (ORACLE, DB2) such as development and production database (DB2 on UNIX). Also worked on Mainframes, MVS/JCL to read, modify and create new SAS datasets as per the business needs.
  • Generated custom reports with line sizes, page breaks, Header message, and Bottom message using PROC SORT, PROC PRINT, and PROC REPORT.
  • Created ETL workflows in SAS/ Data Integration Studio to populate the analytical data marts for business reporting.
  • Debugging SAS programs using data null, Put statements and using data step debug.

Technologies & Tools: SAS | SAS DI Studio | SQL | Teradata | Unix | Ab initio

SAS Programmer

Confidential

Responsibilities:
  • Modified data using SAS /BASE and Macros.
  • Employed techniques like sorting and merging on the raw data sets and coded them using PROC SQL to get the required output.
  • Worked as a developer in creating complex stored Procedures, SSIS packages, triggers, Cursors, tables, and views and other SQL joins and statements for applications.
  • Used SSIS to transform data into SQL database via FTP from Text files, MS Excel as source.
  • Worked as a developer in creating complex Stored Procedures, SSIS packages, triggers, cursors, tables, and views and other SQL joins and statements for the applications.
  • Worked on SSRS to handle Ad-hoc requests.
  • Designed tables, constraints, necessary stored procedures, functions, triggers and packages using T-SQL.
  • Performing Data source investigation, developed source to destination mappings and data cleansing while loading the data into staging/ODS regions.
  • Worked to get the data dynamically into the UI from SQL Database.
  • Used shell programming to run weekly and monthly reports.
  • Used SAS/ACCESS to gather data from RDBMS like TERADATA and DB2.
  • Coordinating the production of monthly, quarterly, and annual performance reports for senior management.
  • Used Python's modules (numpy, matportlib etc.) for generating complex graphical data creation of histograms etc.
  • Extensively used SAS / Macro facility to provide reusable programs that can be conveniently used time to time and created tables, graphs and listing reports.
  • Used SAS Data Integration Studio to develop various jobs processes for extracting, cleansing, transforming, integrating, and loading data into Data marts and Data warehouse database.
  • Extracted data sets from server using PROC IMPORT and created datasets in SAS libraries.
  • Coding SAS programs with the use of SAS /BASE and SAS /Macros for ad hoc jobs.
  • SAS scripts on Unix are run, and the output datasets are exported into SAS .
  • Created charts showing performance using SAS /GRAPH.
  • Maintained and enhanced existing SAS reporting programs for marketing campaigns.
  • Run reporting programs and download the results into EXCEL and build pivot tables.
  • Presented the results and statistical reports in PowerPoint for marketing staff.
  • Moved data set across platforms (from PC and Mainframe to UNIX and Vice Versa).

Technologies & Tools: SAS | Unix | SAS DI Studio 4.2 | Shell Scripting | Python 2.7 |SQL | BASE SAS | SAS MACRO | SAS / STAT | SAS /CONNECT | SAS /ACCESS | DB2 |Teradata |PL/SQL.

We'd love your feedback!