We provide IT Staff Augmentation Services!

Sas Analyst Resume

0/5 (Submit Your Rating)

Jersey City, NJ

SUMMARY

  • SAS Certified Professional with 7+ years of experience working as a SAS Analyst/Developer on Unix and Windows environment.
  • SAS Analyst with experience with analysis and reporting in the Healthcare domain dealing with Medicaid, MMIS, Managed Care, Accountable Care Organizations (ACO’s), HIPAA, along with statistical, epidemiology and bioinformatics related data analysis.
  • Good understanding of HIPAA rules and regulation including claims processing, data analysis, utilization review, and billing and security in dealing with electronic transactions.
  • Experienced and certified in Oracle technologies including Advanced Oracle SQL Certification. Sound experience with
  • Strong experience with Excel using VLOOKUP’s, Pivot table’s, Charts, Arrays, Sparkline’s, Outlining, Customized reports.
  • Worked with different test environments including DIT, SIT, QA and UAT phases and have experience working with both functional and non - functional testing and preparing the product or service as per the Acceptance Testing especially for non-functional testing.
  • Involved in the development of test plans, test scenarios and test strategies to facilitate the process of testing. Also involved in Analysis, Design, and Functional Specifications to identify Test Requirements, Design Test Cases, Test Scripts, and Test Data with expected output.
  • SAS Certifications cleared in Base SAS 9.3 and Predictive Modeling using Enterprise Miner 13.
  • Demonstrated ability to understand business requirements and processes and also learn new technical tools and software applications, and work with a variety of data and data sources.
  • Worked with SAS Data Integration (DI) Studio to develop/automate ETL Processes and involve in Error Handling in SAS.
  • Strong experience with ETL and DW concepts. Have been part of Data migration and ETL testing projects.
  • Experience performing QA validation and checking for duplicates, removing duplicates and checking for data validation and verification using SAS programming, Stats data analysis and data validation.
  • Experience in Data Validation, Data Scrubbing, Data Cleaning, Data warehousing, Data Marts and Statistical reporting using statistical procedures like Proc Freq, Proc Means, Proc Univariate.
  • Experienced with Unix Shell Scripting and creating datasets in SAS using Unix.
  • Developed and modified existing SAS programs as well as imported data using SQL Pass Through and Libname engine methods to create tables and extract data from the Teradata and Oracle DB.
  • Trained in Good Manufacturing Practices (GMP’s), GS1, Regulatory Compliances and FDA Guidelines including CFR 21. Hold a Master’s degree in Packaging Science specializing in Food and Pharma Packaging and Logistics.
  • Experienced with SAS Visual Analytics Data Builder, SAS Visual Analytics Explorer, SAS Visual Analytics Reports Designer modules. Used SAS Visual Analytics for preparing, exploring, analyzing, and interpreting data. Results were helpful for understanding customer behavior analysis, customer profiling, market segmentation, Campaign Analytics, generating trend analysis etc. Used SAS CI tool for customer intelligence and segmentation.
  • Worked with SAS Predictive Modeling for preparing data, building predictive models, assessing models, scoring new data sets and implementing models. Certified Predictive Modeler and good knowledge with Linear and Logistics regression, ANNOVA, Standard deviation, Time Series models, pricing and payment models.

TECHNICAL SKILLS

Languages: SQL, SAS, PLSQL, JAVA

Tools: Oracle Developer, SQL Server, SAS 9.4, SAS Enterprise Guide 13, SAS Enterprise Miner, SAS Visual Analytics 7.3 - Data Builder, Explorer, Reports module, SAS Management Console, LASR Server, MS Excel, MS Visio, Jira, ALM, SAS/Connect, SAS Risk Dimensions, SAS DI (Data Integration), SAS CI (Customer Intelligence), Tableau.

Operating System: Unix, Linux, Windows

SAS Experience: End to End Development of SAS Infrastructure, Manipulating and transforming data, combining data sets, creating summary reports, importing and exporting raw data files, identifying and correcting data and programming logic errors, Building and assessing predictive models, scoring new data sets and implementing models, customer behavioral analysis, customer segmentation, customer retention, Creating Visual Analytics reports, Preparing data in Visual Analytics Data Builder, Exploring Visual Analytics data. Running UNIX Batch jobs and scheduling jobs. Using SQL Pass through to update records. Integrating data between HADOOP and SAS.

Production Support: Working between Production and Development environment. Version control for each releases. Fixing Production issues. Process enhancement and changes implementation on Development environment and moving them to production

Data Integration and ETL: Involved in ETL and Data Migration activities including the HADOOP-SAS integration along with linking databases with Teradata and Oracle. Performed QA validation of data which includes data validation and verification of data on both Source and Target DB’s and the SQL and SAS validation in target Data Warehouse and create Data Marts. ETL testing includes knowledge of Type 1 and Type 2 Slowly Changing Dimensions.

PROFESSIONAL EXPERIENCE

SAS Analyst

Confidential, Jersey City NJ

Responsibilities:

  • Extracted, manipulated data, and created Data Sets from various sources like Excel, flat files, Oracle database, Access database using PROC IMPORT techniques and SQL pass through facility and performed SAS data manipulations against the resulting sets. Used SQL Pass through method to connect between the SAS Data Analytics interface and other external systems like Oracle and Teradata.
  • Used SAS/Connect software to divide risk analysis projects into various submodules and used macros like %RDCSPAWN to improve performance. The macro helped divide the tasks into multiple subtasks which resulted in improving performance. Used Proc Compare to compare the results.
  • Publish Daily Test status reports / defect metrics for the client to review/discuss on a daily basis and create formulae in Dashboards in QC to generate specific reporting data as per client requirement. eg: Total Open Defects Summary List.
  • Involved in the development of test plans, test scenarios and test strategies to facilitate the process of testing. Also involved in Analysis, Design, and Functional Specifications to identify Test Requirements, Design Test Cases, Test Scripts, and Test Data with expected output.
  • Reviewed the data mapping document regularly for ETL testing and generated customized queries using SQL to verify the requirements. Generated high level test scenarios for testing each phase and then wrote descriptive test cases for each phase and logged defects in ALM and Jira.
  • Worked with SAS DI (Data Integration) as an ETL tool to integrate and manage data from various sources and verified data on Source and Target side of ETL, verified data completeness and transformation rules, tested referential relation and integrity of data as per requirement specification documentation, checked for duplication of records and/or data errors. Performed QA validation on various business rules and transformations including checking for duplicates, deleting duplicates, performed QA data validation and verification and ensured data is correctly migrated to the Data Warehouse and creating customized Data Marts. Involved in automation of ETL batch jobs and involve in error handling of SAS and SQL code. Used ETL Slowly Changing Dimension Type 2 for Database and Data warehouse migration.
  • Extensively used Excel for VLOOKUP’s and Pivot tables to generate customized reports for client and help with immediate firsthand identification of problems.
  • Verified the excel results with results generated from SQL and SAS. Developed, implemented, and facilitated process for data identification, segregation, and cleansing of data as well customized results using through excel pivot tables and Vloop’s.
  • Ran Unix batch jobs to schedule reports using Unix - Cron Jobs and updated tables and scheduled reports as per timestamp based on minutes, hours, days of week, days of month etc.
  • Created SAS datasets through Unix environment using Unix Shell Scripting.
  • Managed tracked and reported efforts and work deliverables along with collaborating with data consumers to remediate issues and track the data quality risks.
  • Generated tables, listings and graphs using Proc Means, Proc FREQ, Proc TRANSPOSE, Proc REPORT,
  • Proc Tabulate, Proc GCHART and Proc GPLOT Procedures. Results were helpful for generating trend and market analysis. Imported data from various sources including database, excel, csv and then generated reports in various formats including HTML, PDF, CSVandRTF usingSAS ODS.
  • Created data set specifications or programming specifications in SAS, SQL in accordance with project requirements and good documentation and programming practices.
  • Produced quality customized reports using PROC REPORT and also provided descriptive statistics using various procedures like ProcFreq, Means and Univariate procedures.
  • Worked with complex datasets to extract customized reports using PROC SQL, PROC RANK, PROC SORT, PROC REPORT for creating a preferred list of customer reports as per the given requirements from business analysts.
  • Performed Root Cause Analysis, identified patterns and perform trend analysis and contributed towards the implementation of the Data Management policies.
  • Involved in data verification, data manipulation and coding activities including the usage of basic and standard SAS procedures and Macro coding. Used SAS Macros coding for handling several repetitive tasks and avoided extensive coding.
  • Good exposure using SAS and proc sql including concatenation, merging, user defined formats, group by, joins, rank, sorting, removing duplicates etc.
  • Involved in verification of programming logic by overseeing the preparation of test data, testing and debugging of programs.
  • Generated tables, listings and graphs using PROC MEANS, PROC FREQ, PROC SUMMARY, PROC TRANSPOSE, PROC REPORT, PROC GCHART and PROC GPLOT Procedures.
  • Generated output files in the form of listing, HTML, RTF and PDF formats using SAS ODS.
  • Used SAS Visual Analytics for preparing data using Data Builder and performing various SQL operations on Data Builder including creating new SQL data queries to perform joins, add calculated columns, expressions, aggregation functions and sub setting and sorting data or re-using SQL subqueries.
  • Performed various data analysis such as forecasting, correlation using SAS Visual Analytics Explorer. Extensively used Visual Analytics Explorer features like Parameters, Rank, Interactions, Display rules to generate customized exploration and data analysis.
  • Used SAS Visual Analytics Designer module for creating reports - inserting objects into reports, using different tables and graphs to display results, used different container objects like vertical, horizontal, prompt containers in reports, using controls to display results, inserting images, using stored process, using gauges to display status of variables, using customer graphs to display results, working with interactions, report links and ranking values in reports.

SAS Analyst/Developer

Confidential, San Francisco California

Responsibilities:

  • Worked with Project Managers design, execute, and document data management and quality assurance procedures for research data collection projects using SAS and other statistical packages.
  • Supported research efforts by reviewing statistical methodologies, recommending methodological best practices and implementing statistical analysis solutions.
  • Involved with Data Migration and ETL testing project using complex SQL queries. Used various SQL queries including Joins, Union, Aggregate functions, Group by, Inline views, transpose for validating data.
  • Using the data mapping document or the requirements document, created detailed test cases for each phase of the ETL process. The test cases checked for the required columns, old versus new changes, data integrity etc. The test cases included detailed description, expected versus actual results comparison etc.
  • Submitted daily status report to client using excel pivot tables and vlookup. Created excel pivot tables and charts using worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot tables.
  • Involved inClaims Processing System, this project consisted of providing various reports to the management to track how many claims are we processing and what are the efforts being done to contain claims cost.
  • Coordinated and facilitated work sessions with Medicare Operations users and IT development to define projects' scope and requirements and manage project expectations.
  • Recommend changes for system design, methods, procedures, policies and workflows affecting Medicare/Medicaid claims processing in compliance with government compliant processes like HIPAA/ EDI formats and accredited standards ANSI.
  • Tasks also involved analysis and assessment of the current MMIS andEDI claims, documentation of business and technical requirements, preparation of cost analysis and implementation of newMMISautomation system.
  • Collaborated with project team members on sample selection, study design, and statistical analysis related to survey methods.
  • Developed datasets and ad hoc reports used for sample analysis using SAS and other statistical packages.
  • Developed and designed methods, procedures, and specifications for collecting, organizing, interpreting, classifying and reporting on complex information for computer input and retrieval, utilizing knowledge of database construction and retrieval methods.
  • Performed risk based approach as per FDA expectations for determining the level of validation and documentation appropriate for SAS programs.
  • Provided analyses and descriptive reports of project data for management of research projects. Document, program, debug, and run a variety of scheduled and ad hoc processes, summaries, statistics, and other electronic reports.
  • Manipulated and managed very large data files in SAS as well as developed SAS Macros for processing, and extensively used proc SQL and Base SAS for analyzing and reporting on trend analysis.
  • Prepared final datasets for project deliverables. Specifically, compiled, cleaned and ensured the quality of datasets delivered to clients. Documented quality assurance operations performed in final reports, and provide basic documentation on datasets.
  • Collaborated with Project Team members and Operational staff to identify and resolve serious technical problems; take the lead in diagnosing problems, including programming, data and procedural errors.
  • Created customized and presentation-quality reports from data analysis results. This includes presenting summary tables and figures.
  • Applied quantitative methods and techniques to manage and analyze research data, as well as perform advanced statistical analysis.

SAS Market Analyst/Developer

Confidential

Responsibilities:

  • Developed and automate daily, weekly, monthly, and seasonal reporting/scorecards using SAS and SQL to monitor the health of the eCommerce business as well as share business insights with marketing, product development, sales, and finance stakeholders.
  • Used Survey Select procedure along SURVEY Means, SAS Survey Freq procedures,which provides a variety of methods for probability sampling.
  • Exceptional use of opinion polls surveys questionnaires demographics and statistics for collecting relevant data for market survey. Used SAS proc SQL and SAS Macros to generate customized reports and avoid and repetitive coding.
  • Using Excel pivot tables to manipulate large amounts of data in order to perform data analysis, position involved extensive routine operational reporting, hoc reporting, and data manipulation to produce routine metrics and dashboards for management.
  • Used predictive modeling analysis for handling probability based complex designs including stratified selection, clustering and unequal weighting.
  • Used sample survey method to obtain information about a large population by selecting and measuring samples from the population. Used imputation methods to take care of missing values from Survey data.
  • Ensured data integrity and testing processes are followed for reporting and research tools. Worked with internal and vendor support teams to report and resolve data discrepancies.
  • Partnered with internal and external Information Technology teams to develop short and long term data analytics infrastructure.
  • Supported data and analytics requests throughout the development cycle including gathering data requirements, sourcing and validating data, analyzing data, building models, synthesizing insights, and presenting results for market research survey data.
  • Supported analysis across multiple large-scale data sources (structured and unstructured) to identify and socialize key facts and insights from within the ecommerce channel.
  • Uncovered insights from exploratory analysis by leveraging data experiments and appropriate measurement tools for market survey data.
  • Applied multivariate statistical tools to help build predictive models, improve customer segmentation, optimize approach to online pricing, and improve elements of the digital marketing mix.
  • Developed, measured, analyzed and reported all key performance indicators for HL’s online sales channels. Developed and executed performance dashboards as well as performed ad hoc and recurring analyses on market survey data.
  • Analyzed performance and impact of digital marketing and merchandising investments with key e-retailer accounts to determine customer impact and return on investment (ROI) (for both online and in-store purchases) using sampling market research strategies and tracking and arranging survey data. Proactively identified conversion breakdowns on key e-retailer sites by analyzing clickstream data and purchased funnel metrics; developed and tested new methodologies, technologies, or approaches to find potential conversion improvement opportunities through coordinated market survey analysis.
  • Identified business growth opportunities via data driven insights. Developed actionable recommendations and present them to eCommerce and Business Unit leadership.
  • Using SAS Predictive Modeling created data sources in Enterprise Miner, explored and assessed data sources, build predictive models using regression analysis (linear and logistics), decision trees and neural networks, used fit statistic for different predictions, used decision processing for adjusting over sampling, used profit/loss information for assessing model performance and for comparison of models, market segmentation, customer behavior analysis, market research survey data.

We'd love your feedback!