Sas Data Analyst Resume Profile
5.00/5 (Submit Your Rating)
Summary
- Joseph has total Experience of 8 years in design, development, and implementation using SAS technologies.
- Good knowledge of Data Management Studio, DataFlux Transformations and Dedup Techniques.
- Good knowledge in reporting, Business Intelligence, SAS Data Integration Data Quality Using DataFlux.
- Experience in integrating Data Flux Jobs in Data Integration Studio.
- Experience in LSF Scheduler, Creating Flows in SAS Management Console and monitoring in Process manager.
- Experience in Risk Analytics and Database Marketing.
- Developed ETL Jobs with performance tuning.
- Good Experience in SAS technology in Telecom, Banking, stock market and Life Science.
- Interact with the marketing and Business for requirement gathering, planning and implementation.
- Experience in benchmarking activity.
- Experience working with huge data in Terabytes.
- Experience in Campaign Management and ABT creation for the Churn Model.
- Good knowledge of Pass through concepts of SAS Access to Databases.
- Experience in Automation projects using Base SAS/Macros.
- Good Knowledge in Loop Transformation for Parallel Execution.
- Experience of working in Tech Support role along with SAS India.
- Experience in Installing the USPS, DPV, LACS Database on the DataFlux Server.
- Experience in MS Office tools like MS Excel.
- Technical Competencies
- Languages SAS, SQL
- Operating System Linux, UNIX, Mainframes, Windows
- Databases Oracle 11g, Netezza, SQL Server
- SAS DI BI Tools Data Integration Studio 4.2, Data Flux v8, Data Management Studio 2.4, Web Report Studio 4.2, Information Maps Studio 4.2, SAS OLAP Cube Studio 2.4, Enterprise Guide, SAS Management Console.
- Design, Version Control
- And Documentation Tools SharePoint
Work Experience
Confidential
Role: DataFlux Developer
Roles Responsibilities:
- Involved in requirement gathering, analysis and design.
- Develop the Profiling Job for Data Analysis.
- Developed Re-usable Cleansing services which can be used across for various projects.
- Produce CASS Certified addresses along with Geo Coding and Area Code Lookups.
- Developing the batch Jobs and Services for the Sales Representatives to Standardize, Cleanse, match and DE dupe Leads for Sales Force.
- Installation and Configuration of DataFlux, Quality Knowledge Base, USPS and CASS Database on the DataFlux Server.
- Design High level and Detail Design for the project.
- Update the Configuration files for Better Performance.
- Develop Data and Process Jobs.
- Technologies Used: Data management Studio2.4, Linux.
Confidential
Role: Sr. SAS Developer
Roles Responsibilities:
- Design and Develop Data Integration Jobs for parallel execution.
- Developed Pass through Jobs for better performance.
- Automating the monitoring activity by using BASE SAS and email options
- Designing and Scheduling the Flow in SAS Management Console.
- Monitoring the Workflows in LSF Scheduler.
- Extracting the Prepaid Campaign data, Identifying the target population and Benefit the Customer as per Campaign Design across all the circles.
- Development of UNIX script for migration and monitoring.
- Design and develop the Control Base population for entire Telecom Base.
- Participation in UAT activities for Frequent Benefit Passing.
- Developing WRS reports and Dashboards.
- Technologies Used: Data Integration Studio 4.2, Web Report Studio 4.2, Information Maps Studio 4.2, SAS OLAP Cube Studio 2.4, SAS BI Dashboard, Enterprise Guide , SAS Management Console, Unix, Oracle 11g
Confidential
Role: Sr. SAS Developer
Roles Responsibilities:
- Building the DI Jobs to track the daily load and validate the results.
- Load data from Dimension and Fact tables.
- Performance test for the optimization of transformations.
- Using Pass-through for the SQL tables to optimize the time.
- Bulkload for better performance.
- Error and Exception Handling.
- Building Profile and Architect Jobs in Data Flux.
- Running the Data Flux Jobs in DI Studio, Performing Prioritization and DE duplication.
- Monitoring the Jobs in Data Integration Server.
- Technologies Used: Data Integration Studio v4.2, Data Flux, Enterprise Guide, SQL Server
Confidential
Role: Sr. SAS Developer
Roles Responsibilities:
- Creating Analytical Base Table ABT for identifying the churning population.
- Produce Month on Month, Week on Week analysis for the Prepaid Customers.
- Develop Segments to help identify the churn.
- Analyzing the trends in the data.
- Daily load for various data sources.
- Creating Summarized reports for analysis.
- Automating the code to avoid redundancy and save time
- Technologies Used: Enterprise Guide, Data Integration Studio v4.2, Oracle 10g
Confidential
Role: DataFlux Developer
- Roles Responsibilities:
- Profiling, cleansing and standardizing the data.
- Identifying the suitable data type and definition for the fields.
- Generating match codes and identifying the Sensitivity.
- Cluster the data for deduplication.
- Tune the Job for better Performance.
- DfPower Customize to analyze the algorithm.
- Technologies Used: Data Flux, SQL Server
Confidential
Role: Sr. SAS Developer
- Roles Responsibilities:
- Providing Solutions to various Customers tracks on Business Intelligence, Data Integration, Data Quality and Installation.
- Creating PLAN file and Installing SAS products on Clients and Server for Single and multi-machine architectures.
- Worked on Clustering for the SAS Web Applications.
- Providing Demo on technicality of SAS Products and features to Client.
- Worked on Migration Issues on Enterprise Guide and WRS.
- Worked on Customer tracks for BASE SAS, Data Integration Studio 4.2, DfPower Studio v8, Enterprise Guide. Web Report Studio, Information Maps Studio, SAS OLAP Cube Studio, SAS Management Console
- Technologies Used: SAS, Data Integration Studio 4.2, DfPower Studio v8, Enterprise Guide, SAS BI Dashboard, Web Report Studio, Information Maps Studio, SAS OLAP Cube Studio, SAS Management Console
Confidential
Role: SAS Developer
- Roles Responsibilities:
- Involved in Planning, Requirement Gathering.
- Support the Benchmark activity.
- Developed ETL Strategy.
- Developed Data Model.
- Performance Tuning of ETL process for speed and resource optimization.
- Profiling, cleansing and standardizing the data.
- Identifying the fields for better results.
- Validation, Gender Analysis and Organisation Identification.
- Match code generation.
- Clustering the data for deduplication.
- Technologies Used: SAS, Data Integration Studio 4.2, DfPower Studio v8, Netezza, Solaris
Confidential
Role: SAS Analyst
- Roles Responsibilities:
- Campaign Planning and prioritization.
- Summary Reports are developed on the Segments to audit the volumes for Campaign.
- Performance of the customer and business strategies is tested and SAS reports are provided to the Business.
- Auditing the Test Campaign with the production.
- Giving the Lock and Load Approval for the Campaign to go LIVE on production.
- Creating waterfall reports for different channels.
- Automate and audit the reports for Business to minimise time and efforts.
- Quality check for the Campaign.
- Technologies Used: SAS, Mainframes
Confidential
Role: SAS Data Analyst
- Roles Responsibilities:
- Involved in analyzing the clinical data structure based on the clinical designs, which are applied to conduct the clinical trial.
- To standardize the clinical data based on the reporting requirement.
- Develop the listing and summary tables.
- Quality check for the reports or undesired output.
- Develop the macros for creating validation reports to minimize errors.
- Creating Business Continuity Plan , Defect Tracker,
- Issue Tracker, Peer Review for project.
- Technologies Used: SAS, UNIX, Windows