We provide IT Staff Augmentation Services!

Sr. Sas Administrator Resume Profile

3.00/5 (Submit Your Rating)

Jacksonville, FL

Objective

SAS Certified Professional offering over fifteen years of experience working within the IT industry involving complete Project Management life cycle, Database / ETL Management, Data Quality Management, Production support and software development life cycle.

Work Experience

Summary of Skill

Application Management Skills

  • Application development, report development, troubleshooting, estimation, architecture, design documentation, testing, promotion to higher environment, enhancements and providing production support using SAS BI tools.
  • Application integration testing, training, data conversion and functional enhancements.
  • Definition of business requirements, workflow analysis, and application development.
  • Preparation of project documentation, test plan, test cases, incidence reports, and sign off documents.

Database Management Skills

  • Database administration and Datasets development using SAS technology.
  • Extraction, Transformation and Loading ETL using advanced SQL, PROC SQL, Base SAS and Macros.
  • Combined expertise in development of effective data infrastructure, ETL processing, data warehouse, data modeling, and business intelligence reporting and dashboards.
  • Knowledge of IBM's Netezza and Composite tool.
  • SAS Certified Professional Base Hands-on experience of database technology including

SAS 8, 9.1.3, 9.2, 9.3 and 9.4.

Production Management Skills

  • Production environment management providing high availability and appropriate database troubleshooting.
  • Close coordination with development, systems and business units to implement new enhancements.
  • Preparation and maintenance of well-documented systems and conceptual design documents.

Data Quality Solution Skills

  • Developed complete project plan for scope and schedule for data quality solution, data problem discovery, data analysis, development of working as well as production prototype.
  • Architect, discover, design and developed data quality solutions using Data Flux Data Management Studio 2.2 and 2.3, Data Flux Web Studio and Data Management Server.
  • Presentation of In-Action batch as well as real time DataFlux job for profiling, standardization, integration, enrichment and data quality methodologies for higher management and application teams.
  • Installation and configuration of DataFlux Data Management Studio along with Quality Knowledge Base QKB and Data Packs.

Technical Skill

  • Languages: BASE SAS, SQL, PL/SQL, ProC, C , Java
  • Administration: SAS 9.1.3, 9.2, 9.3 and 9.4.
  • Databases: SAS, SPDS, Netezza, SASPS, Teradata, Oracle RDBMS, 11g, 10g, 9i, 8i 8, and 7.x, Sql
  • Server 2005.
  • Development Tools: SAS BASE v8, v9, SAS Enterprise Guide 4.3, SAS DI Studio 4.2, SAS
  • Management Console 4.2, SAS Information Map Studio 4.2, SAS OLAP Cube
  • Studio 4.2, SAS Web Report Studio 4.3, SAS Information Delivery Portal 4.3, SAS
  • BI Dashboard 4.3, SAS SQL, SAS Macros, Composite, TOAD, Oracle Workflow,
  • Enterprise Manager, Oracle Designer 6.0,SQL Navigator, DBArtisan, SQL Plus, SQL
  • Loader, Visio, Enterprise Manager,and JDeveloper.
  • Data Quality: DataFlux Data Management Studio 2.2, 2.3, DataFlux Expression Language
  • 2.3, DataFlux Web Studio 2.3, DataFlux Data Management Server 2.3
  • DataFlux DfPower Studio 8.0 and Master Data Management MDM .
  • Architecture: SAS BI Platform 9.1, 9.2, 9.3 9.4 on SAS GRID, Client/Server, J2EE and Oracle
  • Application Server 10g.
  • Front End: Oracle Forms 4.5, 5.0, 6i, 9i and 10g Oracle Reports 2.5, 3, 6i and Visual Basic 6.0
  • Integration: Oracle Data Integrator ODI
  • Operating Systems: WINDOWS 98/2000/XP/Vista/Windows 7, UNIX HP/UX, AIX, Solaris, Linux
  • Others: Microsoft Package word, excel, access, power point and project and WINSCP

Sr. SAS Administrator - SAS GRID Platform .

Confidential

Provided SAS Administration, Production and Development support related to Bank of America's Risk Management platform for data definition, connections, promotion, implementation, configuration, troubleshooting and general maintenance of the SAS EBI and SPDS Environment.

Responsibilities:

  • Administration and Managing GRID platform using SAS Management Console, SAS Deployment Manager, SAS Web Administration Console, SAS Metadata server and Application Server Configuration which involved not limited to Updating the SID File for licenses renewals, change the host names, rebuild Web applications, update Passwords, uninstall SAS software from the local machine, apply downloaded hot fixes to SAS software, Web-based interface that enables to monitor details for SAS Web applications, managed Metadata repositories by a repository manager, Clearing the Credentials Cache, working on Server manager, user manager, establishing connections using ODBC.ini and SPDS manager.
  • Administered, Controlled and Managed SAS Grid environment using EGOSH service commands in Putty sessions and RTM web application.
  • Developed Platform administration and access management related documents for the SAS Admin team.
  • Successfully performed SPDS server and security maintenance on a regular basis for stable and efficient SPDS platform.
  • Performed SAS GRID administration tasks of first priority setup, standard setup and optional setup.
  • Worked on Starting, Stopping, and Checking the Status of Servers on GRID platform.
  • Worked on Monitoring the Activity of SAS Servers and Administering Logging for SAS Servers using SAS Management Console, Server Performance Counters and Information Fields and SAS OLAP Server Monitor.
  • Worked on Administering Logging for SAS Servers.
  • Configured and executed Job Statistics Reports for SAS Data Integration Studio.
  • worked on Best Practices for Backing Up and Restoring SAS Content - Using Operating System Commands to Back Up the Metadata Server and Using the Export SAS Package Wizard to Back Up Specific SAS Folders. performed SAS Metadata Server Backup Tasks.
  • Managing Metadata Server Memory by Setting the Server's MEMSIZE Parameter and Input and Output Balancing.
  • Worked on SAS Metadata Repositories and Folders for Copying, Promoting, Importing, Exporting, and Analyzing Metadata, Moving a Metadata Repository to a New Location on the Same Metadata Server and Registering a Metadata Repository.
  • Used Promotion Tools - Export SAS Package, Import SAS Package Wizards and Batch Export / Import Tools to promote individual metadata objects / groups of objects from one metadata server to another, from one location to another on the same metadata server. Also promoted the physical files that are associated with the metadata.
  • Used the SAS Deployment Manager to Update Host Name References. Also, worked on Troubleshooting the Update Host Name References Tool.
  • Performed SAS 9.3 to SAS 9.3 Migration from Test environment to Prod using the SAS Migration Utility. Worked on designing migration, performing pre migration tasks, run the SAS deployment wizard to migrate SAS content, complete Manual migration steps, validating new environment and maintaining the same.
  • Coding SQL / Proc SQL programs related to platform support using SAS Enterprise Guide.
  • Build, deploy and scheduled LSF Process Manager jobs for SAS flows for production processing.
  • Developed Extract, Transform, Load ETL processes related to code promotion / platform administration tasks using Base SAS, SAS Data Integration Studio and SAS Enterprise Guide.
  • Designed data warehouse connection techniques to connect to heterogeneous data sources. Connections performed from SAS to SQL, Oracle, SPDS and Terradata.
  • Provided analysis and coding support for fixing failing SAS and SQL programs on processing high volumes of datasets and SPDS tables.
  • Analyzed and recommended changes to make OLAP cubes efficient to report large volumes of data, hierarchical drill downs, and slice and dice capabilities.
  • Managed SAS software depot and platform licenses.
  • Provided 24/7 support for monthly production processing for data loading as Business Consumable Objects BCO's .
  • Working knowledge of Installing Netezza tools on SAS platform servers.
  • Knowledge of IBM's Netezza tool to design high-performance data warehouse applications for using in enterprise data warehousing, business intelligence, predictive analytics and business continuity planning.
  • Knowledge of using the Composite Data platform to integrate data from multiple, disparate sources across the extended enterprise into a unified, logical virtualized data layer for consumption by front-end business solution including portals, reports and applications.
  • Worked on Autosys jobs and Shell scripts which executes SAS Macros and Programs.
  • Successfully handled work requests related to server access, user access, data access, troubleshooting and performance issues and user profile configuration.

Environment: SAS Platform Administration GRID 9.2, 9.3 and 9.4, SAS Enterprise Guide 4.3, SAS GRID Platform 9.2 / 9.3 / 9.4, SAS BASE, SAS Macros, SQL, SAS EBI, SAS Web Report Studio 4.2, SAS D I Studio 4.2, SAS OLAP Cube 4.2, SAS Information Map 4.2, SAS BI dashboard 4.2,SAS Information Delivery Portal 4.2, LSF Process Manager 7.1, IBM's Netezza, Composite Data platform, Autosys, Shell scripts, windows and Linux operating system.

Sr. SAS / DataFlux Consultant

Confidential

As a primary resource for SAS and DataFlux technology, contributed in architecting and building DCPS Master Data Management, Master Reference data and BI reports: The incoming data from SQL data source was subjected to DataFlux data quality methodologies to produce cleaner and meaning full SAS BI reports.

Responsibilities:

  • Present the end products with Q A sessions for Data warehouse datasets, web report studio reports, Information maps, OLAP Cubes and Information delivery portal to the users.
  • Develop unit/system test cases and perform Unit/system testing on SAS datasets and reports.
  • Deliver SAS code, reports and testing results on time as per the schedule.
  • Develop data profile reports for subject areas which identify bad data along with analysis notes. This document is used for Data Quality methodology implementation.
  • Use Data Flux DB Viewer to view, analyze and query source data. Used SQL query builder to build, write SQL code to execute against the data pulled in data flux DB Viewer.
  • Data quality and cleaning initiative effort involve: Architect and create profile jobs to produce profile reports using Basic Statistics, Frequency Distribution and Pattern Frequency Distribution Analysis.
  • Conduct inventory of data sources, build data dictionary around subject areas for Data Quality initiative as a baseline for DataFlux jobs.
  • Document Business rules for transformation logic to be used in creation of data ware house and analytical reports. Create Mapping Specification Document - maps the source data to target datasets/ reports. This document contains detail transformation/code logic used to transform source data.
  • Gather detail requirements and conduct requirement analysis through walkthroughs and interview with business users and SQL DBA's. Create requirement documents for building data warehouse and data marts for analytical reports and dashboard needs.
  • Develop extensively SAS code, Macros, Proc-SQL to perform Extraction, Transformation and Loading ETL related to data warehouse and reporting tasks using SAS BI tools Enterprise Guide, Base SAS and DI Studio.
  • Develop an efficient SAS codes to perform high volume of data crunching and transformation quickly.
  • Create and manage Metadata for source and target data using SAS Management Console SMC . Enable libraries for consumption by EBI client applications.
  • Create source tables, pre filters/filters, formats and Info maps on the fly for reports using Information Map.
  • Write procedures and functions to perform ETL Extraction, transformation and loading of data to create temporary and permanent SAS datasets.
  • Develop ad-hoc report, Data sources and Data summarization - functionality to be used by business using Web Report Studio, DI Studio and OLAP cubes.
  • Concatenate and merge large volume of data using SET and MERGE functionality.
  • Use Proc SQL and SQL commands to process the data faster wherever applicable.
  • Develop SAS Programs for Data Cleaning, Validation, and Analysis for adhoc reports. Test and debug ETL code, macros and report code.
  • Define and implement SAS coding standards/specifications to read and understand the code easily by other SAS users.

Environment: SAS Enterprise Guide, SAS Platform 9.3, SAS BASE, SAS Macros, SAS SQL, SAS EBI, SAS Enterprise Guide 4.3, SAS Web Report Studio 4.3, SAS D I Studio 4.3, SAS OLAP Cube 4.3, SAS OLAP Server 9.3, SAS Information Map 4.31, SAS BI dashboard 4.3,SAS Information Delivery Portal 4.31, Data Flux dfPower Studio 8.1, SQL server 2008, windows and UNIX operating system.

SAS Team Lead

Confidential

Campaign Measurement Reports:

Primarily responsible for the complete SDLC Software Development Life Cycle for building campaign measurement SAS database for campaign measurement reports on the estimation of dollars spent on different advertisements via different Engines such as Google, Yahoo, MSN etc. The data source/ input files to create the SAS database are obtained from Teradata team as a multiple text file. The source files are extracted using different formats, transformed by applying programming logic and loaded as a SAS dataset. The ETL process is made automated by using Visual Basic script to call the SAS programs and scheduling the Visual Basic script job through windows job scheduler. The SAS report code is developed with dynamic display and processing of date with in SAS code itself.

Responsibilities:

  • Developed SAS code to perform ETL Extraction, transformation and loading and process flow of data to create source SAS datasets using SAS Base SAS DI Studio.
  • Generated list reports using the PRINT and REPORT procedures using Base SAS and Web Report studio.
  • Created summarized business data using OLAP Cubes which are in turn used by SAS programmers and business.
  • Modified variable attributes using options and statements in the DATA step
  • Used SAS functions to manipulate character data, numeric data, and SAS date values
  • Used FORMATTED, LIST and COLUMN input to read raw data files
  • Used INFILE statement options to control processing when reading raw data files
  • Used various components of an INPUT statement to process raw data files including column and line pointer controls, and trailing controls.
  • Created source tables, pre filters/filters, formats and maps as on the fly reports using Information Map Studio.
  • Created VB script with windows job scheduler to automate the data base creation process.
  • Automated several production and ad-hoc SAS Applications. This improved the performance of the current system and minimized the number of Ad-Hoc requests.
  • Created SAS code to generate reports with dynamically processing of the data for recent data sets to create the Campaign measurement reports.
  • Created test execution plan, test cases, test scenarios and test scripts in SAS for Quality Assurance testing.

Environment: SAS Enterprise Guide, SAS platform 9.1.3 and 9.2, SAS EBI 9.2, SAS Web Report Studio, SAS D I Studio, SAS OLAP Cube, SAS Information Map, SAS Macros, SAS SQL, Teradata and Windows XP

Sr. SAS Systems Analyst

Decision Analytics

Primarily responsible for building new applications for Decision Analytics Team, performed enhancements and development work to add /edit the functionality of the existing reporting applications, performed programming recommendations / changes / enhancements to existing SAS programs for efficiency and best practices, implemented Data quality solutions to clean the enterprise data and implemented project management methodology for software development life cycle SDLC .

Responsibilities:

  • Extensively involved in SAS programming to create SAS data sets including large SAS data steps
  • Compiled Stored SAS Macros, SAS procedures and reusable SAS include programs.
  • Developed new or modified SAS programs to load data from the source, applied required transformations and loaded the transformed data to the target using SAS base, SAS macros, SAS-SQL, SAS functions and SAS procedures.
  • Developed reports using SAS Add-In for Microsoft Excel and created Stored Process and Macros.
  • Developed complex reports using PROC FREQ, PROC MEANS, PROC SUMMARY, PROC REPORT, PROC TABULATE and PROC TRANSPOSE.
  • Handled status update and Process change request PCR .
  • Prepared, documented and tested SAS programs which include creation of Conceptual Design Document, Systems Design Document, Test Plan and Test Cases.
  • Developed and summarized reports using SAS OLAP for Multidimensional Expression MDX .
  • Created OLAP Cube using SAS OLAP Cube Studio, SAS OLAP data Provider 9.1, Source Designer Wizard and Cube Designer Wizard.
  • Provided SAS programming technical support to data modeling group for SAS Base, SAS macros, SAS Enterprise Miner and SAS EG for Data transformation, data crunching data mining tasks and regression model functionality.
  • Created reports using BI Server i.e. Web Report Studio, Enterprise Guide and Microsoft office integration.
  • Designed and developed Physical and Logical data view using SAS Information Map Studio.
  • Administered SAS Management Console for the creation of metadata, user profiles, configuration of client applications for SAS server connectivity, repositories creation and management etc.
  • Creation, management and troubleshooting of SAS Metadata Repository. Involved with higher management for Repository planning, access and space allocation.
  • Performed administrative tasks applying to the platform, including starting and stopping servers, checking the status of servers, setting server logging options, administering the SAS Metadata Server and administering SAS Metadata Repositories.
  • Created metadata objects that are used to establish connectivity to data sources and targets. Also performed setting up shared access to SAS data.
  • Designed and Developed reports using SAS BI Server i.e. Enterprise Guide and Web Report Studio.
  • Used SQL Pass-Through data access method of SAS Access to connect to varies databases using ODBC Open Data Base Connectivity .
  • Applying the data quality solutions of profiling, standardization, data quality and integration using data flux.
  • Used data flux batch job processing to clean the incoming data to the data base.
  • Coding /Building the data quality solutions of profiling, standardization, data quality and integration using SAS DataFlux.
  • Coding /Creation of SAS - DataFlux schemas to implement customized business rules.
  • Assisted higher management in implementing technical project management methodology using project management plan which includes scoping, scheduling, risk management and quality assurance.

Environment: SAS V9.1.3-BASE, SAS-SQL, SAS-Macros, SAS Enterprise Guide 4.1, Enterprise Miner 5.3, SAS EBI, SAS Web Report Studio, SAS D I Studio, SAS OLAP Cube, SAS Information Map, DB2-SQL, Oracle 10g and Data Flux dfpower studio 8.0 in windows / Unix environment.

Principal SAS Consultant

Project Management

Building a new application to support Fireman's Insurance which involved reviewing the project method for Application On Call services P1, P2 and P3 , identified the stakeholders and Created project strategic plans for the successful execution and implementation of the project. Also participated in the Conceptual design document, Detail project execution plan and the Detail design document with the client. Preparation of all project management documentation related to system design approvals and Sign off at each tasks. Scheduling and successful implementation for the User Acceptance Testing, developed the project plan to create the documents related to Risk analysis, troubleshooting and efficient execution and project Sign off and maintenance plan for different task assigned to the execution team.

Environment: SAS/ Base, SAS/Macro, Unix, Mainframe and Windows

  • Responsible for building a project plan, executing it and providing production support for the Application Migration. In dept creation of project execution plan with task divided to build the capability for production support. Designed the project based on Progressive Elaboration and Sign off from the stake holders for Project Execution Plan. Project management responsibilities also included documentation of the Production support process: quality management, time management for tickets and Risk Management. Hiring, training and building the team to address technical as well as functional issues,
  • Project focused on the addition of enhanced functionality for the Price change Per Enterprise Unit Application. Primarily responsible for the creation of system design, detail estimation and project schedule document. The project was divided into 5 phases to address the Sign off from Stakeholders and Project Managers. This included quality gate checks, user Acceptance testing Approvals, Change Request justification with back-out plan and deployment of the project in production with Sign off from the Product Designer. Also responsible for building the project execution team on the basis of Application Area knowledge, standards, functional domain Insurance /technical knowledge and General Management/interpersonal Skills.

Responsibilities:

  • Developed and implemented new application using SAS, Unix and Mainframe.
  • Monitoring and trouble shooting of daily, weekly and monthly jobs.
  • Handling and resolving tickets related to SAS, DB2, Mainframe and UNIX.
  • Handled status update and Process change request PCR .
  • Planned and Implemented Knowledge Transfer KT from Fireman's Fund Insurance Co, Novato, CA, USA to IBM, Bangalore, India.
  • Strategically planning to provide 24/7 support to the client by scheduling the teams availability.
  • Performing Enhancement and development work to add/edit the functionality to the applications.
  • Successfully handled and resolved P1, P2 and P3 critical tickets .
  • Successfully trained the Technical competency for SAS Resources.

Environment: SAS V8-BASE, SAS-SQL, SAS-Macros in Windows/Unix Environment, DB2-SQL, Mainframe- MVS/ESA-JCL, ISPF AS400,

SAS Team Lead

On Site

Production Support Migration

Project involved managing and running the Consumer Legacy run team. This involved migration of consumer finance production support to India. Preformed feasibility study for project plan successfully. Defined and documented the Application area knowledge Functional and Technical ,standards and regulations. Created training and migration plan along with the milestones, check points and approvals at each phases of training and migration, successfully conducted risk management on the project and established the strategy for Project Communications Management. Established the process for weekly/monthly status updates, reviewed and appraised. Described and planned the process for project quality management for quality planning, and quality assurance.

Responsibilities:

  • Monitoring and troubleshooting the daily jobs AMDW, ADS, APPS Genesys and QDW1.
  • Monitoring and troubleshooting the weekly jobs Quick Screen, Experian, Equifax, CLNTSERV and Collection Process.
  • Monitoring and troubleshooting the monthly jobs AMDW, Client Data mart, TS 49 data mart, ADS Dashboard, APPS Dash board, TS 49 Dash board, Data mart Billing Promo, Acquisition Data mart, S2K Dash board, APPS Legacy mainframe extract, Edge Extract, Narex Extract, CSDM Loads, History Mon and Monthly Code2.
  • Handled and Resolved tickets related to SAS, Mainframe and UNIX.
  • Handled status update, Business update, Tickets status calls on daily and weekly basis.
  • Planned and Scheduled Knowledge Transfer KT from Accenture Corporation, Mumbai.
  • Strategically planned to provide 24/7 support to the client by scheduling the teams availability.
  • Successfully decommissioned AMDW.
  • Successfully handled Emergency Response Team ERT call and troubleshot the problem .
  • Handle the FICO extractions.
  • Maintained the daily checklist for a Minutes of meeting MOM and b Real time involvement.
  • Successfully checked Technical competency for SAS Resources.

Environment: SAS V8-BASE, SAS-SQL, SAS-Macros in Windows/Unix Environment, Oracle 7.x 8 SQL PL/SQL Program Units , Mainframe- MVS/ESA-JCL, ISPF AS400,

Sr. Systems Analyst

Project involved analyzing the requested campaign and execution of the approved campaign. This included coordinating with the Business Partners BP and Econometricians for Eligibility File of Card Members CM's , monitoring the compliance issues related to the company policies and evaluating Business justification for the Data Exception Fields.

Responsibilities:

From the POLL of Analyzed Data, the parameters are used to pull the required Card Members CM's Information from the Mainframe and Unix Datasets/Tables. As the last minute suppressions run the CM's List through any one of:

  • 1 Standard Policy Suppressions
  • The data pulled is saved in a mainframe file named: ICIM01.LOY . .Trigger. This run through the process of Last minute suppressions and created a Listauth file Private information file named: PRDIN.PUBLIC.LISTAUTH. . From this listauth file the final file is created after filtering the variables, which is in the form of text or Excel file by using the SAS Datasteps.
  • The Information of the Card Members CM resides in Unix which is updated monthly, and the weekly updated information resides on Mainframe. Depending upon the need the CM information is pulled from tabled in Unix or Mainframe using SAS Data steps and PROC SQL.
  • Once the request is received by the Business Partner BP in the form of Campaign Management System CMS through lotus notes it is checked for Business Justification of the request and purpose. The request is thoroughly checked for Business compliances as per different regions and countries. The request is at last checked for any extra information needed or for eligibility file or for lack of information. For any of the above things the BP is communicated keeping the managers in loop. The starting of the project is intimated to the BP.
  • The final file before the delivery is checked for Quality Assurance and delivered to the BP for successful 100 flawless execution of the campaign.
  • 3 Legal Compliance Mailing.
  • 2 No Standard Policy Suppressions,
  • The project documentation is done specific to the country stating the step by step explanation of execution. Once the document is complete it the project is sent for Quality Assurance QA check. After QA check the final file requested in whatever format by BP is sent using CMS.
  • Closed 80 projects so far as flawless project programming execution. QC check done on 40 projects successfully. No programming or logical error established for the closed projects.
  • As a excellent track record no project failed till date.
  • Projects Execute: Campaign Management System CMS APA Asia Pacific Australia , JAPA Japan Asia Pacific Australia , LAC Latin America Caribbean , EMEA Europe middle East and Africa .

Environment: SAS V8-BASE, SAS-SQL, SAS-Macros in Windows/Unix Environment, SQL/PLSQL-Oracle 8, Mainframe- MVS/ESA-JCL, ISPF AS400 and Lotus notes.

Team Lead Business Intelligence

Involved in project focusing on planning, execution and control of the complete Software Development life cycle. This included monitoring and identification of risks, corrective action and updating the risk response plan. Defined and created the change management document, managed the changes to the software version, project scope, project schedule and project costs as per the change management document. Also mentored the team in order to improve performance by developing team cohesiveness, training and motivation for efficient software development and proper time management.Also responsible for the coding, executing and managing of a team of programmers for data mining, data warehousing and database Management, E-commerce applications, N-Tier applications and Business objects.

Responsibilities:

  • Involved in project focusing on planning, execution and control of the complete Software Development life cycle. This included monitoring and identification of risks, corrective action and updating the risk response plan. Defined and created the change management document, managed the changes to the software version, project scope, project schedule and project costs as per the change management document. Also mentored the team in order to improve performance by developing team cohesiveness, training and motivation for efficient software development and proper time management.
  • Also responsible for the coding, executing and managing of a team of programmers for data mining, data warehousing and database Management, E-commerce applications, N-Tier applications and Business objects.

Description of the Project Profile:

1 Accessing Data :

  • Used FORMATTED, LIST and COLUMN input to read raw data files
  • Used INFILE statement options to control processing when reading raw data files
  • Used various components of an INPUT statement to process raw data files including column and line pointer controls, and trailing controls.
  • Combined SAS data sets using the DATA step
  • Creating Data Structures
  • Created temporary and permanent SAS data sets
  • Created and manipulate SAS date values
  • Used DATA Step statements to export data to standard and comma delimited raw data files
  • Controlled which observations and variables in a SAS data set are processed and output.

2 Managing Data:

  • Investigated SAS data libraries using base SAS utility procedures
  • Sort observations in a SAS data set
  • Conditionally execute SAS statements
  • Used assignment statements in the DATA step
  • Modified variable attributes using options and statements in the DATA step
  • Accumulated sub-totals and totals using DATA step statements
  • Used SAS functions to manipulate character data, numeric data, and SAS date values
  • Used SAS functions to convert character data to numeric and vice versa
  • Processed data using DO LOOPS
  • Processed data using SAS arrays

3 Generating Reports:

  • Generated list reports using the PRINT and REPORT procedures
  • Generated summary reports and frequency tables using base SAS procedures
  • Enhanced reports through the use of labels, SAS formats, user-defined formats, titles, footnotes and SAS System reporting options
  • Generated HTML reports using ODS statements

4 Handling Errors:

  • Identified and resolve programming logic errors
  • Recognized and correct syntax errors
  • Examined and resolve data errors
  • Projects Executed USA: Direct MAC System, RDG Insurance System, Test Tutor, Florida Tourism Information System FTIS .

Environment: SAS V8-BASE, SAS-SQL, SAS-Macros, Active X, COM, MTS, SQL Server 7.0, Oracle 7.x 8 SQL PL/SQL Program Units , VB-6.0, ASP, HTML, VB Script, Java Script, J Script in Windows

Software Programmer

Project involved analyzing the requested campaign and execution of the approved campaign. This included coordinating with the Business Partners BP and Econometricians for Eligibility File of Card Members CM's , monitoring the compliance issues related to the company policies and evaluating Business justification for the Data Exception Fields.

Responsibilities:

Involved in project Analysis, pre sales requirement reporting, prototype designing, coding, testing, and coordinating post sale updates. Responsible for systems analysis, designing client server N Tier Application with service oriented layering, designing and coding GUI and Backend with Analysis Creation of Database, Database Objects and raising queries with SQL, PL/SQL Program Units, Analysis Building Business Logic, COM Active X Components and testing with version Changes.

Speedy Math-1.0, Attorney Assistant.

Environment: Oracle 7.x SQL PL/SQL Program Units with Developer 2000, VB-6.0, Active X, COM, MTS, SQL Server 7.0, ASP, HTML, VB Script, Java Script and J Script in windows environment.

We'd love your feedback!