Sr Data Modeler Resume
Plano, TX
SUMMARY
- Over 10 plus years of total IT experience in Data Warehousing, Data modeler, Business intelligence and Clinical Data Management wif emphasis on Business Requirements Analysis, Application Design, Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems in teh Financial, Banking, Media, Health Insurance, Retail, Telecom, and Pharmaceutical industries.
- Expertise in SAS wif focus on SAS/Base, SAS/Macros, SAS/Stat, SAS/Access, SAS/Connect, SAS/Graph, SAS/ODS, SAS/Share, SAS/EIS, SAS Enterprise Guide, SAS Management Console, SAS/Enterprise Miner, SAS DI Studio, SAS Information Map Studio, SAS Web Report Studio, SAS Information Delivery Portal and SAS Data Quality (DQ).
- Experience wif SAS BI tools, Informatica, Teradata, SQL Server, Oracle and UNIX.
- Expertise in JMP, IBM SPSS and Rapid Miner.
- Strong knowledge in developing, designing and implementing Predictive Analytics models using neutral networks and logistic regression
- Specialization inSAS programming, Predictive Analytics and Database Marketing area
- Experience in OLTP Modelling and OLAP Dimensional modelling (Star and Snow Flake) using ERwin (conceptual, logical and physical data models).
- Implemented projects wifBASE/SAS, SASMACROS,STAT, PL/SQL, GRAPH, OLAP, PROC, DATA, FORMAT, ODS in windows environment involving ETL processes
- Extensively involved in teh creation of SASdatasets, Reports,Listings, Graphs, tables, Adhoc reports and dealing wif large datasetsaccording to teh Standard Operating Procedures (SOP’S),reviewing CRF’s
- Specialization inSAS programming, Predictive Analytics and Database Marketing area
- Extensive experience in usingBusiness intelligence (BI) applicationslikeSAS OLAP Cube studio, SAS Web Report Studio, information map studio, SAS add - in for MS office and SAS Enterprise guide (EG) forcreating, editing & managing reports in live projects
- Strong analytical, modeling, communication & presentation skills
- Involved in complete Software Delivery Life Cycle (SDLC) in large data warehouse environment.
- Experience in software development methodology such as Agile.
- Experience in identify, research and resolve ETL production root cause issues. Experience in maintenance, enhancements, performance tuning of ETL code.
- Strong expertise in SAS Base procedures, reporting and summary procedures.
- Experience in using SAS to read, write, import and export to another data file formats, including Spread sheet, Microsoft Excel and access tables.
- Extensively worked on SAS views and SAS Indexes to reduce I/O bottlenecks and processing time for large volumes of data.
- Expertise in using SAS MP-Connect (Multi Processing) to reduce time in generating reports/Application execution time.
- Highly creative and intuitive in developing, designing and debugging SAS programs and SAS/Macros to extract, modify, merge and analyse financial, sales and demographic data and generate output of data analysis in teh form of files, tables, listings, graphs and reports.
- Worked on ARIMA procedure for forecasting applications.
- Highly proficient in dealing wif missing values, duplication, dates, validation and other data cleansing issues using SAS tools.
- Experience in databases such as Oracle 8i/9i/10g/11g/12c, SQL SERVER 2000/2005/2008, DB2 and Teradata V2R6 and in writing SQL, T-SQL and PL/SQL Packages, procedures, Functions, Triggers, Cursors as per teh business needs.
- Heavily interacted wif teh Business Users to gather requirements, Design Tables, standardize interfaces for receiving data from multiple operational sources, coming up wif load strategies of loading staging area and data marts.
- Extensively used teh tools like MS Visio, Quick Base, PVCS, File Zilla, Toad, SQL developer and HP open view.
- Quick learners wif ability to understand job requirements, employ new ideas, concepts, methods, and technologies.
TECHNICAL SKILLS
SAS Skills: SAS Base, SAS/Macro, SAS/SQL, SAS/STAT, SAS/Access, SAS/Graph, SAS/Enterprise Miner, SAS/Connect, SAS/Enterprise Guide, and SAS/ODS.
Programming Languages: SQL, PL/SQL, HTML, Visual Basic, C, Java J2EE, Java Script, C#, ASP .NET
Databases: Oracle, Teradata, SQL-Server, Sybase and DB2
Software Packages: Microsoft Office Packages, UML, MS Project 2000 and Crystal Reports, Business Objects, Cognos, Micro Strategy, Xelcius Dashboards and Tableau.
Operating Systems: Windows (PC), Mainframe (JCL) and Linux, AIX, Sun Solaris
PROFESSIONAL EXPERIENCE
Confidential, Plano, TX
Sr Data Modeler
Responsibilities:
- Analysis:Data preparation, analyze variables, create variables, development and comparison of multiplesegmentation models, profit calculations and optimization of selected model, creating and modifying reports(SAS Enterprise Miner, Base SAS SAS/Macros, SAS/ODS, Proc Means/Logistics/Import/Export/graph, SAS BI tools - SAS add in, web report studio)
- Involved in Functional Design and Technical Design documents.
- Performed ETL process to fetch teh data from excel file and load in SAS dataset and Run DI jobs during month end activity.
- Utilized MS Excel in order to perform advanced functions such as (V-Lookup, H-lookup, Dynamic Lookup, Pivot table, Filters, Macros)
- Involved in using Python Scripting to extract data from database sources.
- Used Tableau Desktop to analyze and obtain insights into large data sets.
- Analyzed, designed and developed Extraction, Transformation and Load (ETL) processes for Data Warehousing projects using SAS Data Integration Studio, Base and Advance SAS.
- Conceived and designed end to end Predictive Analytics solutions to support Confidential s business units and initiatives.
- Performed Predictive Analysis on money transactions & Behavior-based Attrition Modeling
- Created a high-quality statistical-grade data set, optimized for Modeling & Analytics, emphasizing data validation, Predictive Analytics, data cleansing & data reduction.
- Assembled & analyzed teh modeling database using Predictive Analytics techniques, including data reduction, data enhancement and data transformations.
- Developed a datamart extracting data from different sources like SQL Server, Oracle and flat files from mainframes.
- Coordinate wif business and other IT developers to understand requirements and develop optimized solution.
- Parse incoming data from files and Oracle tables, transform and merge teh data and save to Oracle Tables.
- Extracted data from teh databases using SQL SAS/Access, SAS SQL procedures and create SAS data sets.
- Performed data analysis, created reports wif extensive use of BASE SAS, Macros and SAS reports wif teh halp of default procedures.
- Created new programs and modified existing programs.
- Created reports using SAS /ODS and SAS /Views from SQL Server database tables into different formats like HTML and RTF
- Created reports using PROC REPORT, PROC TABULATE, ODS statements and PROC TEMPLATE to generate output formats in HTML and also generated crystal reports using Business Objects.
- Generated HTML, PDF and other reports using SAS/Enterprise Guide and SAS Web Report Studio.
- Restructured SAS datasets using PROC TRANSPOSE procedure.
- Applied SAS macros to automate teh code so that they generate reports on teh invocation of teh macro, which halped to save time and manual intervention.
- Involved for eventual source analysis in teh part of decommissioning databases.
- Developed UNIX Korn Shell scripts and Crontab to automate recurring processes.
Environment: Python, SAS/Base, SAS/Macro, SAS/Access, SAS/STAT, SAS/ODS, SAS/GRAPH, SAS V9.1.3, SASViews, SAS Enterprise Guide, SAS Enterprise Miner, SAS Web Report Studio, SAS Information Delivery Portal, Business Objects, DTS, Excel, Windows XP, UNIX.
Confidential, Denver, CO
Sr Data Modeler
Responsibilities:
- Involved in Construction of ETL Process, Extracting data using SAS/Access, SAS SQL procedures and creation of SAS data sets Transformation according to teh Business Requirements and Loading of data in to teh appropriate data sets.
- Write SAS macros to improve program efficiency and streamline Predictive Analytics applications.
- Provide expertise and develop staff workshops related to statistical analysis and Predictive Analytics.
- Development of SAS® solutions for retail and wholesale sales Predictive Analytics.
- Extracted data from Teradata using SAS/Access.
- Wrote Macros to develop a concise and reusable code.
- Converted large data from Teradata to SAS data sets using SQL pass through and libname facility.
- Data cleansing and Data manipulation done using Datastep and Procedures.
- Extensively used PROC PRINT, PROC REPORT, PROC GCHART, PROC TABULATE, PROC MEANS, PROC FREQ, PROC SQL and PROC DATASETS.
- Maintained and enhanced existing SAS reporting programs.
- Prevalidation and post validation checks done on weekly process.
- Analyzed and implemented code and table changes to improve performance and enhanced data quality of SAS programs.
- Created different customer segments using Proc Rank procedure
- Conducted statistical modeling for both continuous and categorical data, including linear regression, nonlinear regression, ANOVA, GLM model, repeated measure, nonparametric regression, classification tree etc.
- Users defined through SAS Management Console.
- Servers bounce process done on weekly basis.
- Used SAS/Enterprise Miner to develop Neural Networks, Logistic Regression and decision trees model.
- Prepared process flow diagrams using MS VISIO.
Environment: SAS 9.1.3, HP Integrated Lights Out, Teradata7.2,Toad,Windows Server 2003,UNIX,Teradata7.2, Base SAS, Macros, SAS/Access, SAS Management Console, MS VISIO, SAS Enterprise Miner, SAS/Stat
Confidential, ChicagoSr. Data modeler
Responsibilities:
- Understanding teh legacy system data and generate reports for data anomalies to teh EPIC business team.
- Data cleansing and data standardization based on business rules.
- Developed Informatica mappings, Sessions, Work flows and SQL code.
- Validated data loaded into teh target database from source using Informatica mappings.
- Involved in Requirement Analysis, Functional Design and Technical Design.
- Developed teh mapping specifications from teh business requirements and teh data model. Involved in data modelling and design of teh data warehouse.
- Created Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Filter, Sequence, Router, Update Strategy, Lookup transformations.
- Involved in Informatica administrative work such as creating Informatica folders, repositories and managing folder permissions.
- Involved in installation of teh patches for Informatica power center.
- Involved in migrating code from Development to Testing and from Testing to Production Environments, creating Connect Strings and synchronizing teh Connect Strings and Connection Objects in Dev, Test and Prod environments.
- Extensively involved in performance tuning on both Informatica side and Database side.
- Implemented Type1 and Type2 slowly changing dimensions (SCD) logic.
- Accuracy is maintained; Developed Informatica work flow to automate teh similar type of feeds.
- Modified mappings according to new requirement.
- Column profiling done for generation of basic statistics like number of Distinct, Duplicate, Null values on each columns in teh customer tables using Teradata.
- Created summary columns which are used for analysis
- Found data anomalies on all columns in teh tables.
- Comparative analysis for teh tables present in customer related databases.
- Extensively used functions for data conversion
- Extensively used sql for data quality issues.
- Teradata utilities used for loading data into different tables.
Environment: SAS Enterprise Miner, Informatica7.1.2, Oracle 9i, NCR Teradata6.0,SQL, Retail, Informatica 7.1.2, Oracle 9i, NCR Teradata6.0, MS Excel, UNIX, Windows XP, Toad
Confidential, NY
SAS Data Modeler
Responsibilities:
- Business Users Support, Involved in supporting business users in generating new reports or modifications to teh existing reports to meet their goals.
- Extensively Involved in Data Extraction, Transformations and loading of teh data from data marts, data warehouse.
- Converted large data from Oracle to SAS data sets by using SQL pass through facility.
- Prepared flowcharts indicating teh input data sets, sorting and merging techniques and wrote SAS code accordingly.
- Analysed descriptive statistics and demographic data using PROC UNIVARIATE, PROC FREQ, and PROC TABULATE.
- Developed and executed SAS SQL queries for merging, concatenating, and updating large volumes of data.
- Worked on creating SAS Data sets from flat files and merging data
- Extensively used PROC PRINT, PROC REPORT, PROC GCHART, PROC TABULATE, PROC MEANS, PROC FREQ, PROC UNIVARIATE, PROC SUMMARY and PROC DATASETS.
- Wrote queries on teh existing Oracle database to provide ad-hoc reports using SAS/SQL.
Environment: SAS Enterprise Miner, UNIX, Windows XP Oracle 9i, SAS8.2, Base SAS, Macros, SAS/Access, Oracle 9i, Banking & Finance
Confidential
SAS Clinical Data Analyst
Responsibilities:
- Involved wifCDM group and statisticiansfor Clean teh data, Categorize data, Coded free-text data and processing data for analysis.
- Used SAS/Access to Import and Export data to and from MS application.
- Prepared data using data management methods such as if/else statement, DO grouping, SELECT, WHERE statement, ARRAY and SAS functions.
- Imported ASCII text and RDBMS data using SAS PROC IMPORT and LIBNAME.
- Employed techniques like sorting and merging on teh raw datasets and coded them using PROC SQL and SAS MACRO facility to get teh required output.
- Used SAS Macros and procedures like Proc SQL, TRANSPOSE, TABULATE, UNIVARIATE, MEANS, FREQ for creating summarized table for reporting.
- Generated Listing, PDF reports for presenting teh findings of various statistical analysis summary wif SAS/ ODS
- DATA step programming for permanently defined labels to data.
- Wrote excel macros using VBA.
- Documented change and modification.
- SAS/GRAPH for various types of graph for analysis and submission.
Environment: BASE SAS, SAS/REPORT, SAS/STAT Oracle, MS EXCEL, VBA, SAS/SQL, SAS/ODS, WIN XP, UNIX.