Team Lead Developer Resume
Columbus, OH
PROFESSIONAL SUMMARY:
- Team Lead with 10 years of extensive experience in Business Analyst/SAS Developer/ Business Intelligence Tools on different platforms - UNIX, LINIX and Windows for Marketing Campaigns on Mortgage Banking ( Supper streamline & FHA Loan Refinance ),experience in (ETL) data extraction, transformation and loading to data warehouse, data analysis, updating, testing, trending, Data Cleaning, Data Aggregation, Data Mining, Data Analysis, mapping analysis and business driven data validation in varies MDM, AML/KYC, CCAR, Credit Cards and Classic SSL/FHA campaigns.
- Experience in Data analysis and Reporting in AML Anti-Money Laundering, analyzing root causes of AML/KYC compliance risk issues, assist on daily ad-hoc request.
- Experience building the open & closed accounts on the daily base AML/KYC Compliance programs. Assist building a documents of the mapping KYC between Database warehouse and CIS.
- Experience in Data analysis and Reporting in health care, finance and marketing sectors. SAS experience includes detailed knowledge of statistical analysis of financial, marketing and health care data and in the production of reports, tables, graphs and listing
- Responsible for Master Data Management, data enrichment and maintenance based on compliant master data policies
- Drive compliance to established data management policies, processes and published service level agreements (SLAs)
- Prioritize work area and work on tactical initiatives relating to data records, inter-departmental processes and client related projects
- Prepare and maintain accurate documentation of processes and procedures and support ongoing audit requirements
- Expertise in Automation process by schedule jobs in Control-M.
- Expertise in Big Data analytics (Greenplum, Hadoop HDFS, Map Reduce, Hive, PIG, SQOOP, Hadoop Hdfs, Mapreduce, Yarn, Hive, Pig, Sqoop, HBase, Spark and Cloudera Manager)
- Hands on Experience in Base SAS 9.4 9.2, SAS/Macros, SAS/SQL, SAS/Assist, SAS/Connect, SAS/STAT, SAS/Access, SAS/GRAPH, SAS/ETS, SAS ETL and SAS/ODS
- Possess excellent skills with PROC SQL, SAS Arrays, SAS Base procedures, summary procedures and especially Data Step programming and reporting
- Extraction and creation of data tables using SAS/ACCESS, SAS/SQL from large scale databases like ICDW/TERDATA, EDW/DB2, Oracle Database, Mainframe and Oracle/INFO1
- Expert in using PROC SQL, PROC REPORT, PROC ACCESS, PROC GPLOT, PROC GCHART, PROC FORMAT, PROC TABULATE, DATA MERGE, SORT to generate ad-hoc and customized reports
- Extensive experience in SAS/STAT procedures such as PROC REG, PROC GLM, PROC FREQ (Chi-Square), PROC MEANS, and PROC UNIVARIATE
- Developed and executed PRIDECTIVE MODELS such as REGRESSION, LOGISTIC REGRESSION including BACKWARD ELIMINATION and STEPWISE REGRESSION
- Expertise in using SAS Business Intelligence tools such as SAS Enterprise Guide, SAS Stored Processes, SAS OLAP Cube Studio, SAS Information Map Studio, SAS Web Report Studio, SAS BI Dashboard and SAS Information Delivery Portal
- Experience with creating a dynamic prompts or dependent prompts through SAS BI Tools. Creating SAS Stored Process (STP) via SAS EG and view the report over Web Report Studio and SAS Information Portal Map.
- Integrating and managing the data mappings from data sources to the SAS Analytical data mart using SAS DI Studio\ETL . Extracting, sorting, ranking, transforming, designing, developing SQL queries and implementing the DI Studio job flows to load
- Deep Knowledge of probability and statistics for engineering, economics and sciences
- Experience in modeling linear and non-linear regression and Testing hypotheses and finding confidence intervals for statistical value
- Experience of Factor Analysis, Logistic Regression, K-mean Clustering and Generalized Linear Models using SAS environment and SPSS as well
- Modeling and Simulation to understand the behavior of a dynamic system using MATLAB
- Strong leadership skills with ability to work independently
TECHNICAL SKILLS:
SAS Skills: SAS Base, SAS/Macro, SAS/SQL, PL/SQL, SAS/STAT, SAS/Access, SAS/Graph and SAS/ODS, SAS DI Studio, SAS Business Intelligence tools, SAS Stored Processes, SAS OLAP Cube Studio, SAS Information Map Studio, SAS Web Report Studio, SAS BI Dashboard, SAS Information Delivery Portal, SAS Enterprise Miner, SAS Enterprise Guide and PROCESSES MANAGER
Programming Languages: SAS, R, SQL, PL/SQL, HTML, C++, Python, Universal Math solver, Matlab, SPSS and Minitab
Databases: TERADATA, DB2, ORACLE, MAINFRAM and ORACLE/INFO1, Hbase
Software Packages: Microsoft Office Packages, UML, and MS Project 2000
Operating Systems: Windows, UNIX and LINIX, FTP, SFTP
Big Data: Hadoop HDFS, Map Reduce, Hive, PIG, SQOOP, Hadoop Hdfs, Mapreduce, Yarn, Hive, Pig, Sqoop, HBase, Spark and Cloudera Manager and EMR
Business Intelligence Analyst: Tableau, Futrix, SAS BI Intelligence -Web Report studio
WORK EXPERIENCE:
Confidential, Columbus, OH
Team Lead Developer
Responsibilities:
- Involve in meetings and projects with business users, project managers & developers to provide, monthly, daily, adhoc reports upgrade or expand applications that accomplish business operations and goals.
- Creating adhoc reports including ETL perform, extracting, sorting, remove duplicates records, transforming and designing using SQL queries and dataset.
- Develop code based on technical specifications to extract and process targeted account data.
- Validate and test all code for accuracy and efficiency.
- Assist in development of the specification by recommending alternative solutions to technical problems.
- Work with large SAS data sets (up to 900 variables). Pull data from large data (i.e. millions of records) like Teradata, Oracle or DB2 database as data sources into a SAS environment and performing SAS data manipulations against the resulting sets. This includes ability to merge datasets and to write SAS Macros and generate list outputs/reports.
- Validate the output by comparing the waterfall, freqs and sample records with previous run.
- Check to log for error, warning and uninitialized variables.
- Perform the Rate Sheet QC and automated the QC validation by creating macros for each purpose.
- Work with marketing and strategy team to approve the audit and new changes.
- Release the data internal team and external files to be sent to external vendors via SFTP.
- Provide programming support of ad-hoc requests and special projects as assigned.
- Support the Business Banking in conducting scheduled compliance monitoring, testing and compliance reporting
- Source a wide selection of client documentation from internal and external data sources in order to extract key information and validate client’s identity and reputation in order that they meet KYC checks
- Communicate with the AML Partners and across jurisdictions throughout the duration of an investigation
- Review exception reports and conduct forensic testing as part of ongoing compliance monitoring program
- Identify account transactions/checks that suggest suspicions of money laundering, terrorist financing.
- Research unusual transactions via internet and other avenues and pass all transactions that deem to be non-suspicious.
- Responsible for the appropriate collection and examination of documents to assist in identifying unusual transaction patterns.
- Document and report the investigation findings in the case management system
- Validate accuracy of data in KYC platforms and ensure completeness of document package.
- Providing descriptive statistics of the data using PROC MEANS, PROC FREQ and PROC UNIVARIATE.
- Use SAS ODS to create HTML, RTF and PDF outputs files in the process of producing reports.
- Perform statistical analysis and graphs using SAS Tools like PROC GPLOT, PROC GCHART in ODS RTF tools.
- Integrate big data (millions of data records) from large data warehouse(ICDW/TERADATA, EDW/DB2, ORACLE/INFO1) - CMS, CMSS, CMSS GNP, RM, RM1, RM2, RO, and marketing data marts based on specific business requests
- Retrieving Millions of Records using different Procedures.
- Update and modified the production code according to the dynamic requirements
- Performing data validation, transforming data from RDBMS oracle to SAS datasets.
- Develop Utility Macros and extensively used existing macros for Validation, Analysis, Report generation and Integration of Data.
- Successfully migrated Model Ready from EDW to TERADATA.
- Clean and optimize the existing the process by reducing significant amount of time
- Use SAS/MACRO for writing complex Macros facility to decrease the redundancy of extensively used SAS scripts and to increase the flexibility of the code.
- Assign library using LIBNAME statement by providing the location of data in UNIX environment, this can by on SAS EG or remotely on PC SAS 9.3.
- Identify new fee category and gaps by DATA STEP, IF…THEN … ELSE statement.
- Resolve the gaps and issues by filtering data and create multi DATA STEP, RENAME variables and sorting data (PROC SORT).
- Merging, appending and sorting the data sets using various SAS procedures and other SAS tools.
- Building the final summary report using SAS SQL from data (UNIX) and providing WHERE statement, GROUP BY statement, ORDER statement and HAVING statement if needed.
- Use PROC Export procedures for exporting data such as Excel Spreadsheet if needed.
- Login into Metadata Server using SAS MANAGEMENT CONSOLE. Then Managing Libraries and building files like data, Information Map and Web Report. Register data into Metadata.
- Creating a simple information map by register data into Map Information Studio. Move data from Resource to Information Map Contents; organize the data including format data (comma, percentage, dollar, date, time…..), numeric, character and dynamic format as well. Create new variables if needed. In addition, creating a complex information map.
- We build the report in Web Report Studio, its monthly report either by region or reason or both. In addition, we present the final report in SAS INFORMATION DELIVERY PORTAL.
- Move the data from operation development to the production side and schedule to run the process over Control- M; this is a monthly report.
Environment: PC SAS 9.4, SAS Enterprise Guide 7.1, Base SAS, SAS Macros, SAS/Access, SAS/STAT, ODS RTF, SQL, PL/SQL, MS-Office, SAS MANAGEMENT CONSOLE, Map Information Studio, SAS Web Report Studio, SAS INFORMATION DELIVERY PORTAL, CONTROL-M, ETL, ICDW\TERADATA, EDW\DB2, ORACLE Database, Info1\Oracle, UNIX server, SFTP, Kron Shell, Greenplum
Confidential, San Jose, CA
Team Lead Developer/Data Architect/ETL /MDM
Responsibilities:
- Performed analytical and programming activities including analysis, design, development, testing, implementation, and documentation of integrating the data management solutions
- Identified business priorities on MDM subject areas. Translated business requirements to technical implementation. Designed MDM solution for three domains. Developed MDM integration plan and architecture for customer, Products and services
- Designed logical and physical model and implemented in Oracle/Teradata environment
- Developed customer contact metrics and validated them with business users
- Analyzed and documented all client requirements and business processes
- Implemented master functionality and analysis for existing business
- Maintained detailed documentation of process flow diagrams and operational support
- Troubleshoots end user technical issues, identifying problems and leading solutions
- Developed MDM strategies and solutions utilizing SQL server tools
- Debugging and providing solutions to business partners for RA
- Monitor the RA master flow and other jobs over Process manager overview
- Filling the RA Stats and Preparing RA graph - MDM, EDW, EDM and SLA tabs(daily, Weekly)
- Checked SAS logs and kill long running job using UNIX secure shell tool, getting dynamic job status using Toad query.
- Raised cases for SLA miss and job failures in Remedy, taking TKPROF based on data base sessions and raising P1 case handling, Manual job run using SAS DI on remote machine
- Created new job flow, scheduling and rescheduling the jobs using SAS Management Console
- Created new job including extracting, sorting, ranking, transforming, designing, developing SQL queries and updating the rule flow over SAS DI
- Developed workflows and coordinator jobs in Oozie.
- Developed Spark scripts by using Scala shell commands as per the requirement.
- Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
- Developed Scala scripts, UDF's using both Data frames / SQL and RDD / MapReduce in Spark for Data Aggregation, queries and writing data back into RDBMS through Sqoop .
- Performed transformations, cleaning and filtering on imported data using Hive , Map Reduce , and loaded final data into HDFS .
- Experienced in deploying data from various sources into HDFS and building reports using Tableau.
- Load the data into Spark RDD and performed in-memory data computation to generate the output response.
- Loaded data into HBase using Bulk Load and Non-bulk load
Environment: PC SAS 9.4, SAS Enterprise Guide, Base SAS, SAS Macros, SAS/Access, SAS/STAT, ODS RTF, SQL, PL/SQL, MS-Office, SAS Management Console, SAS DI Studio, ETL, PROCESSES MANAGER, INFORMATICS, EDW\TERADATA, ORACLE, TERADATA SQL, TOAD, Dollar U(CONTROL-M), UNIX server, Spark, HDFS, Map Reduce, Pig, Hive, Sqoop, HBase, Tableau, Cloudera, Shell Scripting.
Confidential, Columbus, OH
Team Lead Developer
Responsibilities:
- Involved in meetings and projects with business users, project managers & developers to automate, upgrade or expand applications that accomplish business operations and goals.
- Integrated big data (millions of data records) from large data warehouse(ICDW/TERADATA, EDW/DB2, ORACLE/INFO1) - CODDS, CVM, FDM, OTH, RTL, VLS and marketing data marts based on specific business requests
- Retrieved Millions of Records using different Procedures.
- Updated and modified the production code according to the dynamic requirements
- Supported the production team by developing a UNIX server that invokes different macros to produce monthly data that is used for CCAR.
- Performed data validation, transforming data from RDBMS oracle to SAS datasets.
- Developed Utility Macros and extensively used existing macros for Validation, Analysis, Report generation and Integration of Data.
- Successfully migrated Model Ready from EDW to TERADATA.
- Cleaned and optimized the existing the process by reducing significant amount of time
- Used SAS/MACRO for writing complex Macros facility to decrease the redundancy of extensively used SAS scripts and to increase the flexibility of the code.
- Create a temporary table in SESSION schema use PROC SQL pass EXEXUTE statement over Terdata database. Give users an authority to access or modify table using EXEXUTE (GRANT)
- Created the Preliminary Portfolio report and the final report which used to create RPM report and more report. We upload our final results into Oracle Database using simple SAS SQL or BULKLOAD procedures. Our final files (UNIX) match the original files (DB2). Then, the finalize reports have been built. For Reporting, we suggest to use Futrix or Tableau or Web Report.
- Move the data from operation development to the production side and schedule to run the process over Control- M; this is a monthly report.
Confidential
SAS Business Intelligence
Responsibilities:
- Pull Fee Data and VLS Fess from different data warehouses, Teradata Oracle/Info1 and SQL using SQL Pass through facility.
- Prepared new datasets and modified existing datasets using Set, Merge, Sort, Update, Formats and Functions and created Tables and Listings for the same.
- Extensively used various SAS Data Step functions, SAS procedures, and SQL to write reports logics for SAS Stored Processes.
- Advanced Querying using SAS Enterprise Guide, calculating computed columns, using filter, manipulate and prepare data for Reporting, Graphing, and Summarization, statistical analysis, finally generating SAS datasets.
- Managing the datasets and making them available on the server using the SAS Management Console, so that the datasets can be used as inputs for creating the Information maps.
- Creating Information maps using the Information Map Studio, and further creating the Web Reports using the Information maps as the input through the Web report Studio, and later make these web reports available on the Information Delivery Portal
- Importing all Reports from web report studio, Stored Process, BI Dashboard and Information Map to Information Delivery Portal
- Building Quick Reports from Web Report Studio using Information maps, create filters, graphs, and customize reports
- Developing SAS Stored Processes that can be easily used by end users using SAS Add-in for Microsoft Office and other SAS BI clients from SAS Information Delivery Portal
- Login into Metadata Server using SAS Management Console. Then Managing Libraries and building files like data, Information Map and Web Report. Register data into Metadata
- Creating a simple information map by register data into Map Information Studio. Move data from Resource to Information Map Contents; organize the data including format data (comma, percentage, dollar, date, time…..), numeric, character and dynamic format as well. Create new variables if needed. In addition, creating a complex information map
- We build the report in Web Report Studio including a dynamic prompts, its monthly report either by region or reason or both. In addition, we define the dashboard report using SAS BI Dashboard by creating data model, range, indicators and dashboard. Moreover, we present the final report in SAS Information Delivery Portal. We export the final report in excel sheets
- Developed organized and maintained training materials for users and provided technical support and guidance for users and researches user inquiries. Familiarity with Big data analysis.
- Move the data from operation to the production side and schedule to run the process over Control-M
Confidential
Statistical Analyst
Responsibilities:
- Extensively used SAS procedures like means, frequency and other statistical calculations for Data validation.
- Developed SAS programs using Base SAS for tabulation counts, correlations and check for dispersion for normality using PROC Means, PROC Tabulate and PROC Univariate.
- Developed and maintained standardized SAS programs and Macros for regulatory submission reports.
- Default Report output is HTML and provided the options of WORD & EXCEL formats using SAS/ODS statements.
- Performed data analysis mainly for Regression and ANOVA using PROC ANOVA and PROC GLM
- Perform statistical analysis and graphs using SAS/STAT Tools like PROC GPLOT, PROC GCHART in ODS RTF tools
- For Reporting we use PROC REPORT, PROC TABULATE and PROC SQL
Environment: PC SAS 9.2, SAS Enterprise Guide, Base SAS, SAS Macros, SAS/Access, SAS/STAT, ODS RTF, SQL, PL/SQL, MS-Office, SAS Management Console, Map Information Studio, SAS Web Report Studio, SAS BI Dashboard, SAS Information Delivery Portal, Futrix, Tableau, CONTROL-M (Production), ETL, ICDW\TERADATA, EDW\DB2, ORACLE Database, Info1/Oracle, UNIX server, SFTP
Confidential, San Antonio, TX
SR. Business Analyst/Analytic/MDM
Responsibilities:
- Confidential is a family-owned bank serving communities in San Antonio, Boerne, New Braunfels and the Texas Hill Country. The aims of the project is improving accuracy, reducing losses, improving capital management by building standardized Reports and dashboards in Mortgage Banking. Responsible for helping to define and executing on the master data management (MDM) vision for the enterprise. Produce and distribute operational reports, ad hoc analyses and pipeline details to help moving and improving their performance. Manage project in data sources, validation and production.
- Actively involved in design and implementation of application software using primarily SAS system and analyzed existing application software and recommended improvements.
- Predictive Modeling Using Decision Trees, Predictive Modeling Using Regression, Predictive Modeling Using Neural Networks and other tools
- Model Evaluation and Implementation, Cluster Analysis: K-Means Cluster Analysis, Self-Organizing Maps, and Association and Sequence Analysis
- Integrated and implemented MDM, business intelligence (BI) and data warehousing (DW / EDW) solutions
- Collaborated with data management team and conducted large data management workshop Analyzed, modeled big-data sets and developed conceptual model to handle velocity and variety of big data
- Developed MDM capability that is estimated to generate.
- Design and develop customized MDM code based on Specification
- Documentation such as Extension and Additions and Lightweight Services Design and develop customized MDM services.
- Managed rules for data standardization and validation, matching and merging
- Maintained MDM jobs and Master Data sequences Build test scripts for unit testing of customized MDM code, create/implement/package/deploy/test/release the modification
- Designed flowcharts indicating the input data sets and the techniques that would be used (sorting, merging, etc.) to get the desired output.
- Developed SAS macros for data cleaning, reporting and to support routing processing.
- Present numerical information in various formats. Created SAS customized report using the Data null technique.
- Merging, Indexing, appending and sorting the data sets using various SAS procedures and other SAS tools
- Extensively used SAS Data Step functions and descriptive statistical Procedures to process large amounts of customer response and sales data
- Used PROC SQL LIBNAME and PROC SQL Pass-through methods to access data on Oracle database
- Created customer mailing lists for Direct Mailing and Telemarketing using PROC Forms
- Extensively used PROC Import and PROC Export procedures for importing and exporting PC files and Microsoft office files such as Excel Spreadsheet, Access Tables
- Maintained and enhanced existing SAS Reporting programs using Proc Template Styles, PROC Tabulate(Styles), PROC Report(Styles), PROC Print(Styles), ODS RTF, and ODS PDF for marketing campaigns
- Used SAS enterprise Guide to run statistical tests like ANOVA, GLM, Chi-square etc.
- Created Ad hoc reports for marketing managers using SAS Enterprise Guide
Environment: PC SAS 9.3, SAS Enterprise Guide, Base SAS, SAS Macros, SAS/Access, SAS/STAT, SQL, PL/SQL, ODS RTF, SAS Management Console, Map Information Studio, SAS Web Report Studio, SAS BI Dashboard, SAS Information Delivery Portal, Futrix, SAS DI Studio, ETL/Teradata, UNIX, MS-Office.
Confidential, P A
SAS Analyst
Responsibilities:
- Collaborated with physicians, medical writers and data managers to finalize mock tables
- Design; select appropriate SAS procedures for each statistical analysis; test-run SAS program on mock data to ensure smooth analysis implementation.
- Used PROC SQL/IMPORT to import data from mainframe oracle clinical databases and MS excel sheets.
- Performed Data analysis, statistical analysis, generated reports, listings and graphs using SAS Tools SAS/Base, SAS/Macro and SAS/Graph, SAS/SQL, SAS/Access.
- Produced quality customized reports by using PROC TABULATE, PROC RPORT Styles, and ODS RTF and provide descriptive statistics using PROC MEANS, PROC FREQ, and PROC UNIVARIATE.
- Worked with statistician to analyze the results obtained from various statistical procedures like Chi-Square, PROC ANOVA, GLM, and T-test.
- Developed and debug routine SAS macros to create tables, graphs and listings.
- Provided proper validation, including testing and documentation (e.g., requirements document, program validation), in accordance with GCP and company standards.
Environment: SAS/BASE, SAS/MACROS, SAS/SQL, PL/SQL, ODS RTF, SAS/STAT, SAS ETL, SAS/Enterprise Guide, SAS/Grid, SAS Data Integration, SAS OLAP Cube Studio, Excel, Windows, SPSS, Oracle, ICDW/Teradata
Confidential, Chicago, IL
SAS Developer
Responsibilities:
- Developed and designed SAS programs/macros to analyze financial data and to create files, tables, listings, and graphs. Developed ad-hoc reports as per business requirements and created various reports like summary reports, tabular reports
- Cleaned existing data and converted them into useful SAS Datasets, merged datasets and created reports based on Ad-hoc requirements
- Developed applications for measuring financial performance of newly acquired accounts: Forecast vs. Actual and developed SAS programs for generating reports on key financials, income statements, and balance sheet.
- Used SAS procedures like PROC FREQ, PROC MEANS, PROC SORT, PROC PRINT, PROC TABULATE AND PROC REPORT in order to understand and prepare the data
- Performed ETL-Extraction/Transformation/Loading, data migration, sampling, data preparation, graphical presentation, statistical analysis, validation, reporting, and documentation.
- Extracted data from different sources like Oracle and Teradata and text files using SAS/Access, SAS SQL procedures and created SAS datasets
Environment: SAS/Base SAS Enterprise Guide, SAS Enterprise Miner, SAS/Macro, SAS/STAT, SAS/GRAPH, SAS ETL, SAS/ACCESS, SAS/ODSPL/SQL, DB2, MS EXCEL, MS ACCESS, SAS DI Studio, SAS Management Console, SAS Map Information Studio, SAS Web Report Studio, SAS BI Dashboard, SAS Information Delivery Portal, SAS OLAP Cube Studio, Oracle, ICDW/Teradata, Tableau
Confidential
SQL/SAS ANALYST
Responsibilities:
- Involved in Analysis & Marketing Team to make business decisions
- Interacting extensively with end users on requirement gathering, analysis and documentation
- Documented methodology, data reports and model results and communicated with the Project Team / Manager to share the knowledge
- Performed complex statistical analysis using PROC MEANS, PROC FREQ, PROC UNIVARIATE, PROC REG and PROC ANOVA
- Extensively used SAS procedures such as PRINT, REPORT, TABULATE, FREQ, MEANS, SUMMARY, TRANSPOSE and Data Null for producing ad-hoc and customized reports and external files
- Responsible for generating Financial Business Reports using SAS Business Intelligence tools (SAS/BI) and also developed ad-hoc reports using SAS Enterprise Guide
- Performed data analysis, statistical analysis, generated reports, listings and graphs using SAS tools e.g., SAS Integration Studio, SAS/Graph, SAS/SQL, SAS/Connect and SAS/Access
- Created reports in the style format (RTF, PDF and HTML) using ODS statements
Environment: : SAS/BASE, SAS/MACRO, SAS/ETS, SAS/CONNECT, SAS Business Intelligence, SAS Enterprise Guide, SAS Enterprise Miner, SAS DI Studio, SAS Management Console, Map Information Studio, SAS Web Report Studio, SAS BI Dashboard, SAS Information Delivery Portal, SAS OLAP Cube Studio, ETL/Teradata, DB2, ORACLE, Windows