We provide IT Staff Augmentation Services!

Sr. Data Analyst /developer Resume

5.00/5 (Submit Your Rating)

Mclean, VA

PROFESSIONAL SUMMARY:

  • Over 9 years of experience in IT as a Programmer/Data Analyst with profound experience in Financial, Retail, Healthcare, and Mortgage industries.
  • Certified expertise in BASE SAS programming, SAS/Macros programming and debugging, experience in the skills in SAS/BASE, SAS/MACRO, SAS/SQL, SAS/GRAPH, SAS/ACCESS, SAS/ODS .
  • Strong experience in Microsoft Office, Excel (Advanced), VBA, SQL, Visio (Process Maps), Teradata SQL Assistant, SAS Enterprise Guide (SAS EG 7.1), SAS 9.1, R, Python 2.7.5&3.6.2, AWS, GitHub, UNIX, Business objects (BI), Tableau 10.1 and Machine Learning Specialization.
  • Utilized SAS Enterprise Guide , UNICA and its functionality to create project flows which could include aspects of data access, manipulation, summarization, Integration and report generation to a variety of output formats and forecast models.
  • Strong experience in managing credit risk projects, preventing errors in the Model Inputs and Outputs , Claims and coding using PL/SQL, SAS/SQL, and SAS macros as well as in performing database changes to support data storage and access to the new data.
  • Proven skills in Master Data Management, Data Cleansing, Data Archival, Data Migration to SASGrid, validating data integrity, ad - hoc reporting and coding utilizing SAS on UNIX, Unix shell script, Windows.
  • Experience in Quantitative Roles in Financial Services, Consumer Lending, and Credit Risk Management and in creating SAS programs for EDIT CHECKS, Financial concepts, and performed data validation on raw data sets and created data set for analysis.
  • Created and customized Macros for use in data cleaning, initialization, creation of analysis, datasets and report generation for business.
  • Relevant exposure to business process data modeling, business analytics, including data warehousing and business intelligence tool and technology
  • Followed System Development Life Cycle (SDLC) methodology for the design, development, implementation, and testing of various SAS modules.
  • Have strong knowledge in methodologies like Net present values ( NPV ), Returns on assets ( ROA ), Profit and loss ( P&L ) and Time Series Metrics.
  • Experience in both My SQL and MS SQL , Designing and writing mainframe and PC technical programs to gather data and conduct statistical analysis and creating test scenarios to find errors, custom analysis and confirm programs meet specifications. Program Documentation and Review.
  • Performed gap analysis evaluation of the agency’s data landscape relevant to the industry and domain and recommends next steps
  • Experience in updating and creating documentation of code, guidelines, and high level documentation.
  • Excellent analytical, problem solving, communication and interpersonal skills , with ability to interact with individuals at all levels. Ability to work effectively both as an individual and as part of a team.
  • Exceptionally well organized. Enthusiastic, innovative and challenge oriented.
  • Ability to quickly adapt to new applications and platforms.

TECHNICAL SKILLS:

Teradata SQL Assistant:, SAS Enterprise Guide, R, Python 3.6.2, Machine learning specialization, Tableau 10.1, SAS 9.x (Base SAS, SAS/SQL, SAS/MACROS). BOBJ (Business objects) Tableau 10.1, BIRST.

Databases: Oracle, SQL, DB2, Teradata, RDBMS, Sybase. Oracle, PERL/CGI

Languages: Python v2.7.5&3.6.2, POWER BI, SAS, SQL, SQL Assistance, RedShift, Snowflake

Platforms: Windows, LINUX/UNIX and AWS

Others: MS office, UNICA, MS Visio, MS Project Management, CKAN, Ultra Editor Studio, VBA macros, Version One, JIRA, and Shell scripting

PROFESSIONAL EXPERIENCE:

Confidential, McLean, VA

Sr. DATA ANALYST /DEVELOPER

Responsibilities:

  • Been part of DART Team-Data Analysis and Reporting Team for Corporate Compliance backing the ICTT- Independent Compliance Transaction Testing which was designed as a program to establish and maintain a robust, independent transaction testing program that will satisfy regulatory requirements, while meeting business objectives of Risk Management.
  • Working on Locking the objectives and requirements of needs and Identify what additional inputs are needed for request by scheduling Kick-off meetings (e.g. provide testing team with transaction types to confirm, need engagement from the business). Request written confirmation (email) for transaction types and other scenarios to ensure they are capturing what should be included/excluded .
  • Developed scripts in Teradata SQL Assistance , SAS and Python with the results and documentation and Validated with Validation DA for secondary validation and complete the validation check list and posted the results and documentation on DART’s Secured File Share drive and acknowledge the tester’s that the results are ready for review.
  • Supported DATA Transformation programs includes pulling data from AWS Redshift and Snowflake using SQL Workbench to maintain and host regulation reports and BAU process
  • Developed data structures, charts and dashboards in Tableau to support the generation of business insights and strategy & Maintain data infrastructure and develop scripts for regular processes.
  • Analyzed data analysis requests obtained from management to determine operational problems for automation using shell scripting in using pcron and define data modeling requirements, validation of content, and problem-solving parameters.
  • Stayed focused on Innovation projects and developed scripts in python to connect and download files from team sites and uploading excel data to Teradata using python (Pyodbc, Pandas & Numpy) and presented on GitHub efficiency (upload, download, move, rename, remove files etc. And continuing learnings on AWS and Machine learning specialization.
  • Identified opportunities to use data to develop new strategies and improve business performance and utilize knowledge of mathematical modeling and other optimization methods to perform quantitative and qualitative data analysis.
  • Utilized programming and analytical tools including SAS, SQL, Teradata or similar relational database tool, and Unix to formulate mathematical or simulation models.
  • Also, Understood the process of Data migration to AWS for future request using pulse resource, like high level data architectures, VPC setup, Application management-S3 bucket management and Redshift Cluster .

Environment: Python 2.7.5& 3.6.2, SAS EG Enterprise Guide 5.1, SAS 9.2, Teradata SQL Assistant v12, v13, Tableau 10.1, Teradata BTEQ, Fastload, Tpump, UNICA, REDSHIFT, AWS Windows XP, Unix(HP-Unix).

Confidential, Richmond, VA

SEGMENTATION / LEAD

Responsibilities:

  • Lead DA performs the main coding for the campaign in Teradata SQL Assistance, UNICA , and used SAS, Tableau & Redshift for analysis as per the intent given in the Business Intent Rationalization Template ( BIRT ) following CMC standards.
  • Working with the Intent Owner to ensure the data execution meets the campaign's intent.
  • Performed action type coding by populating fulfillment tables (action types like Send Email, Account offer enterprise and Letters) that match the intent of the campaign
  • Checking the code execution standards during each stage of a campaign like inclusions/exclusions, segmentation , offer assignment-fulfillment and assisting product owners in any issues that may require research.
  • Translating BIRT into code that identifies eligible accounts and segments them into a specific population
  • Creating all relevant control reports like overall segmentation control reports( OSCAR ), OTT (Offer Treatment template) and BIV reports and driving them to ORT.
  • Worked on Attestation checklist to make sure all the documents are uploaded to Campaign ORT folders for audits.
  • Helping Intent Owner with all the assessment approvals required for campaigns.
  • Hands on experience on AWS and Redshift by being regular to Team collaborative learnings.

Environment: Python 2.7.5& 3.6.2, Tableau 10.1, POWER BI, SAS EG Enterprise Guide 5.1, SAS 9.2, Teradata SQL Assistant v12, v13, Teradata BTEQ, Fastload, Tpump, UNICA, REDSHIFT, AWS Windows XP, Unix(HP-Unix).

Confidential, Richmond, VA

Data Analyst

Responsibilities:

  • Participated in defining data requirements and gather and validate information, applying judgment and statistical tests.
  • Developed data structures to support the generation of business insights and strategy
  • Maintained data infrastructure and develop scripts for regular processes.
  • Analysed data analysis requests obtained from management to determine operational problems and define data modelling requirements, validation of content , and problem-solving parameters.
  • Developed programming on analytical tools including POWER BI, SAS, SQL, Teradata or similar relational database tool, and Unix to formulate mathematical or simulation models.
  • Communicated and presented data to management by developing reports.
  • Mostly Involved QA redesign projects for automation process which is basically capturing the data from SQL data base on daily base and storing it on the user space Teradata server which is a source across different projects to report Quality Assurance numbers to various operations teams across Bank
  • Worked on ODW (Oracle Data Warehouse) Decommissioning Project where the request is to build the exact same report from data sources on BDW (Bank Data Warehouse).

Environment: SAS EG Enterprise Guide 5.1, SAS 9.2, Tableau, Teradata SQL Assistant v12, v13, Teradata BTEQ, Fastload, Tpump, BO XI 3.2 SP2 BO 4.1, Windows XP, Unix(HP-Unix)

Confidential, Richmond, VA

Data Analyst

Responsibilities:

  • Performed data extraction, aggregation, analysis, and quality checking from multiple data sources in Mainframe environment to create and monitor tests and programs using tried and true platforms like Teradata, SQL and SAS - Use the newest and most relevant technologies to uncover new insights that enable new or better products .
  • Partner with statistician teams during the formulation, implementation , testing and validation of predictive models and used FTP.
  • Validated SAS data sets using PROC MEANS, PROC FREQ and PROC UNIVARIATE.
  • Formatted HTML, RTF and PDF reports using SAS output delivery system ODS.
  • Prepared new datasets from raw sets files using Import Techniques and modified existing datasets using data statements such as SET, MERGE, UPDATE and PROC SORT .
  • Involved in proactive investigations into data issues that impact reporting, analysis, or execution.
  • Performed ad hoc analytics on large/diverse datasets .
  • Help build business cases and lead definition of data strategies that support business strategy.
  • Paired with management to drive new reporting or solutions that improve our ability to operate efficiently and improve continuously .
  • Worked with IT and Digital teams to investigate issues that impact reporting, analysis, or execution.
  • Communicated with Brand and Creative teams to develop responsive design email messages.

Environment: SAS 9.1.3/9.2, BASE/SAS, SAS Enterprise Guide, SAS/MACRO, SAS/SQL, SAS/CONNECT, SAS/ACCESS, Tableau, SAS/ODS, Oracle 11g, Sybase, UNIX and Windows XP.

Confidential, Charlotte, NC

SAS Programmer/Analyst

Responsibilities:

  • Converted the weekly, bi-weekly, monthly and quarterly reports from ad-hoc database to new data warehouse using SAS, PL/SQL and Macros with better and accurate values.
  • Experience in creating reports in SAS Enterprise Guide v4.2/4.3
  • Utilized SAS/ACCESS to extract data from Sybase and other relational databases for analysis.
  • Prepared new datasets from raw sets files using Import Techniques and modified existing datasets using data statements such as SET, MERGE, UPDATE and PROC SORT .
  • Designed, developed, tested and debugged new MACROS and modified the existing macros according to the enhancements in the business.
  • Worked on edit check programs to maintain data consistency , data validation and credit risk management.
  • Production Job Scheduling and Monthly Run.
  • Worked on ad-hoc requirements and later converted them as monthly and quarterly reports.
  • Developed and maintained the regular codes and made some amendments as per the business changes.
  • Written effective new codes to get the required analytical summary data.
  • Transformed the statistical data in various formats (excel, access) into SAS data sheets . Provided statistical programming expertise in the production of analysis, tabulations, reports and listings.
  • Developed programs for custom analysis, cleaning, transforming and modifying the data daily.
  • Used base SAS and SAS/SQL to generate tables.
  • Generated the production reports and submitted with key observations and recommendations.
  • Validated SAS data sets using PROC MEANS, PROC FREQ and PROC UNIVARIATE .
  • Formatted HTML, RTF and PDF reports using SAS output delivery system ODS.
  • Responsible to write the SAS programs to perform the QA analysis/testing.
  • Used PROC compare to compare the data before and after editing
  • Documented the entire process for future reference.

Environment: Base SAS v9.2, SAS/Macros, SAS/SQL, SAS OLAP Cube Studio 4.2, SAS Enterprise Guide v4.2, SAS/ACCESS, SAS/STAT, Teradata, SQL, Sybase, Windows XP, UNIX, Project Management, UE Studio.

Confidential, Phoenix, AZ

SAS Programmer/Analyst

Responsibilities:

  • Utilized SAS/ACCESS to extract data from Oracle and other relational databases for analysis.
  • Prepared new datasets from raw sets files using Import Techniques and modified existing datasets using data statements such as SET, MERGE and PROC SORT.
  • Tested and debugged existing MACROS.
  • Extensively used PROC REPORT and PROC TABULATE to create reports.
  • Generated safety tables, which also involved in analyzing data and generating Reports.
  • Documented the entire process for future reference and creating and publishing reports using SAS Web Report Studio.
  • Created reports using the DATA NULL and the PROC REPORT for the submission as per the regulations and company standards.
  • Validated SAS data sets using PROC MEANS, PROC FREQ and PROC UNIVARIATE.
  • Formatted HTML, RTF and PDF reports using SAS output delivery system ODS.
  • Worked with automation team to automate the reports using SAS/DDE and VB macros .
  • Performed quality control checks on tables, listing etc generated by other team members.

Environment: SAS/BASE, SAS/MACRO, SAS /Access, Enterprise Guide, SAS/STAT , MS-Excel, Visio, ORACLE, SQL, Teradata, Windows XP.

We'd love your feedback!