Sr. Data Analyst Resume
Sterling, VA
SUMMARY
- 16 years of professional experience in Information Technology as Sr. Data analyst on Financial, Educational, Banking and Commercial domains; includes 12 years of experience on secondary mortgage industry.
- Intensive exposure on the entire SDLC in requirements, analysis, design, development, implementation, maintenance, production support and Testing in Client /Server, Web and Cloud based applications.
- Experience on data analysis, data profiling and data cleansing, data mining using PL/SQL scripts, Python, power BI desktop, Tableau, IDQ, DataFlux and Excel.
- Proficient to improve the processes and automate DQ validation scripts to find out potential anomalies on production data.
- Experience on Source to Target Mapping (STTM) and report mapping, BRS and BDG documents.
- Provided Data Quality Scorecards, Dash - Boards and Ad-Hoc reporting using IDQ.
- Experience on statistical and predictive data analysis, visualization, Machine learning using Python.
- Developed data visualization reports using SSRS, Power BI desktop, Python and Business Objects XI (BOXI).
- Extensive experience in writing project Scope documents, Functional/Technical/Transitional/Operational Requirements, Report requirements, Test Strategy, Test Scenarios, Test Plans, Test Cases, RTM, Defect Log, Test Summary Report and mockup data preparation.
- Collaborated with cross team stakeholders to conduct Integration testing on E2E and UAT environments.
- Extensively used TOAD, Rapid SQL, SQL PLUS and DBVisulizer for data analysis and reporting purposes.
- Created risk assessment and impact analysis documents for new functionality and changes to the existing functionality.
- Developed and maintained EUC applications to automate business functionality using PL/SQL, Stored Procedures and VBA code.
- Created Data Correction Utilities (DCU) and Emergency Fix Scripts to correct the production data.
- Expertise to execute the autosys jobs, verify the error logs and trouble shoot issues.
- Validated BizApps, CEHL, Data Quality and Auto publish reports using PL/SQL queries.
- Analyzed production source data with prior issues and troubleshoot them before send to consumers.
- Experience in creating ETL packages using SSIS to extract-Transform-Load Source/Legacy data and to vend consumer data to SFTP server.
- Experience to create SSRS repots and loads them to SharePoint for end user usage.
- Developed applications using ASP.NET, VB.NET, ASP, VB, C++, XML, Business Objects XI and Crystal Reports XI.
- Created expected results database for UAT and SIT validation using PL/SQL scripts.
- Strong presentation skills and the ability to communicate at different levels within the organization with exceptional problem solving and analytical skills.
TECHNICAL SKILLS
Testing: Cucumber, ALM QC11, Quality Center 10.0, Test Director, Win Runner, QTP and Load Runner
Development: Python 3, Hadoop, AbInitio, Informatica, DataFlux, IDQ, MicroStrategy, Power BI, Tableau, Business Objects XI, DataFlux, Crystal Reports XI, JSON, SSIS, SSRS, ASP.NET, VB.NET, C#, C++, VB, ASP, PL/SQL, XML, VBA and UML
Databases: SQL Server 2014, Oracle 11.2.0.1, MongoDB, NETEZZA, Teradata, SAS, IBM DB2, Sybase and MS-Access
Applications: JIRA, VersionOne, TOAD, SQL PLUS, RapidSQL, ER Studio, SharePoint, MS Visio, IBM Rational DOORS, Oracle SOA Suite 11g, MS Visual Studio 2005, DbVisualizer and ClearQuest
Other Tools: PuTTY, Core FTP and PSFTP
PROFESSIONAL EXPERIENCE
Confidential, Sterling VA
Sr. Data Analyst
Responsibilities:
- Practiced Agile Scrum methodology, actively participated and provided timely inputs in sprint planning, review and retrospective meetings as well as daily stand-up sprint meetings.
- Worked on source to target mapping (STTM) document, included data definitions, transformations, derivations and enumerations.
- Analyzed business requirements and worked with SME’s to understand functional work flow of details from source to target system.
- Worked with Product Owner on backlog grooming, user stories creation and prioritization of sprint stories.
- Performed data profiling, data cleansing and data analysis on target systems data using Microsoft Power BI Desktop to provide stats to the business stakeholders.
- Created complex queries to pull large sets of data and performing analysis using PL/SQL.
- Performed impact analysis on new source systems data to consume by EDW.
- Primarily involved in Data Migration using SQL, SQL Azure, Azure Storage, and Azure Data Factory, SSIS, PowerShell.
- Migrated SSAS Model to Azure Analysis Services to connect to PowerBI.
- Worked with business users on Chang Control Board (CCB) tickets and provided LOEs to implement them in EDW.
- Closely worked with development team to make sure whether the software application fits within the architecture and has the required behaviors.
- Created automation process to merge individual mapping documents into enterprise level master mapping document.
- Validated BizApps, CEHL, Data Quality and Auto publish reports using PL/SQL queries
- Responsible to review report requirements and provide feedback based on EDW implementation feasibility.
- Provided guidelines and mapping details to create semantic layer for reports and self serve usage.
- Produced forecast loan data on critical elements using Python linear regression model based on historical data.
- Created data visualization reports to convert the data into Scatter plot, line, bar and pie charts using MatPlotLib, Plotly and Seaborn libraries in Python
- Analyzed and processed complex data sets using advanced querying using Python.
- Worked on the requirements forDashboards, Bar/Line charts, Clustered Column Charts, Pie Charts, Tree map using PowerBIDesktop.
- Worked on Data Analysis Expressions (DAX) for accessing data directly fromtabularSSAS database.
- Worked on data definition standards document to achieve Enterprise Data Management is to consistently define and make standardized data available across the enterprise.
- Worked independently and/or collaboratively with Subject Matter Experts and business POCs.
Environment: MS SQL Server 2014, Python 3, JIRA, SSAS tabular model, Power BI, MS Azure, MS Visual Studio 2014, SSIS, SSRS, SQL Server Data Tools (SSDT), SharePoint, UNIX, MS Access 2016, and MS Project.
Confidential, Washington DC
Sr. Data Analyst
Responsibilities:
- Automated Remote Tele-Workers eligible expenses using Confidential ’s HR data portal (Hotel stay, Travel and Transportation) based on primary Confidential Offices.
- Analyzed legacy system data in SQL Server to create DQ (Data Quality) rules for Pre and post validations to execute them At-Rest or In-Line.
- Reconciled all activities necessary to process multi-state payroll and account for related transactions (e.g. salaries, benefits, deductions, taxes and third party payments).
- Automated employee pay period preview reports (New hires, Over Time, Issued pay checks for Inactive employees, Multiple pay checks per pay period etc,.)
- Reconciled ADP Top Row Validations by comparing ADP data with Confidential HR home employee’s payroll data.
- Ensure compliance with relevant laws and internal policies. Establish and monitor appropriate controls, policies, and procedures within payroll unit.
Confidential, McLean VA
Sr. Data Analyst
Responsibilities:
- Worked in an Agile (Iterative/Scrum) environment as a core team member, participated in sprint planning, daily standup, sprint demos/reviews, retrospective and refine backlogs.
- Created End2End Process Flowcharts for Primary Mortgage Loan APP and Correspondent/Aggregator Assignment Center.
- Performed impact analysis on upstream and downstream systems for attributes changes in PML APP.
- Analyzed the data between Loan APP and CDW legacy systems with sample set of loans and provided comparison results whether it comply with data completeness & data accuracy dimensions using IDQ.
- Provided assistance with ad-hoc requests from business teams and work with the data engineering and governance teams.
- Extensively used VersionOne to create user stories, tasks and estimated time to complete the tasks.
- Worked on Production Data Movement Controls using Business Activity Monitor (BAM) tool.
- Utilized Azure VM's and hosted Azure database for faster performance.
- Performed impact analysis for newly added attributes to PML APP and notify impacted consumer systems.
- Worked on project deliverable like BRS, UAT Artifacts, Operation Readiness Requirements (ORR), Data Flows, Non Functional Requirements (NFRs), Operational Requirements (OPRs), STTM, Physical/Logical Data Models and BRD.
- Performed post production validations and shared the results to make Go/No-Go decision.
- Responsible to validate DDLs and perform Smoke Test for PROD deployments.
- Performed predictive analytics such as machine learning and data mining techniques to forecast loans eligibility criteria with accuracy rate.
- Responsible to identify gaps in Loan APP and PML APP where consumer elements mapping incorrectly.
- Tracked existing production issues and provided solutions to prevent recurring issues.
- Analyzed and documented the production data anomalies to notify consumers timely.
- Worked with Data Modelers to create a data warehouse objects with relational database concepts like referential integrity for accuracy and consistency of data.
- Performed E2E UAT with production like data, coordinated with PML APP source and consumer team members to execute test cases and validate data points.
- Created Test Cases, test plans and executed on HP ALM for UAT/E2E scenarios.
- Created UAT Testing artifacts for various releases; Test Strategy & Plan, Test Cases, Test Results, Test Summary Report, Defect Log and Requirements Traceability Matrix (RTM).
- Analyzed source data in MongoDB to implement Target system data models.
- Updated Source To Target Mapping (STTM) documents with legacy systems ingredient attributes, Transformation Logic and enumeration values to implement it in APPs.
- Reviewed the SIT scenarios and Test cases to make sure all Functional Requirements were covered.
Environment: IBM DB2, Hadoop, MongoDB, Python, Informatica, VersionOne, Azure Data Lake, Rapid SQL, DOORS, BAM, SharePoint, HP ALM, ER Studio, MS Visio, Microstrategy and XML Spy.
Confidential, Reston VA
Sr. Data Analyst
Responsibilities:
- As a Team Lead, responsible for guiding a group of Test Engineers to complete the project deliverables on time.
- Involved/Reviewed Test data creation to test Business functionality, Enumerations and DQ rules.
- Created Data Profiling SQL scripts to identify mock up data gaps for Enumerations, Transformations and Derivations.
- Responsible to create business process flow charts as part of sprint deliverables.
- Generated Test Results using DTF tool by providing source and target test case details.
- Used Cucumber to automate the autosys data load jobs & acceptance test case execution.
- Created Cucumber Step Definition and Feature files using RubyMine language.
- Created data visualization reports and done data profiling using Tableau.
- Responsible to reconcile data movement source to target and CEHL (Common Error Handling language) reports.
- Created “AutoGenTestcases” Tool to generate Vending Test Cases as well as Data load Counts validations from Staging, Error, Exceptions and Target layers.
- Conducted daily check-in status meetings with the team and provided UAT status to the management about Accomplishments, Working tasks, Past due tasks.
- Reviewed Test Strategy, Test cases & Plan and RTM documents.
- Utilized test case Auto Compare Tool to compare Expected with Actual results and to load Test results to the QC.
- Worked on controls using SQ scripts to verify process anomalies before publishing the data to the consumers.
- Responsible to execute Autosys jobs and to investigate root cause incase of job failure on UNIX environment.
- Provided Reconciliation, Frequency and Metric counts for active production data to consumer business teams.
- Validated Data Mart reports on NETEZZA environment.
- Created Change tickets and Incident Tickets to copy of Production archive files including scrambled NPI data.
- Responsible to provide justification/Resolution for BizApps and CEHL reports.
- Responsible to review Issue Tracker and open the iCART defects.
- Responsible to coordinate with Development and Business teams to resolve the defects on daily basis.
Confidential
Sr. Analyst
Responsibilities:
- Involved to create Application Logical Data Model (ALDM) and Enterprise Logical Data Model (ELDM).
- Performed Data Profiling on source systems data to understand the pattern and to analyze the data.
- Worked on Source to Target mapping document for consumer data glossary elements.
- Analyzed legacy system data in SQL Server to create DQ (Data Quality) rules for Pre and post validations to execute them At-Rest or In-Line.
- Worked on Metadata requirements to maintain the Logical/Physical Data modeling standards and metadata file specifications.
- Worked on Acquisition Loan source enumerations document and Stakeholder fact sheets.
Environment: ALM QC 11/Quality Center 10, NETEZZA, Informatica 9.5, Oracle 9i, SQL Server 2014, Oracle SOA Suite 11g,DTF tool, Cucumber, Tableau, TIBCO, DOORS, DataFlux, Embarcadero ER Studio 9.5.0,Toad for Oracle 9.5, SharePoint, MS Visio, XML Spy, MS Access 2007, Rational ClearQuest, PuTTY, FileZillaand TIBCO GEMS.
Confidential, Vienna VA
Sr. Data Warehouse Analyst
Responsibilities:
- Created SQL scripts to compare Pre and Post migration objects and table data counts for Production server migration.
- Performed Data Profiling on the source systems and provided data characteristics, patterns and allowable values to create Source to Target Mapping (STTM) document.
- Validated functionality of Informatica workflow and transformations as per requirements.
- Verified data completeness and data accuracy ensure that the workflow does not allow any invalid or unwanted data to be loaded.
- Reconciled the user View repots with Source data using SQL Scripts.
- Created SIT Process & approach document and System Test scenarios.
- Created Test Cases and Test Scripts to validate the data from Source to Landing, Landing to LoadReady and LoadRedy to Base layers based on STTM document.
- Created HP Quality Center Dashboard reports for Defects life cycles management.
Environment: Informatica 9.5.1, Teradata SQL Assistant, Informatica Power Center DVO, MS SQL Server 2012, HP Quality Center 10, SharePoint, MS Visio, XML Spy and Beyond Compare 3.
Confidential, McLean VA
Sr. Tester/ Sr. Data Analyst/Business Analyst
Responsibilities:
- Responsible for Functional testing, Integration Testing, Defect coordination, Test status, User Acceptance Testing and Signoff with respect to the functionality as a Teat Lead.
- Analyzed the data to identify and interpret patterns and trends, assess data quality and eliminate irrelevant data using DataFlux and PL/SQL Scripts.
- Created Logical and Physical Data models, Data Glossary and BRS documents.
- Lead the UAT team to migrate SQL Server version, Complaints Analytics, HAMP Tier2 and Data Integration projects.
- Responsible for estimating testing effort, preparing test schedules, risk analysis, identifying and allocating the resources.
- Responsible for prioritizing and assigning the testing tasks & Monitoring testing stages/levels.
- Facilitating Defect Review Meetings, Management Meetings & Go/No Go decision meetings.
- Leading Cross-functional QA efforts in Inter System Testing, conducting User Acceptance Testing.
- Analyzed Confidential provided HAMP data to create Requirements and provide risk assessment to the business team.
- Validated source data XML and XSD files and provided anomalies to Confidential .
- Created BRS and Data models based on source data files and requirements.
- Worked on Scope, Business requirements and Technical requirements for the implementation of new projects.
- Created Test Cases, Test Scripts, Test Results and Dash board Reports using Quality Center.
- Created and/or reviewed Testing artifacts; Test Strategy, Requirements Traceability Matrix (RTM), Defect Log, Test Results, and Test Summary Report.
- Responsible to review Test data to accurately simulate the scenarios for real time user.
- Created DCU’s and Emergency Fix scripts to update production data.
- Developed EUC application to process HAMP Complaints data and provide Statistics Reports to the Business Team.
- Responsible to execute Autosys jobs in UAT and verify the error logs and provide status to the team.
- Created SQL scripts for Data accuracy validations and Loan sampling tranche logic.
- Responsible to validate Post-Production data and provide status to the Business and Production teams.
- Evaluate MicroStrategy reports data to quickly identify problems, issues and gaps.
- Responsible to improve performance by optimizing SQL scripts to remove bottle necks processes and eliminate duplicate data by normalizing the database tables.
- Involved in Developing Servicer Loan Sampling using ASP.NET.
- Worked closely with Business to fix data anomalies and provide them Ad-hoc Reports using PL/SQL scripts.
- Publishing best practices to the enterprise and implementing industry best practices in the team.
- Coordinating with external teams for setting up test data and tracking defects at their end.
- Conducting Test status meetings/walkthroughs, resolving issues and escalating if required.
Environment: MS SQL Server 2008/2005, DataFlux, ER Studio, ASP.NET, ETL - SSIS, SharePoint, MS Visio, XML Spy, ALM QC 11, MicroStrategy, Rational ClearCase, Rational ClearQuest, IBM Rational DOORS and PuTTY.
Confidential, HerndonVA
Sr. Tester/ Data Analyst
Responsibilities:
- Validated Data Quality reports, Data mapping and Auto publish reports using PL/SQL queries.
- Validating daily jobs for Confidential, Genie Mae and Non- Confidential REMICS
- Created PL/SQL scripts to validate Extract process, Auto Recon process, Transformation process and Load process.
- Involved in updating production data using data correction requests.
- Performed Data Profiling on raw sources using technical tools such as TOAD and MS Access.
- Provided Loans/Securities production data to Deloitte & Touche (D&T) Business Audit Team to perform certain computerized procedures in SAS data sets or a delimited text files.
- Validated iUAT with the production data using Ad-hoc queries.
Environment: ETL - AbInitio, Oracle 9i, Sybase, SharePoint, Business Objects XI, Toad for Oracle 9.5, MS Access 2003, Quality Center, Rational ClearQuest, PuTTY, Rational ClearCase and Rational RequisitePro.
Confidential, Herndon, VA
Sr. QA Tester/ Business Analyst
Responsibilities:
- Worked with Business team to understand the detailed impact and business changes needed to support new policies.
- Created Business requirements and developed Test Scenarios, Test Plan, Test Strategies, Test Cases, Data mapping, Data modeling, UAT mock data and created expected results.
- Generated Test Results using automation tool which was developed using VBA code in MS Access and compare Actual and Expected results.
- Executed UAT and iUATAutosys jobs using PuTTY based on instructions provided by Development Team.
- Provided support to production applications by tracking production issues and troubleshooting them to sustain application in production.
- Validated PLAE, LAR and BOXI reports using PL/SQL scripts
- Responsible to update business requirements based on the new enhancements.
- Implementing standards for requirements and scope documentation and ensuring quality deliverables.
- Performed data comparison/analysis between current production ADW data warehouse and legacy production Pool Prepay SAS data sets.
- Created SLS (Subledger) expected results using VBA code in MS Access.
- Extracted data from FDW and performed ad-hoc queries by using TOAD.
- Created the ER database using VBA code in MS access to validate GFAS Processor engine
- Responsible to Identify the Data issues in ADW and PDR databases before going to the production.
Environment: AbInitio, Oracle 9i, Sybase, SharePoint, Business Objects XI, Toad for Oracle 9.5, MS Access 2003, Quality Center, Rational ClearQuest, PuTTY, Rational ClearCase and Rational RequisitePro.
Confidential, Washington, DC
Sr. Data Management Analyst
Responsibilities:
- Involved in working with clients in gathering requirements, creating use cases, STTM and a logical data model.
- Generated the State and Federal statistics reports (Blackman-Jones, Special Conditions, Child Count, etc) which measure the timeliness of IEP’s and Eligibilities using SQL Server Reporting Services (SSRS).
- Created SSIS/DTS Packages for data migration between the EasyIEP and DCPS/PCS Secure FTP servers.
- Produced reporting procedures, which allowed clients to address data, quality issues before loading data into STAGING database.
- Generated Ad hoc Reports based on Management Requirements using BusinessObjects XI (BOXI).
- Automated to store the specific reports on the SharePoint using SSRS subscription schedule on daily basis.
- Scheduled the SSIS packages to process inbound and outbound data using SQL Server Agent.
- Created automated DQ scripts to find out anomalies on PROD database, trigger these scripts after data load completed.
- Created process flow diagrams to get a better understanding of the processes involved during data migration.
Environment: ASP.NET, MS SQL Server 2005/2003, MS Visual Studio 2005, SharePoint, WS FTP Pro, Core FTP, SSIS, DTS, SSRS, Business Objects XI, UNIX, MS Access 2007, Visual Source Safe, Visio, Rational Suite and MS Project 4.1.
Confidential, Herndon, VA
Sr. QA Analyst/ Sr. Data Analyst
Responsibilities:
- Responsible for developing and maintaining the appropriate controls for ensuring high quality data are provided.
- Created Expected Results (ERDB) using PL/SQL scripts (Stored Procedures/Packages) to validate GFAS development Data Transformation and Pre Processor systems.
- Extracted data from ADW database and performed ad-hoc queries using PL/SQL Scripts.
- Created the ER database using VBA code in MS access to validate GFAS Processor engine.
- Analyzed, designed and coded to build a new schema for staging the tables and views.
- Responsible to Identify the Data issues in ADW and PDR databases before going to the production.
- Developed Ad-hoc reports as per business team, operation team and Manager’s requests.
- Query optimizations using query analyzer and Index tuning.
- Worked with D&T, E&Y and EUC teams to help them to understand on various data sources.
- Collected and documented policies, calculation methods, business processes as well as business rules.
Environment: ETL - AbInitio, Oracle 9i, MS Access 2003, Business Objects XI, Toad for Oracle 8.6, TOAD, SQL Plus, DbVisualizer 4.3.1, Rational ClearCase, Rational RequisitePro, MS Excel, MS Project 4.1, Windows XP.