Sr. Loan Iq Data Analyst Resume
3.00/5 (Submit Your Rating)
NJ
SUMMARY:
- 7+ varied experience in data warehousing projects experience in various phases like Architect, Design, Analysis, validation, Modeling, Extraction, Transformation.
- Extensive experience in project management best practices, processes, & methodologies including Waterfall, Agile, SCRUM, and Rational Unified Process (RUP).
- Responsible for all facets of clinical data management for a multitude of studies including data management plans, database design, SAS programming, CRF design, edit checks, IVRS data review, Knowledge of PMED, data clarifications, training of site personnel, time lines, and budgets.
- Demonstrated competencies in managing clinical trials data by developing and implementing new or improved processes for data management.
- Created dashboards and reports from concept to execution with clinical data from electronic health records, health information exchanges, insurance claims clearinghouses and customer - relationship management database.
- Exceptional skills in T-SQL programming: Dynamic SQL, Stored Procedures, Triggers
- Involved in Dimensional Modeling, Collecting Function requirements, Reverse engineering, ER Model creation, logical and physical model creations.
- Experience in ETL informatica 9.1,8.6, 8.5.7.1, COGNOS 8.4, UNIX scripting, Autosys, Control M, Tidal.
- Specialized in and developed data integration and automation software for multiple projects using SSIS and .NET Framework.
- Had done several ETL mappings, Transformations, sessions and Batches by using Informatica.
- Very strong understanding of ANSI SQL, Table Joins, RDBMS architecture, Query tuning and very strong knowledge of Data Modeling, data automation, Normalization, Dimensional Modeling, Star schema, Fact and Dimensions.
- Recognized collateral.net loan documents and the key data, reviewed collateral documents for agency.
- Design and tune T-SQL scripts and stored procs as workarounds to standard practices for loading of especially large financial record sets (memos and transaction details).
- Extensive experience in the development and implementation of Data Warehouse & Business Intelligence (BI) Solutions for maximum performance and efficiency.
- Evangelized benefits of data warehousing to strategic, tactical and power users, getting buy-in and adoption of process/methodology.
- Expertise in broad range of technologies, including business process tools such as Microsoft Project, MS Excel, MS Access, MS Visio, technical assessment tools, MicroStrategy Data Warehouse Data Modeling and Design.
- Extensive experience in Data Analysis for loading high volumes of data and smooth structural flow of the data.
- Experience with data mining tools like NodeXL, Gephi, and UCINET.
- Created KPI and tactical reports using Pentaho Report Designer.
- Advance knowledge of the Loan IQ vendor loan agency and trading platform 4.
- Strong Business Analysis Skills including In depth knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases like Requirements, Analysis/Design, Development and Testing and Practical knowledge of Rational Unified Process (RUP).
- Experienced with Loan Origination process and Fixed Income Products like Treasury (Bonds, Notes, and Bill), Mortgaged Backed Securities (MBS), Municipal Bonds, Corporate Bonds, Junk Bonds and Convertible Bonds.
- Experienced in data warehouses and data marts for business intelligence reporting and data mining along with developing and documenting process flows for business processes.
- Developed extensive SQL queries using Query Manager against oracle database for data verification and backend testing.
- Experience on well-known Commercial Lending product Misys Loan IQ on all modules with expertise on Accounting.
- Ensures completeness, correctness and consistency of clinical data and data structure Identifying, tracking, and resolving basic queries
- Good understanding on SDLC process and involved end to end implementations of DWH and reporting solutions using Tableau, SSRS, OBIEE, Informatica, SSIS and Data Stage.
- Good understanding of Capital Markets debt, derivatives, and foreign exchange portfolio performance attribution and performance composites, Derivatives (Interest Rate, Credit, Equities), ABS, Equity, Loan and Fixed Income Business Process Management.
- Created SSIS packages to validate, extract, transform and load data to data warehousdatabases.
- Extensively worked with SSIS tool suite, designed and created mappings using various SSIS.
PROFESSIONAL EXPERIENCE:
Sr. Loan IQ Data Analyst
Confidential, NJ
Responsibilities:
- Conducted JAD sessions with management, SME, brokers, users and other stakeholders for gathering requirements and to discuss open and pending issues.
- Interpret and enforce all terms for facilities as per Loan Agreements.
- Extensive experience in Data Analysis and ETL Techniques for loading high volume s of data and smooth structural flow of the data.
- Worked intensively on the PL/SQL for creating and working on the database for multiple projects.
- Providing recommendations and helping in resolving errors and reconciling daily bank control of payment centers and treasury services.
- Modeled the fact tables and developed packages to populate them with T-SQL tasks to aggregate product warranties and their over time amortization.
- Writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary, Data Migration and Interface Requirements Documents.
- Followed Software Development Life Cycle using Agile(SCRUM)Methodology which is based on iterative and incremental development
- Gathered business and functional requirements for new batch and real time interfaces for the 'Global Loan Platform / Misys Loan IQ' conversion project.
- Led project planning and management of internal resources for client risk replacement by formulating strategies to improve business and operational processes by coordinating with the bank market's CIO for public funds, bonds lending, & mutual fund.
- Conducted SCRUM meeting, Sprint meeting and Sprint planning meeting.
- Tested the brokerage and financial services applications like Equities, Fixed Income, Investment Research, Investment Deposits, Accounts (Accounts Activity, Balance Holdings, Portfolio Management, E-statements, Retail Consumer Lending, Mortgage Modules, Integration Module
- Worked with Large DB2 and Oracle, SQL Server databases doing loading, validation and manipulation of data.
- Prepare BRD's for Enhancement and test environment when developed by Misys Loan IQ.
- For the Confidential 'International Global Lending' and Merger / Integration projects, performed SQL querying, XML and C# research and data analysis for the Misys Loan IQ database and Loan IQ downstream interfaces .
- Prepares and reviews all collateral balances and movements; create daily margin/collateral movement reporting.
- Manage testing and create comprehensive testing scripts for a project involving the addition of new enhancements in Loan IQ. In addition, create general regression scripts in Loan IQ for use in major software upgrades. Assisted in coordinating the consolidation of global testing across multiple projects.
- Developed JSPs to implement the business logic, and use java beans to retrieve the data.
- Developed complex T-SQL queries to load data including loops with nested cursors and merges.
- Wrote complex SQL queries to perform the backend testing of the Oracle database using PL/SQL developer and UNIX shell commands.
- Coded conversion programs and prototypes for the new system using C#.Net, WinForms, and stored procedures.
- Responsible for integrating Misys Loan IQ with the Confidential lending platform application, as the go forward solution. Part of a 3-member Project Management team that advises Senior Management on the proper interfaces to be used between LIQ and existing legacy systems.
- Met with Credit Risk Analysts and Loan Officers to gather Credit Approval Process and Reporting Requirements (Credit Exposure Data, Loss Data) for Credit Cards (TSYS), Currency Swaps, FX Forwards, REPO, Equities, Fixed Income instruments like Bonds, US Treasuries, Mortgage Backed Securities, and Asset Based Lending Products according to the Basel II compliance guidelines.
- Support Loan Operations by reviewing critical issues and breaks, utilizing Subject Matter Expertise in Loan IQ functionality to determine appropriate solutions based on current business practices. When appropriate, work with development and IT to coordinate challenges that require a more complex solution.
- Performs review of collateral and documents to ensure documents meet agency guidelines or customer agreements in a timely and accurate manner.
- Involved in Analysis of gathered requirements data, verification of requirements and creation of business requirement specifications (BRS). Extensive Knowledge of Loan IQ Data Model.
Sr Clinical Data Analyst
Confidential, Lowell, MA
Responsibilities:
- Integration of Investigator and Sponsor initiated query responses in clinical database (Paper- Oracle Clinical and Clintrial and EDC - InForm and Medidata Rave).
- Integrated, review and reconcile data received from external vendors with clinical database and ensure readiness for analysis.
- Performed raw data review and validation on discrepant data using the Data Validation Specifications.
- Reviewed of Sponsor generated listings to ensure commonality of databases.
- Reviewed output of edit check and assist in revision of a DVS including addition of new edit checks and Protocol Violation listings.
- Process all loan transactions, borrowings, repayments, interest, fees, assignments, rate changes.
- Monitor daily reports, investigate and process any adjustments, resolve discrepancies.
- Calculate and collect all fees due, bill customers, pay participants and maintain ticklers
- Generated weekly metrics report and provide Sponsor with feedback on the Raw Data transfer.
- Created SQL scripts to obtain daily data review and query metrics.
- Interpret loan documentation to ensure system accuracy.
- Processing of Deferments, non-accruals, paid off loans, guarantor changes, rating changes, address changes.
- Used NodeXL, Gephi, and UCINET for extract the data from social websites and performed an analysis on data by using these tools.
- Performed study start-up activities including form design, preparation of edit checks, database testing (User Acceptance Testing).
- Created Data Validation plan and Data Cleaning Specifications including Lab and SAE reconciliation plan.
- Assemble collateral documents for review.
- Improved Performance of the database by creating Clustered and non clustered Indexes and by optimizing the T-SQL statements using SQL profiler.
- Booked and serviced agented, syndicated and bilateral loans in Loan IQ.
- Reading of Credit Agreements, Processing of Assignments, Assumptions, and Amendments.
- Knowledge in reading and entering Borrowing Base Certificates into Loan IQ.
- Draft and finalize CRF Completion Instructions, Data Handling Guidelines.
- Reviewed and provided feedback of study contract; recognize out of scope activities and communicate to PM / DM Lead.
- Write offline-listing specifications for data review, protocol violations and lab data reconciliation guidelines.
- Reviewed and interpreted Legal Loan Documentation for terms essential to the deal booking in the Loan IQ system.
- Effectively processed all loan transactions including advances, repayments, rollovers, interest and fees in LoanIQ.
- Assisted in the migration from Midas to Loan IQ successfully
- Review and reconcile electronic data from vendors (Central Laboratory data) and Serious Adverse Events (SAE) to ensure the accuracy and correctness between the CRF data and E-data, and issue queries as applicable.
- Executed training sessions for junior team members in Lowell, MA; RTP, India and South Africa offices on data cleaning activities including data review and query integration.
- Facilitated cross-functional team meetings both internally (study team members, coders, biostatisticians and other functional groups on technical requirements for DATABASE DEVELOPMENT and data transfer, and CRF/queries process flow) and externally (sponsor, e-data vendors).
- Verified and loaded collateral data from collateral documents in loan packages into loan accounting.
- Design and develop T-SQL files that queries for FICO Debt Manager OLTP tables and a collection of supplementary flat tables with data about consumers, consumer accounts, payment journals, and multiple types of transactions.
- Created Stored Procedures, T-SQL, Triggers, Views and Functions for the database Application.
- Performed functional QC and provide feedback on Raw-data transfer output and export datasets.
- Managed and coordinate team activities worldwide (India, South Africa) from start-up through maintenance and lock of the study.
- Performed Database Quality Assessment (Critical Items audit and Blended audit) of study data at Database lock.
- Created and Maintain Central Files for all study related procedures, guidelines and correspondences in accordance with SOPs
Sr. Data Analyst
Confidential, Plano, TX
Responsibilities:
- Perform On-call rotation for Data Warehouse Relational Database System and Unix Decision Support System (both consisting of multi-million row Database / Table spaces).
- Managed ad hoc and recurring reporting, data validation, verbatim analysis and data mining, data automation exception processing, and business rule/process clarification.
- Conceived, designed, coded, and implemented a Business-to-Business (B2B) reporting tool (ART) using the Enterprise Data Warehouse.
- Worked iteratively with business customers to create ad hoc reports from 27 disparate data sources.
- Working closely with Business and IT for documentation (BRD, FRD, FSD, Change Request, Mapping, WorkFlows) of requirements.
- Prepare AS IS and TO BE system process diagrams, data automation, data mapping documents, business rules documents, Data Dictionary and function specification document.
- Working on data mapping and data automation for various products in Deposits, Securities, FX and Derivatives and worked in Data Profiling for various banking portfolio.
- Performed Impact analysis using Informatica Metadata Manager by extracting data lineage information for Data Structures (source/target/target definition), Fields and multiple power center Repositories.
- Perform data profiling, data automation and data mapping to map various source data systems to the new Basel database.
- Wrote High level T-SQL procedure to execute Window command shell program.
- Involve in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica Power Center.
- Create JIRA production bug workflow, conduct Gap Analysis by creating Swim lane diagrams and highlight the differences between the Current (AS IS) environments versus the Target (TO BE) environment.
- Mapped the source and target databases by studying the specifications and analyzing the required transforms.
- Developed supporting documentation such as source to target mapping documents, business logic flows, process flows, requirements traceability matrix, etc.
- Extensive experience in implementing CDC using Informatica Power Exchange 8.x/7.x.
- Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
- Design and implement all stages of data life cycle. Maintain coding and naming standards.
Data Analyst
Confidential, Houston, TX
Responsibilities:
- Integrated data from external sources like swift, xml using B2B Data exchange tool of Informaticaa and Worked on Migration of mappings from Data Stage to Informatica.
- Created the expression, encryption, SSN replacement data masking techniques in data masking transformation of Informatica Power Center.
- Data masking for the limited trust zone using data masking transformation of Informatica power center.
- Created Hierarchies like World Region- Country in the Data Model by creating Entity Objects, Entity Types, Hierarchies & Relationship Base Objects, then created profiles to be utilized for Data Stewards for Master Data Governance.
- Statistical analysis of various health indicators to identify patients with high risk of medical non-compliance through R.
- Developed the “Workflow Tool” website for our department by the Java, Javascript, Oracle, JIRA, SVN, MS Visio (Business logic of Document), and Eclipse.
- Involved in using Fast Load & Multi Load Teradata Utilities wherever standard ETL workflows cannot be used.
- Performed scrubbing on data imported from foreign flat files and databases. Providing SQL scripts to ETL engineers for implementation of workflows inside SSIS.
- Heavy data entry, coding or entering of basic identifying information into the file or database.
- Validated, analyzed the results from SAS to the source database, or existing data dictionaries.
- Created project artifacts such as business, system, functional and non-functional requirement documents. Augmented documentation with process flow diagrams and a data dictionary.
- Provided a solution to summarize the end to end process of Lineage across Confidential . Coordinated with multiple team to understand their data and flow. Came up with a solution to provide end to end manual lineage in a spread sheet which helped data owners and controllership and managed end to end manual lineage project for multiple portfolios.
- Involved in dataanalysis and creating datamapping documents to capture source to target transformation rules.
Data Analyst
Confidential, Charlotte, NC
Responsibilities:
- Prepared Test Reports and submitted the Bug Findings to the Bug Tracking system using Quality Center and its modules (Requirements, Test Plan, Test lab and Defects).
- Archived all project documents and implemented the UCM model for activity-based configuration management using Used Rational Clear Quest for defect tracking and change management request and reported them.
- Lead and participated in multiple ongoing project meetings using iterative and Agile project methodologies.
- Used Test Director for bug tracking and reportin.
- Collaborates with peers and Subject Matter Experts (SMEs) to design internal training plans, standard operating procedures, and quick reference guides.
- Accountable for ensuring that content of all material is relevant, data-driven, and approved by SMEs.
- Worked with Business Users/SMEs to carry out thorough data analysis (Table/Column Profiling, Pattern Frequency Analysis, Alert Configuration etc. using Dataflux and SQL Management Studio) to support decision making process.
- Participated and answered questions raised by SMEs in the deep dive sessions.
- Very good knowledge on source control management systems such as SVN, Perforce, TFS, GIT and CI tool as Jenkins and Build Automation tools like Maven and ANT.
- Performed Data analysis and Data Profiling on clinical data (patient vitals, admitting, discharge, transfer, imaging, lab, pharmacy, etc.) and financial data (like hospital billing, professional billing, etc.) using SQL queries and tools to understand requirements of data and business.