We provide IT Staff Augmentation Services!

Data Modeler/analyst Resume

2.00/5 (Submit Your Rating)

NY

SUMMARY:

  • Computer Science & Engineering Degree from VNIT Nagpur India with brilliant academic record. Has strong IT & Financial Industry background with experience in development and implementation of projects in Data (REF, MDM, EDM, DWBI, Bigdata/Cloud) working in AGILE/WATERFALL development models with distributed development teams and lot of Cross Organizational Coordination
  • Total 20 years of experience in Core/Investment Banking, Insurance, Telecom, Market Research, Survey Industries TOGAF and PMP trained.
  • Professional experience with last 12+ years as Data Modeler/ Lead Developer/Data Analyst working on solution design and delivery of OLTP, ODS, EDW/DWH, BI platforms for diverse business clients and needs. Prior 8 years’ as core programmer.
  • Expertise in designing, developing and maintaining Data platforms, Data Architecture, Data Modelling, Industry data models, Data Profiling, Business metadata management, Data Governance and Data Quality frameworks with hands on experience on multiple large databases, ETL tools, SQL, PL/SQL, UNIX, PERL, Python, ERWIN etc. Skilled in SQL, PL/SQL Performance Optimizations
  • Hands - on experience in data cleansing/migration/consolidation, LEGACY platform migrations/upgrades/BI implementation projects. Familiar with Ingestion to DATA LAKE Cluster and data analytics. Worked on MAINFRAME/COBOL in past
  • Expertise in Requirements Gathering & Data modeling Star Schema and Snow Flake Dimensional and 3NF Normalized Relational Modeling, fine-tuned Physical Data Modeling, SQL, DDL and DML scripts for target Databases and publishing models and data sets
  • In-depth knowledge of data models, data management, structured and unstructured data, DDL Scripts, Keys & Indexes & Table Partitioning, External Tables, Data Federation, Data Models, DW/BI methodologies, DWH appliances, OLAP, BI reporting using data discovery tool, visualization tool with data analytics capabilities, Spreadsheet driven reports, IT-driven BI & business driven-BI reports
  • Proven delivery of Data Strategy, EDW data artifacts, business rules/models/data, GAP analysis, Impact analysis, future-state designs, TO-BE process, process/data governance & data migration strategies, reports, dashboards, scorecards model/designs.
  • Solid understanding of software development (SDLC), change management lifecycles, E2E testing & release procedures and prod support. Proficient in data sourcing/data mapping, development in SQL, ETL, Oracle, Teradata, SQL Server, Golden Source 8.x and Teradata FSLDM (Model for Banking and Insurance Domain)
  • Strong problem solving and analytical skills, business analysis, project management and multi team coordination, client facing skills
  • Good understanding of client data lifecycle, Trade/Payment lifecycle, FO/MO/BO process/data/systems, Cash EQ, Core Banking, Cash Management, Transaction Banking, Correspondent Banks, Payments systems, Compliance Systems, Banking Compliance rules, policies and standards, Market/Credit/Operational Risk, Finance & Risk Data (Risk Metrics, Liquidity coverage ratio, CCAR, BASEL

TECHNICAL SKILLS:

Industry Data Models/Tools: TERADATA FSLDM, GOLDEN SOURCE 8.x, IBM BDW, Data Quality and MDM tools

Major Databases/MPP DWH/EDW & OLTP: Oracle, TERADATA, SQL Server, NETEZZA. MAINFRAME, DB2, COBOL.

ETL Tools and Scripting Languages: INFORMATICA, SSIS, ODI, T-SQL, SQL, PLSQL, UNIX, PERL. Python. Java (Familiar).

Big Data and Cloud based environment: Hortonworks HADOOP, HIVE, HIVEQL, NoSQL DB, Spark, AWS, S3, EMR

Data Modelling: Relational/Dimensional/Canonical, UML, ERD, VISIO PRO, TOAD Data Modeler, ERWIN (LDM, PDM) Database Normalization/De-normalization techniques, data integration and complex SQL queries performance tuning

Business Process Modelling: ARIS, SIPOC, VISIO.

Business Intelligence and Reporting: Business Objects, XCELCIUS, Micro Strategy, SSRS, QlikView. Familiar with Tableau.

Data Analysis and profiling: Qualitative and Quantitative Analysis using MS EXCEL, MS ACCESS, TOAD, SQL, PL/SQL, SAS analytics, with good understanding of all structured and unstructured data types and analytical functions.

Well Versed with: MQ, AUTOSYS, CONTROL-M, Real TIME ETL, SOA, REST/SOAP API, POSTMAN, JSON, XML, TIME-SERIES

Other tools: MS WORD/EXCEL/PPT, VISIO, ARIS, JIRA, CLARITY, VERSIONONE, Version control systems. Familiar with Collibra

Other Utilities: Teradata BTEQ, SQL Assistant, Fast Load, Multi Load etc. IBM Sterling Connect Direct

Methodologies: Complete Software Development Life Cycle, Waterfall, Agile, Iterative, Scrum, PMP Trained

FUNCTIONAL SKILLS:

Asset classes: Equities (EQ), Fixed Income (FI), FX, Money Market (MM), Financial/Commodities Derivatives (Future & Options) - OTC and ETD, SF, PF, FUNDS, USPF Product data attributes. Worked on Client/Security/Product Master Data (MDM). Knowledge of Insurance Party and related data. Familiar with the Financial Message Models, Canonical Model and data exchange

Process/Data in BFSI: Client Onboarding, Client Data Migrations, Reference Data, KYC/CDD/EDD/AML Screening business processes and data flows, PRE/POST Trade Data, middle/back office/risk management, Settlements, Reconciliations, Payments, Compliance/Regulatory Reports, Cash reconciliations, Safekeeping, Payments, Corporate actions, Cash Management Services

Enterprise data: Party Data, Instruments, ratings, parties, prices, corporate actions, products and positions data used for trading, compliance, risk management, settlements, and accounting purposes. Finance Accounting/REF data i.e.

Client & AML Data: Client, Portfolio, Accounts, COB, KYC, CID, CDD, OFAC, SANCTIONS, REF Data, industry classification

Risk Data: Familiar with FRS GLOBAL, MiFID, Market/Credit/Operational Risk, BASEL, Central Bank Reports (CBR), P&L, VAR.

Vendor systems: LZ FIDESSA, FINACLE Treasury, MISYS, EAGLE, REUTERS/BLOOMBERG, FIRCOSOFT/ACTIMIZE data needs

CURRENT EMPLOYMENT:

Confidential, NY

Data Modeler/Analyst

Responsibilities:

  • Preparation of Data model design documents (Conceptual, Logical and Physical) and Data Solution designs for the review as part of Several Data initiatives taken up and various streams of work.
  • Collaborate with business partners and technology partners to gather and solidify business data model covering multiple financial products like Fixed Income, Money Market (MM), Financial/Commodities Derivatives, HYBRIDs, FUNDS, PROGRAMS, and Structured Finance Products (PF and USPF) from multiple source systems and VENDOR market data (CUSIP, ANNA, PLATTS etc..). Consolidation of REF/FIN & industry data from multiple systems into MDM and flow to DWH
  • Prepare Data Flows and DATA PROFILING/DATA QUALITY check list on Source/Target data and formalize DATA Governance Rules. Presenting Data profiling results from SQL/QlikView for decision making in initial phases.
  • Documented the end to end data model & solution design with FULL DATA LINEAGE and traceability for PLATTS Commodities data loading to the HUB
  • Prepare Glossary, Classifications of data domain, identifying type of entities, Attribute Types, and Relationships (Cardinality, Degree and Nature), distribute models, data sourcing documents, and data flow/architecture changes for discussion.
  • Designed the Logical and Physical Data Model (ERWIN LDM & PDM) for discussions and enhanced the DWH Physical Data Model and create DDL/DML SQL scripts and data seeding scripts.
  • Extensively involved in Complex S2T Data mapping development, documenting business rules, metadata and required table structures in multiple layers and Data modeling. Write SQL queries to reconcile data between Source and target.
  • Identified/documented multiple 3NF models of source systems and transformation rules required for populating and maintaining 3NF MDM systems and subsequent flow of filtered data to data warehouse and aggregations in DWH (SQL Queries).
  • Worked with Database/IT team to create a best-fit Physical Data Model from the logical data model.
  • Development of SQL/Python scripts for data profiling and data extractions from multiple source systems and from API.
  • Design and develop data applications or pipelines and develop interfaces to acquire and publish data sets for defined use case(s). Transform business requirements into technical specifications
  • Resolve existing Data Grievances/Pains and Data Quality Issues in new system and mitigate risk.
  • Enhance Data models/processes that improve efficiency, performance and stability and build re-usable components/API.
  • Review of BRD/User Stories and Mapping of business processes and Creation of CDM/LDM & ETL/DQ Rules/API Designs to cater for new business requirements incorporating data standardization strategies and presentation to IT & business
  • Development of various functional domain services contracts (SOAP and REST API) and sample test data snapshots in relational and JSON/XML. Provide subject area models in VISIO/PDF for API developments for guidelines.
  • Analysis to of requirements to identify critical data elements/linkages and establish data lineage, data strategy
  • Prepare DFD/Model/Mappings to industry Confidential model CMP GOLDEN SOURCE 8.x(Silver/Gold data) REFERENCE DATA PLATFORM and downstream data warehouse. Develop SQL queries based on the GOLDEN SOURCE 8.x product tables.
  • Proposed data solutions, changes in Data models (ERWIN/VISIO) & ETL solutions covering 128 countries for Financial Data Consolidation, Reporting & Data Distribution Requirements. Articulate changes in DWH data model and co-ordinate with on-shore and off-shore teams on UAT and monthly release using technologies (ORACLE SQL, SQL Server, PL/SQL, Python, VBA/Macro, INFORMATICA, UNIX, AUTOSYS, MICRO STRATEGY) for various development need.
  • Document the data flow/mapping and methods for data ingestion to hub as well as data distribution from hub via DVL layer
  • Document the flow and calculation of key REF Data, risk metrics/measures (30+ Indicators and SCORES) and associated meta data in DWH and explain to the Technology.
  • Solidify Data Requirements and identify gaps in DATA MODEL. Prepare/Review process/data flow diagrams,
  • Provide data inputs and assist in POC for ETL pipelines on AWS EMR and Big Data platform pilot implementation
  • Document logic for data collection and aggregation of historical data across full time-series and consolidation in DWH
  • Identify data and process integration impacts for new platform implementation with Real time ETL and overnight batch ETL.
  • Identifying DATA GOVERNANCE issues and formulating refined business process and data flow for long term solution.
  • Identify and analyze new sources for Phase 2 and researching the API data retrieval mechanisms and data integrations from BEA website. Analysis and support data loading in Big Data environment for new sources for POC.
  • Review and advice on Informatica S2T mappings, mapplets and reusable transformations. Also used several transformations like expression, Aggregator, Lookups, Joiner, Normalizer, Router, update strategy, Filter, Union, Sequence Generator, Stored Procedure, Source qualifier, XML transformation, etc.

Confidential

Data Modeler/Architect

Responsibilities:

  • Delivered end-to-end architecture, design and lead the Data modeling activities that have been taken up from Data architecture deliverables standpoint
  • Analysis to identify data model changes to cater for critical data elements (for direct feeds from upstream source systems and strategic feeds from EDW) for replacement of legacy compliance system for screening against Sanctions and Due Diligence Lists (PEP, Adverse Media and AML) for all segments (i.e. CIB, Private Banking, Retail, Commercial, Employees and Securities) and related parties’ screening. Discuss and validate end-to-end solution for each source system screening by FIRCOSOFT and modelling data needs for reporting & Data Quality checks/dashboard requirements by source systems
  • Work with Technology and Architecture team to design and develop data solutions/models in the Anti Money laundering (AML Screening GNS) and Trade Surveillance.
  • Provide inputs on creating Tier 2 data model (Oracle/Teradata) & ETL(INFORMATICA) rules & mappings and Tier 3 downstream Teradata views based on Tier 1 source data. Assist in reconciliation of data from source to target systems before feeding screening systems. Feed the recommendations/changes required for ERWIN Model publishing for the releases
  • Qualitative and quantitative data analysis, validating ETL Rules and DQ rules in line with DQMF (Data Quality Monitoring Framework) for risk management.
  • POC for HADOOP platform implementation for storing, processing, and analyzing large volumes of data and query results
  • Support ETL to RDBMS & HDFS and Performed Data Analytics using HIVE and HIVEQL on customer data stored on Hadoop HDFS.
  • Participate in the development and maintenance of corporate data architecture, data management standards and conventions, data dictionaries and data element naming standards and conventions for multiple computing environments
  • Support data review processes by collecting information and fulfil data and reporting requests received from internal stakeholders. Provide inputs on data quality improvements and data/business transformations rules
  • Identify data and process integration impacts for new platform implementation. Enhancing and standardizing the interfaces for the Excel Loader data/list entries integration and Data provisioning for Management Information reports
  • Coordinates new data development and ensures consistency and integration within existing application systems.
  • Partner with technology to “translate” the business requirements into technical requirements and analysis of data gaps.
  • Recommend solutions and options for system performance improvements and Build POC (Proof of Concept) for application design and reporting requirements
  • Solidifying the conceptual/logical model and data governance and formalize migrations and distribution of data to downstream operations systems. Assist in system testing, SIT and UAT and production dry runs.
  • Providing regular updates (PPT), RAID log, and Project Plans to stake holders and Management. Managing CR, Issues Log and tracking

Confidential

Data Modeler/Data Architect

Responsibilities:

  • Worked in this role with Client/Finance data initiatives to roll out Architecture, design and data model deliverables.
  • Analyzing and scoping requirements from multiple lines of business ( Middle/Back office Operations/ Finance & Control /Treasury/Global Markets/ Compliance team ) and providing various data solutions and options by carrying out structured analysis and assisting on process and data flow improvements in the target state to support multiple work streams for operational data deliveries and reports delivery from core banking DWH supporting multiple business lines in bank
  • Identifying and data modelling (ERWIN and target database Oracle) of Key data elements for Compliance and Regulatory Reports from Core Banking HUB
  • Interface with Key user’s groups, Architecture team, Global Head Office functional SME to ensure projects requirements are accurately documented in line with set standards/expectations for multiple work streams of the project. Informing the head office on any changes required in Confidential module, data dictionary, logical models and architecture.
  • Document data solutions & data model changes, design, business rules, data transformations rules (INFORMATICA) mappings, people process tasks, & provision data for ODS/BI reporting requirements & identify System of record (SOR) and data needs.
  • Isolate the impact of data changes on downstream consumers of data & reports and document the details and scope.
  • Enhancing and standardizing the interface for the Excel Loader data source integration and Data provisioning for Business Objects UNIVERSE to cater for BI BOXI reports for the new enhancements from the data warehouse,
  • Prepare Model/Documents for the account level reconciliation and exception/break reporting using the GL and finance data.
  • Discuss feasibilities & dependencies with technology teams on various bank-wide projects on data migrations, reporting and automations. Provide hands-on management and guidance for the development team from requirements perspective.
  • Prepare design documents which include descriptions of basic functionality along with definitions of host interfaces (layout and mapping), GUI, processing flows, reporting, database changes, test strategy/test cases, UAT planning.
  • Prepare Data artifacts, Data Dictionary, S2T mappings, data flow for working group meetings and explain gaps/risks.

Confidential

Data Architect/Senior Consultant

Responsibilities:

  • Collaborate with various partners (TREASURY, COMPLIANCE, OPERATIONS, CLIENT SERVICES AND FULFILLMENT, Monitoring and GLOBAL PRODUCT) on BRD and modeling business requirements & data needs, business processes, and other business rules in order to ensure alternatives have been carefully considered and that the chosen solution meets business objectives in various functional areas like Payments/Collections, Returns, Rejects, Repair, Sanctions Filtering(interface with FIRCOSOFT), COB to new platform, AML Compliance, Transaction Monitoring, Accounting, Regulatory Reporting, Clearing, Reconciliation, Investigations and Billing(Netezza)
  • Comprehend existing and new vendor systems logical & physical models(ERWIN) & features and consolidation of new client data and necessary data attributes for client onboarding and client preferences in new system.
  • Translation of business and information requirements to solidified data/report technology requirements for implementation using ORACLE, UNIX, FIRCOSOFT interface, MICROSTRATEGY.
  • Understand the AS-IS and TO-BE architecture and review, capture data flow and clearly define techno-functional and specific information requirements within CLIENT REFERENCE data, ETL & Reporting space. Develop data flow and data sourcing strategy for various Reports (Ops/Central Bank Reporting/Regulatory Reports) from the new repository.
  • Document the reporting data requirements in catalogue, analysis of re-use of existing reports or any incremental changes, getting sign-off, figure out right data sources to fulfill data requests , ETL methodologies and ensure data provisioning in ODS, data mart or DWH layers of global BI Microstrategy reporting platform . Work with data management and BI technology team in UK and India to fill the gaps and provide solutions and leverage existing reports.
  • Collect and maintain formal descriptions of data elements, data definition, data structures and data governance rules.
  • Perform analysis to identify System of record (SOR) and data needs for CLIENT REPORTS, OPERATIONAL, ANALYTICAL and MIS REPORTS from layers of new GOR reporting platform.
  • Analyze data and process gaps (for FI and NON-FI clients and data attributes) between existing legacy system and new system for several countries requirements for interim and final state. Identify country specific requirements.
  • Provide guidance on standardization on data consolidation, interfaces and assist in creating tactical and strategic model for operational and MIS Reporting and CLIENT DATA MIGRATIONS strategy to new reporting DWH (GOR)platform
  • Ensure Data Architecture team in corporate regional variations and deviations for country’s requirement
  • Ensure application of information management recommendations and identify key areas of data quality, data lineage, data security, data retention, data governance and metadata (using INFORMATICA).
  • Compile data attributes and data migration rules for ON-BOARDING of early adopter clients coming via various bank channels and provisioning of the client, product & transaction data (daily & historical) attributes in the target system (Golden Source of Data)
  • Preparing Glossary, data flow diagram, Use Cases, Business Rule and Data mapping specifications & traceability matrix.

Confidential

Data Modeler/Data Analyst

Responsibilities:

  • Analysis and data modelling of all Client/Account/Market/LE specific 800+ data attributes for 11 entities, jurisdictions and departments and formalize data governance rules and modelling in both APAC Model and Global strategic data model (3NF normalized extensible data model in ERWIN)
  • Study of banks local and global trading and settlement systems for Cash Equities, Listed and OTC Derivatives, FX, MM to understand client on boarding process and data relationships and all the counterparty, portfolios, Trading account, SSI attributes to assist in documenting various workflows automation of client on-boarding and consolidation of client reference data in order to get a Single View of Customer Portfolio and associated data in client consolidated repository (CCR - Golden data of bank).
  • Streamlining of COB processes/data workflow in new system, remove data redundancy, optimizing data for business consumption/distribution and regulatory data compliance from both APAC and GLOBAL repositories.
  • Solidifying the conceptual/logical model and data governance and formalize migrations and distribution of data to downstream operations systems for client/portfolio/accounts trading various products (EQ, FI, FX, MM, & DERIVATIVES).
  • Classifications of data domain, identifying type of entities (Category, Attributive, Associate), Key/Non-Key/Derived Attributes, and Relationships (Cardinality, Degree and Nature) for APAC and Global Model.
  • Perform data analysis using MS ACCESS and publishing the analysis results to decide on migration strategies
  • Design and Development of the Strategic and Tactical Client Data Model using ERWIN in 15 Subject Areas taking TERADATA FSLDM as a reference model. This includes Business Object Model development, Logical Data Model development, Data attribute definition and related source to target mapping against the source system.
  • Document descriptions of data elements, data definition, data structures and data governance rules.
  • Record/Report data quality issues in the issue log and work with the governance, business and IT SMEs to resolve them.
  • Analysis of Client (Party and Account) On-boarding processes, manual account opening forms(AOF/e-AOF) for 11 legal entities, KYC, CID and CDD, regulatory pre-screening process and replacement of manual on-boarding process with automated process and ensure data compliance with KYC, AML, FATCA & other operational reporting requirements.
  • Collaborate with business (COB, FO(RM)/CID, CDMC, RDS, OPERATIONS, LEGAL & COMPLIANCE, PC, Credit Risk, CRM) to carry out system Client onboarding STP workflow analysis and propose various options for process & data model and presentation to stakeholders and highlight gaps. Provisioning of data attributes for FATCA regulatory reporting from APAC and GLOBAL repository. Preparing test cases, BA artifacts, design documents and getting sign-offs from various parties.
  • Preparation of client data integration and data migration approach documents and get approval from various stakeholders.
  • Analyze gaps between existing and target business/data architecture and impacts on the data migrations to CCR and ensure alignment to Enterprise Architecture standards and processes. Assist in data cleansing using data feed from AVOX.

Confidential

Data Modeler/Data Architect

Responsibilities:

  • Work with EA in finalizing Data Architecture Changes and ERWIN Data Model Changes and get sign-off for MY implementation.
  • Understanding the flow of post trade data from source systems and migration of relevant data attributes into enterprise DWH (TERADATA FSLDM) and then to the Malaysia FDM with the necessary filtering logic/views. Make necessary updates to upstream and downstream source to target mappings documents and data extraction/transformation rules from upstream TERADATA FSLDM. Development of Python/SQL scripts for data profiling and data extractions from multiple source.
  • Participate in review of the Dimensional Data-Modeling, design and approach to be used for different data mart components like Hierarchies, XMAP and Look Up tables, Pre and Post OFSA Instrument tables, Break Funding and Liquidity Premium, GL related Tables, Risk Metrics, Cost Allocation tables, YTD tables and Historical Data Table for Malaysia MIS reporting with baseline as Singapore financial DataMart tables on TERADATA FSLDM Platform. This includes data for CIF (customer details), hierarchies, deposits (saving, current, fixed), loans (mortgage, consumer, commercial) Credit Cards, capital market securities, FX, Derivatives, money market, Telegraphic Transfer, Remittance, insurance products in the bank. This involved creation of total 90+ PDM DDL scripts deployed in different layers of Teradata
  • Preparing the end to end data flow diagrams for GE Insurance (acquired by OCBC Bank) following OCBC’s guidelines. Extending the model to accommodate insurance data requirements
  • Document & Explain the FSLDM upstream subject area model (with main entity PARTY, PARTY ASSET, PRODUCT, AGREEMENT, EVENT, FINANCE/GL) and PDM entities, PK, relationships, S2T mapping approach & Validation Rules.
  • Validating and Finalizing the ERWIN Downstream Data model with about 20 Subject Areas and aligning to upstream TERADATA FSLDM model.
  • Maintain tracker spreadsheet for the changes in LDM, SIBS Retrofit Changes and ensure the PDM script deployments.
  • Publishing the data model comparison reports and results weekly to the team for discussion for any changes.
  • Define FDM entities (instrument, look-up, reference and user control data) and data attributes and prepare glossary of terms for consistent usage. Logical and Physical Data Modeling (ERWIN) in TERADATA FSLDM for MY downstream implementation and reporting and Standardization of data type and provide methodology for correct REFERENCE data and the keys in the FSLDM DWH tables
  • Contribute data model related aspects to the overall Architecture. Update MIS Malaysia EDW Platform Downstream Data Architecture document and get approval from OCBC stake holders on the SUBJECT AREA.
  • Document and get approval for any table/subject area changes and upstream FSLDM product data extraction criteria changes to bring the data into MY FDM
  • Assist in finding out the correct source system data to be brought into GL related tables at customer account and transaction levels for GL reconciliation and reporting amount on different business accounting segments like (product codes, GL account numbers, cost center codes, customer segments, entity code and sub-ledger codes).
  • Setting up of appropriate Unique Primary Index (UPI), Unique Secondary Index (USI), Non-Unique Primary Index (NUPI), Non-Unique Secondary Index (NUSI), Join Index in the instrument tables as required in Teradata.
  • Bringing RISK METRICS table data (sourced from source system FERMAT) in Malaysia data mart with all the fields required for the calculation of risk parameters to cater for requirement of automation of computation of operational Risk RWA and required capital in line with BASEL-II approach and Risk Calculation Scenario. Analysis to find data model changes in the several instrument tables to accommodate ORWA and CRWA, RCAP, ECAP, EL to be transferred back into the respective instrument tables after the computation of risk data elements in the risk metrics table. Upstream data modelled in RISK CALC SCENARIO, BASEL FUNCTION TYPE, BASEL II APPROACH, HISTORY (CREDIT RISK, EQUITY HOLDING, CAPITAL) and other data required for risk calculation.
  • Make changes in the subject area model and views required for FDM GL Reconciliation and Exception reporting
  • Make changes table structures along with some additional new fields required by ETL process/jobs at run time.
  • Refine and Publish logical models for Cross Reference tables and Lookup tables used by different areas in the FDM.
  • Discuss with ETL team, BI Hyperion Reporting team and testing team to ensure there are no data gaps in data model and address them effectively with proper impact analysis. Guiding on the test data source used by Hyperion OLAP CUBES.
  • Document requirement for SAS data sets, BI dashboards/scorecards using BO & XCELCIUS for senior managers at MOM

Confidential

Data Modeler/DWBI Architect

Responsibilities:

  • Provide guidance on architectural and data modelling issues for Banking and Insurance domain projects
  • Prepare guidelines/documentation on ERWIN LDM and PDM approach, FSLDM usage and data modelling artifacts for multiple subject areas and Teradata DDL scripts and Indexing mechanisms
  • Defining the critical business data elements and relationships. Logical and Physical data modeling(3NF) using ERWIN r8 for both Oracle and Teradata as target database. Classification of Attributes Type and identification of Relationships within subject area and across multiple subject areas. Document the ETL design/Specs for SSIS implementation
  • Mapping business requirements into Conceptual and Logical Models (TERADATA FSLDM) for banking and finance industry.
  • Understand the needs of business units and assist in the development of roadmap for models and releases.
  • Providing support to delivery teams for architectural and performance issues with necessary changes in data model

Confidential

Data Engineer/DWBI Consultant

Responsibilities:

  • Dimensional Data-modeling for the metric calculations and bucket segmentation, historical data and views required by BI reporting. Created conformed data structures, Facts & Dimensions, primary (rollup) metrics, Derived Metrics (roll down records) Snapshot Metrics, Systematic Adjustments for multiple metric types and CDR types (Wireless or Land Line)
  • Identify, assess, and document business requirement and workout the strategy for data movement between data sources and targets using SSIS ETL tool and create data model to support the requirements and creation of POC and mockup. Plan and document data migration strategy. Delivery of 30+ BI reports with federated queries on Netezza and Sql Server.
  • Creation of design documents, data model, Technical specifications and getting approved from COE team in USA.
  • Provided technical guidance to the team and ensure delivery as per set standards. Creation of generic approach using dynamic sql to cater for dynamic aggregation requirement for the fact metrics at all levels of aggregation for OLAP reports

Confidential, NY

Senior Developer/Data Consultant

Responsibilities:

  • Reverse engineering of existing OLTP databases to logical data model using ERWIN for business analysis and discussions.
  • Study of existing OLTP applications and legacy systems. Data-modeling (Logical to Physical Design) of ODS/EDW using ERWIN. Schema design for REFERENCE DATA, INSTRUMENT DATA, MARKET DATA and PARTY DATA.
  • Logical and Physical data modeling using ERWIN r8 (12 Subject Areas) with Oracle as target database and generate various ERWIN reports and Model Comparison Reports, Model Quality Reports and DDL scripts.
  • Assist in implementation of FIDESSA Latent Zero Vendor for FI modules: Minerva, Sentinel, Tesseract, Derivatives and REAL TIME loading of market DATA (BeNCHMARKS and CREDIT RATING)
  • Development of complex generic packages using Oracle, PL/SQL, UNIX and Control-M and testing
  • Identify the key data elements for modelling client accounts party data, Instruments and Bloomberg data for market valuation and regulatory reporting. Database Schema design, DDL scripts, Development of Account Market Valuation Module.
  • Analysis for provisioning of Business Objects UNIVERSE to cater for 20+ BI reports from repository
  • Collaborate with Architect, SME and ETL team to enhance the model for flexibility and performance of data loading.
  • Document the requirements for data extraction from SYBASE/UDB, transformation and loading to ORACLE HUB.
  • Provide inputs on data quality improvements and data/business transformations rules as Data SME required for INFORMATICA mappings, Sessions & Workflows and dependencies between the workflows.
  • Assist in FULL and DELTA ETL INFORMATICA test cases and CONTROL-M BATCHES timings for the data flow for Lookup, Transaction & Position data, close of business events, start of business events, Market Data and reference data required.
  • Reconciliation of both source and target data sources using complex SQL joins, correlated sub-queries, and other aggregate functions for validations of test results.

We'd love your feedback!