Tdm Solutions Architect Resume
New York, Ny
SUMMARY:
- Setup Optim server on Linux platform
- Setup Target and Source database (Oracle) configurations
- Performed upgrades
- Setup relationship mappings
- Adapted various selection criteria to mappings
- Performed Subsetting processes
- Performed masking processes
- Developed and implemented TDM framework including customization around IBM Optim
- Triggered custom load of Optim data using Data Pump to achieve performance and multithreaded/parallel processes
- Automated all load specific components including dynamic generation of Data Pump configuration files
TECHNICAL SUMMARY:
Languages: Java, C, Perl, VBScript, KSH/SED/AWK, PL - SQL, T-SQL, JCL
O/S: Solaris, AIX, Linux, Windows, Z/OS
Technologies: IBM Infosphere Guardium 9.0, Optim TDM 8.1, MS SSIS, Access, Erwin, Power Designer, IBM TWS, Frontrange ITSM, HP QC - Quality Center 10, HP QTP - Quick Test Professional 10
Databases: Cassandra , Graph , Spark , Solr , DB2, Teradata, Oracle, SQL Server, Sybase
Hardware: Sun, IBM
PROFESSIONAL EXPERIENCE:
Confidential, New York, NY
TDM Solutions Architect
Responsibilities:
- As chief Architect engaged in R&D efforts with IP (Intellectual Property) focus for Data Solutions/Framework including SOA/MicroServices, NoSQL (DataStax - Cassandra, Spark, Graph, Solr), AI and machine learning for pattern recognition, etc., as well as predictive analytics based on behavioral attributes.
- Lead role responsible as principal TDM architect to make assessments for clients and derive innovative custom solutions specific to client requirement including cost/benefit and timeline/resource analysis.
- Interview client development team and leads to learn current practice to derive solutions for challenges of test data as relates to devOps and Test Automation efforts.
- Provided complete solution and design of TDM framework that included AI solution to derive data patterns and relations; database repository to capture meta data of various test data and message structures from producer and consumer systems; a Rules engine to capture rules specific to each data elements and attributes and for Decisions; a Service Virtualization solution to drive applications/functions for real data and designed drivers to CA DataMaker and IBM Optim using XML configurations.
- Designed TDM framework for major financial institution within TPF platform supporting credit authorization system real time on mainframe and distributed platforms that included synthetic data (message) generation for ingestion by various message simulators through MQ.
- Developed data model and process architecture for complete TDM framework.
Confidential, New York, NY
Enterprise Data Steward/Solutions Architect
Responsibilities:
- Responsible as principal data architect to re-design existing data landscape of two major applications.
- Designed a meta-data based model to replace existing relational data model to eliminate any constraints or limitation in number of attributes within any table.
- Responsible for data governance and stewardship enforcing policies and standards across applications including evaluation of specific data points as assets within enterprise platform.
- Identified responsible parties for each data point contributing to Enterprise platform along with validating for accuracy, consistency and architecting for accessibility and bidirectional as well as single directional data flow using SOA from Oracle EBS platform and JSONM based API access over Solr.
- Provided technical evaluation and POC for implementation of DataStax-Cassandra, Graph, Spark for NoSQL/Big Data solution into intelligent selection process as well as predictive analytics and implementation of integrated Solr indexing including benchmarking relational vs graph database performance.
- Installed and configured multi node instances of Cassandra, Graph, Spark and Solr for distribution by McKinsey’s geographic regions world-wide.
- Provided NoSQL implementation on Cassandra to accommodate person data text based attributes such as comments and evaluation of individuals for analytics.
- Designed best fit team selection process on Cassandra-Spark based on project/engagement characteristics such as geological attributes, industry, functional expertise, education, language skills, etc.
- Implemented Solr based index optimization.
- Designed Micro services and EBS SOA services for sourcing and writing data to multiple database platforms.
- Transformed relational model and implemented into graph model (including vertices and edges) on top of Graph (Tinkerbell).
- Developed mechanism using shell script on Linux to generate graphQL including schema: property keys, vertex and edge labels, vertex and edge properties, and vertex to edge connections.
- Designed and developed interface to transfer/sync data from Oracle into Cassandra.
- Lead data architecture effort to redesign legacy application - working closely with development teams - following Agile methodology to evolve and implement new data architecture.
- Worked with Enterprise Data Architecture team to evaluate, analyze and make assessment of data consistency across application landscape.
- Provided enterprise data architecture including consistent path for unified and consistent data consumption and store into enterprise platform from various applications.
- Responsible for detailed Data Analysis and identification of enterprise data vs application specific localized data across the organization and specifying technical strategy for storing and consuming to/from enterprise data platform (Oracle-EBS) using SOA gateway as well as API through JSON and Solr.
- Provided evaluation of Stored procedures and views related to reporting and other functions on EBS/Oracle platform.
- Developed and published a complete enterprise level data access security roadmap including evaluation of current state and recommending the future state.
- Detailed a standard data access security framework that include Policy, Model, and Mechanism along with Rules and Primitives that make up the fundamental elements of the policy.
Confidential, New York, NY
Database Engineer
Responsibilities:
- Designed and implemented a meta-data based data repository to capture all aspects of Guardium appliances including hierarchal layout of Central Manager to Collector to STAP to database engines.
- Designing and implementing a rules-based engine to drive all data collection for analysis and reporting of various security related concerns.
- Developed automation script to capture Guardium diagnostic details.
- Developed dynamic reports including appliance machine tracker capturing space, cpu, memory, etc.
- Developed automation scripts for Guardium patch deployment and installation in Aggregators and Collectors.
- Developing on Sybase platform - Views and Stored procedures.
- Designed and implementing physical data model (all meta-data based).
- Configured Central Manager, Aggregators, and Collectors.
- Managed groups and users for Guardium.
- Installed STAP agents into Database server machines on Sybase, Oracle, DB2, and SQL Server platforms.
- Managed reports and policies - propagated into various appliances using export/import functions.
- Configured Central Managers and Aggregators to send reports to Hadoop through SYSLOG.
- Developing CLI interface extension for IBM Infosphere Guardium security using Pearl 5/Expect with functions to retrieve various details of STAP, Collector, and Aggregator.
- Working with IBM to improve the CLI for Confidential ’s automation paradigm.
Confidential, Jersey City, NJ
TDM/Data Solutions Architect
Responsibilities:
- Developing TDM (Test Data Management) Framework to facilitate automated test data provisioning activities across QA, Development platforms (Oracle, Sybase, SQL Server and DB2) using Java (Netbeans 8 framework), KSH, and T - SQL.
- Designed and implemented non-deterministic Masking algorithm consistent across all platforms to obfuscate unsecured production data exposed in development and test platforms.
- Implementing Rules Based Engine to support the following:
- Intelligent non-interactive activities and processes (e.g. unattended data securing masking)
- Dynamic code generation that include Masking logic specific for each database platform
- Scripts and wrappers for various level of masking and provisioning activities
- Database Functions (where supported) that include masking algorithms
- Stored Procedures that drive the masking process Update-In-Place.
- Optional methods (scripts) external to the target database in case the database version does not support functions or target database is not available for updates
- The provisioning framework consists of dynamic Self-Learning Engine (SLE) with capability to learn from various provisioning actions and events and subsequently utilize the knowledge to determine optimal actions.
- A Central Metadata Repository (CMR) is included as the heart of the TDM framework to collect the following:
- All database related attributes such as DDL s including column, index, constraint, etc.
- All provisioning activity statistics including log of each to be utilized for Audit purposes as well as Analytics/Reporting.
- Metadata Deltas as database changes occur including DDL changes potentially affecting provisioning activities and implement Dynamic Adaption (DA) of changes to dynamically prevent process failures implements Artificial Intelligence and Self Correction policies implemented using Rules Based Engine.
- Process and activity related data including latency and event based exceptions to feed into Rules Based Engine to be utilized by the Decision Making Engine (DME).
- Designing interface between various data feeds and target databases to transform to masked data.
- Producing TDM Framework with centralized and integrated database platform to provide data provisioning service On-Demand though Self-Service interface.
- Developed TDM strategy/document for Confidential including features such as automation, integration, of various data feeds from source to target platforms as well as functionality such as data masking, sub-setting, and data generation
Confidential, WI
TDM/Data Solutions Architect
Responsibilities:
- Developed award winning data related architecture/solutions - 1st prize in 2013 WIPRO Innovations.
- Responsible for assessment of business requirements for multiple divisions/lines of business across entire firm.
- Acquired technical specifications including process flow document, data model, etc.
- Responsible for driving various discovery and requirement gathering sessions with SME, business stakeholders.
- Performed detailed analysis based on business/data requirements - produced analysis, assessment and solutions document including strategy, architecture, time, cost and resource allocation.
- Developed tools/applications across multiple platforms (Z/OS-DB2-Cobol, JCL; Linux-Oracle/Teradata-KSH; Windows-SQL Server-Powershell) supporting testing and implementation efforts.
- Worked with vendors such as IBM and TIBCO to achieve collaborative efforts for various solution implementations.
- Evaluate and implement various technology solutions: 3rd party and custom applications/tools.
- Architected/Implemented ETL process from mainframe DB2 to Websphere MSG Queue and down-stream to Oracle and SQL Server.
- Mined retail data for business analysis from DB2 and Teradata feeding to SAS system - utilized SQL/KSH on Linux platform.
- Supported retail audit functions within Z/OS-DB2 platforms.
- Implemented sub-setting, masking and data generation processes using Informatica Power Center 9.5 and ILM-TDM.
- Provided data masking for store credit card, customer identification columns within sales/transaction tables.
- Architected Gold Copy solutions for various projects.
- Produced detailed documents.
- Designed/Implemented Data (XML) generation tool using C++ containing sales data to Msg Queue Supporting performance testing.
- Designed intelligent ETL tool for Teradata using CLI to replace its Data Mover utility.
- Designed Referential Constraints resolver used in ETL processes that resolved table parent-child relationships.
- Designed and implemented custom process implementing Oracle Flashback for QA testing efforts.
- Designed automated SQL Server restoration process consisting of application based services, mirroring, version syncing, and production backup images.
- Collaborated with IBM, Toshiba and TIBCO for large application implementation supporting KOHLS eCommerce business that included end-to-end order/transaction coverage and flow through settlements with Banks with integration of Oracle ATG, IBM OMS, GIV inventory system, AJB Settlement application and First Data finance.
- Facilitated integration testing by means of generation of XML based transactions from ATG through OMS through Sales Hub and then to First data for payment processing.
- Managed various testing efforts with various teams including functional, performance, regression, UAT, etc.
- Supported large EDW (KOHLS Enterprise Data Warehouse) on Linux/Teradata platform including security implementation using Informatica ILM-TDM masking as well as Protegrity encryption and tokenization implementations.
- Supported IBM OMS (Order Management System) for KOHLS in collaboration with IBM team.
- Coordinated collection and establishment of a repository containing various artifacts for each projects/applications including Data Model, Data Flow Diagram, Process Flow Diagrams, etc.
- Collaborated with TIBCO for custom solutions over MDM product applications.
- Provided technical support including machine/database/server related issues.
Confidential, NJ
Data Solutions Architect
Responsibilities:
- Developed custom ETL (data provisioning) tool integrated with IBM Optim that performs key functions:
- Subsets data based on specific Selection Criteria
- Extracts data in specific order considering Parent - Child relationships
- Extracts data considering table characteristics: reference vs. transactions
- Considers multithreading of IBM Optim feature for parallel extract
- Utilizes Oracle Data Pump for parallel data import for performance optimization
- Developed Auto-Generation tool for Oracle Data Pump Export and Import functions.
- Developed Database object comparison tool to be integrated with export/import processes.
- Developed Oracle database/table properties (e.g. table space, partitions) discovery tool for integration with export/import processes.
- Implemented masking capabilities for PII data using IBM Optim.
Data Analysis/Mining/Masking
- Collaborated with business analyst and QA team for various data requirements selection criteria through interview and discovery sessions.
- Evaluate and implement data selection criteria based on financial advisor, investor, portfolio and fund specification provided by the client to be used for data mining and data extraction from enterprise data warehouse for production of replicated platforms for development, sales, demo, QA and UAT.
- Perform obfuscation (masking/disguising) real data to hide proprietary and private client information using.
- Produced documentation including process and data flow diagrams illustrating hardware infrastructure and processes that utilize them within and cross network domains.
Infrastructure Preparation
- Coordinate efforts to establish the infrastructure necessary for entire testing platform: Source and Target database connection including updating TNS entry, database access (Oracle account), and schema including required privileges to use Oracle DATA PUMP to be able to extract data.
- Specify and acquire Linux account and privilege to read/write to the DATA PUMP directory at the OS level with the help of system administrators on both Source and Target server machines.
- Secure both Oracle and Linux ports to be available/open in order to be able to communicate between servers distributed within various networks restricted by firewalls.
- Assist in specifying criteria for Weblogic application server configuration including client interface URL.
- Manage Oracle database objects such as constraints, index, sequence, stored procedures and packages in addition to loading data in the process of preparing a composite replicated platform.
- Work with DBA, Linux, and Weblogic administrators along with Release teams to coordinate and synchronize Oracle, Weblogic, and Java codes based on required release.
Platform Validation
- Validate data that is mined based on the selection criteria that include Advisors, Investors and Fund information following the mining/extraction process including reference and Price tables.
- Verify obfuscated data for consistency including PII (Personal Identity Information) data.
- Verify application utilizing the URL for connection and running various reports.
- Validate entire Oracle database for validity of objects including stored procedures, packages, DB Links, etc.
- Validate code releases in terms of SQL DDL’s as well as Java code.
Troubleshooting
- Debug any issues with the application and data by examining the trace logs from Weblogic application server/instance.
- Review Oracle objects including stored procedures to detect issues that prevent applications from functioning properly and/or causes failures in report generation.
Confidential, New York, NY
Data Analyst
Responsibilities:
- Performed Data Analysis using Toad/SQL on Linux/Oracle platform for data validation and quality testing/management of BOA compliance data for publishing government mandated BOA holdings reports.
- Participated in testing and validation of vendor and processed data from Informatica ETL process.
- Reviewed existing Data Model and published Data Dictionary for greater understanding of Oracle table relationships to specify object modifications in technical requirements document for the technology team as well as to develop various SQL queries for analysis process.
- Performed data validation and comparative analysis using various Micro Strategy reports and the data source - Oracle Tables and Views.
- Maintained various static data sets in Excel (CSV) used to provide to the database team. Also, performed analysis using Excel and its functions such as VLOOKUP, etc.
- Performed Data Profiling of BOA holdings data along with the holding companies and sources which provide the data.
- Worked with technology and testing teams to implement test/production data into UAT platform for various validations and testing activities.
- Prepared test plan for disaster recovery exercise.
Confidential, New York, NY
Solutions Architect
Responsibilities:
- Designed and developed Database Driven Rules Based application using VBScript and SQL on Oracle 10g platform to identify Health Care Professionals and their payment history - rules include combination of services provided extracted from Data Warehouse containing HCP data.
- Designed and developed Vermont Price Disclosure Report application using VBScript and SQL on Oracle 10g platform to generate Vermont state mandated reports (in Excel format) detailing Confidential drug pricing comparing against its competitor pricings.
- Performed analysis for migration of third party product pricing data for reporting and automation including all necessary data mapping.
- Developed process for migration of product pricing data migration into Oracle using Informatica Power Center 9.
- Designed and developed complete framework using HP Quality Center 10, Quick Test Professional 10, Oracle 10g, VBScript, and XML eliminating requirement for any additional coding.
- Developed various data validation tests incorporated within the framework each driven by independent XML files with unique SQL statements customized for each.
- Test automation includes QC Defect generation, automated Defect Record generation in Excel format as well as automated Email Notification to testers.
- Used Informatica Power Center 9 for various ETL processes to Extract, transform and load source data into database tables for use by tools developed for the Data Stewardship team.
Confidential, New York, NY
Data Solutions Architect/Infrastructure Automation Engineer
Responsibilities:
- Provided hands-on technical/relation management/leadership onshore (US)/offshore (Phillipines) teams.
- Developed team structure implementing 5x24 support model that included dedicated leads/resources aligned with lines of business.
- Provided mentoring, guidance and leadership related to various technical engagements including architecture and solutions.
- Interacted with client management for various project related requirements and negotiations in terms of resource, time and cost allocation.
- Interacted with various teams including DBA’s (DB2, Oracle, SQL Server), Infrastructure teams (Windows, UNIX/Linus Server), Hardware, Msg Queues, etc. for access and acquisition/allocation related issues.
- Provided guidance for achieving strategic approach and architecture to various technical/data requirements.
- Managed various testing efforts with various teams including functional, performance, regression, UAT, etc.
- Provided technical support including machine/database/server related issues.
- Provided technology training to individuals.
- Provided guidance for various process automation tools.
- Architected and implemented a Test Data Management System - TDM used as a central data repository aiding in establishing MDM optimizing Data Analyst team activities - automatically depositing test data produced from databases on Oracle 10g platform based on Use Cases provided by System Analysts.
- Developed spreadsheet Parser and Loader scripts for TDM using KSH - used for loading TCR (spreadsheet corresponding to use case containing test cases for various scenarios) into TDM.
- Developed dynamic Data Profile Generator along with view and Stored Procedure generators using PL/SQL, KSH, SED and AWK on AIX platform.
- Developed Test Query Generator and Loader used to validate data during test executions based on values which are passed in for each test case.
- Setup and managed PVCS utilized as repository and control center for workflow between System Analysts producing the use cases and the data analysts creating test data for spreadsheets containing scenarios and Test Cases based on Use Cases.
- Architecting Performance Testing platform using Mercury Load Runner including setup of clients/server and integration with Quality Center automation and SDLC - database to contain test cases and corresponding parameters for correlation into VuGen scripts.
Confidential, NJ
TDM/Data Solutions Architect
Responsibilities:
- Developed complete automation tools for Selective Data Migration which includes ETL process using Pre-Specified Selection Criteria to Select, Transform and load using KSH/SED/AWK on Linux platform and Oracle 10g.
- Implemented Automated Self-Analysis process in the Migration tool to accommodate Referential Integrity along with Index and Constraints at load time within target databases containing large volume of data (sizing up to several Terabytes f data).
- Implemented Size Analysis for batching beyond specified threshold and controlling Transaction size.
- Developed automated Masking process for Data Privacy to load and prepare databases which include proprietary financial data in disguised form hiding personal and key financial information using Compuware File-Aid and KSH/SED/AWK on Linux platform.
- Implemented automated Primary/Foreign Key Propagation scheme retaining Referential Integrity.
- Developed Data Validation processes for Masked and regular data using Rule Based Relation template.
- Developed various DDL Generation and Reverse Engineering tools using SQL-Plus, PL/SQL, KSH, SED, and AWK on Linux platform and Oracle 10g: Tool to reverse engineer database Objects: Tables, Columns, Constraints, Index, Sequence, etc.; Tool for data Migration.
- Performed Data Analysis of Confidential Wealth Management System including client account and reference data; portfolio accounting, performance reporting, fee billing; security prices/transactions - current/history, assets/positions/classifications, portfolio performance and corresponding advisors/investors for implementation of data Migration from its Data Warehouse into smaller Data Marts for Development, Integration, and QA platforms.
- Analyzed existing Data Model for the data warehouse containing wealth management data for reporting.
- Developed Logical and Physical Models using Erwin 7.0 implementing database that support the data extraction automation process.
- Implemented prototype for EMPS using MS Access 2000.
Confidential, New York, NY
Data Warehouse Architect
Responsibilities:
- Performed Analysis of new and existing Financial Reporting Systems G/L, A/P, and A/R:
- CODA on VAX platform in terms of current data source: Tables and Data Feeds layout, content, Data Mapping and logic pertaining to the existing reports.
- SAP system in terms of the new accounting structure in order to design and implement new reporting system consisting of an ODS platform and an OLAP platform on SQL Server 2000/2005.
- Universe in Business objects through Data Mapping to the Data Mart.
- Performed Business Analysis in JAD and Discovery sessions in partnership with principals and clients producing Data Flow Diagrams along with Conceptual Model supporting financial data.
- Specified Data Flow Diagrams and Process Flow Diagrams for streamlining data from SAP to ODS to Data Warehouse.
- Designed and implemented both Logical Model and Physical Model using Erwin 7.0 for the ODS platform to accommodate diverse data feeds, data cleansing and a central data repository through both real-time and batch processes from various sources.
- Produced Dimensional Model (Star Schema) for implementation of Dimension and Fact tables using Erwin 7.0 customized for all reporting requirements within the OLTP platform.
- Implemented De-normalization for performance and efficiency.
- Implemented Measures consisting of various Aggregations and Pre-calculations.