Data Architect / Data Modeler / Data Quality Analyst / Etl Lead Resume
Iselin, NJ
SUMMARY
- Accomplished Business Intelligence Professional with over 15 plus years’ experience in leading large complex programs, matrix manage cross - functional resources, driving process efficiency, implementing systems and key initiatives.
- Defining and implementing strategy for enterprise level data integration architecture, business intelligence, enterprise data, and data quality solutions across all divisions.
- Experience includes the assessment of existing platform architecture frameworks, evaluate and select software and hardware technologies, process definition, re-engineering and implementation of data strategy.
- Envision and create solutions that meet requirements of data integration; model the pieces of an infrastructure and their points of integration; prove the feasibility of a architecture design; create the data flow design artifacts that are required to deliver and to maintain the infrastructure; guide a solution through to its completion; and ultimately can be implemented and supported in production.
- Demonstrated experience managing demands of complex projects, developing innovative systems, providing technical support, and performing business systems analysis.
- Offers strong combination of business and technical expertise, ensuring steadfast project management from development of initial concept through design and implementation of solutions.
- Utilized broad technical skills encompassing concrete knowledge of key technologies to streamline design, development, and implementation functions. Skilled project leader; able to provide quality technical support and leverage available and emerging technologies in order to create innovative business solutions.
- Experience and exposure to various business domains like Manufacturing, Services, Sales, Marketing, Banking, Finance, Investments, Pharmaceutical, AML, Loans, Risk Management, Trade Finance, Accounting, Surveillance &Compliance, Deposit and Credit Risk.
- Executed data warehousing / OLTP database projects through the full software life-cycle from requirements gathering, data profile, data modeling and technical specifications to design, development and implementation.
- Hands-on ER and Dimensional Data Modeling experienced in designing OLTP database / Data Vault/ Star Schema/ Snowflake Schema methodologies in Sybase Power Designer 15 and ERwin r7.3. In depth exposure and understanding of DWH Concepts and ETL process using Informatica.
- Strong working experience in Informatica PowerCenter 9.6 9.1 8.6X, IDQ/IDE, Informatica Metadata Manager 9.6, Informatica Developer, Informatica Analyst, Power Exchange CDC, Data Transformation Developer, MDM Hierarchy Manager, Web Services, SAP BW services to Extract, Transform and Load the data.
- Designed ETL solutions to load Data warehouse from various source systems which was build using IBM InfoSphere DataStage 7 in order to create business centric view to help business decision process.
- Experienced delivering high-impact solutions in the areas of enterprise planning, business intelligence and Data Integration Data Warehouse. Supported both new development / project and production support, as the highest-level technology domain expert.
- Performing assessment of business requirements; collection and identification of technical specifications; and the subsequent development of technology solutions that require development to be viable (i.e., business applications, work flow systems, purchased applications, developed applications, and applications integration).
TECHNICAL SKILLS
O/S: Aix, Solaris, Linux, Windows, HP/UX 9.x/10.x, z/OS
Languages: Shell Scripting, PL/SQL, HTML, XML
Databases: DB2 V9.7, Oracle 11g, MS SQL Server 11.0, Teradata 14.0
Tools: Rational Team Concert, DataStage 7/8, Informatica 9.6 9.1.0 8.6 X, IDQ/IDE, Power Exchange, Informatica Developer, Informatica Analyst, Data Analyzer, Informatica Metadata Manager 9.6, Informatica MDM, DAC, OBIEE 11g, Business Objects XIR234, AutoSys, ER/Studio V8 V9, Sybase PowerDesigner 12.0, 15.0, Erwin r7.3, Datastage Designer, Datastage Director, Datastage Manager, Datastage Administrator, Universe Designer, Web Intelligence, Autosys, Cisco Tidal Enterprise Scheduler, ESP, COGNOS.
Other: Six Sigma, Dimensional Data Modeling, EIM, PL/SQL, Hub-Spoke, Bus-Architecture, Bill Inmon and Ralph Kimball methodologies, ER Data Modeling, MDM, SOA, EAI, MVC
Software: DOM, SAX, SQL LIMS, LabWare LIMS, Visio 2003, Siebel DWH, QualityCenter, NDM/Connect Direct, WebLogic, Teradata v2R5/v2R6, Crontab, Korn shell, XML, XSD, ICDs, Data Flow Diagrams, DMAIC, Web Services, SOA, Unix, Linux, Data Cleansing, Clarity, Share point, Harvest, J2EE, FLOAD, MLOAD, TPUMP, TOGAF, ZACHMAN, RUP
PROFESSIONAL EXPERIENCE
Confidential, Jersey City, NJ
Tech Lead/Informatica MDM Architect /Informatica Developer Tool / Data Analyst/IDQ/IDE/Data Quality Lead/ Data Modeler / Solution Architect / IT Lead
Responsibilities:
- Collaborated with Project Managers, System Integrators, Database, Infrastructure, and QA teams as well as business users to design and develop IT solutions that adhere to sound integration architecture and the overall enterprise integration strategy and deliver within an agile development framework.
- Assessed MDM needs for trade finance data integration requirements, design MDM architecture and implement MDM solutions for customer data.
- Informatica MDM Hub configurations - Data modeling Data Mappings (Landing, staging and Base Objects), Data validation, Match and Merge rules, writing and customizing user exits, customizing configuring Business Data director (BDD) and Informatica data director (IDD)applications.
- Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
- Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
- Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records based on business needs.
- Implemented ETL Process of populating Landing / Staging / Base Tables for customer Data Set into Informatica MDM from various source systems.
- Responsible for defining and implementing strategy for Trade finance data integration architecture, business intelligence, data, and data quality and MDM Architecture solutions for T360 data.
- Served as the technical expert responsible for design, development, implementation, and support of trade finance applications on the SOA platform at an enterprise level.
- Interacted with business users and Business Analysts as a bridge between IT team and Business Team and Performed Requirements Analysis of Business Requirements to prepare function requirement document and tasks list for each interface so that efforts can be estimated for budgeting, planning and timelines.
- Provided hands-on delivery of integrations. This includes authoring technical documents, designing processes and procedures, defining configuration parameters, developing code and reviewing business requirements to determine appropriate technical solutions.
- Designed and developed Data Flow and Interface Architecture Diagrams to depict involvement of all upstream and downstream systems in order to deliver technical solution of data requirements.
- Performed code reviews with in-house and contracted development teams to ensure proper integration architecture is implemented and standards are followed.
- Developed test cases and test plans. Conduct unit and system test for new and/or modified interfaces. Support QA team in regression and end to end testing scenarios.
- Work with Enterprise Architects to ensure integration solutions are based on SOA architectural standards and are in alignment with enterprise integration strategy
- Data modeling for DWH by following the Star Schema methodology of building the data warehouse.
- Analyzed BI Reports Requirement in order to Design reporting solution using Business objects which could meet requirements.
- Performed impact analysis and submitted report for each downstream interface to identify program of changes caused by fact that change in source system.
- Worked closely with Business Analysts to define conceptual and logical models of Core Banking Hub database design.
- Used Informatica Data Quality 9.1 (IDQ) for Analysis, data cleansing, creating Rules and Data Standardization.
- Responsible for the solution design and development of Informatica Development Efforts to build ETL process for loading Trade Financial data from T360 system to Core Banking Hub database.
- Profiling and analyzed the source data using Informatica IDE/IDQ to determining accuracy and completeness of the source data which clarified the structure, relationship, content and derivation rules of data.
Environment: Rational Team Concert, OVS, GPP, AML, HotScan, SWIFT Messaging, Doc Exam, Oracle GL, FAH, DB2, Mainframe, IMMS, ACBS, T360, Trade Finance, Loan, EXIM, Informatica 9.1, Oracle 11g, SAP Business Object XI R2, Microsoft Project, Embarcadero E/R Studio 8.0, Putty, AIX-Unix, Microsoft Share point, Tidal, Shell Scripts, Quest Toad, PL/SQL, IDE/IDQ, MDM Hub / BDD / IDD / Data Profile, Power Exchange, Connect Direct, FTP, MQ Broker, T360 Trade Processing System, Mete data Manager, Data Lineage, Audit Reports, Pushdown Optimization, Change Data Capture, XML, XSD, XML Generator, XML Midstream, XML Parser, MQ Middleware, DR testing.
Confidential, Jersey City, NJ
Tech Lead / ETL Architect / Data Analyst / Quality Lead (IDE/IDQ)/ Informatica Metadata Manager / Informatica MDM/BDD/IDD Architecture Lead
Responsibilities:
- Worked with Enterprise Architecture Team to perform gap analysis between current Actimize system and overall enterprise architecture requirements.
- Leading the technical architecture and design review assignments for a quality solution implementation of new Data Warehouse which was integrated with Actimize system.
- Leading and mentor the efforts of data quality completeness initiative for compliance system used Informatica IDE/IDQ ETL architecture.
- Overseen activities for all tasks in Project from planning to delivery and sustaining support to ensure that projects meet established business objectives, time and budget constraints.
- Performed deep assessment of the Actimize system in order to find whether ETL best practices were followed or not and submitted the recommendations of improvement.
- Designing Data Warehouse / Data Model / MDM and Integration architecture solutions for Hornet Data warehouse requirements to develop data Integration solution.
- Initiated Design of ETL architecture solution assessments to design data quality rules and data.
- The Process / Data flow diagrams have been reversed engineered from reviewing current actimize system in order to perform data quality measures.
- Performed Data/Code Quality assessment to know the hard coded values and records rejected and data.
- Developed Data Lineage Reports in Metadata Manager along with performing data profiling on actimize data.
- Involved in data governance initiative to standardize the data models (Canonical Models) across divisions.
- Supported FIBO, ISDA, ISO including standard canonical models for governance and used visualization techniques to represent the relationships and related entities.
- Participated in regulatory initiatives BCBS, Basel, SR-14, SR-117 and also created impact analysis using Informatica metadata manager.
- Generated Data profile reports using Data explore in Informatic 8.6.1. In order to undrstand source and target data which is processed thru ETL processes for Actimize.
- Responsible to work with developers to assure design and implementation conforms to internal standards, makes use of best practices and performs well.
Environment: Informatica 9.1, Informatica 8.6.1, MDM Hub, IDE/IDQ, Data Analyst, Oracle 11g, 10g, ER, SAP Business Object XI R2, Microsoft Project, Embarcadero E/R Studio 8.0, Putty, Linux, MS SQL Server 2008 R2, Autosys Jobs scheduling, Informatica7.1, Metadata Manager, Data Explorer 8.6, Data Quality8.6, Shell Scripts, Quest Toad, PL/SQL, Mantas, Actimize, CadBatch, Hornet, CBW, Watch List, Restricted List, Erika Engine, Alerts, Java, Data Profile, Power Exchange.
Confidential, Iselin, NJ
Data Architect / Data Modeler / Data Quality Analyst / ETL Lead
Responsibilities:
- Designed architecture solutions for Investment’ BI reporting data requirements using data warehousing concepts
- Worked with the Enterprise Architecture team to ensure that new technology solutions are designed for optimal access and usefulness, and leverage existing technologies for their specific technical domain(s) and understand system-wide impacts.
- Designed Asset Class & Business Unit Hierarchical Data Model using Erwin 7.1 tool.
- Logical & Physical Data modeling for derivative elements need to be sourced from MUREX data system.
- Filenet and ETL integration Data Modeling to capture reports transfer activities.
- Designed Filenet & ETL integration Architecture in order to ftp OBI reports from OBI to Filenet through ETL process.
- Designed and implemented MQFTE & Autosys jobs for Filenet & ETL integration process.
- Designed Datastage parallel jobs to generate and parse xml documents in order to share metadata information between Filenet & ETL processes.
- Setup OBI reporting system to schedule reports through IBOTs in order to deliver them in pdf format on Datastage server so that they can be MQFTE to Filenet for further processing.
Environment: Architectural Patterns (Public Subscribe, RESFful API, Biztalk, MSMQ,nServiceBus, IBM Datstage 7, Oracle 11g, Erwin 7.1, OBIEE 11, Microsoft Project, Clarity, Putty, Linux, Microsoft Share point, Oracle SQL Developer, Autosys Jobs scheduling, Mobius, MQ, Connect Direct, Star Team, Filenet, NAS Storage mount
Confidential, Kalamazoo, MI
Data Architect / Data warehouse Lead / Data Modeler
Responsibilities:
- Designed complex data integration solutions in an inclusive and participatory manner to negotiate and with and influence other design parties to reconcile technical and business considerations in order to arrive at the optimal solution of the requirements.
- Logical & Physical Data modeling for SQL LIMS and LABWARE systems data requirements
- Managed complexity of data integration environment in both technical and business sense understanding data and process flow thru data integration layer and up/down stream systems dependencies.
- Source to target mappings with business rules.
- Data Analysis & Profiling on LIMS and SAP data to understand the relationships and data quality issues.
- Defined and implemented DWH & ETL Architecture using Kimball Methodology for data requirements.
- BO universe design using PGS DWH dimensions and facts for reporting requirements.
- Translated business needs into technical solutions, designed, developed, documented overall architecture of systems, Lead the architecture solutions that meet performance, usability, scalability, reliability, and security needs.
Environment: Data Fabrication, Data Virtualization, Informatica 9.01, Web services, HA grid, Oracle 10g, Oracle E-Business Suite (Oracle EBS 12.1), Erwin 7.1, SAP, OLAP technologies, Business Objects XI 3.0, Microsoft Project, SQL LIMS, LABWARE LIMS, Share Point, Universe Designer, Web Intelligence, Oracle Fusion Applications, Ralph Kimball Methodology.
Confidential, Pittsburgh, PA
ETL Architect / IDE/IDQ /Data Quality Lead / Data Modeler
Responsibilities:
- Logical and Physical dimensional modeling for PNCI DWH using Erwin Data modeler tool 7.3 to have data warehouse tables structure created so that ETL can be implemented.
- Analyzed data from a variety of data sources to present a cohesive view of the data and completed data profiling using IDE on various source systems to understand the data.
- Lead Data Design & ETL efforts to build data warehouse including writing design specifications and finalize DWH ETL Architecture.
- Worked closely with Informatica development team on ETL design and build to ensure the data solution provided meets business expectation and any technical difficulties are resolved on time.
- Designed three layers ETL architecture for PNCI data warehouse where middle layer represents unified layer which store operation data for PNCI reporting and downstream applications.
- Established new extract feeds using Power exchange Tool to integrate source data with Informatica processing for DWH loads.
- Created, review, and provide recommendations on technical procedures, architecture documentation, and engineering decisions.
Environment: Informatica 8.6x, IDE/IDQ, Power Exchange, Oracle 10g, Erwin 7.3, Shell Script, AIX, Main Frame, OLAP technologies, data migration strategies, Trillium version 12, Harvest Software Change Manager, CA clarity, OBIEE 11g, NDM/Connect Direct, Oracle E-Business Suite (Oracle EBS 12.1), Oracle Fusion Applications, MS Project.
Confidential, Moline, IL
Data Analyst/IDE/IDQ / Data Quality Lead / Data Architect/ Data Modeler
Responsibilities:
- Responsible for the overall design of the data/information architecture, which maps to the enterprise architecture and balances the need for access against security and performance requirements.
- Logical and Physical Data model design enterprise wide using Sybase power designer to have database structure in place for Orders & Products and Dealer Inventory Tracking data to start web interface and ETL implementation.
- Managed program/project roadmap and key milestone regarding criticality, downstream impact if dates are missed and determine alternative/mitigating actions.
- Develop cross-functional project plans and manages execution of tasks and completion of all deliverables within the business process and technical areas.
- Developed data/metadata models and technical specifications using standard modeling techniques.
- Analyzed the business requirements to design, architect, develop and implement highly efficient, highly scalable ETL processes for Inventory data.
- Analyzed data from a variety of data sources to present a cohesive view of the data and completed data profiling using IDE on various source systems to understand the data.
- Worked with business and technical partners, perform data analysis and data profiling, identify data quality issues, develop data model, ETL and report specifications, document enterprise data repositories, and repository relationships.
- Prepare, maintain and publish ETL documentation, including source-to-target mappings and business-driven transformation rules.
- Designed and developed the Operational Data Store for EIM & DIT report requirements that will help business users in business decision process.
- Implemented a Data Quality Solutions, which includes Standardization and Matching of Source data using IDQ.
- Actively designed PIN Expansion data model to achieve Interoperability between systems utilizing different PIN formats.
- Performed architect, design, and develop enterprise data warehouse components including databases and ETL using Informatica.
- Designed and implemented product data vault to extract, transform and load the product data with history of changes on the products attributes.
- Supported the project teams during the user acceptance testing activities and assists in development of post deployment data maintenance plan
- Developed the Data Integration Specifications ETL and Web including data cleansing and data transformation rules.
- Defines data/information architecture standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and applies accepted data content standards to technology projects.
- Identified data quality issues by analyzing data in various systems and working with business users in order to implement data cleansing processes.
Environment: Power Designer 15, Integration Method(ESB, Webservice, API, ODBC, MOM,Informatica7.1, IDE, IDQ, DB2 Host, AIX, Shell Script, QMF Windows8.1, WinSQL3.5, Microsoft Visio2003, ESP, JCL, SAS, Oracle 9i, Business Objects, Universe Designer, Web Intelligence, database performance tuning, data migration strategies,SQLServer.2005, SSIS, Trillium version 10, XML, XLST, XPath and XQuery, DOM and SAX, ClearCase, Share Point, Quality Center, SQLServer.2008, SSIS, Bill Inmon Methodology.
Confidential, Stamford, CT
ETL Lead / BI Lead
Responsibilities:
- Analyzed database issues and provide recommendations as required to support the Development and System Administration teams in the design, development, testing, and tuning of the supporting applications, queries, and or, the relational data base.
- Worked with business users to document Data Migration requirements and criteria, define Master Data and functionality requirements, including data acquisition, quality, approvals and distribution.
- Design and documented the ETL processes that have used various transformations like Aggregator stage, Transformer stage, Sort stage, Complex Flat File stage, DB2 stage for DMT Organization data.
- Worked as data modeler to design logical and physical model in order to begin ETL development with.
- Worked with business, functional and technical teams to coordinate deployment activities, constantly monitoring actual vs plan and highlighting anticipated issues or slippage and formulating recovery plans or assessing impact if not recoverable.
- Worked directly with the Functional SMEs, BSAs, and development team to troubleshoot issues, resolve defects and implement system enhancements
Environment: OBIEE784/75, Trillium version 10, Siebel DWH, Informatica7.1, Oracle 9i, Tomcat Web Server, HP-UX, DAC Server, Shell script, MS Project, Ascential Datastage7.5, Tomcat Web Server, Shell Script, Siebel Analytics784, ERwin r7.1