Data Modeler / Architect Resume
CA
SUMMARY
- An Information technology professional with more than 13 years of extensive experience in all phases of Software development life cycle including System Analysis, requirements gathering, design, data modeling, development, ETL / BI architecting and implementation of various DW - BI projects.
- Have strong experience in desigining enterprise level EDW & Data Marts based on 3NF and Dimensional data modeling (Fact Tables, Dimension Tables, Star and snowflake schema) techniques respectively.
- Sold hands-on experience in using ERWIN 9 data modeing tool and knowledge in Power designer.
- Complete understanding of the end-to-end ETL-BI solutions (understanding the business requirement, fitment of the datamodel, high level design, low level design, ETL jobs development, facts and dimensions loading, dependency of the ETL jobs, scheduling, CDC logic, SCD considerations, performance considerations, SQL tuning, Tableau dashboards, Cognos cubes & reports development)
- Good knowledge and exposure to Source System Analysis, Data profiling, Data quality, Data Governance and Master Data Management.
- Highly skilled in ETL mappings developments using Informatica and Talend ETL tools; and visualization development using Tableau 8 and OLAP solutions using Cognos and Actuate tools;
- Have exposure to Informatica Data Validation Tool (DVO) which is an out-of-the box automated testing solution from Informatica.
- Experienced in Advanced data visualization tools such as Tableau and knowledge in Qlikview & Spotfire for dashboards and reports development.
- Strong experience on AWS Redshift, Oracle, DB2, Teradata and Netezzadatabase Systems.
- Also has exposure to latest technologies such as on No-SQL, Mongo-DB, HVR and Vectorwise.
- Expertise in Test strategy, Test data, Test Plan & Testcases preparation and Test execution.
- Experience in extraction of data from various heterogeneous sources (relational, flat files, Excel etc.) to load into data Warehouse/data mart targets.
- Extensive experience working with business users/SMEs as well as senior management.
- Strong knowledge of SDLC models such as waterfall, iterative, and agile, with excellent process and system documentation skills.
- Excellent verbal, presentation and written communication skills. Able to communicate with technical, non-technical and senior management audiences. Have played a significant role in requirements planning, client interaction and onsite-offshore coordination.
- Worked on industry specific datamodels(Oracle EBS, FSLDM and IIW) and custom datamodels.
- Worked on Finance, Health care (Services), Insurance, Banking and logistics domains.
TECHNICAL SKILLS
Data Modeling Tools: ERWIN 9 Data Modeler
BI & DWH Tools: Informatica 9.1, 8.6; DVO, Talend 5.2; Tableau 7, 8; HVR; Cognos 8; Actuate 8; BO XIR2; Crystal Reports
RDBMS: AWS Redshift, Oracle, Teradata, DB2, Vectorwise, Netezza
GUI Development Tools: Centura Builder, Visual Basic 6.0, 2005
Programming Languages: Java 2, C, C++
Operating System: Windows NT, Windows 2000 Prof, Unix
Other Methodology / Tools: Agile, Lean, MS Project, Clear Quest, VSS, TFS, Quality Center, ALM
PROFESSIONAL EXPERIENCE
Confidential
Data Modeler / Architect
Responsibilities:
- Requirements Analysis and Source System Analysis of various Invoice & Payment Transactional systems (Oracle EBS P2P & Timberline) and Projects Master source system (SAM).
- Extensive data analysis in identifying and categorizing Enterprise Reference data, Master data and Transactional data.
- Design of Conceptual, Logical and Physical data models using ERWIN 9 tool. Responsible for generating the PDM scripts and execution in Redshift database.
- Followed Inmon’s Hub-and-Spoke architecture approach which involves EDW design based on 3NF (relational) and data marts design based on STAR (dimensional model) design priciples.
- Used ERWIN 9 Data Modeler tool for the design and development of all the data models and scripts generation.
- Used ERWIN’s reverse engineering, forward engineering and subject-area based data model development.
- Defined Enterprise-level Data Modeling standards and best practices for SunEdison EDW projects.
- Dealt-with performance tuning of resource intensive queries in Amazon Redshift database by defining appropriate distkeys, sortkeys and design strategies.
- Overall responsible for defining the complete Data architecture for SunEdison.
- Also, developed various ETL specs and conducted walk-through sessions with ETL team for development.
- Performed QA validation of ETL-loaded data for various subject areas and reference data.
Environment: AWS Redshift, ERWIN 9, IBM Data Stage, Tableau 9, QC, Share Point, Oracle EBS.
Confidential
ETL Architect
Responsibilities:
- Responsible for complete application development lifecycle based on business requirements (Agile methodology); including business function analysis and documentation.
- Designed and developed the datamart star schemas in Teradata by understanding the reporting/OLAP requirements.
- Defined ETL coding, naming standards, deployment guidelines and industry best practices.
- Responsible for defining the complete system ETL and Reporting architecture
- Responsible for analyzing the legacy source systems and data conversions.
- Performed data standardization, cleansing the data before loading into the Data Warehouse.
- Prepared solution architecture documents which includes source data analysis, gap analysis, technology recommendations, Logical, Physical data models and ETL design, test plan documents
- Designed and developed the slowly changing dimensions; error handling of rejected records.
- Involved in performance tuning at various levels, integration testing, issue tracking and debugging.
- Performance tuning, identify bottlenecks, debugging mapping, issue tracking and code reviews.
- Mentoring the development team on BI development and industry best practices.
- Set strategy and oversee design and development of EDW staging areas and target tables.
Environment: Informatica 9.1, OBIEE, Teradata 14, HP QC, Unix, Informatica DVO
Confidential
BI Architect
Responsibilities:
- Played the roles of ETL Architect, Data analyst and Data Modeler at different stages of the project.
- Contributed in the Source System Analysis, High level design and Low level design specification.
- Designed the ETL framework in Talend tool including standards and best practices.
- Developed Talend jobs with high functional complexity using optimal number of transformations / design logic.
- Worked on Performance tuning of long running Talend jobs by applying best practices learned.
- Designed strategy for initial and incremental/CDC loads.
- Involved in all Testing phases including UT, SIT & UAT support.
- Contributed towards standards development for Tableau Data Visualization tool.
- Designed various Tableau dashboards for depicting various business performance KPIs using chart types such as Heat maps, Treemaps, Packed bubble and Geographical map, etc.
- Have strong knowledge on Data visualization architecture and exposure to leading tools such as Spotfire and Qlikview.
Environment: Talend 5.2, HVR, Tableau 8, Oracle, Vectorwise, MongoDB
Confidential, CA
BI Architect
Responsibilities:
- Confer with business units and technical staff to understand data usage, lineage and attributes for enhancements and defects.
- Used data analysis techniques to validate business rules and apply the short term and long term fixes to resolve the issues according to the priorities.
- Extensively designed data mapping and filtering, consolidation, cleansing, integration, ETLs and customization of data mart.
- Worked with project team representatives to ensure that LDM, PDM and ETLs were developed in line with corporate standards and guidelines.
- Impact analysis on the downstream modules and effort estimation for the CRs / defect fixes.
- Perform data quality profiling on various source systems for better understanding of data and surface data quality opportunities while working with SME and data stewards.
- Mentored onsite leads & coordinated with offshore team for ensuring the quality of the deliverables as per project timeline.
Environment: Informatica 8.6, DB2 Toad, Unix, Quality Center, ERWIN, Mainframe
Confidential, IL
ETL Architect
Responsibilities:
- Requirements analysis and high level design preparation and walk-through with client for approval
- Prepared the technical and ETL design documents.
- Gathering information from different data sources and provides detailed data analysis.
- Review data to determine operational impacts and needed actions.
- Conducted GAP analysis to analyze the variance between the system capabilities and business requirement.
- Designed and develop extract-transform-load (ETL) modules, data quality rules, exception management workflows.
- Improved performance through SQL query optimization and ETL mapping level changes.
- Troubleshooted and resolvde common operational issues and escalate if required.
- Lead a team of 2 onshore leads and 5 offshore resources and managed their activities through daily/weekly calls and status report.
Environment: Informatica 8.6, Netezza 6, AQT, Autosys, Unix, Quality Center, Mainframe
Confidential, IL
ETL Architect
Responsibilities:
- Involved in planning the technical conversion approach
- Contributed towards doing POC activity of converting initial set of mappings from DB2 to Netezza and checking the performance of the system.
- Involved in documenting the challenges & solutions
- Ensured close communication between onsite and offshore team members in terms or knowledge sharing, challenges faced & solutions.
- Testing the system, Troubleshooted and helped in fixing technical issues
- Lead a team of 2 onshore leads and 8 offshore resources and managed their activities through daily/weekly calls and status report.
Environment: Informatica 8.6, UDB DB2 9, Netezza 6, AQT, Autosys, Unix, Quality Center
Confidential, IL
ETL Lead
Responsibilities:
- Played the roles of ETL Architect, Data analyst and Data Modeler at different stages of the project.
- Work with the business team to determine the requirements
- Involved in defining the Technical architecture, High level design and Test strategies
- Closely interacted with various cross functional teams within Zurich including client IT management, Source team, architecture team, environment support team, deployment team and users team
- Feasibility study of the new requirement and enhancements.
- Estimate the cost and effort for the requests and define the delivery plan
- Define the data model and ETL specifications
- Lead a team of ETL developers, Data Analysts and Business Intelligence report developers
- Involved in Configuration Management Plan, Risk Management Plan, Defect Prevention Plan, Quantitative Process Management Plan etc.
- Involved in Implementation planning and Production roll-out
Environment: Informatica 8.6, DB2 9, AQT, Perl, Mainframe, Autosys, Unix, Quality Center, IIW, ZEM, ERWIN
Confidential, IL
ETL Lead
Responsibilities:
- Lead the ETL architecture and development efforts.
- Define ETL Strategy and framework for Integration of Policy data into the Warehouse.
- Create low level ETL Design based on the defined ETL framework and business specifications.
- Modeled data audit solution for the ETL process to capture the data loading statistics including rate of data load, data load successes and failure rates data rejections and data quality.
- Architect an Intelligent Workflow design to optimize the data load strategy.
- Work closely with the various business analyst teams to determine the ETL requirements.
- Validate ETL specifications, conduct and co ordinate the unit testing, integrated testing and User acceptance testing.
- Managed pool of resources in Onshore Offshore environment.
Environment: ZODS, Informatica 8.6, UDB DB2 9, AQT, Autosys, Unix, Quality Center, ERWIN
Confidential
Onsite coordinator / Lead
Responsibilities:
- Understanding requirements and proposing Technical architecture and High level design
- Review of low level technical design documents
- Mapping Review of Teradata FSLDM model with the source systems and development of ELT (Extraction, Loading Transformation) components using Data Stage and Sunopsis tools.
- Establishing the Unit Testing, System Integration Testing and User Acceptance Testing strategies.
- Review of Test Plans and Test Results
- Interfacing with client management and technical team including source team, architecture team, environment team, deployment team and users team in fulfilling the development team requirements. involved in Implementation planning and ‘Effort & Cost estimation’ of change requests
Environment: FSLDM, Teradata 6.1, DataStage 7, Sunopsis, Control-M, Unix, Quality Center, ERWIN
Confidential
Tech Lead
Responsibilities:
- Understanding the requirements and preparing high level and low-level design specifications.
- Lead team of developers to ensure code development as per the project standards
- Review of low level technical design documents and test cases
- Informatica ETL Code review
- Establishing the Unit Testing, Link Testing and System Testing strategies.
- Review of Test Plans and Test Results. Conducting ad-hoc audits
- Responsible for ensuring the quality and productivity goals of the project while complying with all the standards and processes.
- Successfully implemented Lean Principles (Poka Yoke, VSM, Level out the workload, DSM & Visual Controls) in the Projects which resulted in tremendous productivity improvements
Environment: Informatica Power Centre 7.1, Oracle 9i, Unix (AIX), Mercury Test Director 8.0, Informatica Data Explorer 5, Lean methodology
Confidential, Boston
Tech Lead
Responsibilities:
- Lead Actuate development team of around 6 members. involved in day to day interaction with clients in gathering the requirements, participating in meetings and coordinating with offshore dev team for delivery.
- Designed and developed Actuate reports as per the business requirements
- Participated/provided input on estimation preparation of new Actuate Reporting proposals
- Mentored new team members and conducted many Actuate Training sessions.
- Review of reports developed by others and ensuring deliverables quality from offshore.
- Involved in quick fixes and change requests as prioritized by client.
- Traveled to China in a short term for project related KT/training
Environment: Actuate 7.2 & 8.0, Actuate i-Server, Actuate Active Portal, WebSphere 4.0, J2EE, SOAP APIs, Crystal Reports 6.0, Oracle 9i