Senior Solution & Data Architect/data Modeler/etl Developer Resume
Pittsburgh, PA
SUMMARY
- Senior Business Intelligence Data Architect with a passion for new technology and an ability to speak at both a technical level as well as a Business functionality level.
- Exhibit excellent project management skills, bringing the ability to quickly put structure in place to manage work in a dynamic complex environment.
- Have a passion for coaching and developing a team of associates that is able to share and embrace client vision.
- Have a proven record of achievement in the area of Data Architecture, Data Analysis and Modeling, Database Administration and Performance Tuning, Data Warehouse Design, Transactional (OLTP) and Dimensional Modeling (OLAP) Implementation.
- Experience in designing solutions for Big Data environment both structured and unstructured.
- Extensive experience in Data warehouse design and development using MS DTS, SSIS, IBM DataStage v9.1 and Informatica Powercenter 9.5. Extensive experience in implementing Master Data Management (MDM) strategies, requirements and policies aligned to companies’ goals for data quality, data security and data integration.
- Have excellent communication and partnership skills for interacting and communicating with key stakeholders at all levels across the company to manage, inform and influence outcomes.
TECHNICAL SKILLS
Hardware: IBM 370 series, 30xx series, PS2 series and compatibles, Sun Solaris
Software: Microsoft Office Project (EPM) NT/Server, NT/Advanced Server (Clustering and Load Balancing), MS/SqlServer - Utilities, QMF, MS/WORD for Windows, Excel, MS/Mail, Project Management Workbench, VBSQL, OLE Servers, ADO.Net, ASP, ASP.Net, ACTIVE-X, RDO, HTML, DHTML, ODBC, Telerik WEB Controls, MS-SQL Server Stored Procedures and Triggers, Rules, Erwin, WEB Services, SOA, WCF, MS/SqlServer Reporting Services (SSRS), IBM MVS and Z/Os operating Systems, ORACLE, IBM DataStage v9.1, PL/SQL, IBM Data Archive, Informatica Powercenter 9.5 VersionOne - Agile
Databases: Hadoop, MongoDB, Oracle 12c, Oracle 11g, Oracle 10g, DB2/LUW, DB2-ZOS, MS/SQL Server 7.0, 2000, 2005, 2008 and 2012. Integration Services (SSIS), Analysis Services (SSAS), Data Mining Using SSAS 2012, MS/Access 2013.
Languages: VB.Net, JavaScript, Visual Basic, C#.
PROFESSIONAL EXPERIENCE
Confidential, PITTSBURGH PA
Senior Solution & Data Architect/Data Modeler/ETL Developer
Responsibilities:
- Participated in the architecting, development of New Computer System that helps client manage their entire internal operation.
- Participated in the conversion of the front end systems and the installation of a new Online Transaction Processing and Data Warehouse. Developed ETL procedures to convert the old warehouse data into the new one.
- Developed ETL procedures to automate implementation of Reference Data for both the OLTP side and the OLAP side.
- Produced Entity Relationship Diagrams (ERDs, STAR Schemas) for all systems Using Erwin Data modeling tool.
- Responsible for applying Master Data Management (MDM) strategies, requirements and policies aligned to the company’s goals for data quality, data security and data integration.
- Prepares deliverables such as source to target mappings and transformation rules documentation.
- Designed New Analysis Services Cubes.
- Developed canned SSRS Reports for accessing The SSAS Cubes as well as the underlining DW tables
- MS BI platform.
- Interfaced with Network Engineer, Windows Systems Engineer, SQLServer DBA and Business users.
- Made extensive use of Visual Studio Data Tools, Team Foundation Server (TFS).
- Made extensive use of Microsoft SQL Server Integration Services, TSQL.
- Familiar with Business Intelligence platforms such as Qlikview and Tableau.
- Participated in Database Development review sessions and Database Administration review sessions.
Confidential
Senior Solution & Data Architect/ETL Developer
Responsibilities:
- Responsible for architecting a highly scalable SMP datawarehouse Server capable of processing over 100 terabytes of information.
- The hardware comprised of 4-way SMP (64 bits) systems capable of scaling up to 64 nodes.
- Participating in the conversion of the front end systems and the installation of a new Data Warehouse. Develop ETL procedures to convert the old warehouse data into the new one.
- Prepares deliverables such as source to target mappings and transformation rules documentation.
- Convert Data Mining Model from SQLServer 2005 to SQL Server 2012
- Design New Analysis Services Cubes
- Develop canned SSRS Reports for accessing The SSAS Cubes as well as the underlining DW tables
- MS BI platform.
- Interfaced with Network Engineer, Windows Systems Engineer, SQLServer DBA and Business users.
Confidential, Purchase NY
Senior Data Architect /Data Modeler
Responsibilities:
- Lead the planning, requirements gathering, and implementation of a Big Data Modelling initiative for the client, both structured and unstructured data.
- Designed, developed, and managed project plans in a complex dynamic environment, revised needs to meet changing requirements.
- Developed detailed tasks for accurate project planning and scheduling, as well as performing estimation, forecasting, planning, analysis, issue / risk / change management, escalation management, meeting facilitation, variance analysis, and status reporting.
- Build relationships and collaborated with key stakeholders to ensure delivery of commitments.
- Inventoried data sources for all major systems by reverse engineering the RDBMS databases (Oracle, MySQL, Informix) to produce an Enterprise Data Dictionary reconciling all disparate data definitions and obtaining business definitions for business critical data elements.
- Produced Data Flow Diagrams, showing flows of all major data elements between systems and reporting sources.
- Evaluated data consistency and cleanliness. Develop recommendations for cleaning up the data and for keeping it clean on a go forward basis.
- Produced Entity Relationship Diagrams (ERDs) for all systems.
- Designed and documented data governance process.
- Developed policy that specifies who is accountable for various portions or aspects of the data, including its accuracy, accessibility, consistency, completeness, and updating.
- Defined processes concerning how the data is to be stored, archived, backed up, and protected from mishaps, theft, or attack.
- Developed a set of standards and procedures that defines how the data is to be used by authorized personnel.
- Put into place a set of controls and audit procedures that ensure ongoing compliance with government regulations.
- Defined rules and policies that govern data quality.
- Responsible for applying Master Data Management (MDM) strategies, requirements and policies aligned to the company’s goals for data quality, data security and data integration.
- Responsible for supporting MDM implementation using Master Data Services and Data Quality Services in MS/SQLServer 2012 Enterprise Edition technologies.
- Identified Data sources in order to determine data ownership.
- Used Embarcadero Connect web application to browse, analyze, and manage metadata from disparate metadata repositories. Used Embarcadero Connect to help understand how information and processes are derived, how they are related, and how they are used.
- Used Embarcadero Connect to extract metadata from application, business intelligence, data integration, data modeling, and relational metadata sources, to browse and search metadata objects, trace data lineage, analyze metadata usage, and perform data profiling on the metadata in the Enterprise Metadata Repository.
- Used Embarcadero Connect to create and manage business glossaries to generate reports on the metadata in the Enterprise Metadata Repository warehouse.
- Established Change Management Policy and Procedures to Control Updates to Data Model so that Enterprise Data Dictionary can be Up-to-Date.
- Tool used: ER/Studio, VersionOne - Agile.
Confidential, CA
Senior Data Architect/IBM DataStage Developer
Responsibilities:
- Developed and validated requirements for Enterprise Data Warehouse.
- Collaborated with MO to develop a data integration strategy for the project.
- Prepares deliverables such as source to target mappings and transformation rules documentation.
- Developed ETL solutions, architectural standards and rules for using IBM DataStage v9.1.
- Provided input on long-term technology planning for data integration and BI initiatives.
- Determine optimal approach for obtaining data from diverse source systems.
- Implemented Change Data Capture on SQL Server (CDC)
- Provided guidance and best practices strategy for maintaining SCD1, SCD2, SCD3, SCD4, SCD6 using IBM DataStage v9.1
- Built appropriate logging and tracking systems to support the ongoing management and optimization of the system.
- Used IBM Infosphere Information Analyzser to collect information on data-flows and manage changes within the integrations and communicate with the user community by generating Analytical reports regarding the objects found in the Metadata Warehouse.
- Used IBM Infosphere Information Analyzser to browse and search metadata objects, trace data lineage in order to determine dependencies between source and target data, analyze metadata usage, and perform data profiling on the metadata in the Metadata Manager warehouse.
- Demonstrated a high proficiency level with the following: Large scale ETL (Extract, Transform, Load) using IBM DataStage v9.1, Large scale ETL (IBM DataStage development for XML and various other feeds, Performance tuning, clustering, RDBMS data modeling, design and partitioning.
- Managed design and implementation of business reporting solutions using IBM Cognos. Highly proficient with IBM Cognos Framework Manager, Report Studio and Analysis Studio.
- Operating Systems: Windows Server 2003/2010, Linux. Database Servers: IBM DataStage v9.1, IBM DB2/LUW, MS/Sqlserver 2012 Enterprise Edition, Knowledge of distributed architecture and design. Data Modeling tool: ER/Studio, Thorough knowledge scalability techniques. Thorough knowledge of data backup, recovery, and rollback planning/implementation, in DB2/LUW optimization, fine-tuning, and scalability techniques. Thorough knowledge of data security issues.
Confidential, Jersey City NJ
Senior Data Architect/Informatica DBA
Responsibilities:
- Developed and validated requirements for Archiving Enterprise Data Warehouse.
- Collaborated with PMO to develop a data archiving strategy for the organization.
- Developed EL solutions, architectural standards and rules for using Informatica Data Archive.
- Provided input on long-term technology planning for data archiving, integration and BI initiatives.
- Determined optimal approach for obtaining data from diverse source systems.
- Built appropriate logging and tracking systems to support the ongoing management and optimization of the data archiving system.
- Operating Systems: Windows Server 2003/2010, Linux. Database Servers: ORACLE 10g, IBM DB2/LUW, IBM DB2/Z-OS, PL/SQL, Knowledge of distributed architecture and design. Thorough knowledge of data backup, recovery, and rollback planning/implementation, in ORACLE 10g optimization, fine-tuning, and scalability techniques. Thorough knowledge of data backup, recovery, and rollback planning/implementation, in DB2/LUW, and DB2/Z-OS optimization, fine-tuning, and scalability techniques. Thorough knowledge of data security issues.
Confidential, Hoboken NJ
Senior Data Architect
Responsibilities:
- Developed and validated requirements for Enterprise Data Warehouse.
- Collaborated with PMO to develop a data integration strategy for the organization.
- Developed ETL solutions, architectural standards and rules for using Informatica Powercenter 9.5.
- Provided input on long-term technology planning for data integration and BI initiatives.
- Provided guidance and best practices strategy for maintaining SCD1, SCD2, SCD3, SCD4, and SCD6 using Informatica.
- Determined optimal approach for obtaining data from diverse source systems.
- Built appropriate logging and tracking systems to support the ongoing management and optimization of the system.
- Oversee projects and educate other Data Services team members on deliverables.
- Make recommendations to improve Data Services environments.
- Demonstrated a high proficiency level with the following: Large scale ETL (Extract, transform, load) using Informatica Powercenter 9.5. Informatica development for XML and various other feeds, Performance tuning, clustering, RDBMS data modeling, design and partitioning.
- Operating Systems: Windows Server 2003/2010, ORACLE 10g, PL/SQL, Knowledge of distributed architecture and design.
- VersionOne - Agile, Data Modeling tool: ERWIN. Thorough knowledge of data backup, recovery, and rollback planning/implementation, in ORACLE 10g optimization, fine-tuning, and scalability techniques, Thorough knowledge of data security issues.
Confidential, New York NY
Senior Data Architect | Database Administrator
Responsibilities:
- Partnered with the AG team to create an integrated data model of the Enterprise DW.
- Validated the business requirements for the business processes in scope.
- Embraced a practical and pragmatic approach that results in an effective design that is implementable and extensible.
- Identified the key business user and IT roles required for a successful dimensional model design effort.
- Facilitated dimensional data modeling workshops.
- Provided guidance and best practices strategy for maintaining SCD1, SCD2, SCD3, SCD4 and SCD6 using SSIS.
- Documented the dimensional data model in an effective set of deliverables to enable downstream implementation efforts to move forward smoothly.
- Provided knowledge transfer of the dimensional design process as well as the final design to relevant AG team members.
- The specific goals and objectives of the AG Data Warehouse Dimensional Model initiative include the following:
- Provided a comprehensive dimensional data model for the new DW that presents a consolidated model with conformed dimensions shared across business processes as appropriate to support integration for the following business areas: Exposures, Premiums, Losses, Roll forwards, Ratings, and Market Data.
- The ETL system relies on an overall infrastructure framework to standardize and automate the ETL process. This framework has five major subsystems that address the management and operations of the ETL systems. The ETL framework includes subsystems that support the instrumentation and tracking of the execution of the system and its ongoing operations. In addition, the ETL system and underlying database require ongoing maintenance.
- The DW/BI system is built with appropriate logging and tracking systems to support the ongoing management and optimization of the system. The ETL process has an auditing system to ensure the integrity of the data being loaded, and to support the troubleshooting and optimization needed for any significant production system. The data warehouse database has monitoring system to support a range of needs including performance tuning, capacity planning, and potentially even user access auditing and compliance. The enterprise reporting environment also have its own monitoring system to track the usage and performance of the organizations standard reports.
- The DW is a collection of about 30 fact tables and 60 dimension tables (Info dimensions, Bridge tables, and surrogate key pipeline, conformed and degenerate dimensions)
- The DW contains a minimum of 40 years of data partitioned by year where each year has an average of 5500 MB of data
- Manage client expectations by building relationships; communicating project status and open issues; preparing reports; conducting reviews and issue meetings; discovering new issues.
- Implements solutions by monitoring project progress; tracking action items; conducting design and implementation reviews; examining, researching, and resolving issues; escalating issues to appropriate authority; responding to team members' concerns; following production, productivity, quality, and customer-service standards; identifying work process improvements
- Updates job knowledge by participating coordinating hardware and software evaluations with vendors.
- Responsible for supporting MDM implementation using Master Data Services and Data Quality Services in MS/SQLServer 2012 Enterprise Edition technologies.
- Demonstrated a high proficiency level with the following: Large scale ETL (Extract, transform, load) using SQL Server 2012 Enterprise Edition, SSIS development for XML and various other feeds, Performance tuning, clustering, RDBMS data modeling and design. Develop and administer OLAP cubes in SSAS 2012. Data modeling tool: ERWIN.
- VersionOne - Agile
- Operating Systems: Windows Server 2003/2010, Installing SQL Server database, knowledge of Transact SQL (T SQL), Knowledge of distributed architecture and design Thorough knowledge of data backup, recovery, and rollback planning/implementation, in SQL Server 2005 optimization, fine-tuning, and scalability techniques, Thorough knowledge of data security issues.
Confidential Midatlantic, Mount Laurel NJ
Data Architect
Responsibilities:
- Design, develop, monitor, and performance tune data warehouse tables, views, indexes, stored procedures, service broker components, etc. on a SQL Server 2008 Enterprise Edition environment. Design and develop ETL processes using SSIS 2008.
- Develop and administer OLAP cubes in a SSAS 2008 environment. End to end tasks such as dimension, cube, and MDX script development, partitioning and aggregation strategies, deployment and query optimization techniques.