Technical Consultant Resume
SUMMARY
- Dedicated Cloud Engineer, Data Modeler, Data warehouse / Database Architect over 13 years of experience (Total 19 years of IT exp) with global retailers including Confidential., Confidential Supply, The Confidential Company, Confidential ., Confidential Inc., and Finance clients like Confidential and Confidential, US.
- Extensive data modeling and data architecture experience in designing OLTP or E - R modeling & Data Warehouse applications with Kimball modeling like Star, Snowflake Schemas, and Bill Inmon of 3NF Modeling.
- Hands on Experience on Private Cloud environment with Managed/Hosted services.
- Worked on all phases of data warehouse development lifecycle, including requirement gathering, data analysis, development, testing, deployment, and production support.
- Profound experience to understand Functional Requirements and convert them into Conceptual, Logical and Physical data models using tools such as ERwin, ER Studio and MS Visio.
- Demonstrated ETL architect expertise utilizing ETL tool IBM InfoSphere Information Server (DataStage), IBM InfoSphere Change Data Capture, SSIS, IBM Netezza appliance, and Teradata.
- Have top-notch communication skills, with the ability to interface with senior management from information technology and business users to understand the business needs behind the IT solutions which are needed.
- Proven leader with experience to mentor junior staff and conduct code reviews, performance related recommendations in the data warehouse and ETL systems.
TECHNICAL SKILLS
Languages: C, C++, SQL, PL/SQL, Pro*C, UNIX shell scripts, Java and Java script.
Database: Oracle11g,10g/9i/8i/7.x, Teradata V2R5/V13, Netezza 4/6, SQL Server 2008/2012/2013, and DB2 Warehouse on Cloud (dashDB) v11.1.9.0
ToolsERwin: 7.3/8.6/9.5, ER Studio, Aginity, TOAD, SQL*Navigator, SQL Developer, IBM Rational Clear Case, PVCS, Visual SourceSafe, Remedy, HP service desk, HP Project and Portfolio Management Workbench(PPM), PTC Integrity 10, WinSQL, and HEATSpecial SoftwareIBM InfoSphere Information Server 8.x,9,x,11.x, IBM DataStage 7.5.x, IBM InfoSphere Change Data Capture 10.2.0, Cognos, Microstrategy, Control-M, Auto Sys, IBM Connect Direct, IBM Lift, IBM Data Server Manager V2.1 and Microsoft Project Professional Server 2007
PROFESSIONAL EXPERIENCE
Confidential
Technical Consultant
Key Responsibilities / Achievements:
- Attended IBM Cloud training sessions to understand the Cloud architecture and associated components like Bluemix console, IBM Cloud Foundry Org, Space and categories and other Cloud Platform components.
- Worked with IBM Cloud Support team to setup the EZCORP’s IBM cloud ORG, Space and Identity management.
- Coordinated with EZCORP internal support team and IBM Cloud Support team to establish IBM Cloud Integrated Analytics Environment (ICIAE). Analysed the existing EZCORP environments and prepared high level Cloud architecture diagram with proposed VPN Tunnels for EZCORP’s PROD and DR environments.
- Worked with IBM Cloud offering team to setup the EZCORP Db2 WoC(dashDB) and tested the connectivity from management console and Db2 Client tools like Db2 Data Server client, Aginity for dashDB
- Working as a Db2WoC Admin and performed activities like create new database objects, user management and prepare the lower environment for development team with real time data. Working with IBM Cloud Support team to resolve the Db2 connectivity issues and other operational issues and maintain IBM Cloud environment.
- Coordinating with CloudBasic vendor to setup the PROD and DEV environments for data replications.
- Providing support to configure and validate the data replications from SQL Server/PeopleSoft to Db2 WoC.
- Actively participating Db2 WoC - DWH design discussions/ brainstorm sessions and providing support to prepare documentations and challenges. Prepared Db2 WoC- DWH design approach with high level tasks details.
- Analysed existing General Ledger Interface (GLI) data feed from Netezza and consolidated the business and technical work flow documents. Also, created GLI data feed technical mapping document.
- Created conceptual and logical data model and reviewed with Enterprise Architect and deployed in Db2 WoC.
- Working with IBM Cloud offering team to setup the EZCORP Datastage 11.5/11.7 on Cloud environments and tested the connectivity and other associated IBM parts. Performing Datastage Admin tasks to create Datastage Projects, new user logins and maintain the security roles. Reviewed Db2 Stored procedures and enhanced it.
- Developed on-demand Datastage jobs to extract real time data and loaded into Db2 WoC Dev environment.
- Installed, configured and tested IBM Lift tool; developed the scripts to extract the data from other data sources and loaded into Db2 WoC for development team data needs.
- Installed, configured and tested IBM Data Server Manager V2.1 utilities for monitoring and workload management.
Confidential
BI Consultant
Key Responsibilities / Achievements:
- Analysed existing SQL Server DWH and Netezza DWH systems; Prepared Netezza data load related inventory list.
- Attended brainstorming sessions with SME to understand the existing business processes flows
- Identified long running Netezza data loads and modified NZ design features and improve the performance.
- Worked with Offshore team and supported daily ETL issues and Netezza data load issues.
- Analysed existing Netezza DWH design and documented the design related issues, ETL batch process issues, Data source availability, Cognos Cube build process and transaction level / summary level data issues.
- Analysed existing IBM Change Data Delivery (CDD) design and proposed the new design to minimize the failures.
- Performed detailed analysis on existing Netezza DWH design and proposed new DWH design with different subject areas like Sales, Loan, Layaway, HR, and Finance with associated DataMart.
- Analysed existing Strategic Companion Reports and prepared detailed measures list with business rules.
- Discussed with business team and validated the measure rules and documented User stories with tech details.
- Developed the data validation scripts from source and target systems for specific measure lists and tested.
- Performed the data validations from new Netezza data sources and documented the data issues.
- Worked with Product owners to estimate the user stories, tasks and Product Backlog Items for Sprint plans.
- Setup the Erwin 9.0 Model Mart server and performed reverse engineering against SQL Server and Netezza db.
- Derived the Strategic Companion reports specific data model and reviewed with team to ensure the source data.
- Worked as a back-up Netezza Admin created new users and assigned roles as per EZCORP security roles.
- Performed Netezza Weekly maintenance activities to support the regular operations.
Confidential
Database Architect
Key Responsibilities / Achievements:
- Analysed the existing legacy databases and propose consolidated data model. Analysed existing SSIS packages and updated ETL technical mapping document, DB inventory list with application SCREENs and data mapping.
- Understand the existing business processes flows and dependencies in between applications.
- Discussed with business analysts and SMEs to understand the User Stories and prepared tasks for each SPRINT.
- Analysed existing data sources for each user stories and prepare high level data profile documents.
- Discussed with Business and documented the business rules /clean-up approach for data quality.
- Created data migration scripts with required data conversions rules and performed the Unit testing.
- Created Conceptual data model and discuss with enterprise architecture team to get approval.
- Created Logical and Physical data model with Entity Relationship data modeling (OLTP)
- Setup the Identity Server related databases with LDAP Directories for security login pages and shared folders.
- Created deployment tasks and tested with DBA team to finalize the deployment tasks. Reviewed test cases for each Sprint and validate the results. Provided support to development team for data related clarifications/issues.
- Actively participated along with scrum team in daily Scrum meeting, Sprint review/retrospective meeting and Sprint planning meetings. Provided support to Product owner to prepare the Product Backlog Items.
Confidential
Data Architect and ETL Architect
Key Responsibilities / Achievements:
- Worked as a Data Architect and ETL Architect and provided Post Implementation support for R2A and R2B1.
- Performed data profiling using IBM InfoSphere Information Server Analyzer and maintained data lineage. Analysed R2B1 requirements and created Data model for POEM ODS and Source to Target Mapping Document.
- Modified and maintained the changes in Buying Subject area’s EDW data model ( Kimball’s Dimensional model)
- Worked on R2B1 related ETL Design, Development, Testing, deployment and post prod support activities.
- Automated Data Integrity Check (ADIC) Summary process - Framework is designed and developed as part of R2B1 release to provide high level validation between Source and Target databases.
- Worked with other IT teams (POEM /RMS Application team, EQA, DBSG, CDC) for System Integration testing.
- Created Project artefacts including ETL design documents, Deployment build notes, and UAT validation documents to meet the TJX Integrity deployment process. Prepared R2B1 back-out strategy plan for R2B1 deployment activities and tested the back-out steps. Created Data Migration and Data Conversion scripts as per Change Req.
Confidential
Data Architect
Key Responsibilities / Achievements:
- Worked as a Data Architect and led 4 members team. Analysed TJX legacy Merch Payable system Lawson and understood the business context. Performed data profiling using IBM InfoSphere Information Analyzer.
- Analysed the business reporting requirements and discussed with business SMEs to complete functional requirements and technical requirements. Also, reviewed all requirement documents with business.
- Prepared Technical mapping document from Functional Requirement Documents (FRDi), with required technical transformation rules and reviewed with SMEs. Master Data requirements are identified and validated.
- Prepared Conceptual data models, data flow diagrams, and data profiling reports and reviewed with DA team.
- Analysed Oracle RMS/ReIM OLTP data model and prepared multiple solution approach with de-normalization process for EDW-Invoice Matching Subject area and subsequent data mart requirements.
- Prepared both 3rd Normal form & Kimball methodology dimensional model with GRAIN details and discussed with Enterprise architecture team. Finally secured the approval for the Kimball’s dimensional model.
- Designed logical and physical data models using Erwin and reviewed with Enterprise Architect team.
- Prepared Bus Matrix for Invoice Matching subject area and updated the document as per TJX data governance.
- Created Invoice Matching Data Mart using Kimball method dimensional modeling for 50+ BI reports.
- Prepared Source to Target ETL mapping documents with detailed ETL load logic and ETL transformation rules.
- Prepared data volume estimations based on business requirements and proposed high level Database /BI system capacity plans. Also, Prepared Key Design Documents (KDD) and presented with IA Review Board for approvals.
- Created Deployment Kit using PTC tool as per TJX standard and supported the EQA activities. Supported Metadata manager to update the data dictionary and maintained data lineage using InfoSphere Info. Governance Catalog.
- Reviewed ETL UTC documents, UTC results and provided support to ETL development activities.
- Developed data validation process for basic data quality checks to adhere with TJX data governance policy.
- Prepared Source/Target mock-up data for Invoice Matching data model and supported Data Mart requirements.
- Discussed with DBA team and prepared large table list and proposed Netezza design recommendations.
Confidential
Data Architect
Key Responsibilities / Achievements:
- Worked as a Data Architect and Re-designed EDW buying subject area data model.
- Understand the business context and analysed the requirements in business and technical perspective.
- Evaluated current ODS data models, data base schemas, and prepared the gap analysis report.
- Prepared the solution approach document to address the gaps in data models and data base schemas.
- Performed data profiling using IBM InfoSphere Information Server Analyzer and maintained data lineage using InfoSphere Information Governance Catalog. Gap analysis report was discussed with Enterprise Architecture team and finalize the approach for change in ODS / EDW (Kimball’s dimensional model) and data marts.
- Prepared and executed clean-up activities to sync the metadata in all lower environments ODS/ EDW /Data Mart
- Developed automated data integration check validation scripts for buying subject area.
- Presented the data models with Information Architecture Review Board and secured approvals.
- Created Deployment Kit using PTC tool as per TJX standard and supported for EQA activities.
- Developed data validation process for basic data quality checks to adhere with TJX data governance policy.
- Analysed Netezza distributions on large tables and provided recommendations for performance improvements.
- Discussed with DBA team to provide required work group level assignments in all Netezza environments.
Confidential
Data Architect
Key Responsibilities / Achievements:
- Worked as a Data Architect and understand the business requirements in business and technical perspective.
- Analysed and reviewed the functional requirement documents to support the Business Analyst team.
- Conducted brainstorm sessions with BI / DA team for BI report complexities and listed multiple solution options.
- Performed data profiling using IBM InfoSphere Information Analyzer and maintained data lineage using InfoSphere Information Governance Catalog. Prepared gap analysis report as per functional requirement.
- Prepared Conceptual/Logical data model using Kimball dimensional modeling and reviewed with DA team.
- Created data flow diagrams, and data profiling reports for Planning Analytics Data Mart.
- Prepared Key Design Decisions (KDD) and presented with Information Architecture Review Board for approvals.
- Developed few ETL Proof of Concepts to load huge volume of data sourced from big flat files or tables.
- Created Data volume estimates and proposed high level system requirements as part of SA doc preparation.
- Identified Master Data requirements for this project and validated against Master Data Authority.
- Designed logical and physical data models using ERwin and prepared Source to Target ETL mapping document.
- Created Deployment Kit using PTC tool as per TJX standard and supported the EQA activities.
- Developed data validation process using PL/SQL scripts for data quality to adhere TJX data governance policy.
- Analysed Netezza distributions on large tables and provided recommendations for performance improvements.
- Prepared custom scripts using NZSQL and nzload for on-demand IT requirements.
Confidential
Data Architect / Technical Architect
Key Responsibilities / Achievements:
- Led 6 members of development and QA team with ETL and Teradata technologies. Analysed the business problems mentioned in the RFQ and prepared Impact Analysis document with proposed solutions.
- Performed Data profiling using IBM InfoSphere Information Analyzer and data analysis with custom SQL scripts.
- Prepared list of frequently used tables and provided recommendations to create or modify with appropriate Indexes like UPI, NUPI, USI, NUSI or PPI.
- Created or updated logical, Physical data models using ERwin 7.6 and maintained the metadata repository.
- Built end-to-end data integration solutions using IBM DataStage 7.5/8.7 to implement new interfaces.
- Prepared Detailed Technical Design document for DataStage/Teradata Change Requests and finalize End to End Design for required problems / change requests listed. Prepared the source to target data mappings document, traceability matrix document, and provided technical support to development teams for all data related queries.
- Discussed with GDW panel and secured approvals for Impact analysis and DESIGN documents.
- Conducted peer design reviews, code reviews, and recommended best practices.
- Provided Technical training in DataStage, Teradata, and ERwin to build the competency resource pool.
Confidential
Nielsen Media Programs
Responsibilities:
- Worked as a Data Architect and Technical Architect for DataStage and Netezza in Nielsen Media Programs Team.
- Represented as an offshore data architect in weekly meeting with enterprise Architect team and explained all in-flight projects architecture design status, open issues and finalize the conceptual/logical data models.
- Used IBM InfoSphere Information Analyzer for data profiling and supported Business analysis team to complete Business Requirement Document and Functional Requirement Document.
- Prepared source to target ETL data mapping document and provided support to respective development team for data mapping / data analysis related queries. Actively involved in media specific Proof of Concepts.
- Defined the process to develop the reusable components, Promoted design/develop for performance idea; ensured the client’s standards are followed by development teams and captured Lesson Learned report.
- Supported Development Teams to implement standards and best practices in the project, conduct code reviews, test scenarios, and deployment plan reviews. Created NZ Scripts, stored procedures using NZSQL and NZload.
- Provided Technical training in DataStage and Netezza to build the competency resource pool new projects.
Confidential, Irving, TX
Data Architect & Technical Architect (DataStage/Netezza)
Responsibilities:
- Led 15 members of BI-Team (Onsite / Offshore) and delivered 24*7 support. Created Project Plans using Microsoft Project Professional Server and updated regularly with project metrics and generated project status reports every week. Always focused to maintain more than 97% of SLA for all critical incidents.
- On an average, 20% of incidents flow was reduced every quarter. Around 100 (per quarter) critical incidents / repeated incidents were converted into problem tickets. Based on priority Problem tickets were implemented and improved the performance and data quality. Approximately 10% of new applications were added into Production; maintained the SLA with existing resources. Cost to the company and customer was significantly reduced.
- Performed source data analysis and data profiling on RMS (Oracle Retek Merchandising System)
- Prepared conceptual data models; explained to Enterprise Architect team; and finalized the approvals.
- Designed logical and physical data models using Erwin for new DWH projects and supported dev. activities.
- Documented the interface processes in current legacy DWH system and re-design it using IBM DataStage.
- Automated all the new DataStage ETL jobs using UNIX shell scripts through Control-M scheduler and added data validation checks including business rules and referential integrity checks.
- Worked with BI/DWH practice to create Standard Operation Procedures and Implemented the process for maintenance, monitoring, backup and recovery operations for all IBM DataStage Projects.
- Analysed Netezza query history database and prepared long running batch queries and fine-tuned the Netezza table distributions. Prepared custom scripts using NZSQL and nzload for project specific proof of concepts.
- Automated Netezza query history database alerts; prepared alert reports; discussed output with Enterprise architect team; and recommended performance tuning options.
- Prepared Netezza weekly maintenance process and applied IBM’s best practice guidelines like regular Statistics collection, using groom option to recover the free space. Worked with IBM support and acted as a backup DBA.
- Worked with IBM support to analyse current workloads; configured different groups in Netezza; and utilized the Netezza work load management features to fine tune the performance.
- Defined various processes / procedures to streamline the batch processes with better business communications.
- Led 10 members of offshore team and provided application support for RMS batch processing.
- Prioritize the enhancement requirements and created ETL Design documents. Actively participated development activities and delivered as per schedule. Supported to update Logical and Physical data models using ERwin.
- Prepared the document with large table Netezza and worked with onsite to improve the query performance.
- Prepared custom scripts using NZSQL and nzload for project specific proof of concepts. Applied best practice guidelines like regular Statistics collection, using groom option to recover the free space.
- Automated Netezza query history database alerts; prepared alert reports; discussed output with Enterprise architect team; and recommended performance tuning options.
- Prioritized/Assigned/Delivered the incident tickets in the HEAT incident management system and maintained SLA.
- Actively involved in all account level trainings and created a resource pool for DataStage and Netezza.
Confidential
ETL DataStage Consultant
Responsibilities:
- Involved in the design of EDW Stores architecture; performed code reviews and implemented best practices.
- Created end-to end Process flow diagram for all ETL jobs.
- Designed and developed DataStage jobs, job sequences, Job Activity, Email Notifications, and Shared Containers
- Created UNIX shell Scripts for file validation and scheduling DataStage jobs.
- Performed unit testing, system integrated testing and supported for deployment activities.
- Prepared detailed ETL specifications based on business requirements and Performed data analysis.
- Coordinated with Onsite team and provided support to update the conceptual/Logical data model.
- Designed and developed DataStage jobs, job sequences, Job Activity, Email Notifications, and Shared Containers
- Developed custom BTEQ scripts and used Teradata utilities to load the data into Teradata tables.
- Created UNIX shell Scripts for file validation and DataStage jobs
- Developed UNIX shell scripts to execute DataStage jobs.
- Created test cases and performed Unit test and system integrated testing.
- Provided support in post-deployment phase and knowledge transferred to production support team.
- Worked as a Network and Security administrator and maintained very large LAN / WAN networks.
- Strong hands-on experience with CISCO routers and Checkpoint firewall and maintained Disaster Recovery (DR).