Etl Architect / Data Modeler - Senior Technical Account Manager Resume
Warren, NJ
OBJECTIVE:
- Highly Motivated ETL / BI - Data Architect / Senior Data Modeler / Hadoop Developer with Seventeen years of professional experience in designing, developing and administering software using various technologies. Proven project-based leadership experience, technical expertise, problem-solving skills, and effective communication style, interpersonal skills, motivated, diligent, team-player - can be highly productive in a stressful work environment.
SUMMARY:
- 17 years of IT experience as ETL Data Architecture, Data modeling, Data analysis, Oracle SQL & PL/SQL development and DataStage ETL Development.
- Having Domain knowledge on Banking, Transportation, HR, Pharma, Equity Trading, E-commerce, ERP, and Credit Cards Domain.
- Worked on Functional area like Store / Data Migration, Enterprise Data Warehouse, Fraud Data Lake, Credit Card Systems and inner workings, TSYS interaction, HR Bonus System Implementation, ERP, Legal and compliance, KYC etc.
- Excellent command over Conceptual Data Model, Logical and Physical data modeling.
- Well versed in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.
- Worked with Erwin (Model Manager and Model Mart), leveraged report feature for metadata extract out of Erwin Data Model.
- Proficient in System Analysis, ER/Dimensional Data Modeling, Data Vault Data Modeling, Semantic Data Modeling, Database design and implementing RDBMS specific features.
- Provided guidance in establishing Data Modeling and Database Standards, Policies and Guidelines and best practices to be used across all Data Models for enterprise.
- Contributed to creating Data Dictionaries for multiple silos system and incorporating them into one Enterprise Data Dictionary.
- Providing technical leadership and modeling expertise and guidance in establishing Data Governance and Data Quality initiative for Company.
- Solid experience using Meta Data to support Data lineage tracking and operational data governance, enabling systems to provide long-term stability and reliability.
- Experience with Data Migration, Data Conversions, Data Extraction/ Transformation/Loading (ETL) using PL/SQL Scripts.
- Extensive experience in development of SQL, OLAP, Stored Procedures, Triggers, Functions, Packages, Performance tuning and optimization for business logic implementation.
- Experience in Data Transformation, Data Mapping from source to target database schemas, data cleansing procedures.
- Created Mapping Document between source and the target system, mapping of attributes.
- Expertise in using Oracle 9i/10g and good knowledge in 10g databases.
- Proficient working experience with SQL, PL/SQL, Stored Procedures, Functions, Packages, DB Triggers and Indexes.
- Proficient in SQL*Loader, SQL*PLUS, PL/SQL developer and TOAD applications to manage database.
- Proficient in working on Netezza as a Backend Database / Appliance and DataStage as an ETL tool
- Experienced in Tuning SQL Statements and Procedures for enhancing the load performance in various schemas across databases.
- Technical documentation of different lifecycle deliverables and user manuals.
- Used Erwin tool for developing Entities and relationships in both physical and logical model.
- Familiarity with incorporating of various data sources such as Oracle, DB2, XML and Flat files into the staging area.
- Expertise in Software Development Life Cycle (SDLC) of Projects - System study, Analysis, Design, Resource Planning, Coding, Testing and implementing business applications.
- Expertise in Agile Methodologies and Software Development Life Cycle (SDLC) of Projects - System study, Analysis, Physical and Logical design, Resource Planning, Coding, Testing and implementing business applications. (Agile & Waterfall Methodologies).
- Used Planning/Scrum Poker methodology to estimate effort size of development goals for software development and plan Sprints.
- Facilitate scrum ceremonies (grooming, sprint planning, retrospectives, daily stand-ups, etc.)
- Used Rally to create User Stories, track Burn down Chats and manage product and release backlogs, etc.
- Hands on experience in preparing User Specification Requirement, Software Requirement, Technical Documentation.
- Conducted meetings with the functional team and the technical team for requirement gathering.
- Well versed with Data quality approach to data analysis using profiling techniques.
- Proposed and Lead effort for Data Quality Analysis tool (In-house @ Confidential ) development.
- Execute Data Quality Assessment for specific functional areas of business.
- Strong in writing Shell Scripts to automate file manipulation and data loading process.
- Excellent team member with problem-solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and an enthusiastic team player.
- Loaded large sets of structured, unstructured and semi structured data from different servers onto Hadoop Distributed File System.
- Migrated existing Hive processes to Spark to improve performance drastically
- Transformed data which is moved onto HDFS into single file using Pig Scripts.
- Created Hive tables (both managed and external) using partitioning, dynamic partitioning and bucketing based on application requirement.
- Implemented Spark Scripts using Scala, Spark SQL to access hive tables into spark for faster processing of data.
- Involved in converting Hive/SQL queries into Spark transformations using Spark RDD's, and Scala, Python.
- Loaded data from Linux file system to Hadoop Distributed File System and vice versa.
- Reviewed Hadoop Log files.
- Used YARN for cluster resource management.
- Managed and automated data loading using Oozie
TECHNICAL SKILLS:
Operating System: Windows NT 95/98/, UNIX, MS-DOS
Hadoop Core: HDFS, Map-Reduce, Yarn
Hadoop Ecosystems: Pig, Hive, Sqoop, Spark, Spark SQL, Flume, HBase, MapReduce, Zookeeper, Oozie, Azure, KafkaHadoop Cluster Cloudera CDH3/4/5
NoSQL Databases: HBase, Cassandra, Dynamo DB
Databases Netezza, Oracle 11g, Oracle 10g /9i /8i/ 7, Sybase, DB2
Development Tools: Developer 2000, Designer 6i/Developer 6i, Form 6i, Report 6i, Form 4.5, Report 2.5
Databases Query Tools: SQL Navigator, TOAD (Tool for Oracle application Developers), Oracle SQL developer 3.1
ETL Tool: InfoSphere DataStage 8.1, 9.1, Informatica Powercenter 8.1, 8.6
Data Modeler: Erwin 4, 7.2 (Model Mart), Erwin 8 (Model Mart), ER Studio, MS-Visio
Data Upload Tools: SQL Loader, OLLOAD, Oracle Data Pump
Data Governance: Erwin
BI Tools: Cognos Series 7.0/6.0, MS Access Reports, MicroStrategy 9.2.1, Developer 2000 (Reports 2.5/3.0)
Programming Languages: PL/SQL, SQL, C, Pro *C, Shell Script
Version Management: Clear case (labeling, merging & SCA build), CVS, SCCS and Visual Source Safe.
DBA Skills: Object Creation, Security & Access Permission, System performance, Production Release
PROFESSIONAL EXPERIENCE:
Confidential, Warren NJ
ETL Architect / Data Modeler - Senior Technical Account Manager
Responsibilities:
- Working with the Architecture team Supporting KYC Data Model, Data Governance, ETL Architectural Issues. Part of the Compliance AML (Anti Money Laundering) Systems, worked on the CitiKYC - Know Your Customer Application.
- Worked on a global Citi KYC initiative for NAM (North America) clients.
- Reviewed Design Documents for different functionality existing within KYC.
- Lead Architect for KYC implementation for Citi Screening, EAP, L1 Reports, L2 Reports and Reconciliation Reports.
- Lead Architect for Implementation of necessary system changes to ensure compliance with FinCEN's Enhanced Customer Due Diligence Rule.
- Reviewed Data Migration ETL code built for migration of cards data from source system (DQIP).
- Review and Approve various project artifacts such as the Data Mapping Template, Change Request(CR), Target Record Layout (TRL), and an internal Issue/Resolution Worksheet.
- Designed logical and physical data models for system designed and developed for Citi KYC.
- Involved with an End to End design reviews for all stakeholders.
- Provided suggestions and improvements for ETL framework for capturing data into EDW for data feeds needs.
- Participated in design decision discussion on Logical Model, database scenario, ETL framework and approach for data load, Error Handling etc.
- Ensured DIT, SIT and UAT testing cycles were managed properly by meeting acceptance criteria.
- Held various live sessions with ETL team to make sure all requirements were implemented in the ETL design doc for data migration. Validate via Traceability Matrix.
- Proactively communicate and collaborate with business, compliance and IT team to review deadlines, and analyze functional requirements
- Loaded large sets of structured, unstructured and semi structured data from different places onto Hadoop Distributed File System.
- Migrated existing Hive processes to Spark to improve performance drastically
- Transformed data which is moved onto HDFS into single file using PigScripts, Python.
- Worked on Building and implementing real-time streaming ETL pipeline using Kafka Streams API.
- Created Hive tables (both internal and external) using partitioning, dynamic partitioning and bucketing based on application requirement.
- Implemented Spark Scripts using Scala, Spark SQL to access hive tables into spark for faster processing of data.
- Involved in converting Hive/SQL queries into Spark transformations using Spark RDD's, and Scala, Python.
- Wrote Map reduce programs and responsible for troubleshooting issues in execution of MapReduce jobs by inspecting and reviewing log files.
- Loaded data from Linux file system to Hadoop Distributed File System and vice versa.
- Reviewed Hadoop Log files.
- Used YARN for cluster resource management.
- Managed and automated data loading using Oozie
- Used Planning/Scrum Poker methodology to estimate effort size of development goals for software development and plan Sprints.
- Facilitate scrum ceremonies (grooming, sprint planning, retrospectives, daily stand-ups, etc.)
- Used Rally to create User Stories, track Burn down Chats and manage product and release backlogs, etc.
- Developed the Star Schema / Snowflake Schema for the proposed warehouse models to meet the requirements.
- Resolved multiple Data Governance issues to support data consistency at the enterprise level.
- Determining the transformation for each attribute that is being mapped / derived / ETL generated.
- Analyzing and updating existing logical and physical models for discrepancies with the physical database and business rule document.
- Played significant role in various phases of project life cycle, such as requirements definition, functional & technical design, testing, Production Support and implementation.
Environment: Oracle 10g, 11g, PL/SQL, SQL, Toad 9.2, SQL Developer/Modeler, SQL*Loader, Erwin 7.2, Visio, UNIX, Talend 5, Ab-Initio, HPQC and Jira for Defect tracking, Rally Agile Software, Hadoop 0.20.2, Hive 0.2.0,Spark, pig 0.11.1, Sqoop, Flume, Oozie, YARN, Azure Data Factory, Azure Storage
Confidential
ETL Architect / Data Modeler
Responsibilities:
- Working with the Architecture team for building Enterprise Data Warehouse, Data Migration\Conversion API’s and Fraud Analytical System.
- Worked in the client launch team for stores CBK, CJB, Spanx, PetSmart, Quik, DC Shoes, Nautica, Kipling, Shoe Carnival and different Market Places.
- Involved in various projects to understand data in different source system and conversion API’s for migration of OMS Orders, Webstore Customers, Wishlist, CSR (Customer Service) and Enterprise Data Warehouse involving customer, Order, Product, Promotion and Wishlist.
- Data Migration\ Conversion API’s were built for different Use Cases meant for data migration.
- Actively involved in building the Fraud Analytical Data Lake, proposed data model, built ETL framework and designed S2T (source to target) mapping document for data coming from different other system.
- Designed logical and physical data models for system designed and developed in Confidential .
- Involved with an End to End design for fraud as an analytical system storing aggregated data for fraud identification and checks.
- Worked with the fraud analyst for requirements gathering and mainintaing a product backlog list for 2 weeks sprint cycle.
- Worked on the enterprise data warehouse Data model and ETL framework for capturing data into EDW for store reporting and data feeds. This includes Customer, Order, Product, Promotion, Wishlist and Cart Data.
- Participated in design decision discussion on Logical Model, database scenario, ETL framework and approach for data load, Error Handling etc.
- Used Planning/Scrum Poker methodology to estimate effort size of development goals for software development and plan Sprints.
- Facilitate scrum ceremonies (grooming, sprint planning, retrospectives, daily stand-ups, etc.)
- Used Rally to create User Stories, track Burn down Chats and manage product and release backlogs, etc.
- Developed the Star Schema / Snowflake Schema for the proposed warehouse models to meet the requirements.
- Resolved multiple Data Governance issues to support data consistency at the enterprise level.
- Determining the transformation for each attribute that is being mapped / derived / ETL generated.
- Analyzing and updating existing logical and physical models for discrepancies with the physical database and business rule document.
- Played significant role in various phases of project life cycle, such as requirements definition, functional & technical design, testing, Production Support and implementation.
- Excellent team member with problem-solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and an enthusiastic team player.
- Support QA, UAT, Production Release and Store Launch.
Environment: Netezza, Oracle 10g, 11g, PL/SQL, SQL, Toad 9.2, SQL Developer/Modeler, SQL*Loader, Erwin 7.2, Visio, UNIX, InfoSphere DataStage 8.1, Talend 5, Pentaho, Business Objects, Jira for Defect tracking, Rally Agile Software.
Confidential
Lead Data Modeler & Senior ETL \ Oracle Developer
Responsibilities:
- Involved in the projects from requirement analysis to better understand the requirements and support the development team with a better understanding of the data.
- Did the impact analysis for multiple projects from a data modeling prospective.
- Designed logical and physical data models for multiple OLTP and Analytic applications.
- To Design the logical Model and ensure that it follows the normalization rules and have the best possible traceability pattern with Scalability in your models
- Developed the Star Schema/Snowflake Schema for proposed warehouse models to meet the requirements.
- Supported the DBA in the physical implementation of the tables in both Oracle and DB2 databases.
- Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
- Created business data constraints, indexes, sequences etc. as needed in the Physical data model.
- Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents and analyzing the database using SQL queries.
- Pulling inter day changes vs intraday changes
- Understanding of Data Quality Services principles and techniques.
- Understands the value of Data quality approach to data analysis using profiling techniques.
- The Profiling of data is defined at the level of Table, Column and Domain Values Column
- Execute Data Quality Assessment for specific functional areas of business.
- Designed class and activity diagrams using Power Designer and UML tools like Visio
- Responsible for Enforcing Naming Standards in Erwin Model.
- Designed Data Model for Operational reports and trending reports
- Resolved multiple Data Governance issues to support data consistency at the enterprise level.
- Involved in physical transformation and Load of data for multiple Projects.
- Responsible for ensuring and providing consistent metadata across all the models in Barclaycardus.
- Performed Data Modeling by creating logical and Physical models of the database system.
- Designed and developed Data Models using Erwin to support business functionality.
- Wrote requirements for various reports to be generated and migrated.
- Writing SQL queries to find the proper source to target for the attribute mapping.
- Determining the transformation for each attribute that is being mapped / derived / ETL generated.
- Analyzing and updating existing logical and physical models for discrepancies with the physical database and business rule document.
- Generate Scripts and Comparing different versions of data model.
- Played significant role in various phases of project life cycle, such as requirements definition, functional & technical design, testing, Production Support and implementation.
- Excellent team member with problem-solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and an enthusiastic team player.
- Granting different authority to respective users
- Support UAT, Release and getting Sign off.
Environment: Oracle 9i, 10g, 11g, PL/SQL, SQL*PLUS, SQL, Toad 9.2, C, SQL*Loader, ERwin 7.2.3 - Model Mart, Visio, UNIX, Windows NT/XP, InfoSphere DataStage 8.1, Hyperion 9, MicroStrategy 9.2.1
Confidential
Data Modeler & Senior ETL \ Oracle Developer
Responsibilities:
- Created the whole Database Model in Erwin and finally the implementation of the whole schema.
- Creating Physical and logical tables. ( API’s, Interfaces, etc )
- Conducted meetings with the functional team and the technical team for requirement gathering.
- Analyzed Source system and Business requirements and developed User and Functional requirements.
- Used Erwin 4.0 tool for developing tables and relationships in both physical and logical model.
- Generated Scripts and Comparing different versions of data model.
- Used Shell Scripts to automate file manipulation and data loading process.
- Used PL/SQL, Stored Procedures, Packages, Database Triggers and SQL * Loader for functional requirements converted to technical terms where ever required.
- Played significant role in various phases of project life cycle, such as requirements definition, functional & technical design, testing, Production Support and implementation.
- Excellent team member with problem-solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and an enthusiastic team player.
- Granting different authority to respective developers.
- Giving UAT support and getting Sign off.
Environment: Oracle 9i, 10g, 11g, PL/SQL, SQL*PLUS, SQL, PL/SQL Developer, C, SQL*Loader, ERwin 4.0, ETL (Informatica Powercenter 8.1), Cognos7.2, UNIX, Windows NT, FTP, Autosys
Confidential
Data Modeler & Senior Oracle Developer
Responsibilities:
- Total Ownership of the database Team from Onsite. ( Team Leader for Database Team)
- Actively involved in onshore and offshore co-ordination for database related work.
- Created the whole Database Model in Erwin and finally the implementation of the whole schema.
- Creating Physical and logical tables. ( API’s, Interfaces, etc )
- Conducted meetings with the functional team and the technical team for requirement gathering.
- Analyzed Source system and Business requirements and developed User and Functional requirements.
- Used Erwin 4.0 tool for developing tables and relationships in both physical and logical model.
- Generated Scripts and Comparing different versions of data model.
- Used Shell Scripts to automate file manipulation and data loading process.
- Used PL/SQL, Stored Procedures, Packages, Database Triggers and SQL * Loader for functional requirements converted to technical terms.
- Played significant role in various phases of project life cycle, such as requirements definition, functional & technical design, testing, Production Support and implementation.
- Excellent team member with problem-solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and an enthusiastic team player.
- Granting different authority to respective developers.
- Giving UAT support and getting Sign off.
Environment: Oracle 9i, PL/SQL, SQL*PLUS, SQL, PL/SQL Developer, C, SQL*Loader, ERwin 4.0, 7.2 Model Mart, MS-Visio, Java, UNIX, Windows NT, FTP, Autosys
Confidential
Data Modeler & Senior Oracle \ ETL Datastage Developer
Responsibilities:
- Interact with business analysts to develop modeling techniques and re-structuring strategies
- Coordinated with DBA in creating and managing tables, indexes, db links and privileges.
- Performed design, coding, implementation and support.
- Developed back end interfaces using PL/SQL stored packages, procedures, functions, Collections, Object types and Triggers.
- Utilized SQL*Loader to load flat files database tables.
- Utilized PL/SQL developer tool in developing all back end database interfaces.
- Responsible for performing code reviews.
- Developed user documentation for all the application modules.
Environment: Oracle 9i, 10g, PL/SQL, SQL*PLUS, SQL, PL/SQL Developer, C, SQL*Loader, Erwin 4.0, Java, DataStage, UNIX, Windows NT, FTP
Confidential
Senior Oracle Developer
Responsibilities:
- High Level Design/ Low Level Design, preparation of SRS/HLD/LLD documents.
- Monitoring in the capacity of Module Leader/Reviewer for the following Phases:
- Low Level Design
- Construction/Unit testing.
- Coding of critical transactions.
- Analysis of Critical Transactions (identified on the basis of Transaction Impact) to estimate the application and Database Server Communication.
- Preparation of System Test Specifications and execution/monitoring of System test Specs as a part of the System Testing Team.
- Performing peer reviews
- Interact with client to get the requirement Signed off.
- Involved in preparation of User Manual Documents
- Solving PR
Environment: Oracle 9i, PL/SQL, SQL*Plus, SQL Loader, Toad, UNIX, D2k forms and Designer 6i
Confidential
Oracle Programmer
Responsibilities:
- On the technical Front My involvement in the project was as a programmer.
- On the Managerial front, I was constantly in touch with the Client to get requirement and feedback.
- Implemented one Module alone Known as STI Collmen.
- Had an In-depth knowledge of different Modules i.e.
Environment: Avalon, Oracle 8i, PL/SQL, SQL*Plus, SQL Loader, Toad, UNIX, D2k forms and Reports.