We provide IT Staff Augmentation Services!

Sr. Data Architect/modeler Resume

0/5 (Submit Your Rating)

Lexington, KY

SUMMARY

  • Over 9+ Years of experience as dataArchitect and data modelerwith excellent knowledge on BI,Datawarehouse, ETL and Bigdata technologies.
  • 9+ years with reporting and dashboard development worked extensively on the Master Data Management (MDM) and application used for MDM
  • Experience with data modeling for NoSQL databases such as MongoDb and utilized document model and key pair value modeling.
  • A good experience with Tableau and Involved in creating dashboards and reports in Tableau
  • Verygoods experience with ACOs, PCMHs, EOCs and their Clinical measures with Enterprise Data Models around Confidential Claims
  • Extensive experience in data warehouse complete life cycle using SQL Server SSIS, SSRS, and SSAS.
  • Experienced in data warehouse development using AWS, RedShift, MS - SQL, SSAS, SSIS,Visual Studio.
  • Excellent understanding and 9+ Years working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
  • Prepared status reports on project work and workplan progress with Erwin9.6.
  • Experienced in Developing Data governance model and standard operating procedures to manage over 10,000 data elements being tracked by dashboards
  • Excellent experience in Data Warehousing and in OLTP systems. Played various roles as DataArchitect
  • Highly proficient inDataModeling retaining concepts of RDBMS, Logical and PhysicalDataModeling until 3NormalForm (3NF) and MultidimensionalDataModeling Schema (Star schema, Snow-Flake Modeling, Facts and dimensions).
  • Very good understanding of AWS technologies like EC2, Redshift, etc.
  • Expert level understanding of usingErwin and Power Designer, ER Studio
  • Have strong experience and exposure of Informatica with a great understanding,
  • Used complete knowledge ofdataware house methodologies (RalphKimball, Inmon), ODS, EDW and Metadata repository.
  • Expertise in theDataAnalysis, Design, Development, Implementation and Testing usingData Conversions, Extraction, Transformation and Loading (ETL) and SQL Server, ORACLE and other relational and non-relational databases.
  • Experienced in Consolidating and auditingmetadata from disparate tools and sources, including business intelligence (BI), extract, transform, and load (ETL), relational databases, modeling tools, and third-party metadata into a single repository.
  • Expert level understanding of using different databases in combinations forDataextraction and loading, joiningdataextracted from different databases and loading to a specific database.
  • Expertise indatacleansing for analysis, performdataquality testing for gaps, and liaising withdata origination teams.
  • Experienced in using distributed computing architectures such as AWS products (e.g. EC2, Redshift, and EMR),Hadoopand effective use of map-reduce, SQL and Cassandra to solve big data type problems.
  • Experienced in Consistently delivering results in various stages of Software Development Life Cycle (SDLC)
  • Experienced working with variousdatasources such as Oracle, SQLServer, Teradata&Netezza
  • Extensive experience with OLTP/OLAP System and E-R modeling, developing Database Schemas like STAR schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Extensively experienced in using tools like SQL Plus, TOAD, SQL Developer and SQLLoader.
  • Experienced inDataModeling using designed tool Erwin, Oracle SQL Developer, SQL Server Management studio, My SQL, SQL Plus and Toad.
  • Expertise in writing SQLQueries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
  • Experienced in deploying and scheduling Reports using SSRS to generate all daily, weekly, monthly and quarterly Reports including current status.
  • Experienced in designing and deploying reports with DrillDown, DrillThrough and Drop down menu option and Parameterized and Linked reports.
  • Experienced using MapReduce, and "BIgdata" work on Hadoop and other NOSQL platforms
  • Excellent Experience with BIgdatatechnologies (e.g., Hadoop, BIgQuery, Cassandra)
  • Experienced in understanding Stored Procedures, Stored Functions, Database Triggers, and Packages using PL/SQL.
  • Extensive experience in advanced SQL Queries and PL/SQL stored procedures.
  • ExcellentDataWarehousing concepts including MetadataandDataMarts.
  • Experienced in Business Intelligence (SSIS, SSRS),DataWarehousing and Dashboards.
  • Developed and deliver dynamic reporting solutions using SQLServer 2008 Reporting Services (SSRS)
  • Experienced in creating OLAP cubes, identifying dimensions, attributes, hierarchies and calculating measures and dimension members.
  • Extensive working experience in Normalization and De-Normalization techniques for both OLTP and OLAP systems in creating DatabaseObjects like tables, Constraints (Primary key, Foreign Key, Unique, Default), Indexes.
  • Strong analytical, logical, Communication and problem solving skills and ability to quickly adapt to new technologies by self-learning.

TECHNICAL SKILLS

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED

DataModeling: Erwin r9.6/r9.5/7x/6x/5x, ER/Studio 9.7/9.0/8.0/7. x

Databases Oracle: 12g/11g/10g/9i/8i/7.x, Teradata, DB2 UDB 8.x/7.x, DB2 Z/OS 9.x/8.2, SQL Server 2008/2005/2000 , MySQL, MS- Access, Flat Files, XML files.

Programming Skills: SQL, PL/SQL, Shell Scripting.

Operating Systems: Win 95/NT/98/2000/XP, LINUX, Sun Solaris 2.X/5/7/8, IBM AIX 5.3/4.2, HP-UX, MS-DOS.

ETL Tools: Informatica 8.x and 9.x

Scheduling Tools: Autosys, Tidal, Maestro (Tivoli)

Other Tools: Teradata SQL Assistant Toad 9.7.2/7.4/7.3 , DB Visualizer 6.0, Microsoft Office, Microsoft Visio, Microsoft Excel, Microsoft Project .

PROFESSIONAL EXPERIENCE

Confidential, Lexington, KY

Sr. Data Architect/Modeler

Responsibilities:

  • Gathered business requirements, working closely with business users, project leaders and developers. Analyzed the business requirements and designed conceptual and logical data models.
  • Designed Star schema data mart to convert OLTP to OLAP model using Erwin. Lead a team in the development and of the conceptual data models and physical data models.
  • Owned and managed all changes to the data models. Created data models, solution designs and data architecture documentation for complex information systems.
  • Provided valuable input into project plans and schedules, translating business requirements into conceptual, logical and physical data models.
  • Managing data movement throughout our AWS infrastructure including building data science models and standing up end user applications
  • Provide data architecture support to enterprise data management efforts, such as the development of the enterprise data model and master and reference data, as well as support to projects, such as the development of physical data models, data warehouses and data marts.
  • Conducting research, providing recommendations, and implementing new solutions in AWS.
  • Lead the strategy, architecture and process improvements for data architecture and data management, balancing long and short-term needs of the business.
  • Building relationships and trust with key stakeholders to support program delivery and adoption of enterprise architecture.
  • Designing, developing, and testing complex data processes in AWS.
  • Writing ad-hoc queries based on schema knowledge for various reporting requirements Writing / Tuning data ingestion procedures from external suppliers and partners using PL/SQL, Teradata SQL Assistant, SQL loader, 3rd party tools.
  • Dealing with complex data models and object relational database mapping, while producing complex reports
  • Working with technical analysts and software developers to identify and design data models based on requirements definitions and interactive discussions to support both new and existing application system processes
  • Providing technical leadership, mentoring throughout the project life-cycle, developing vision, strategy, architecture and overall design for assigned domain and for solutions
  • Defining core and support technology, data entities, business functions and/or subject areas that transcend organizational and functional boundaries.
  • Developed an AWS Cloud platform that supportedBig Data processing and analytics.
  • Providing technical and architectural subject matter expertise to the various development teams including communicating architectural decisions and mentoring other technical staff around the various development technologies and decisions.
  • Conducting strategy and architecture sessions and deliver artifacts such as MDM strategy (Current state, Interim State and Target state) and MDM Architecture (Conceptual, Logical and Physical) at detail level.
  • Designing and developing a distributed data processing platform using Big Data technologies and constructed a flexible, multi-terabyte data warehouse using Redshift on AWS.
  • Maintainingdata mapping documents, business matrix and other data design artifacts that define technical data specifications and transformation rules

Environment: Erwin r9.6, SQL, Oracle 12c, MDM, AWS, Teradata SQL Assistant, NetezzaAginity, Informatica, PL/SQL, SQL Server, Windows, Hadoop, UNIX, Redshift, EC2.

Confidential, Overland Park, KS

Sr. Data Architect/ Modeler

Responsibilities:

  • Participated in the design, development, and support of the corporate operation data store and enterprise data warehouse database environment.
  • Used multiple sources for dashboard design solution and resolved many complex issues
  • Implemented passing parameters from Dashboards to Webi Reports to show the detailed data for the selected values in the Dashboard.
  • Worked in Agile Data Modellingmethodology and creating data models in sprints in SOA architecture
  • Created demos in Tableau Desktop and published onto Tableau Server
  • Documented a whole process of working with Tableau Desktop, installing Tableau Server and evaluating Business Requirements.
  • Applied the AgilePrinciples and Practices
  • Worked on data migration from SQL Server 2000 database and flat files and loading them into a SQL Server 2005/2008/2008 R/2012 database
  • Migrated SSIS packages from SQL Server 2008 R2 to SSIS 2012.
  • Performed Sorting, formatting, and Grouping to custom reports in SSIS and (SSRS).
  • Created named sets, calculated member and designed scope in SSAS, SSIS, SSRS.
  • Designed solution using SAP MDM
  • Performed Unit tests and Built MDM repository with MDM
  • Worked in accessing the data from Mongo DB.
  • Worked with MongoDB for data modeling on NoSQL
  • Worked on Identifying and acting on cross-selling opportunities in Insurance .
  • Worked on Pre-qualify and apply underwriting guidelines to new business clients with insurance.
  • Managed the Master Data Governance queue including assessment of downstream impacts to avoid failures
  • Worked on data quality issues in implementation of a change as Data Architect
  • Worked on data architecture and database design with Er Studio
  • Involved in dimensional modeling of the data model using ER/studio Data Architect. Tuning and optimization of Teradata Queries Created the ETL data mapping warehouse to design the business process. Designed Logical data model and Physical Conceptual data documents between source systems and the target data warehouse
  • Worked with project and business teams throughout the software development life cycle to deliver a data model that is consistent with business needs and architectural standards.
  • Created, managed, and modified logical and physical data models using a variety of data modeling philosophies and techniques including Inmon or Kimball
  • Involved with delivery of complex enterprise data solutions with comprehensive understandings in Architecture, Security, Performance, Scalability, and Reliability
  • Profiled source data and meet regularly with IT partners to develop complete source to target data mappings
  • Developed requirements, perform data collection, cleansing, transformation, and loading to populate facts and dimensions for data warehouse
  • Interacted with application development, enterprise architecture, business intelligence, technology services, and vendors on a regular basis
  • Generated DDL from model for RDBMS (Oracle, Teradata)
  • Remained current on database technology trends and upcoming database releases. Provided guidance on database technology direction
  • Coordinated with the database administration group to ensure day-to-day activities (archives, statistics collection, disk defragmentation) are performing in the most efficient manner
  • Investigated data quality issues to determine root causes, resolve any data issues and recommended process change to prevent reoccurrence
  • Involved with performance tuning of SQL
  • Proactively investigated database query log data to find opportunities to improve performance in OLAP and OLTP queries executing against the database

Environment: Erwin, Informatica Power Center, IBM Information Analyzer,Teradata, Oracle 11g, DB2, NoSQL, Hadoop, Hbase, SQL,PL/SQL, XML, Windows NT 4.0, Sun Solaris Unix 2.6, Unix Shell Scripting.

Confidential, Dayton, OH

Sr. Data Modeler/ Architect

Responsibilities:

  • Involved in Datamapping specifications to create and execute detailed system test plans. The datamapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Built complex formulas in Tableau for various business calculations.
  • Used HTML coding within the Geospatial dashboards to customize the features as per the client requirements.
  • Involved in the development of the Publication Report and the Shared services Universe.
  • Built Tableau visualizations against various Data Sources like Flat Files in Excel and Oracle,
  • Defined the Primary Keys (PKs) and Foreign Keys (FKs) for the Entities, created Dimensional model star and snowflake schemas using Kimball methodology
  • Worked datamart design and creation of cubes using dimensional data modeling and identifying Facts and Dimensions, Star Schema and Snowflake Schema
  • Created DTS packages in SSIS, and SSRS, SQL server 2000, 2005, 2008, 2008R2.
  • Installed SQL server 2005and applied service packs.Migrated databases from SQL server 2000 to SQL server 2005.
  • Involved in extracting the data using SSIS from OLTP to pre-staging and then to data warehouse.
  • Designed SSIS packages to load Trading data from differentsources Text file, Excel to staging and staging to Data warehouse worked extensively on Informatica Master Data Management (MDM) and application used for MDM
  • Conducted MDM unit tests and code reviews with Informatica IDD
  • Worked Non relational database Mongo DB, knowledge with Hive and Hbase
  • Developed DAO interfaces & implementations for database related operations and implemented IDS API for interaction with Mongo DB through which feeds are fetched for security purposes.
  • Worked on making policy changes to property policies per the insured's request.
  • Worked on implementing and assisting with design of dataGovernance acquisition systems
  • Developed ER and Dimensional Models using Power Designer advanced features. Created physical and logical data models using Power Designer
  • Used Erwin and Visio to create 3NF and dimensional data models and published to the business users and ETL / BI teams.
  • Involved in dataanalysis and creating datamapping documents to capture source to target transformation rules.
  • Involved in Migrating thedatamodel from one database to Teradata database and prepared a Teradatastagingmodel.
  • Worked with business users to gather requirements and create data flow, process flows and functional specification documents.
  • Used Erwin and Visio to create 3NF and dimensional data models and published to the business users and ETL / BI teams.
  • Worked on creating role playing dimensions, factless Fact, snowflake and starschemas.
  • Analysis of functional and non-functional categorized data elements for dataprofiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
  • Created Talend Mappings to populate the data into Staging, Dimension and Fact tables.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • Prepared Testcases based on Technical Specification document.
  • Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
  • Designed and implemented a Cassandra noSQL based database
  • Extensively worked on Talend Designer Components-Data Quality (DQ), Data Integration (DI) and Master Data Management (MDM)
  • Produced Dimensional (Star Schema) data models for Data Warehouse/OLAP applications using data modeling -best practices- modeling skills and the Ralph Kimball methodology.
  • Responsible for testing all new and existing ETL data warehouse components.
  • All Mappings & workflows succeeded in Testing Environment move from Development to Production Environment.
  • Created new services for data extraction and SOA transformation.
  • Designed and developed NoSQL solutions for all users.
  • Performed data manipulations using Talend.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files,SQL Server 2005 with high volume data.
  • Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.
  • Interacted with functional analysts to understand requirements and write high level test scripts.
  • Logical and Physical design using both Ralph Kimball and Bill Inmon style data warehouse designs.
  • Managed and reviewed Hadoop log files.
  • Reviewed ERD dimensional model diagrams to understand the relationships and cardinalities to help in preparation of integrity test cases.
  • Written test cases for dataextraction, datatransformation, and reporting.
  • Responsible for Testing Schemas, Joins, Data types and column values among source systems, Staging and Datamart.
  • Analyzed the objectives and scope of each stage of testing process from the Test plan.
  • Interacted with business analysts to gather the requirements for business and performance testing.
  • Responsible for performing the datavalidation, process flow, dependency, Functionality Testing and User Acceptance Testing.
  • Extensively used QualityCenter to prepare test cases, execution of test cases and bug tracking.

Environment: Erwin, Informatica Power Center, Informatica IDQ, Oracle 11g, Main frames,DB2 MS SQL Server 2008 R2, NoSQL, SQL,PL/SQL, XML, Windows NT 4.0, Sun Solaris Unix 2.6, Unix Shell Scripting.

Confidential, Marlborough, MA

Sr. Data Architect/ Data Modeler

Responsibilities:

  • Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document)
  • Worked on Help creating and maintaining the Sprint Backlog, Sprint Burndown Chart and Task Board in Agile
  • Developed IT Capitol dashboards using dashboard designer.
  • Extensively used QAAWS to develop dashboards.
  • Created and assisted users in Tableau dashboard development
  • Created Adhoc Reporting process and Created different Charts in Dashboards on Tablaeu
  • Collaborated and shared knowledge and experience among the Team, Product Owner, ScrumMaster and Stakeholders in Agile
  • Worked on Created Parameterized, Cascaded, Drill-down, Cross-tab andDrill-through Reports using SSRS 2008.
  • Used both Star Schema and Snow flake schema methodologies in building and designing the logical data model in both Type1 and Type2 Dimensional Models.
  • Created Multi-dimensional Cubes using Power Play Transformer and published cubes to Cognos connection
  • Migrated the data from legacy system Oracle and from different flat files like CSV, Txt, Excel to the local SQL Server using SSIS 2008
  • Worked on Import Manager to import data into MDM, Creating Custom fields in MDM console
  • Performed look ups for validation with MDM
  • Performed data manipulations using Informatica
  • Used Informatica reusable components context variable and globalMap variables
  • Experience with Workflows to automate the business validation, definitions and processes to deal with the exceptions with Eagle.
  • Completed auditability and system-wide security features via a centralized data repository in Eagle.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schema.
  • Created architecture stack blueprint for data access with noSQL, used as the technical basis for new Cassandra projects.
  • Involved in extensiveDatavalidation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Developed separate test cases for ETL process (Inbound & Outbound) and reporting
  • Involved with Design and Development team to implement the requirements.
  • Developed and Performed execution of Test Scripts manually to verify the expected results
  • Designed and developed ETL processes using InformaticaETL tool for dimension and fact file creation.
  • Transform data to various sources using SQL Server Integration Service andTalendOpen Studio.
  • Shared responsibility for administration of Hadoop, Hive and Pig.
  • Worked with the SOA development team to create large scale business enterprise systems for clients.
  • Primary Data Analyst/Data Modeler performing Business Area Analysis and logical and physical data modeling for a Data Warehouse utilizing the Bill Inmon Methodology. Also designed Data Mart application utilizing the Star Schema Dimensional Ralph Kimball methodology.
  • DataAnalysis primarily IdentifyingDataSets, SourceData, Source MetaData,DataDefinitions andDataFormats.
  • Created logicalphysicaldatamodels and MetaDatato support the requirements Analyzed requirements to develop design concept and technical approaches to find the business requirements by verifying Manual Reports.
  • Performeddataanalysis anddataprofiling using complex SQL on various sources systems including Oracle and Netezza
  • Developed strategies fordatawarehouse implementations,dataacquisitions, provided technical and strategic advice and guidance toseniormanagers and technical resources in the creation and implementation fordataarchitecture anddatamodeling.
  • Defined corporate metadata definitions for the enterprisedatasupported databases including operational source systems,datastores anddatamarts developed logical and physicaldatamodels and documented sourcedatafrom multiple sources, internal systems, external source systems, and third partydata.
  • Analyzed needs and requirements of existing and proposed IT systems to supportdatarelated requirements.
  • Designed, tested and debugged external and DB2 native stored procedures.
  • Involved in Migrating thedatamodel from one database to Teradata database and prepared a Teradatastagingmodel.
  • Designed star schemas and bridge tables to control slowly changing dimensions. Applied the normal forms on the OLTP database.
  • Tuning and optimization of Teradata Queries Created the ETLdatamapping documents between source systems and the targetdatawarehouse.
  • Involved in dimensional modeling of thedatamodel using ER/studioDataArchitect. Tuning and optimization of Teradata Queries Created the ETLdatamapping warehouse to design the business process. Designed Logicaldatamodel and Physical Conceptualdatadocuments between source systems and the targetdatawarehouse
  • Developed ER and DimensionalModels using PowerDesigner advanced features. Created physical and logicaldatamodels using Power Designer.
  • Involved in the process design documentation of theDataWarehouse DimensionalUpgrades.
  • Consolidated and generated database standards and naming conventions to enhance EnterpriseDataArchitecture processes.
  • Successfully created and managed a conversion testing effort which included adataquality review, two system test cycles, and user acceptance testing.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
  • This involved coordination with DBA's and developers to determine the correct requirements for the application changes, preparation of the logical and physical models in accordance with the Enterprise Architecture, generation of thedatadefinition language (DDL) to create the database.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Cassandra, Shell Scripting

Confidential, Watertown, MA

Sr.Data Modeler/ Data Analyst

Responsibilities:

  • Worked ondataanalysis,dataprofiling,datamodeling,datamapping and testing for a given task
  • Worked on Providing support to reports and Tableau Applications.
  • Created complex Dashboards with Dynamic Visibility as the key component.
  • Developed Universe on IDT for dashboard and detail reports
  • Involved in System testing & resolved issues on Tablaeu.
  • Created Snapshots, Report Caching and Report History in SSRS 2008 R2 and SSIS
  • Involved in Normalization and building ReferentialIntegrity Constraints with SSIS
  • Facilitated transition from Waterfallmethodology to Agilemethodology.
  • Processing data files through Informatica. Extensively worked on ETL tool Informatica.
  • Different types of file and database handling using Informatica Components. Exception handling in InformaticaDie, Java
  • Worked on Teradata Environment based on thedatafrom PDM. Conceived, designed, developed and implemented this model from the scratch
  • Worked with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.
  • Established and maintained comprehensivedatamodel documentation including detailed descriptions of business entities, attributes, anddatarelationships.
  • Developed mapping spreadsheets for ETL team with source to targetdatamapping with physical naming standards,datatypes, volumetric, domain definitions, Transformation Rules and corporate meta-datadefinitions.
  • Administrator for Pig, Hive and Hbase installing updates, patches and upgrades.
  • Investigated various NoSQL database alternatives and methods for online database compression.
  • Processing data files through Talend . Extensively worked on ETL tool Talend.
  • Data Mart Modeling & Design for Data Mart releases using Ralph Kimball Methodology.
  • This involved analysis of a variety of source systemdata, coordination with subject matter experts, development of standardized business names and definitions, construction of a non-relationaldatamodel using ErWin v9.2 modeling tool, publishing of adatadictionary, review of the model and dictionary with subject matter experts and generation ofdatadefinition language.
  • Performed transfer and loading of files to DB2.
  • Helped the BI, ETL Developers, ProjectManager and end users in understanding theDataModel,dataflow and the expected output for each model created
  • Gained Comprehensive knowledge and experience in process improvement, normalization/de-normalization,dataextraction,datacleansing,datamanipulation
  • Created, documented and maintained logical and physical database models in compliance with enterprise standards and maintained corporate metadata definitions for enterprisedatastores within a metadata repository
  • Managed full SDLC processes involving requirements management, workflow analysis, sourcedataanalysis,datamapping, metadata management,dataquality, testing strategy and maintenance of the model.
  • Consolidate and audit metadata from disparate tools and sources, including business intelligence (BI), extract, transform, and load (ETL), relational databases, modeling tools, and third-party metadata into a single repository.
  • Expert level understanding of using different databases in combinations forDataextraction and loading, joiningdataextracted from different databases and loading to a specific database.
  • Excellent understanding and working experience of industry standard methodologies like SystemDevelopmentLifeCycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
  • Extensively used Erwin as the main tool for modeling along with Visio
  • Used SQL on a wide scale for analysis, performancetuning and testing

Environment: PDM, SAP, JD Edwards, Teradata 13.10, Microsoft SQL Server 2012, SQL Manager, SAP Logon, Erwin 8.0, Visio, NoSQL, Informatica, Business Objects XI, SOA, Teradata SQL Assistant

Confidential, New York, NY

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • Developed working documents to support findings and assign specific tasks.
  • Proposed Design Solutions on MDM
  • Worked on Fetching data from SAP HANA as data source for detail reports based on Dashboard.
  • Designed dashboard & building business logic for creation of reports on Tableau.
  • Worked on the Maintenance of Tableau Server
  • Worked in Dimensional Data modeling, Star Schema and Snowflake Schema, facts &Dimensions
  • Worked on formulating the Dimensional model to suite the reporting requirements
  • Created test cases and test scripts for SIT & UAT on MDM
  • Extensively worked on Informatica Designer Components - InformaticaData Quality (IDQ) and Master Data Management (MDM)
  • Created complex mappings in Informatica 9.x
  • Created Informatica Mappings to populate the data into Staging, Dimension and Fact tables.
  • Created complex mappings in Talend 5.x.
  • Excellent exposure inDataProfiling in reference toDataWarehouse and BI development.
  • Analyzed Web based Apps for the digital marketing of the products over the browsers.
  • Involved with data profiling for multiple sources and answered complex business questions by providingdatato business users.
  • Worked withdatainvestigation, discovery and mapping tools to scan every singledatarecord from many sources.
  • Wrote and executed unit, system, integration and UAT scripts in adatawarehouse projects.
  • Worked on Front end Java applications fordataanalysis and providing results to Business users.
  • Supported application programmers and users in their daily interactions with DB2.
  • Extensively used ETL methodology for supportingdataextraction, transformations and loading processing, in a complex EDW using Informatica.
  • Performeddatareconciliation between integrated systems.
  • Metrics reporting,datamining and trends in helpdesk environment using Access
  • Wrote complex SQL queries for validating thedataagainst different kinds of reports generated by BusinessObjectsXIR2.
  • Extensively used MSAccess to pull thedatafrom variousdatabases and integrate thedata.
  • Worked on SAS and IDQ forDataAnalysis.
  • Assisted in the oversight for compliance to the EnterpriseDataStandards
  • Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle, flatfiles, SQLServer2005 with highvolumedata
  • Worked with Excel Pivot tables.
  • Created and Monitor workflows using workflow designer and workflow monitor.
  • Performeddatamanagement projects and fulfilling ad-hoc requests according to user specifications by utilizingdatamanagement software programs and tools like Perl, Toad, MS Access,Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Analysis on Mainframedatato generate reports for business users.
  • Identify & record defects with required information for issue to be reproduced by development team.

Environment: Quality Center 9.2, MS Excel 2007, DB2, PL/SQL, Java, Business Objects XIR2, ETL Tools Informatica 8.6/9.1/9.5, SSIS, Oracle 11G, Teradata R13, Teradata SQL Assistant

Confidential, Minnetonka, MN

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Involved in reviewing business requirements and analyzingdatasources form Excel/OracleSQLServer for design, development, testing, and production rollover of reporting and analysis projects.
  • Configured SSIS packages using Package configuration wizard to allow packages run on different environments.
  • Working on a prototype to export summary data from P2P Area to develop a dashboard report based on the Xcelsius front-end
  • Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Tested the database to check field sizevalidation, check constraints, storedprocedures and cross verifying the field size defined within the application with metadata.
  • Participated in meetings, reviews, and user group discussions as well as communicating with stakeholders and business groups.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in QualityCenter.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in ETL job failures, abort ordataissue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Created and executed test cases for ETL jobs to upload masterdatato repository.
  • Responsible to understand and train others on the enhancements or new features developed
  • Conducted load testing and provide input into capacity planning efforts.
  • Provided support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using LoadRunner.
  • Created and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Tested the database to check field sizevalidation, check constraints, storedprocedures and cross verifying the field size defined within the application with metadata.
  • Participated in meetings, reviews, and user group discussions as well as communicating with stakeholders and business groups.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

Confidential, St. Louise, MO

Data Modeler/ Data Analyst

Responsibilities:

  • Created and executed test cases for ETL jobs to upload masterdatato repository.
  • Created logs for ETL load at package level and task level to log number of records processed by each package and each task in a package using SSIS.
  • Configured SSIS packages using Package configuration wizard to allow packages run on different environments.
  • Tested the database to check field sizevalidation, check constraints, storedprocedures and cross verifying the field size defined within the application with metadata.
  • Involved in reviewing business requirements and analyzingdatasources form Excel/OracleSQLServer for design, development, testing, and production rollover of reporting and analysis projects.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in QualityCenter.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in ETL job failures, abort ordataissue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Responsible to understand and train others on the enhancements or new features developed
  • Conducted load testing and provide input into capacity planning efforts.
  • Provided support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using LoadRunner.
  • Created and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Participated in meetings, reviews, and user group discussions as well as communicating with stakeholders and business groups.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

Confidential, St. Louise, MO

Data Analyst

Responsibilities:

  • Created and executed test cases for ETL jobs to upload masterdatato repository.
  • Implemented SSIS data, created maintenance procedures and provided data integrity strategies.
  • Created ETL packages using SSIS to move data from various heterogeneous data sources.
  • Created and Monitor workflows using workflow designer and workflow monitor.
  • Performeddatamanagement projects and fulfilling ad-hoc requests according to user specifications by utilizingdatamanagement software programs and tools like Perl, Toad, MS Access,Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Tested the database to check field sizevalidation, check constraints, storedprocedures and cross verifying the field size defined within the application with metadata.
  • Participated in meetings, reviews, and user group discussions as well as communicating with stakeholders and business groups.
  • Tested the database to check field sizevalidation, check constraints, storedprocedures and cross verifying the field size defined within the application with metadata.
  • Participated in meetings, reviews, and user group discussions as well as communicating with stakeholders and business groups.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 5.x, Load Runner 6.0, Oracle 10g, UNIX AIX 5.1, PERL, Shell Scripting

We'd love your feedback!