Sr. Data Analyst/modeler Resume
Chicago, IL
SUMMARY
- Over 8+ years of extensive Information Technology experience in all phases of Software development life cycle including System Analysis, Design,DataModeling, Dimensional Modeling, implementation and Support of various applications in OLTP and DataWarehousing.
- DataModelerwith strong Conceptual, Logical and Physical DataModeling skills,DataProfiling skills, Maintaining DataQuality, experience with JAD sessions for requirements gathering, creating datamapping documents, writing functional specifications, queries.
- Experience with Mainframe systems (COBOL, JCL, CICS, VSAM, DB2, IMS, IDMS) and also conversion of Mainframedatato ETL Staging tables.
- Expertise in Gathering and Analyzing Information Requirements,DataAnalysis,DataArchitecture, Business (E - R) modeling, Dimensional Modeling, ETL Design.
- Experience in Informatica Power Center 9x/8.x with Oracle 11g/9i/10g, experience in SSIS and SQL ServerDataWarehouse in Microsoft DW/BI environments.
- Experience in implementingDatawarehousing solutions involving Dimension modeling and Snow flake schema implementation (3 NF).
- Experience in integration of variousdatasources with multiple Relational Databases like SQL Server, Teradata, Oracle and DB2.
- Experience on various Software Development Life Cycles including Analysis, Design, Development and Testing to solidify client requirements in conjunction with Software Developers.
- Experience in RDBMS (Oracle) PL/SQL, SQL, Stored Procedures, Functions, Packages, Triggers worked with Terabytes of Volume databases.
- Hands on experience in migrating database application from legacy to newer technology,data movement,datamapping
- Good experience indatatransformation,datamapping from source to target database schemas and alsodatacleansing.
- Used Teradata Fast Export utility to export large volumes ofdatafrom Teradata tables and views for processing and reporting needs.
- Experience in modeling with both OLTP/OLAP systems and Kimball and InmonDataWarehousing environments.
- A good expertise in Extract Transform and Load (ETL)datafrom spreadsheets, database tables and other sources using MicrosoftDataTransformation Service (DTS) and Informatica.
TECHNICAL SKILLS
DataModeling Tools: Erwinr9,Erwinr8,Erwinr7.1/7.2, Rational Rose 2000, ER Studio and Oracle Designer
OLAP Tools: Microsoft Analysis Services, Business Objects, and Crystal Reports 9
ETL Tools: Microsoft DTS/SSIS, SSRS and Informatica 7.1.3
Programming Languages: SQL, T-SQL, Base SAS and SAS/SQL, HTML, XML, VB.NET
Database Tools: Microsoft SQL Server 2000/2008, Teradata, Oracle 10g/9i, and MS Access
Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server 2003/2007.
Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX
Quality Assurance Tools: Win Runner, Load Runner, Test Director, Quick Test Pro, Quality Center, Rational Functional Tester
PROFESSIONAL EXPERIENCE
Confidential, Chicago, IL
Sr. Data Modeler/Analyst
Responsibilities:
- Part of the team responsible for the analysis, design and implementation of the business solution.
- Developed the logicaldatamodels and physicaldatamodels that capture current state/future state dataelements anddataflows using ER Studio.
- Developed a Conceptual model usingErwinbased on requirements analysis.
- UsedErwinfor reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
- Involved inDataMapping activities for thedatawarehouse.
- Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writingdata.
- Ensured productiondatabeing replicated intodatawarehouse without anydataanomalies from the processing databases.
- Developed Star and Snowflake schemas based dimensional model to develop thedatawarehouse.
- Created PhysicalDataModel from the LogicalDataModel using Compare and Merge Utility in ER/Studio and worked with the naming standards utility.
- Reverse Engineered DB2 databases and then forward engineered them to Teradata using ER Studio.
- Provided source to target mappings to the ETL team to perform initial, full, and incremental loads into the targetdatamart.
- Responsible for migrating thedataanddatamodels from SQL server environment to Oracle 10g environment.
- Worked closely with the ETL SSIS Developers to explain the complexDataTransformation using Logic.Extensively worked on the naming standards which incorporated the enterprisedatamodeling.
- Generated comprehensive analytical reports by running SQL queries against current databases to conductdataanalysis.
- Worked closely with the ETL SSIS Developers to explain the complexDataTransformation using Logic.
Environment: Erwinr9, SQL Server 2008, SQL Server Analysis Services 2008, SSIS 2008, SSRS 2008, Oracle 10g, Business Objects XI, Rational Rose,MS Office, MS Visio
Confidential, Chicago, IL
Sr. Data Analyst/Modeler
Responsibilities:
- Designed LDM/ PDM for Coverage Model / On Line Watch List Management usingErwinfor Global Anti Money Laundry & Economic Sanctions.
- Involved in preparing LogicalDataModels/PhysicalDataModels.
- Identify source systems, their connectivity, related tables and fields and ensuredatasuitably for mapping.
- Designed Logical/ PhysicalDataModel for E-Delivery System Using ER-Studio.
- Created SQL Loader Scripts, table sizing, indexing, table partition, SQL tuning.
- Created/ Tuned PL/SQL Procedures, SQL queries forDataValidation for ETL Process.
- Created Validation reports/System integration reports by Oracle Developer Suite 10g.
- Extracted data from various sources like Oracle, Netezza and flat files and loaded into the target Netezza database
- Created DDL, DML scripts. Created and worked with X-Reference tables fordatavalidation between differencedatamarts/Databases
- Created PL/SQL procedures, triggers, generated applicationdata, Created users and privileges, used oracle utilities import/export.
- Extracted data from various sources like Oracle, Mainframes, flat files and loaded into the target Netezza database.
- Designeddatamodels with GE- ERC standards up to 3rd NF (OLTP/ODS) and de normalized (OLAP)datamarts with Star & Snow flake schemas.
- Created more than 20 New Models and with more than 100 tables. Used Star Schema and Snow flake Schema fordatamarts /DataWarehouse.
- Created DHTML using PL/SQL Tool kits, packages, Dynamic SQL, used Apache
- Provided conceptual and technical modeling assistance to developers and DBA's usingErwinand Model Mart. ValidatedDataModels with IT team members and Clients.
- Worked on IBM Information Analyzer to profile the data and generate various reports
- DataAnalysis and Application testing Tuning using Analyze and Explain plan.
- Extracted the sourcedatafrom Oracle tables, MS SQL Server, sequential files and excel sheets.
- Developed mappings in Informatica to load thedatafrom various sources includes SQL server, DB2, Oracle, Flat files into theDataWarehouse, using different transformations like Source Qualifier.
Environment: Oracle 8i, PL/SQL, SQL Loader 8.1, Toad 4.0, UML,Erwin3.5,K-Shell, Erwin3.5, PL/SQL TOOL KIT, Perl, JDBC, Toad 3.0, PL/SQL
Confidential, Colorado Springs, CO
Sr. Data Modeler/Analyst
Responsibilities:
- Interacting with business users to analyze the business process and requirements and transforming requirements into Conceptual, logical and PhysicalDataModels, designing database, documenting and rolling out the deliverables.
- Conducted analysis and profiling of potentialdatasources, upon high level determination of sources during initial scoping by the project team.
- Developed logical/ physicaldatamodels usingErwintool across the subject areas based on the specifications and established referential integrity of the system.
- Worked with ETL to create source to target mappings and performed validation for source to target mappings.
- Analyzed large number of COBOL copybooks from multiple mainframe sources (16) to understand existing constraints, relationships and business rules from the legacydata.
- Involved indatamodel reviews with internaldataarchitect, businessanalysts, and business users with in depth explanation of thedatamodel to make sure it is in line with business requirements.
- Understood basic businessanalystsconcepts for logicaldatamodeling,dataflow processing and database design.
- Created PhysicalDataModel (PDM) for the OLAP application using ER Studio.
- Participated in JAD session with business users and sponsors to understand and document the business requirements in alignment to the financial goals of the company.
- Worked with DBA to create the physical model and tables.
- Developed and maintaineddatadictionary to create metadata reports for technical and business purpose.
- DevelopedDataMapping,DataGovernance, Transformation and Cleansing rules for the MasterDataManagement Architecture involving OLTP, ODS and OLAP.
- Collaborated with the Reporting Team to design Monthly Summary Level Cubes to support the aggregated level of detailed reports. Worked on the Snow-flaking the Dimensions to remove redundancy.
- Collaborated with ETL, BI and DBA teams to analyze and provide solutions todataissues and other challenges while implementing the OLAP model.
- Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used fordatawarehousing projects.
Environment: Erwin4.0,Erwin3.5.2, Toad, PL/SQL, Oracle 9i, SQL Server 2000, Windows 2005, ER Studio 7.1.1, Quest Central for DB2 v 4.8, COBOL, TeraData, Microsoft SQL 2008 Reporting Services
Confidential, Lincolnshire, IL
Sr. Data Modeler/Analyst
Responsibilities:
- Involved in creating Physical and Logical models using Erwin.
- Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Synonyms, Databasetriggers, StoredProcedures) in the data model.
- Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata as per business requirements using Erwin
- Involved in extensive DATA validation using SQL queries and back-end testing
- Used SQL for Querying the database in UNIX environment
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Creates tables and queries to produce additional ad-hoc reports.
- Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
- Performed data mining on Claims data using very complex SQL queries and discovered claims pattern.
- Used Teradata OLAP functions like RANK, ROW NUMBER, QUALIFY, CSUM and SAMPLE.
- Designed and developed cubes using SQL Server Analysis Services(SSAS) using Microsoft Visual Studio 2008
- Resolved thedatatype inconsistencies between the source systems and the target system using the Mapping Documents.
- Experience inDataTransformation andDataMapping from source to target database schemas and alsodatacleansing.
- Resolved the revolving issues by conducting and participating in JAD sessions with the users, modelers and developers.
Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica, Oracle, Teradata V2R13/R14.10, Teradata SQL Assistant 12.0, Erwin
Confidential, Birmingham, AL
Data Analyst
Responsibilities:
- Involved in Data mapping specifications to generate mapping documents containing various transformation rules to be consumed by ETL teams.
- Involved in compiling, organizing, mining and reporting financial data
- Involved in development, implementation, and roll-out of dashboards for business objective metrics
- Created and executed test scripts, scenarios and test plans that validated initial business requirements and desired analytical reporting capabilities
- Produced data maps, collaborated in designing and validating the logical data model design and prototyping
- Lead documentation effort for interviews, data warehousing requirements, and application data requirements
- Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
- Developed working documents to support findings and assign specific tasks
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
- Performed data mining on Claims data using very complex SQL queries and discovered claims pattern.
- Responsible for different Data mapping activities from Source systems to EDW, ODS & data marts.
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
- Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
- Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
- Tested several stored procedures and wrote complex SQL syntax using case, having, connect by etc
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
- Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Worked with end users to gain an understanding of information and core data concepts behind their business.
- Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.