Sr. Data Analyst/data Modeler Resume
Santa Fe, NM
SUMMARY
- Around 7 Years of relevant IT experience Exceptionally in Data Modeling working as an Data Modeler / Data Analyst in Analysis, Design, Data Modeling (Conceptual, Logical and Physical) for Online Transaction.
- Experience with Software Development Life Cycle (SDLC) like Agile, RAD, Waterfall and V Model with good working knowledge, disciplines, tasks, resources and scheduling.
- Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Architecture and Designing.
- Worked with most of the popular databases including DB2, Teradata, Oracle, and SQL Server & Informix.
- Strong working experience in the Data Analysis, Design, Development, Implementation of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL)
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate - wide-ETL Solution using Data Stage
- Extensive experience in Data Analysis, Data Cleansing, Requirements gathering, Business Analysis, Data Mapping, Entity Relationship diagrams (ERD), Architectural design docs, Functional and Technical design docs, and Process Flow diagrams
- Expertise in Developing PL/SQL Packages, Stored Procedures/Functions, triggers.
- Expertise in utilizing Oracle utility tool SQL*Loader and expertise in TOAD for developing Oracle applications.
- Industry experience as a Data Analyst/ Data Modeler with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/DataMart Design, ETL, BI, OLAP, Client/Server applications.
- Worked with migration project and migrated the data warehouse form different data sources like Oracle to Teradata, Oracle to Netezza
- Extensive experience in identifying the sensitive data and the systems where the critical data resides.
- Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects
- Strong skills in database programming using SQL, PL/SQL, T-SQL, Stored Procedures
- Extensive Working experience in applying Relational Database concepts, Entity Relation diagrams and Normalization concepts.
- Participated in the development and maintenance of conceptual, logical, and physical entity/data models, in the development and deployment
- Experience with theSASsuite of reporting and/orBItools t of modeling standards, governance of Metadata
- Experience in Performance Tuning of sources, targets, mappings and sessions.
- Experience in Data Modeling using Erwin in Client/Server and distribute applications development
- Ability to quickly adapt to different project environments, work in teams and accomplish difficult tasks independently within time frame
- Strong in developing Stored Procedures, Functions, Triggers and packages utilizing PL/SQL
- Good analytical skills, presentation skills with strong communication skills.
TECHNICAL SKILLS
Business Modeling Tools: MS Visio, Power Designer
Database Management: DB2, Oracle 9i/10g, SQL Server 2000/2005, DB2, Sybase 10.x/11.1, Netezza 3.1, Erwin 4.0/3.5 Teradata V2R6/V2R5, ADABAS
Data Warehouse Tool: Informatica 8.1/7.1, DataStage 8.1/7.5, Ab Initio, SSIS
Languages: C, C++, PL/SQL, T-SQL, SQL, Visual Basic 6.0, SAS 9.1, Shell Scripting, PERL, AWK, SED
Office Tools: MS Office Suite, MS Project
Operating System: IBM AIX, UNIX, Windows
Reporting Tool: Cognos, Business Objects, Crystal Reports, OBIEE,SSRS, Microstrategy
Standards: RUP, CMM, ISO 9001
Tools: IMS, XMLSpy, TOAD, SQL Enterprise Manager, SQL Query Analyzer, Vontu, Informatica Data Explorer Win SQL, Citrix, Control Center
Data Modeling Tools: ER Studio 8.0, ERWIN 4.0/3.5
Web Technologies: XML, XSL, XHTML, HTML, CSS, JavaScript, VBScript
PROFESSIONAL EXPERIENCE
Confidential, Santa Fe, NM
Sr. Data Analyst/Data Modeler
Responsibilities:
- Analyzed and gathered business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements Involved data governance, defined data modeling and model maintenance standard.
- Querying data from different database tables as per the requirement and executing the source queries from workflows. Involved in data migration from Data stage to Informatica
- Demonstrate expertise in data conversion, data de-duplication, data delta load process and data cleansing methodologies
- Verification and validation Applications included mainframe data retrieval and selection, XML conversion processes.
- Analyzed Interactive data using SQL, in the AWS cloud-based integrated workspace
- Validating maps on Informatica to verify data from Source to Target data base & record test results.
- Performed testing in mainframe environment.
- Interacted with End Users and SME’s and gathered metadata, created data analysis repository (spreadsheet), which includes following metadata information: data source, hierarchy, fact, dimension, data attribute, data integration, data model, data acquisition etc.
- Conducted source-system data analysis to understand current state, quality and availability of existing data.
- Proficient in BASESAS, intermediate to expert SQL skills and experience with large data warehousing
- Involved in Designing Logical and Physical Data model using Universe.
- Resolved text truncation error during metadata distribution
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Used various checkpoints in the Informatica designer to check the data being transformed.
- Integration of mainframe and Unix environments in Data warehousing
- Identify business rules for data migration and Perform data validations.
- Maintained data models with detailed metadata using Erwin Data Modeler and Model Manager.
- Involved in extensive DATA validation using SQL queries and back-end testing
- Develop Oracle PL/SQL triggers and procedures. Set up Oracle PL/SQL package to analyze the tables and indexes, reorg tables and rebuild indexes
- Visualized, interpreted, report findings and developed strategic uses of data using AWS to create interactive Dashboards.
- Extensively used Quest software TOAD for Oracle to Query the Oracle data base and written scripts in querying pane to validate the tables.
- Tested Informatica ETL mappings that transfer data from Oracle source systems to the Data Mart.
- Created database objects, tables, indexes, writing procedures, SQL scripts.
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Extensively written SQL and PL/SQL scripts to validate the database systems and for backend database validation
- Extracted main frame data coming from VSAM files in fixed length format and created data patterns based on Mapping Rules.
Environment: Informatica, Oracle 10g, SAP Data Services, PL/SQL, SED, AWS, SAS,MVS, SSAS,COBOL II, VSAM Files, Teradata, Javascript, Business Objects, MS Access, MS Excel, Erwin, DB2, TOAD, VB Script, XML, XSD, AIX UNIX, Shell Scripting
Confidential, Nashville, TN
Data Analyst/Data Modeler
Responsibilities:
- Interacted with client and SME’s for gathering the requirements and analyzed the source data for improving the ongoing data quality
- Responsible for data analysis, business/systems analysis; ETL process documentation; user requirements/ functional specs; technical specifications;
- Involved in Client Interaction and knowledge Transfer sessions.
- Experience in data analysis, data integration, conceptual data modeling or metadata creation.
- Involved in maintaining Data Privacy, improving Data Security and identifying the systems where the critical data resides.
- Creating Schedule in Approx. for Informatica mappings and for the Procedures.
- Resolved data quality issues for multiple feeds coming from Mainframe Flat Files.
- Coded UNIX awk and sed shell scripts for data migration for the Department of Credit Risk Warehouse and Capital Funding (CAPFUN) Database
- Direct the development, testing and maintenance ofSAS-EBI reports
- Provide support for, and automation of, routineSAS-EBI stored procedures.
- Create Data Dictionary and migrate the data when shifting to new version of SAS.
- Perform statistical analyses and QC statistical output.
- Results are performed using SAS programming and using techniques such as SAS Macro language, advanced data manipulation, and statistical procedures (e.g., PROC FREQ, PROC REPORT).
- Written complex Regular expressions using PERL & Shell Scripting for various data needs.
- Tested several data migration application for security, data protection and data corruption during transfer
- Define and import metadata from various data sources into database.
- Created SSIS packages using package logging, Breakpoints, Checkpoints and Event handling to fix the errors.
- Personalized Cognos Connection for specified display options for testing regional options and personal information.
- Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
- Used SQL Profiler for troubleshooting, monitoring, optimization of SQL Server and non-production database code as well as T-SQL code from developers and QA
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Responsible for improving the data security and accountable for overall quality of the data.
- Involved in monitoring and reconciliation of data between Data Hub and its sources.
- Acted as a data steward in maintaining the quality metrics on a continuous basis
- Written the SQL scripts for validating the data stage by stage I.e. Source to stage, stage to Pre Landing
- Day-to-day Cognos administration activities like monitoring Scheduled jobs like Cube refresh, Impromptu scheduled reports, Backup and recovery maintenance. tested PL/SQL Programs to prepare and clean Data for data migration and also to retrieve data from the database to compare against the input sets
- Write Transact-SQL statements that use joins or sub queries to combine data from multiple tables.
- Create scripts by using T-SQL. Programming elements include control-of-flow techniques, local and global variables, functions, and error handling techniques.
- Involved in designing of Informatica mappings by translating the business requirements.
- Performed all scenarios of testing Transformation rules, Referential Integrity, Data Integrity, Scenario
- Performed Testing as per Quality Standards.
Environment: Informatica, Oracle, SAP Data Services, Javascript, Cognos reports, SQL SERVER, ERWIN, Shell Scripting, OLAP, PL/SQL, SAS,SQL, T-SQL, VSAM, CDC, Windows XP/2000, TOAD, UNIX
Confidential, Malborough, MA
Data Analyst/ Data Modeler
Responsibilities:
- Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
- Created detailed test plans based on analysis of requirements and design documents.
- Executed System and data quality testing to ensure all development deliverables are production ready.
- Validated Business Analytical Reports created in Business Objects.
- Used Query Studio to test ad hoc reports
- Planned, coordinate and execute Business Objects deployment for End-users and documented the entire project.
- Involved in collecting Business Intelligence reporting data from managers, staff across divisions and across geographies
- Involved in executing the ETL Test Jobs from the Informatica power center workflow Manager to verify different test cases and validate the data.
- Published and Tested Impromptu reports to Upfront by using IWR Server.
- Published and Tested Power play Cubes to Upfront by using Power play Enterprise Server
- Prepared extensive set of validation test cases to verify the data.
- Worked on Snow flake schema.
- Tested different detail, summary reports and on demand reports.
- Wrote Test Case scripts and document, detailed results and summary report
- In collaboration with other team members, analyze raw data to ensure the data will meet the business need.
- Analyzed several business reports developed using Business Objects including dashboard, drill-down, summarized, master-detail & Pivot reports.
- Participated in defining the process for test team.
- Strong working experience in Informatica Power Center with understanding of new concepts.
- Worked with data reconciliation for the entire data migration project for both target system and rejected records.
- Generated test data (on UNIX box using Perl scripts) to support development.
- Experience in Data Inspection/analysis of tables as well as outbound files (data files in EBCDIC & ASCII format).
- Verified layout of data files and control file entries as per business requirement.
- Based on the generated test data wrote test cases to demonstrate testing approach with detailed explanation of the cases for SORs.(System Of record)
- Used Clear Quest for defect tracking.
- Used Mercury Quality Center to store Test Cases and Test Scripts.
- Wrote Informatica ETL design documents, established ETL coding standards and performed Informatica mapping reviews.
- Interacted with the designer to discuss design related issues.
- Responsible to create data, load data into tables using UNIX scripts
- Good exposure working and coordinating with off shore team
- Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
- Performed extensive data validations against Data Warehouse
Environment: Informatica, Oracle 8i, SQL, Business Objects, Java Script, Erwin, Oracle, HTML/CSS, WinSQL, Tomcat, CVS, SQL, PL/SQL, Rational Suite.
Confidential
Data Analyst
Responsibilities:
- Reviewed the Business Requirement Documents and the Technical Specification.
- Generated automated test cases on first build of the module using QTP, Rational Tools.
- Inserted Database Checkpoints to verify consistency of the database workflow using Win Runner
- Expert in writing SQL and PL/SQL statements for various data needs.
- Involved in writing the Test cases and Test plans based on Source to Target Mapping documents and Data Model diagrams.
- Wrote SQL queries for each Test case and executed in SQL Plus to validate the data between Enterprise Data Warehousing and Data Mart Staging Tables.
- Used various checkpoints in the Informatica designer to check the data being transformed.
- Wrote and ran Unix Scripts for batch jobs.
- Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity.
- Performed Integration testing, System testing, and user acceptance testing.
- Designed and documented all the issues and defects to ensure application software functionality for present and future builds.
- Used Test Director to report and tract defects and known issues in the application.
- ConductedIntegration,System,Functional,GUI,Regression,Smoke,DatabaseIntegrity,User Acceptance(UAT) and Adhoc testing
Environment: Informatica, TOAD, SQL, Load Runner, WinRunner,Test Director, Shell Scripting, Sun Solaris 5.8, PL/SQL, Windows NT 4.0, Erwin, Test Cases, Test Plan, Test Scripts, Load Runner 6.0/7.0, Windows NT, AWK, SED