Data Analyst/dba Resume
Alpharetta, GA
SUMMARY:
- Over 8 years of working experience as a Data Modeler and Data Analyst wif high proficiency in requirement gathering and data modeling including design and support of various applications in OLTP, Data Warehousing, OLAP and ETL Environment.
- Excellence in delivering Quality Conceptual, Logical and Physical Data Models for Multiple projects involving various Enterprise New and Existing Applications and Data Warehouse.
- Experienced in identifying entities, attributes, metrics, and relationships; also assigning keys and optimizing the model.
- Proficient in data mart design, creation of cubes, identifying facts & dimensions, star & snowflake schemes and canonical model
- Experience in a formal Software Development, agile environment
- Extensive experience in using ER modeling tools such as ERwin and ER/Studio.
- Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL SQL. Performance tuning and query optimization techniques in transactional and data warehouse environments.
- Involved in the maintenance of therepositories for the metadata.
- Involved in big data concepts such as Hadoop and Datameer.
- Experience in logical/physical database design and review sessions to determine and describe data flow and data mapping from source to target databases coordinating wif End Users, Business Analysts, DBAs and Application Architects.
- Experience in generating DDL scripts and creating indexing strategies.
- Oracle administration on 11g,10g,9i versions. Backup and Recovery strategies using RMAN, Cold\Hot backups, Replication and other data migration strategies.
- Proficient in Data Cleansing and Data Profiling based on standards.
- Experience in Oracle EBS suite and AR .
- Good interpersonal and human relation skills to interact wif clients and co - workers.
TECHNICAL SKILLS:
Databases: MS SQL Server 2008/2005/2000, Oracle11g/10g/9i/8i, DB2 V9.5, MS Access 2000
Programming Languages: SQL, PL/SQL, T-SQL, XML, HTML, UNIX shell Scripting, VBScript, PERL, AWK, SED
Tools: MS-Office suite (Word, Excel, MS Project and Outlook), VSS
Operating Systems: Windows XP/2000/98, UNIX (Sun Solaris 10)
ETL Tools: Informatica Power Center 8.6.1/8.1/7.1, Data Stage 7.5, Ab Initio
Data Modelling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin
Tools: & Software’s: MS Office, BTEQ, Teradata SQL Assistant
OLAP Tools: MS SQL Analysis Manager, DB2 OLAP, Cognos Power play
Analysis and Modeling Tools: ERwin 4.1/7.2/7.0/8.0/9.1/9.6, Sybase Power Designer, Oracle Designer, BPwin, Rational Rose, ER/Studio, MS Visio, Hadoop, Datameer.
PROFESSIONAL EXPERIENCE:
Confidential, Alpharetta, GA
Data Analyst/DBA
Responsibilities:
- Performed analysis of business requirements and determined appropriate database structure.
- Worked directly wif the clients and business analysts to define and document data requirements for data integration.
- Demonstrating insightful exposure to the methodologies of data quality, data cleansing and data transformation wif the ability to conceptualize issues and develop well-reasoned resolutions.
- Determined and documented data mapping roles for the flow of data between applications.
- Performed Reverse Engineering of the legacy application using DDL scripts in Erwin, and developed Logical and Physical data models for Central Model consolidation.
- Actively participating in design, grooming, internal technical discussions and AID review calls.
- Working closely wif the developers to create database objects such as tables, procedures, functions, packages and triggers.
- Performing stats collection on tables/schemas on-demand or as per schedule.
- Worked on two major projects where existing AIX databases were migrated to Oracle Linux Cloud (U2L Projects).
- Performed SQL Tuning and rebuilding indexes for faster access and to reduce disk me/O.
- Performing Logical backups using Export utility and recovering specific object or schema using Import utility.
- Worked on building automated shell scripts for gathering stats and database sessions monitoring.
- Involved in creation of users, allocating to appropriate table space quotas wif necessary privileges and roles for all databases.
- Assisting developers in identifying oracle database related issues and implementing corrective measures.
- Demonstrated strengths in troubleshooting and resolving technical issues. Developed competence in collaborating TEMPeffectively wif both technical and non-technical teams.
- Supporting and maintaining applications in production, testing and development for 20+ critical database across multiple applications.
- Facilitated resolution of operational issues and supported active services by engaging wif business users and IT service providers in an active, solutions oriented frame work.
- Experienced in installation and configuration of SWM.
- Working on Virtual Project management Office (VPMO) using Agile methodologies.
Environment: Oracle 11gR2, Erwin r9.6, Toad, Unix, Linux, PL/SQL, Agile, TDP, MS Office 365, Putty, SVN, OEM 13C.
Confidential, NYC, NY
Sr.Data Modeler
Responsibilities:
- Part of the team responsible for the analysis, design and implementation of the business solution.
- The Data Mart (BMDM) system stores marketing data in DB2database and is interrogated using the Power play\ Impromptu end-user reporting tools
- Responsible for data modeling and building a star schema model in ERStudio
- In creation of the data mart new functionality had to be added like Trend data Inclusion, Product Profitability Calculation \ Reporting, Concept Reporting, Size Of Line Reporting, Global Brand Grouping \ Reporting etc
- Assisted data warehouse project team in extracting business rules
- Experience working wif Netezza
- Created an enterprise data dictionary and maintained standards documentation
- Analysis of data requirements and translation of these into dimensional as well as relational data models
- Extensively involved in source system analysis, source to target mapping, data parsing and profiling
- Analyzing market trends and suggesting techniques for improvement.
- Wrote stored procedures and supported the in-house applications.
- Tuning all database via indexing of tables, MS SQL Server 2000 configuration parameters and stored procedures SQL code optimization.
- Experience working in a formal Software Development, agile environment .
- Participated in the tasks of data migration from legacy to new database system
- Generation of Data Dictionary from CASE tool
- Worked on Metadata exchange among various proprietary systems using XML
- Handled performance requirements for databases in OLTP and OLAP models.
- Created entity / process association matrices, entity-relationship diagrams, functional decomposition diagrams and data flow diagrams .
- Experience wif big data concepts such as Hadoop and Datameer., views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
- Expertise in creatingdatabases, users, tables, triggers,macros, Vb scripting
- Extensively worked wif Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
- Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, TeradataSQL Assistantand Query man.
- Experience in designing big data tables, column families, Hive and Hbase database design wifin HDFS.
- Has Practical understanding of HDFS data partitioning, bucketing and Apache Hadoop, Horton works.
- Performed root cause analysis of data quality issues and partnered wif contract manufacturers to provide accurate and timely manufacturing data to client datamart applications.
- Create the logical data model and work closely wif the Database Administrator wif the translation of the logical model into a workable physical model using CASE tool
- Generation and testing of ddl from CASE tool
- Developed the logical data models and physical data models dat capture current state/future state data elements and data flows using ER Studio.
- Thorough SQL Data Analyst who has extensive experience in maintaining and analyzing all types of SQL databases. Adept at database management, mining specific data from SQL information and working closely wif departmental managers to create useful reports. Specializes in SQL Enterprise Server and SQL 2008.
Environment: ERwin 9.6, ER Studio, SQL Server 2000, SQL Enterprise Server, Tableau 7.0, Crystal Reports 8.0, Sybase Power Designer, Hadoop, HDFS data partitioning, Case Tool.
Confidential, Fairfax, VA
Sr.Data Modeler/Analyst
Responsibilities:
- Gathered and translated business requirements, worked wif the Business Analyst and DBA for requirements gathering, business analysis, and testing and project coordination.
- Worked wif DBAs to create a best fit Physical Data Model from the Logical Data Model using Erwin.
- Administered and maintained the model updates using ModelMart for the entire project team.
- Employed process and data analysis to model a Customer Information Business System.
- Conducted logical data model walkthroughs and validation.
- Conducted team meetings and JAD sessions.
- Developed data mapping documents for integration into a central model and depicting data flow across systems.
- Worked on Conceptual, Logical Modeling and Physical Database design for OLTP and OLAP systems.
- Developed, managed and updated data repositories, me.e. operational data store, data warehouse, data marts.
- Used reverse engineering for a wide variety of relational DBMS, including MS Access, Oracle and Teradata to connect to existing database and create graphical representation using Erwin.
- Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models.
- Used Sybase Power Designer in designing the data model.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables and columns as part of Data Analysis responsibilities.
- Performed legacy application data cleansing, data anomaly resolution and developed cleansing rule sets for ongoing cleansing and data synchronization.
- Designed mappings using Informatica coming from different sources like databases, flat files into
- Oracle staging area.
- Developed the Data warehouse model (star schema) for the proposed central model for the Project.
- Developed Process Methodology for the Reverse Engineering phase of the project.
- Used Teradata utilities for handling various tasks.
- Developed Project plan wif Project Manager’s assistance for the first two phases of the project.
- Developed and maintained the central repository, populated it wif the metadata.
Environment: ERWin 9.1, Sybase power designer, Informatica, ModelMart, Visio 2000, MS Office XP, PL/SQL, SQL, Teradata, Oracle 9i.
Confidential, Uniondale, NY
Sr. Data Analyst/Modeler
Responsibilities:
- Created logical and physical Data Models of the Confidential database using reverse engineering in ER/Studio
- Divided the whole Confidential Database into functional areas based on Confidential application, to make it user-friendly.
- Documented Data dictionary/metadata in detail for the different functional areas like Products, Marketing, Sales Force Automation, Expense Management, Orders, Customer Service, Accounting, etc
- Worked on GL applications (Accounts, Transactions) for Banking Database
- Experience in performing Application Maintenance functional and technical design activities.
- Generated HTML reports for presentation and distribution
- Document database changes based on specs dat are in QA
- Created Entity-Relationship Diagram based on the functional areas in Confidential application. updated Data Models and ERD diagrams periodically (based on major releases or upgrades to the application)
- HTML reports were generated upon request (client or internal).
- Interacted wif users, Business Analysts and database administrators in the process of updating the data models in 3NF and HTML reports.
- Created DDL scripts using Forward Engineering in ER/Studio for the Confidential internal use.
- Worked on providing reports to users using Crystal Reports.
- Built a highly successful multilevel tree model for forecasting at any level and aggregation at any level in SQL environment and ported to Oracle Environment.
- Built a highly scalable Data Warehouse Data model. Build complex ETL procedures to feed the data to warehouse applications and feed the data to Variance engine. Informatica Transformations are designed to extract the data and load into Oracle server DB. Once the data is loaded in the Analytical server complex rollup is performed.
- Created Entity-Relationship Diagram based on the functional areas in Confidential application.
- Updated Data Models and ERD diagrams periodically (based on major releases or upgrades to the application
- HTML reports were generated upon request (client or internal).
- Interacted wif users, Business Analysts and database administrators in the process of updating the data models in 3NF and HTML reports.
- Created DDL scripts using Forward Engineering in ER/Studio for the Confidential internal use.
- Worked on providing reports to users using Crystal Reports.
Environment: Oracle 10g, Embarcadero ER/Studio 7.5, Crystal Reports v8, MS Visio, ERwin Data Modeler r7
Confidential, Philadelphia, PA
Data Modeler
Responsibilities:
- Involved in requirement gathering along wif the Project manager and Users.
- Gathered all the customer services report prototypes from the Project manager including the dispatch and employee vehicle information.
- Participated in JAD sessions
- Conducted Design discussions and meetings to come out wif the appropriate Data Mart
- Understood existing OLTP data model and developed the OLAP data marts in a satisfactory way of the user requirements affecting the performance of the system
- Designed the data marts (logical and physical data models) by identifying facts and dimensions from the requirements using Erwin tool especially the Star schemas.
- Prepared the metadata documentation (in Microsoft Excel) for these data marts.
- Conducted Design reviews wif the business analysts, content developers.
- Created tables in database using DDL scripts, provided grants on the table based on the user requests and performed analyzing the tables periodically.
- Worked wif the Implementation team to ensure a smooth transition from the design to the implementation phase.
- Oracle warehouse Builder tool is used for the ETL process and worked on oracle aggregations.
- Extracted the required data from the source database (Flat Files/MS Access/Oracle/SQL server) and transformed it wif the required transformations and then loaded the transformed data into the target database.
- Developed Packages, Procedures and functions in PL/SQL, wrote triggers for implementing data integrity
- Scheduled the daily transactions using UNIX shell scripting and tested the data for frontend VB 6.0.
- Extracted data from the databases (Oracle and SQL Server) using Informatica to load it into a single data
- Created data mapping documents from source to target systems.
- Used Star Schema and Snow flake schema methodologies in building and designing the logical data model in the dimensional Models.
- Created logical and physical design of the centralized Relational Data base
- Normalization to 3NF/Denormalization techniques for optimum performance in relational and dimensional database environments.
- Provided Database Normalization upto 3NF for the database schemas.
- Updated the database as a daily transactional in order to meet user requirements.
Environment: Oracle 10g, MS Access, Windows NT, ERwin Data Modeler r7, UNIX, VBA, Oracle warehouse Builder 10g Release 2, SSRS.
Confidential, Abbott Park, IL.
Data Analyst
Responsibilities:
- Understood basic business analyst’s concepts for logical data modeling, data flow processing and data base design.
- Interface wif users and business analysts to gather information and requirements.
- Responsible for logical and physical data modeling, database design, star schema, snow flake schema design, cubes, data analysis, documentation, implementation and support
- Involved in Data modeling and design of data marts using ER/Studio
- Involved in the study of the business logic and understanding the physical system and the terms and condition for sales data mart
- Striking a balance between scope of the project and user requirements by ensuring dat it covered minimum user requirements wifout making it too complex to handle.
- Requirement gathering from the users by participating in JAD sessions. A series of meetings were conducted wif the business system users to gather the requirements for reporting. And worked on reporting using Crystal Reports.
- Requirement analysis was done to determine how the proposed enhancements would affect the current system.
- Created LDM, PDM in 3NF using ER/Studio tool and converted the logical models to the physical design.
- Design and developments of the changes and enhancements. The existing submission and refining programs were enhanced to in corporate the new calculations.
- Interacted wif end users to identify key dimensions and measures dat were relevant quantitative.
- Used reverse engineering to connect to existing database and create graphical representation (E-R diagram) using Erwin4.0.
- Designed a simple dimensional model of a business dat sells products in different markets and evaluates business performance over time.
- Involved in unit testing and Performance tuning.
Environment: Oracle, DB2, SQL Server, MS Access, Embarcadero ER/Studio, DataStage, UNIX, Crystal Reports, Caliber RM, VBA, Win NT/2000, Cognos 6, PL/SQL.
Confidential
Data Modeler and Development
Responsibilities:
- Architecting, designing & implementing data acquisition process for a new project. Designed schema, Dimensions and transformations in the Informatica for loading the data from public domain.
- Created base tables, views, and index. Built a complex Oracle procedure in PL/SQL for extract, loading, transforming the data into the warehouse via DBMS Scheduler from the internal data. Helped in designing views to pivot and aggregate the data to create meaningful reports for business users.
- Involved in designing and implementing the data extraction (XML DATA stream) procedures.
- Worked on aggregate dimensions, late arriving facts and dimensions, staging tables and ODS. Performance tuning:
- Identified the bottlenecks in the data extraction and transformation, removed the bottlenecks due to data lookup and complex computation wif caching the master data and all the necessary transformations pushed in db (Informatica push downs). Moved various string operations and derived values to pre extraction layer and persist the data in table to boost the performance. Data was pivoted to boost the performance wif Router in Informatica.
- Worked wif query optimizer (Cost base Optimizer), Explain Plan, Batch Loading:
- Refactored the batch loading process to work optimally. Removed redundant calls, cached the Meta data, calculations pushed to Database, removed the row level processing wif bulk operations. Made several processes to be able to stop and start at will. Created control jobs and made the processes to self recover from the failures. Order entry management.
- Helped in separating inventory management from the core functionality. Identifying tables, views, packages, objects & existing DBMS jobs, Cron Jobs, data migration routine and making the separation transparent wif other system via public synonyms.
- Architected and designed high speed and fast response star schema for eCommerce Multi-tenant on-demand Data model using Toad Data modeler.
- Built a highly successful multilevel tree model for forecasting at any level and aggregation at any level in SQL environment and ported to Oracle Environment.
- Built a highly scalable Data Warehouse Data model. Build complex ETL procedures to feed the data to warehouse applications and feed the data to Variance engine. Informatica Transformations are designed to extract the data and load into Oracle server DB. Once the data is loaded in the Analytical server complex rollup is performed.
- Defined Data Architecture Strategy, Data Management Strategy, data standards, Data architectural.
- Built a high performance variance engine which is the heart of the main system and produces the on-time/on-demand alert, several canned reports, and user defined reports and creates aggregates at various levels. Used CUBE, ROLLUP to create the aggregates for sever analytical reports.
- Established the performance standards and measures. TKPROF, explain plan, DBMS packages used in Oracle to study the performance of the system. Tuned several complex SQL wif indices, partition logics, and organizing the data in a way it is easy to locate and load.
- Built several batch jobs in Oracle. Built QA environment and supported the test process and delivered to Production roll out process. Various responsibilities included working on resolving production issues on time and support.
- Worked on Data Profiling.
- Working closely wif Power Builder Developer team on building Server-Side components.
Environment: Power Builder, Oracle 10gr2, Sybase, DB2, Teradata, Sybase Adaptive Server 12.5/11.9.2, SQL Server, Informatica 8.1, XML, Windows 2000, Erwin, Power Designer.