We provide IT Staff Augmentation Services!

Bi Developer Resume

0/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • Strong Experience in Microsoft (MSBI) Business Intelligence, ETL Data Warehouse, OLTP & OLAP using MS SQL Server Integration Services (SSIS) 2012/2008R2/2008, MS SQL Server Reporting & Analysis Services (SSRS & SSAS) 2012/2008R2/2008 and MS SQL Server 2012/2008R 2/2008/2005.
  • Extensive Experience in ETL Data Warehouse Ralph Kimball Methodologies for supporting Data Extraction, Transformation and Loading processing using MS SQL Server Integration Services (SSIS) 2012/2008R2/2008 and Loaded process from Heterogeneous / Homogeneous Data Sources, performed Data Verification, Data Cleansing, Data Integration, Data Import & Export. Applied various Transformations such as Lookup, Fuzzy Lookup, Merge, Merge Join, Conditional split, Multicast, Derived column, Cache, Slowly Changing Dimension (SCD Type1, Type2, Type3).
  • Rich Experience in Multi - Dimensional Modeling such as STAR Schema, SNOW Flake Schema, Galaxy Schema, and STAR Flake Schema.
  • Experience in ETL Tool using Informatica Power Center 9.1/8.6/8.0/7.1 and built various ETL Transformations such as Aggregate, Rank, Sort, Joiner, Filter, Router, Normalizer, Lookup (connected & unconnected) and Familiarity in ETL Tool TALEND Open Studio 5.0
  • Experience in Data Visualization Tool using Tableau Desktop 9.0/8.0 and Import multiple data source into Tableau and created charts, tables and visualization design.
  • Proficient in Multi-Dimensional Data Analysis, Data Modeling. Built and deployed CUBES using SQL Server Analysis Services (SSAS) 2012/2008R2/2008. Built dimensions, measures, hierarchies, calculated attribute, KPI, perspective, partitions, actions and MDX query.
  • Exposure at building various Reports such as Drill down, Drill through, Drill across, Sub Reports, Dashboards, Parameterized reports, Tables, Matrixes, Gauges, Indicators, Bar Graphs & Line Graphs using MS SQL Server Reporting Services (SSRS) 2012/2008R2/2008.
  • Experience in Data Warehouse/Data Mart, OLTP & OLAP implementations teamed with project scope, analysis, requirements gathering, data modeling, effort estimation, ETL design, development, system testing, implementation & production support. Developed & implemented policies, standards for persevering the integrity & security of data. Hands on experience about MS SQL Server 2012 installation / configuration / migration crossing different editions.
  • Experience in Relational Data Modeling (RDM) and Used RDM concepts to create ER diagrams, Logical and Physical models. Strong understanding of Primary & Foreign key Constraints, Surrogate key, attributes, normalization & de-normalization concepts.
  • Experience in MS SQL Server database development and administration including server installation, configuration, maintenance, tuning, optimization, migration, and monitoring.
  • Extensive database Experience using MS SQL Server 2012/2008R2/2008, Oracle 11g/10g/9i, MS Access 2007, mySQL.
  • Experience in Writing and Tuning Complex SQL Queries, Tables, Joins, Indexes, Group Functions, Sub Queries, Views, Global Temporary Table, TSQL & PL/SQL Database Programming as Stored Procedures, Functions, Triggers, Cursors, Exceptions Error Handling, Dynamic SQL and Package (Specification & Body).
  • Extensive Experience in Software Development Life Cycle (SDLC) Methodologies such as Waterfall, AGILE, SCRUM, RUP (Rational Unified Process), Extreme Programming, Iterative Development, System Analysis & Design concepts. Used UML tools to generate diagrams as Data flow, Use case, Class, Sequence to illustrate system development in MS VISIO.
  • Excellent knowledge in Financial Economics, Insurance, Risk management, Macroeconomics, Microeconomics, Banking, Healthcare (HIPAA, HL7, EDI 834,835,837, ICD 9/10, HMO, PPO).
  • Experience in SharePoint 2010 and Shared documents with people, managed versions of files. Familiarity with Performance Point dashboard functionalities.
  • Experience in C# & ASP.NET Programming, HTML5/HTML, CSS3/CSS, XML, Web Services, WSDL, UDDI, Cloud Enabling, Java Programming, Statistical Analysis.
  • Familiarity with Big Data, Hadoop, HDFS Architecture, Map Reduce, NoSQL Database, Cloud Computing Deployment Models (Public, Private, Community, Hybrid) & Cloud Computing Service Models as IAAS, NAAS, PAAS, SAAS, IAAS & RDBMS Architecture as Teradata, Netezza.

TECHNICAL SKILLS

OLAP, BI Data Visualization Tools: MS-SQL Server (2012/2008R2/2008/2005), MS SQL Server Reporting Services (SSRS2012/2008R2/2008), MS SQL Server Analysis Services (SSAS2012/2008R2/2008), Query & Analysis & Report Studio, PerformancePoint, SharePoint 2013/2010, Tableau desktop 9.0/8.1, MS Excel PowerPivot, Power View

ETL Data Warehouse Tools: Microsoft SQL Server Integration Services (SSIS2012/2008R2/2008), Informatica Power Center 9.1/8.6/8.0/7.1 , Informatica Power Mart, Informatica Repository Manager 9.1/8.6/8.0/7.1 , Informatica Designer 8.6/8.0/7.1, Informatica Workflow Manager 9.1/8.6/8.0/7.1 , Talend Open Studio 5.2

Statistical Analysis Tools: SQL, R, SPSS 19

Data Process and Modeling tools: ERwin 8.2/8.0/4.0, UML 2.0/1.0

Databases: MS SQL Server (2012/2008R2/2008), Oracle 11g/10g/9i, MS Access 10/07, My SQL

DB Languages: MS SQL, T-SQL, Oracle SQL, PL/SQL

Platform Tools & other languages: SharePoint 2013/2010, JAVA

Development Tools: MS SSMS, SQL, SQL*Plus, TOAD 10.6, SQL Developer, ERwin 8.2, SQL*Loader

Workflow Tools: MS-VISIO, MS-Excel, MS-Word, MS - PowerPoint, OmniGraffle 6

Operating Systems: Windows 7/Vista, Mac OX 10.9/10.8.10.7

Methodologies: Ralph Kimball Dimensional Modeling, Star, Star Flake, Snow flake & Galaxy Schema, SDLC (Waterfall, AGILE, Scrum, RUP, Iterative)

Web Technologies: HTML5/HTML, CSS3/CSS, C#.NET, ASP.NET, XML

Big Data Tools: Hadoop, HDFS, Map Reduce, NoSQL

PROFESSIONAL EXPERIENCE

BI Developer

Confidential, Phoenix, AZ

Responsibilities:

  • Actively engaged in Team Meetings to understand and collect the Business Requirements for Building Cube and generating various BI Reports to perform thorough Analysis and prepared the Technical Specification & Report Mapping documents with checking the data sources, types of data integrity, data types, constraints and the business rules.
  • Performed Design on Multi-dimensional data model, Identified dimensions, measures, Slowly Changing Dimensions (SCD) and built dimension hierarchies.
  • Created multiple relationships between Fact Tables, Dimension Tables and Lookup Tables. These relationships include regular, referenced, fact, many to many and no relationship.
  • Generated Cubes using SQL Server Analysis Services (SSAS), Identified data source, data source view and deployed created cubes into server.
  • Developed multiple SSAS Objects such as calculated attribute, KPI, perspective, partitions, translations, perspectives, actions and generated and Enhanced MDX queries.
  • Created different types of actions using SSAS such as Standard, Reporting & Drill through action. These actions fulfilled the cube by returning webs, a set of data or other reports existed in SSRS.
  • Improved Cube Performance by adjusting dimension table, partitioning table, reducing unnecessary measures and Examined Cube Structure using SSAS Cube Browser and added measure, dimension combinations to cube and adjusted cube structure based on requirements.
  • Applied security on SSAS cubes by assigning roles to user and groups. Limited different user’s access to cubes and Performed Cube debugging, deployment and tuning cube performance. Applied cube on SQL Server Reporting Services (SSRS).
  • Created reports using multiple data sources as SQL Query, Cube. Implemented shared data source, embedded data source, data sets & Developed Drill through reports, Drill down reports, Sub & Ad hoc reports. Built multiple SSRS Objects as Tables, Matrixes, Gauges, indicators, multiple graphs using SSAS Report Builder.
  • Created with SSRS expressions, parameters in expressions to control report content and report appearance and Customized code in Expression, Created and Modified expressions such as Ceiling, Row Number and multiple date time expressions.
  • Developed with Parameterized report and Created types of parameters such as non-queried parameter, queried parameter, multi valued parameter and cascaded parameter.
  • Formatted reports using different options such as changing report layout, adding borders, footer and herders, changing text fonts and setting object properties.
  • Performed Tuning on various SQL Statements and Query Optimization. Performed SQL queries extensively to improve the data and report performance and Deployed completed report on Report Server and using report manager to access, view and manage reports.
  • Engaged with Multiple Team Meetings to collect and understand the requirements for ETL Data Warehouse, Data Store, Data Mart and Business Intelligence System. Understood the source data, business rules, data relationships, logical & physical data model and Prepared the ETL Mapping document between Source and Target.
  • Performed the procedure analysis, data modeling, and created Dimensional models in Star Schema with Confirmed Dimension and Fact tables.
  • Planed ETL Structure and Methods of processing source data, created sample database at smaller data amount for testing ETL process and Performed ETL Process design with the obvious approach, created checkpoint and breakpoint to improve the debugging performance.
  • Tested different ETL Approach using staging table. Pulled data from data source and store in staging table, and then execute the data transformation.
  • Performed the process of handling raw data files; Extracting, Transformation and Loading (ETL) the data into the database using SSIS packages & Utilized Data Cleansing using Data Transformations as Fuzzy grouping for possible duplicate data and Fuzzy lookup.
  • Created different Transformations for loading the data from Heterogeneous Sources includes Flat files, EXCEL, OLE DB, XML into Targets including Conditional Split, Derived Column, Fuzzy Lookup, Lookup, Aggregate, Sort, Multicast, Merge, Merge Join, Pivot using SSIS.
  • Developed SSIS Packages using control flow, data flow, event handler and package explorer.
  • Executed SSIS packages using execute package task.
  • Implemented Sequence Constraints in ETL for Error data handling. Directed error data to other steps for further transformations. Handled different types of data error such as data conversion error, expression evaluation error and lookup error.
  • Updated newest data into Data Warehouse using Slowly Changing Dimensions (Type2) Transformation in SSIS and Used control flow tasks and other tasks such as for each loop container, for loop container, sequence container, bulk insert task, execute SQL task, execute package and script task and Built & debugged packages and used variables in packages for dynamically driven data extracting, loading and deployed SSIS packages on MS SQL Server.

Environment: MS SQL Server 2012, SSMS, SQL, T-SQL, MS SQL Server Integration Services (SSIS) 2012, MS SQL Service Analysis Services (SSAS) 2012, MS SQL Server Reporting Services (SSRS) 2012, MS Visual Studio 2012, Informatica Power Center 9.1, Informatica Repository Manager 9.1, Informatica Designer 9.1, Informatica Workflow Manager 9.1, Tableau Desktop 9.0, Windows 7

BI Developer

Confidential, Gainesville, FL

Responsibilities:

  • Attended meetings to identify the GPA Project requirements, Project scope, and developed project outline and proposals. Collected student GPA information such as working, family income, living place, and demographics etc.,
  • Verified & Validated information based on the Predefined & Statistical standards. Created Excel sheet, entered data into data sets and made sure about Data Quality & Data Integrity by eliminating invalid data and Imported data into STATA for further analyzing. Imported data into MS SQL Server for optional multidimensional analyzing and Performed Linear Regression, Two-way Scatter plot with fitted regression line using STATA to analyze data relationships.
  • Created tables in MS SQL Server for storing data. Implemented SQL objects such as data reference, data constraints and indexes. Developed Star Schema model for Multidimensional analyzing & Researched data structure & analyzed data from multiple dimensions to find colorations among factors & student GPA. Developed T-SQL objects as Stored Procedure, UDFs.
  • Performed data partition, data training, data validation. Executed PCA, backward elimination method using MS Excel and Extension XLMiner to find out the best data model to explain.
  • Compared analyzing outcome of two different modes from multiple perspective and combined both for presentation and Utilized Data Visualization Tool using Tableau to develop various Powerful Dashboards and Reports. Used multiple Data source such as Excel, SQL Server. Created multiple charts such as line, bar, bubble for presenting reports.
  • Created Multidimensional analysis charts using Tableau by altering dimensions. Built chars with different forms and perspective using multiple Tableau elements such as filters, marks, columns and rows, show me function, and data grouping.
  • Developed Tableau dashboards, combined multiple tables and charts using Tableau dashboards function. Utilized dash boards objects such as title, text, image and local/global filter.

Environment: STATA 12, DV: TABLEAU 8.1, MS SQL Server 2012, SSMS, SQL, T-SQL, MS Windows 7, MS Word 2013, MS Excel 2013, MS PowerPoint 2013

Reports Developer

Confidential, Gainesville, FL

Responsibilities:

  • Created tables using MS SQL Server and defined data types, data length, columns and data constraints for report data storage and built table references and relationships.
  • Imported SQL Server data into SSAS Report Builder by setting SQL data source and dataset. Implemented shared data source, embedded data source and optimized data source input by customizing SQL Query in data source.
  • Developed drill through report by configuring text box action for drill through, locating drill trough report and linking reports together and Utilized drill down reports by adding expand/collapse action on items. Performed report item hiding on specific items.
  • Implemented SSRS dashboards by combining multiple reports objects such as charts, tables and matrix. Specified paths for external report items.
  • Utilized variables in report to change report display based on dynamic input. Implemented expressions to display, group and sort data in the report.
  • Developed multiple other forms of reports such as Linked reports, Snapshot reports, Cache reports and click through reports and Built ad hoc reports, added drill through functionality and added bookmarks and document maps to provide navigation options within a large report.
  • Created multiple forms of charts such as bar, line, area, scatter, pie and added tables, matrix and other objects to reports from the tool box and Deployed Reports to local server and exported & transformed reports in different forms such as PDF and Excel.

Environment: MS SQL Server 2012, SSMS, SQL, TSQL, MS SQL Server Reporting Services (SSRS) 2012, MS Office 2013, Windows

SQL Developer

Confidential, Gainesville, FL

Responsibilities:

  • Collected the business and database requirements, performed design with MS SQL Server 2012 database and Developed, Identified business area, major business attributes and Created Entity, Entity relationships using MS Visio.
  • Used ERwin 8.2 and Developed Conceptual Data Models (CDMs), Logical Data Models (LDMs), developed data entities and all attributes. Chosen qualified attributes for primary key & foreign key. Defined the reference relationships among tables, Normalized & Denormalized tables.
  • Developed Physical Data Models (PDMs) using ERWin 8.2, and Specified all tables and columns. Converted all entities into tables. Implemented relationships between primary key and foreign key. Converted attributes into columns. Defined data type, data length for each column. Modified the physical data model based on physical requirements.
  • Added data constraints to implement the Data integrity such as referential integrity, unique constraint, not null constraint, valid values constraint.
  • Created objects and SQL Queries such as Indexes, Views, Joins, Sub queries to improve the database performance and Created T-SQL Stored Procedures, Triggers, User Defined Functions, and Cursors to implement complex business rules.
  • Imported existing data from multiple data sources into database using SQL Server Import & Export wizard and Tuned the SQL Queries using Indexes, Table Partitioning, and Table Demoralization. Developed error handling routines to automatically handle possible errors.
  • Utilized Performance Point for developing Powerful Dash boards, Score cards, Key Performance Indicators (KPI’s) and Deployed & Published using Sharepoint 2010.
  • Shared diagrams, designing ideas and related documents through SharePoint 2010.

Environment: MS SQL Server 2012, SSMS, SQL, T-SQL, ERwin 8.2, MS SQL Server Reporting Services (SSRS) 2012, MS Visio 2013, Performance Point, SharePoint 2010, Windows

Statistical Data Analyst

Confidential, Gainesville, FL

Responsibilities:

  • Examined Wine Quality Key Indicator & collected data from the team members to set the project outline, preprocessed & imported data in the form of MS EXCEL into SPSS, created Histogram, Box Plots to show the variable distributions. Performed data correlation, identified outliers using SPSS.
  • Performed multiple Regression, adjusted the regression by observing adjust R Squares value and eliminating variables and Developed the final data model.
  • Implemented an alternative model to test weather model fits. Calculated statistical indicator such as residual standard error, P-value and adjusted r-squared using SPSS.
  • Applied the Three-Based methods and developed methods to classify wine into three categories and calculated the miss-classification rate and Trained data by randomly forecasting using the created model. Adjusted the model and successfully improved the forecasting accuracy and Validated data, increased the overall data quality and model efficiency. Tested the model using a different set of data and finalized model.
  • Implemented the data model to forecast Wine Quality using new set of data. Calculated forecast accuracy & developed reports to present project procedure using MS EXCEL & Power Pivot and built Tables, Pivot Tables.

Environment: SPSS 19, MS Excel 2013, MS Power Pivot 2013, MS Word 2013, MS PowerPoint 2013, MS Outlook 2013

PL/SQL Developer

Confidential

Responsibilities:

  • Managed Lab Instruments of IT assets belong to the Medical School by utilizing database, provided advanced, and efficient management method.
  • Developed relational database using Oracle 10g and Performed Design Entity Relationship (ER) Diagrams to the proposed database and Used ERwin 8.2 to Design various Concept Data Modeling (CDM), Logical Data Modeling (LDM) and Physical Data Modeling (PDM).
  • Implemented physical database and Created database objects such as table, view, index, stored procedure and Maintained Data Integrity as Referential, Domain integrity & Column integrity by using the available options such as Constraints are Primary key & Foreign key.
  • Checked SQL query to keep the queries are written in security manner and checked SQL Injection to avoid security issues and Managed relational database using TOAD. Read tables, checked table relationships, data integrity and developed & optimized SQL Queries.
  • Developed and Tuned various Static & Dynamic Cursors, Exception Error Handling, Control Structures, Stored Procedures, Functions, Triggers, Packages as Specification and Body using PL/SQL and Updated database by collecting datasets from multiple sources, cleaned and transformed data using MS excel and SQL*Loader (Data File, Control File).
  • Extensively used Informatica PowerCenter designer tools including Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Utilized ETL for Migration/Conversion from Heterogeneous Data Sources, Transformations as Normalizer, Update Strategy, Aggregator, Sorter, Joiner, Filter, Router, Connected & Un-Connected Look up & Stored Procedure, Expressions, Rank, Sequence Generator
  • Created ETL Informatica Mappings for various sources such as MS-SQL Server, MS Access, MS Excel, & Oracle to load and integrate the Lab Items and Assets details to Warehouse.
  • Used Workflow Manager for Creating, Validating, Testing and running the Sequential & Concurrent Worklets & Sessions and scheduling them to run at specified time before creating Jobs. During Implementation phase, Tuned Informatica Mappings for optimum performance. and responsible for the daily loads and handling the rejected data.
  • Performed extensive debugging and Performance Tuning of Mappings, sessions and workflows including Partitioning, memory tuning and cache management and Developed Ad hoc Reports using MS Excel and multiple Graphics such as PIE, LINE, BAR and PIVOT Table.

Environment: ERwin 8.2, Oracle 10g, TOAD 10.6, SQL, PL/SQL, SQL*Loader, Informatica Power Center 8.6, Informatica Repository Manager 8.6, Informatica Designer 8.6, Informatica Workflow Manager 8.6, MS Excel 2010, MS office 2010, Windows 7

PL/SQL & BI developer

Confidential

Responsibilities:

  • Collected & identified risk management requirements & performed Research on DW structure & built project planning, design on Multidimensional DW models as Star & Snow Flake schema.
  • Created sample database from data warehouse, implemented and adjusted data model, tables, attributes, grouped multiple dimensions to research fraud characteristic.
  • Applied SQL Query using DDL, DML, DCL Statements & Sub Queries. Utilized SQL clauses as Distinct, Group by, Having, SQL Functions, Conditions. Managed Tables using Indexes, PK, FK, Data Constraints & Developed daily running SQL Jobs to generate risk data for processing. Managed table accessing privileges using Grant and Revoke.
  • Implemented PL/SQL Programming using variables, literals, Cursors, Loops, Conditional Statements and views. Implemented Functions, Triggers, Stored Procedures & Packages.
  • Developed and Optimized SQL Query using TOAD, Read tables, checked table relationships, data types, data integrity using TOAD.
  • Found a common characteristic of fake card transaction, researched & designed a method to giving alert when high-risk transaction happens & Tested risk control method by comparing risk & normal transaction traits, combined with fraud detector dealing ability to test the best solution & Implemented new fraud card transaction control method by Extracting high-risk card and importing risk data into detecting system and updated risk data daily.
  • Collected and identified report requirement and designed the report structure including tables and charts and extracted data from data warehouse. Used group function, user defined function to aggregate data, store information into tables.
  • Created multiple charts such as line, pie, bar to illustrated fraud application status in different dimensions using MS Excel. Compared the fraud and normal application to show fraud characteristics and Built tables and pivot table using MS Excel. Performed functions such as logical, statistical, text, date time, cell reference.
  • Imported data into Excel Power Pivot & Created links between data from different sources. Created pivot tables, charts & added slicers into pivot tables, charts using MS Power Pivot.
  • Created MS Power View report and dashboards. Built pages, views, tiles and added visualization. Performed value sorting, table arranging and converted tables into cards.
  • Developed Ad hoc reports using MS Power Point and MS Excel showing the fraud status and trends. Gave suggestions on how to react, and predicted the result.

Environment: Oracle SQL 10g, PL/SQL, TOAD 10.6, MS Excel 2010, Excel Power Pivot 2010, Excel Power View 2010, MS Office 2010, MS Windows 7

We'd love your feedback!