We provide IT Staff Augmentation Services!

Etl / Bi Developer Resume

3.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY:

  • Over Six Years of experience in IT Professional career in ETL, Business Intelligence and Data warehousing. Extensively worked on MS SQLServer2005/2008, MSBI Technologies like SSIS, SSAS and SSRS.
  • Experience in software development life cycle, business requirement analysis, design, programming, database design, data warehousing and business intelligence concepts, Star Schema and Snow flake Schema methodologies.
  • Expert skill in data migration and ETL from various data sources like Excel, SQLServer, Flat files using SSIS packages and SQL commands. Monitoring, debugging and tuning ETL jobs and workflows.
  • Created SSIS packages to transfer data between OLTP and OLAP Databases with different types of control flow tasks and data flow transformations, securing and deploying the packages.
  • Experience in Validating and testing the SSIS packages on the development server.
  • Experience in Designing, Creating and processing of cubes using SSAS. Created and Configured Data Source and Data Source Views, Dimensions, Cubes, Measures, Partitions, KPI’s using SQL Server Analysis Services(SSAS 2005/2008).
  • Expert in calculating measures and dimension members using multi dimensional expression (MDX), mathematical formulas.
  • Experience in the Reports generation by using Authoring and Managing Components of SSRS from both relational databases and OLAP Cubes including MDX Development.
  • Experience in generating on- demand and scheduled reports for business analysis and management decisions using SQL Server Reporting Services (SSRS).
  • Experienced in creating test data and unit test cases. Writing test cases and system plans to ensure successful data loading process.
  • In-depth knowledge of Relational Data Modeling, Dimensional data modeling and design. Extensive experience in data analysis using SQL and MS-Excel.
  • Experience in performance tuning, Query optimization and database consistency checks.
  • Experience in installation, up gradation and configuration of Microsoft SQL Server and Databases.
  • Good work experience on System analysis, Design, Development, testing and implementation of projects. Ability to profile and understand the different source systems. Very good skills of documenting different kinds of metadata.
  • Having excellent communication, Presentation skills, and Strong analytical and problem solving skills. Liaise with developers and user representatives in application design and document reviews.
TECHNICAL SKILLS:

Business Intelligence:

MS SSRS 2005/2008, MS SSAS 2005 / 2008, Actuate 5,

ETL SKILLS:

MS SQL Server SSIS 2005/2008, Informatica Power Center v 7.x, SQL, PL/SQL, ETL coding through PL/SQL, Export, Import, Unix Shell Scripting, Cron Job Development.

Databases:

MS SQL Server 2005/2008, Oracle 10g and 9i, Microsoft Access, DB2 and Sybase.

Tools:

Callidus 5, Toad, SQL Navigator, Erwin, MS-Excel and MS-Access.

Operating Systems:

Windows XP/2003 Server/2000 Pro, Unix Sun Solaris/HP/AIX, Windows XP and Linux,

Data warehousing: Methodologies/ Processes:

Dimensional Modeling (Star Schema Modeling, Snowflake Modeling, Fact and Dimension Tables), Kimball Methodology,Inmon Methodology, Maintenance of Operational Data Store(ODS) and Enterprise Data warehouse (EDW), OLAP, Metadata Management, Data Migration and Data Cleansing Techniques, Data Profiling, Data Quality and Data Validation Scripts

Internet Software/Other Tools:

HTML, XML, UML, Clear Quest, Case Studio and Harvest.

PROFESSIONAL EXPERIENCE:

Confidential, Atlanta, GA, Oct 2010 Till date.
Campaign Effectiveness Reporting (CER) - A web-based reporting tool that utilizes Microsoft’s SQL Server Analysis ,SQL server Integration Services and Reporting Services to report and analyze the effectiveness of different marketing campaigns segment wise( for targeted people) and region wise for direct and indirect customers. This is used by various business groups for reporting and analysis of the campaigns.

Responsibilities:

  • As a ETL / BI Developer for the CER Data cube project involved in all the phases of the CER development.
  • Understanding existing business model and customer requirements. Detailed study and data profiling of the underlying application systems for the sales of the different campaigns promoted.
  • Elicit requirements using interviews, document analysis, business process descriptions, use cases, scenarios, business analysis, task and workflow analysis.
  • Developed the proto-type of CER and delivered six months of historical data in development environment to the business users for testing purpose.
  • Filtered bad data from legacy system using T-SQL and implemented constraints and triggers into new system for data consistency.
  • Identify the field/column level data quality issues of the source systems to drive the data quality, data cleansing and error checking in the ETL mappings. Identify and document the source to target mapping data lineage.
  • Prepare the detailed level ETL mapping specification documents describing the algorithms and flowchart and mentioning the source systems, change data capture logic, transformation logic involved in each field, the lookup tables, lookup logic used and also the target systems per each individual mapping involved with the datamart. Also, document the important SSIS session properties that should be used for executing SSIS Packages from Terradata in to the SSIS Database..
  • Design and document the error-handling strategy in the ETL load process using Event handlers. Consistently use this error handling techniques in all the mappings.
  • Designed and built cube (CER Cube) using SQL Server analysis Services(SSAS 2008) with 10 dimensions and built daily partitions spanning 13 months and developed Aggregations,calculated members for the cube and generated daily reports to measure the sales of the segmented people across different regions for different campaigns.
  • Designed and developed the ETL data flow to populate the CER Cube Analysis Database using the SSIS packages (SQL Server Analysis Services). Scheduled and automated the packages to populate the data in the cube daily up to date for business reporting.
  • Designed and deployed direct and indirect customers reports for targeted segmentation of people and across different regions with Drill Down approach. Also developed Drill-through reports to measure the performance detail of different distribution channels.
  • Created quarterly sales reports with Chart controls to measure the progress of sales every quarter for different distribution channels.
  • Created test data and unit test cases to ensure successful data loading process.
  • Used Notification services to generate error messages and send them to the user through e-mails. Created report snapshots to improve performance of SSRS.
  • Responsible for helping manage the daily operations of the company, meeting with business analysts, end-users for resolving the issues.

Environment: SQL Server 2008, SSAS 2008, SSRS 2008, SSIS 2008, Teradata V2R6.2.

Confidential, Atlanta, GA, Sep 2007- Sep 2010.

Callidus Sales Compensation System for Small Business Services.(SBS)-Callidus Sales Compensation System for Bellsouth’s Small Business Services (SBS) - An Enterprise Incentive Management (EIM) application to develop and manage incentive compensation linked to the achievement of strategic business objectives which supports sales compensation for SBS outbound employees and vendors, distribution employees and SBS inbound managers and also represents the largest deployment of a packaged incentive software product at BellSouth with almost 6000 payees eligible for monthly and quarterly payments through Callidus used by Compensation Coordinators, HR Pay Administrators, Employees and Partners.

Responsibilities:

  • Data warehouse developer for the Callidus sales Compensation Small Business Services and BI projects team. Involved in all the phases of the Sales and Billing Datamart project from scratch and end to end and performed different roles in the different phases of the project.
  • Detailed study and data profiling of all the underlying transaction application systems for the Sales and Billing subject areas and understand the data models designed by the architect. Identify and capture the right metadata from source systems.
  • Developed the proto-type and delivered three months of historical data in development environment to the business users for a proof of concept.
  • Identify the field/column level data quality issues of the source systems to drive the data quality, data cleansing and error checking in the ETL mappings. Identify and document the source to target mapping data lineage.
  • Prepare the detailed level ETL mapping specification documents describing the algorithms and flowchart and mentioning the source systems, change data capture logic, transformation logic involved in each field, the lookup tables, lookup logic used and also the target systems per each individual mapping involved with the datamart. Also, document the important SSIS session properties that should be used for executing SSIS Packages.
  • Design and developed mappings for loading the source data into the staging area and also for loading dimensions, facts and aggregate tables.
  • Preparation of test scripts which includes complex sql scripts to compare source and target data. Execute the test scripts and validate the results. Also, co-ordinate with business users to perform the User Acceptance Test (UAT). Prepare migration and deployment plans and co-ordinate for deployment.
  • Scheduling and monitoring ETL packages daily load. Developed Exception handling process for each SSIS package. Tested the data with complex queries, joins and sub queries.
  • Provided Technical and Development Solutions for requirements raised by Clients. Used Backup/Restore, Log shipping, and Normalization.
  • Involved in the enhancement/maintenance of the datamart. Done impact analysis and designed developed, tested and deployed a functionality for a new source system into the existing datamart.
  • Monitor historical data loads and on-going loads. Root cause analysis of datawarehouse production issues.
  • On call production support for regular data warehouse load process and off shore co-ordination.

Environment: SQL Server 2000/2005, SSIS 2005, Windows XP, MS Visio, Erwin.

Confidential,Irving, TX, Apr 2005 to Aug 2007. Verizon DSL Metrics Project - A dashboard and scorecard analytical application based on the Star Schema Data mart to measure business metrics for the Sales and Billing subject areas used by the Verizon DSL decision support team.

Responsibilities:

  • SQL Server Developer for the Verizon EDW and BI projects team. Involved in all the phases of the Sales and Billing project from scratch and end to end and performed different roles in the different phases of the project.
  • Detailed study and data profiling of all the underlying transaction application systems for the Sales and Billing subject areas and understand the data models designed by the architect. Identify and capture the right metadata from source systems.
  • Creation of Tables, Views, Stored Procedures, UDFs, Triggers according to the User Requirements. Maintained design structure (schema) by directing implementation of SQL standards and guidelines.
  • Developed the proto-type and delivered three months of historical data in development environment to the business users for a proof of concept.
  • Identify the field/column level data quality issues of the source systems to drive the data quality, data cleansing and error checking in the ETL mappings. Identify and document the source to target mapping data lineage.
  • Also, document the important Informatica session properties that should be used for executing Informatica workflows.
  • Design and document the error-handling strategy in the ETL load process. Consistently use this error handling techniques in all the mappings.
  • Design and develop Informatica mappings for loading the source data into the staging area and also for loading dimensions, facts and aggregate tables. Also, developed Unix shell scripts to copy the source system files and execute the Informatica workflows and sessions.
  • Preparated the test scripts. Execute the test scripts and validate the results with co-ordination with business users.
  • Involved in troubleshooting and resolve complex technical issues, and performance tuning of databases.
  • Optimized the performance of various SQL scripts, Stored Procedures and triggers.

Environment: Sunsolaris, Oracle9i, Callidus TrueComp Manager, 5.1, Informatica Power Center 7.1, ODBC, Actuate, Connect Direct.

EDUCATION:

    • Master of Business Administration M.B.A (Information Systems)

We'd love your feedback!