We provide IT Staff Augmentation Services!

Bi / Data Analyst Resume

3.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • More than 8 years of experience Performing roles as Business Intelligence (BI) developer and Data Analyst with technical expertise, business experience, and communication skills to drive high - impact business outcomes through data-driven innovations and decisions.
  • Proficiency in Microsoft Business Intelligence technologies like MS SQL Server Integration Services (SSIS), MS SQL Server Analysis Services (SSAS) and MS SQL Server Reporting Services (SSRS).
  • Experience in integration of various data sources like SQL Server, Oracle, Access DB, Flat File and Excel File.
  • Experience in complete Software Development Life Cycle (Analysis, Design, Development, Implementation & Testing), ETL Development & Exploratory Data Analytics.
  • Hands-on star schemas, snowflake schemas, dimensional modeling and reporting tools, operations Data Store concepts, Data Marts and OLAP technologies
  • Excellent backend skills in creating SQL objects such as Tables, Stored Procedures, Functions, Views, Indexes, Triggers, Rules and SQL Performance Tuning to facilitate efficient data manipulation and consistency.
  • Experience with Data Analytics, Data Reporting, Ad-hoc Reporting, Scales, Pivot Tables and OLAP reporting.
  • Excellent understanding and thorough knowledge on Hadoop Architecture and components such as Map Reduce and Hadoop Distributed File System.
  • Hands on experience of version control tool Git, TFS.
  • Experience in OLTP/OLAP System for developing Star Snowflake schema (Fact Tables, Dimension Tables) used in relational, dimensional and multidimensional modeling, physical and logical Data modeling.
  • Experienced in Data Integration Validation and Data Quality controls for ETL process and Data Warehousing using MS Visual Studio, SSIS, SSAS, and SSRS.
  • Solid ability to write and optimize diverse SQL queries, working knowledge of RDBMS like SQL Server, Oracle, Teradata.
  • Expert in SQL with writing Queries, Temp tables, CTE, Stored Procedures, User-Defined Functions, Views, Indexes.
  • Experience in working on both Windows, Linux and UNIX platforms including programming and debugging skills in UNIX Shell Scripting.
  • Expertise in Excel Macros, Pivot Tables, VLOOKUPs and other advanced functions.
  • Experience with working in Agile/SCRUM software environments.

TECHNICAL SKILLS

Languages: C, C++, Python, SQL

Database: Oracle, SQL Server, MS Access, Teradata

Software and Tools: SSIS, SSAS, SSRS, Teradata SQL assistant, Hadoop Ecosystem, MS Project, Visio, VMWare, GIT, Team Foundation Server (TFS).

Operating Systems: Windows, Linux, Mac OS X

Scripting: JavaScript, SQL, UNIX Shell scripting, HiveQL, Visualization Tableau, Power BI, R

PROFESSIONAL EXPERIENCE

Confidential - Phoenix, AZ

BI / Data Analyst

Responsibilities:

  • Worked with the financial Intelligence unit (FIU) in identifying the requirements and implementing the business logic.
  • Participated in all phases of data mining, data collection, data cleaning, developing models, validation, and visualization and performed GAP analysis.
  • Performed Data loading, Validation of data, Data fixing, Data mapping, Data transformation, Data Migration.
  • Identified data issues and provided recommendations for resolution to ensure optimal performance.
  • Developed SQL Server Integration Services (SSIS) packages dealing with different data sources and destinations using transformations including execute SQL task, OLE DB source, file system task and script task.
  • Generated reports in various formats like Table format and Matrix format using Reporting services (SSRS) to display in a customized format as per business requirements.
  • Extensively worked on Table Design, creating Clustered Indexes, Non-Clustered Indexes, Primary key Foreign Key Relationships, complex joins and SQL Scripts for Views and Materialized Views.
  • Interface with other technology teams to extract, transform, load (ETL) data from a wide variety of data sources.
  • Involved in writing complex Teradata SQL / BTEQ script and Stored procedure involving joins and union to generate report and alert to monitor the money laundering activity.
  • Used Teradata Explain extensively to performance tune queries to reduce the spool space and impact CPU.
  • Created Store Procedures for commonly used complex queries involving join and union of multiple tables.
  • Created views to enforce security and data customization.
  • Utilize a broad variety of OLAP function like Count, SUM, CSUM and worked on MS Excel using Pivot tables, Graphs.
  • Performed Data analysis and Data Profiling using complex SQL.
  • Developed comprehensive data visualizations in Tableau to illustrate complex ideas to various stakeholder.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Developed Tableau workbooks from multiple data sources using Data Blending.
  • Built dashboards for measures with forecast, trend line and reference lines.
  • Extracted data from database, copied into HDFS File system and used Hadoop tools such as Hive QL to retrieve the data required for building models.
  • Weekly meetings with development team to ensure that business, technical and testing requirements are adequately defined and documented.
  • Participated in code review sessions with senior and junior developers.

Confidential - Union, NJ

BI Developer

Responsibilities:

  • Efficiently helped in the Software Development Life Cycle (SDLC) processes including Analysis, Design, Programming, Testing and Deployment.
  • Created SQL objects like Tables, Stored Procedures, Functions, User Defined Datatypes, Rules, And Defaults.
  • Creating mappings/workflows to extract data from SQL Server.
  • Data Migration and Transformation from Access/Excel Sheets using SQL Server SSIS.
  • Improved the performance of the T-SQL queries and Stored procedures by using SQL profiler, Execution plan, SQL performance monitor and Index tuning advisor.
  • Used SSIS features like event handlers, property expressions, and package configurations, transaction option, checkpoints and protection levels in the packages.
  • Scheduled reports for daily, weekly, monthly for Executives, Business Analyst's, and customer representatives for various categories and regions based on business needs using SSRS.
  • Worked with SQL Server Analysis Services (SSAS) that delivers online analytical processing (OLAP) and data mining functionality for business intelligence applications.
  • Created ETL transformations to load the data into oracle database from SQL Server database.
  • Designed logical and physical data models for multiple OLTP and Analytic applications.
  • Used the Erwin design tool & Erwin model manager to create and maintain the Data Mart.
  • Performance tuning of the database, which includes indexes, and optimizing SQL statements, monitoring the server.
  • Wrote simple and advanced SQL queries and scripts to create standard and Ad-hoc reports for senior managers.
  • Converted and consolidated legacy Visual Basic batch processes to PL/SQL stored procedures in Oracle data-mart tables.
  • Collaborated the data mapping document from source to target and the data quality assessments for the source data.
  • Worked with developers, DBAs, and configuration management personnel to release code to production successfully.

Confidential

Business Intelligence Developer

Responsibilities:

  • Participated in joint requirement development (JRD) sessions with technicians gathering users' requirement, which is the base of conceptual design.
  • Responsible for identifying sources, creating staging database, data warehouse dimensional modeling (Facts and Dimensions).
  • Involved in the optimization of various SQL and PL/SQL Indexes, and other basic tuning techniques.
  • Debug PL/SQL objects and refactor the PL/SQL code using best practices for a better performance.
  • Utilize Oracle tools like AWR and ASH reports, OEM dashboard for continuous monitoring of the data flows across applications.
  • Designed the entire ETL process involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to the business.
  • Analyzed queries performance tuning using execution plan analysis, SQL Profiler and Database Engine Tuning Advisor to optimize the queries by creating various clustered and non-clustered indexes.
  • Designed high level ETL architecture for overall data transfer from the source server to the Enterprise Services Warehouse which encompasses server name, database name, accounts, tables and direction of data flow, Column Mapping, Data dictionary and Metadata.
  • Performed quality check on data, identifying outliers, normality, and standardizing the data.
  • Extensive Knowledge in SSAS storage and partitions, and Aggregations, calculation of queries with MDX, Data Mining Models, developing reports using MDX and SQL.
  • Used various transformations in SSIS like Lookup, Fuzzy grouping, Row count, Performance tuning, parallel query algorithm, data mining by writing Stored Procedures.
  • Configured and maintained Report Manager and Report Server for SSRS.
  • Performance Tuning and optimization of the stored procedure. Responsible for the Creation of the Ad-Hoc Reports using SSRS.
  • Developed and maintained production/development logs to monitor the data changes.
  • Prepared multiple dashboards using Tableau to reflect the data behavior over a period.
  • Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor.
  • Converted Excel Reports to Tableau Dashboard with High Visualization and Good Flexibility.
  • Used SQL Server to the extent for source data.
  • Worked with systems engineering team for planning new Hadoop environment deployments, expansion of existing Hadoop clusters.
  • Perform Maintenance, including managing Space, Remove Bad Files, Remove Cache Files and monitoring services.
  • Set up Permissions for Groups and Users in all Development Environments.
  • Migration of developed objects across different environments.

We'd love your feedback!