We provide IT Staff Augmentation Services!

Data Analyst Resume

5.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • Over 7 years of strong experience in Business and Data Governance, Data Integration, MDM, NoSQL and Metadata Management Services and Configuration Management
  • Skilled in implementing SQL tuning techniques such as Join Indexes(JI), Aggregate Join Indexes (AJI's), Statistics and Table changes including Index.
  • Expertise and Vast Knowledge in Enterprise Data Warehousing including Data Modeling, Data Architecture, Data Integration(ETL/ELT) and Business Intelligence.
  • Experienced in using various Tera data Utilities like Tera data Parallel Transporter (TPT), Mload, BTEQ, Fast Export and Fast load.
  • Experienced in Dimensional Data Modeling experience using Data modeling, Relational Data modeling, ER/Studio, Erwin, and Sybase Power Designer, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Conceptual, Physical & logical data modeling.
  • Expertise in the Data Analysis, Design, Development, Implementation and Testing using Data Conversions, Extraction, Transformation and Loading (ETL) and SQL Server, ORACLE and other relational and non - relational databases.
  • Worked on creating various types of indexes on different collections to get good performance in MongoDB database.
  • In-depth knowledge on Hadoop ecosystem components like: Pig, Hive, Sqoop, Flume, Oozie, Zookeeper, Cloudera Manager, Flume.
  • Experience in deploying Hadoop cluster on public and private cloud environments like: Amazon AWS, Rackspace and Openstack .
  • Extensive experience in development of T-SQL, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
  • Responsible for architecture design, data modeling, and implementation of Big Data platform and analytic applications.
  • Diversified experience in Business Intelligence Development, Data Analyst, Business Analyst, ETL Scripts, ETL Batch Verification, Ad-hoc reporting and DWH Designing & Development.
  • Expertise in MS SQL Server suite of products like SSRS, SSIS and SSAS of MS SQL Server … …
  • Proficient in handling complex processes using SAS/ Base, SAS/ SQL, SAS/ STAT SAS/Graph, Merge, Join and Set statements, SAS/ ODS.
  • Experience in designing of on-line transactional processing (OLTP), operational data store (ODS) and decision support system (DSS) (e.g., Data Warehouse) databases, utilizing Data vault (hub and spoke), dimensional and normalized data designs as appropriate for enterprise-wide solutions.
  • Data Analysis- Data collection, data transformation and data loading the data using different ETL systems like SSIS and Informatica.
  • Researched, designed & prototyped the UX and UI by doing User interviews, Sketch, Photoshop & Invasion mockups.
  • Extensive experience working with Business Intelligence data visualization tools with specialization in Tableau Desktop, and Tableau Server.
  • Extensive experience on business intelligence (and BI technologies) tools such as OLAP, Data ware housing, reporting and querying tools, Data mining and Spreadsheets.
  • Implemented Ralph Kimball's data warehouse methodologies (Star schema & Dimensional Modeling) and aware of Data Vault.
  • Sound knowledge on SDLC process - Involved in all phases of Software Development Life Cycle - analysis, design, development, testing, implementation and maintenance of applications.
  • Thorough Knowledge in creating DDL, DML and Transaction queries in SQL for Oracle and Tera data databases.
  • Technically proficient, Customer dedicated with remarkable experience in Mainframe development & Maintenance projects built with Tera data, JCL & IBM Tools.
  • Experienced working on modification of views on Databases, Performance Tuning and Workload Management.
  • Worked on the multiple projects involving the cross-platform development and testing (Mainframe, Unix).
  • Worked and extracted data from various database sources like Oracle, SQL Server, DB2, and Tera data/Big Data.
  • Hands on experience with modeling using ERWIN in both forward and reverse engineering cases.
  • Good experience with Water fall and agile methodologies.
  • Involved in writing shell scripts on UNIX for Tera data ETL tool and data validation.
  • Working knowledge of Amazon Web Services (AWS) and Cloud Data Management.

TECHNICAL SKILLS:

Languages: SQL, PL/SQL, C, C++, Java/JEE, .NetBI tools Qlik View, Qlik Sense, Tableau, MS-Office suite (Word, Excel, MS Project and Outlook), VSS

Databases: Oracle 11g/10g/9i, MS Sql, DB2, NoSQL(Mongo DB, Marklogic) Teradata R12 R13 R14.10,, Netezza Packages MS office:

Operating systems: Windows, Linux MS Visio, Query-Man, TOAD, ETL, Adobe Flex, Informatica 8.x, Maestro, Business Objects XI R2/R3, Crystal Reports XI R2, Xcelsius 2008, Cognos Reports

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL

Data Analyst

Responsibilities:

  • Interviewed Business Users to gather Requirements and analyzed the feasibility of their needs by coordinating with the project manager and technical lead.
  • Developed forms, reports, queries, macros, VBA code and tables to automate data importation and exportation to a system created in MS Access.
  • Generated Tools in MS Excel using VBA Program for users to extract the data automatically from Tera data database and SQL Server.
  • Created Tableau Dashboard for the top key Performance Indicators for the top management by connecting various data sources like Excel, Flat files and SQL Database.
  • Exposure to NoSQL databases MongoDB and Cassandra
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts using show me functionality.
  • Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.
  • Designing the SSAS (SQL Server Analysis Services) Cubes; SSRS (SQL Server Reporting Services) Reports & Adhoc Querying facilities.
  • Developed basic to complex SQL queries to research, analyze, troubleshoot data and to create business reports.
  • Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
  • Converted wireframes and UX components into semantic HTML pages for better accessibility and search results and decreasing site load time
  • Utilize SSIS .net scripting to interface with third party web sites.
  • Successfully implemented complex ETL systems and Data Warehouses from scrap (using SSIS, SSAS) within tight three-month timeline.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • Scheduled Cube Processing from Staging Database Tables using SQL Server Agent in SSAS
  • Employed advanced T-SQL and Crystal Reports data analytic tools to develop insights into Confidential outcomes and the drivers of the outcomes.
  • Capable of using Manual testing techniques on Web based, Client/Server, Windows and UNIX environments.
  • Data Analysis- Data collection, data transformation and data loading the data using different ETL systems like SSIS and Informatica.
  • Created Stub Data in XML and provided to UI developers to progress the web page development.
  • Involved in SOAP testing and created a blue print for web orchestration.
  • Performed Data audit, QA of SAS code/projects and sense check of results.
  • Conduct cash flow analysis to prepare a summarized reports relating to the cash in-flow and cash out-flow.
  • Track and analyze on a per-project basis all production funding related to original programming in order to provide annual budget and quarterly forecasts.
  • Created new database objects like tables, procedures, Functions, Indexes and Views.
  • Designed Constraints, rules and set Primary, Foreign, Unique and default key and hierarchical database.
  • Developed stored procedures in SQL Server to standardize DML transactions such as insert, update and delete from the database.
  • Created SSIS package to load data from Flat files, Excel and Access to SQL server using connection manager.
  • Created data transformation task such as BULK INSERT to import data.
  • Created SSRS report in BI studio, prepared prompt generated/ parameterized report using SSRS 2008
  • Created reports from OLAP, sub reports, bar charts and matrix reports using SSRS.
  • Worked on generating various dashboards in Tableau Server using different data sources such as Tera data, Oracle, Microsoft SQL Server and Microsoft Analysis Services.

Confidential, Wilmington, DE

Data Analyst

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Accomplished financial tests to ensure compliance with CCAR, BASEL, Dodd-Frank and Sarbanes-Oxley using SQL, Oracle, SAS, DB2, Tera data, and MS Access; being proficient with business intelligence tools such as SSIS, SSRS, TOAD, SAS Enterprise Guide, Tera data SQL Assistant, VBA, Tableau, and Actimize.
  • Successfully worked on Data visualization with tools like Excel,Qlick View.
  • Worked in importing and cleansing of data from various sources like DB2, Oracle, flat files onto SQL Server with high volume data
  • Experience with writing scripts in Oracle, SQL Server and Netezza databases to extract data for reporting and analysis
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Expert in dimensional data modeling using SSAS and Erwin, cube partitioning, optimization, creating aggregations.
  • Creation of multiple, very complex reports in SSRS and Power BI, that run on high volume of data, with response time less than a few secs and which pulls data from SQL Server as well as Mongo DB.
  • Managed and reviewed Hadoop log files.
  • Developed automated data pipelines from various external data sources (web pages, API etc) to internal data warehouse (SQL sever, AWS), then export to reporting tools like Datorama by Python.
  • Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Studied and reviewed application of Kimball data warehouse methodology as well as SDLC across various industries to work successfully with data-handling scenarios, such as data
  • Connected to RedShift through Tableau to extract live data for real time analysis.
  • Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
  • Developed normalized Logical and Physical database models to design OLTP system for finance applications.
  • Experience with UX/UI optimization.
  • Data Retrieval and uploading and creating customized reports from Microsoft SQL server database. Also creating required web applications on asp .net framework
  • Extensively used ERwin for developing data model using star schema methodologies
  • Collaborated with other data modeling team members to ensure design consistency and integrity.
  • Involved in Planning, Defining and Designing data base using Erwin on business requirement and provided documentation.
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Excellent experience on Tera data Appliance Backup Utility (ABU) and ARC to backup data to and from Tera data nodes.
  • Developed mappings/sessions using Informatica Power Center 8. 6 for data loading.
  • Implemented Referential Integrity using primary key and foreign key relationships.
  • Worked with the Business Analyst, QA team in their testing and DBA for requirements gathering, business analysis, testing and project coordination.
  • Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models.
  • Used SQL for Querying the database in UNIX environment
  • Designed, configured and deployed Amazon Web Services (AWS) for a multitude of applications
  • Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic.
  • Designed and developed cubes using SQL Server Analysis Services(SSAS) using Microsoft Visual Studio 2008
  • Reverse Engineered DB2 databases and then forward engineered them to Tera data using ER Studio.
  • Create and deploy reports using SSRS .

Confidential, Chicago, IL

Business Analyst/Data Analyst

Responsibilities:

  • Highly provide hands-on Database, Data Modeling, Data Warehousing, ETL expertise and mentoring.
  • Experienced in gathering/analyzing the business/user requirements, designing ODS tables in consultation with the Data Architect, analyzing the source/target dependencies & Production Troubleshooting.
  • Used Informatica reusability at various levels of development.
  • Developed Stored Procedures, Functions & Packages to implement the logic at the server end on Oracle 9.2. Performed Application/ SQL Tuning using Explain Plan, SQL Tracing & TKPROF. Also, used Materialized View for the Reporting Requirement.
  • Responsibilities also include Gathering the Business requirements, Business Analysis, Design, QA testing and final promotion to the production as well.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Performed Data Analysis and data profiling using complex SQL on various sources systems including Oracle 10g/11g and Tera data.
  • Creation of multiple SSIS packages to import data from the legacy (mainframe) system, Oracle and Mongo DB to target SQL Server DB for report consumption and other use.
  • Assisted in creation, verification and publishing of meta data including identification, prioritization, definition and data lineage capture for Key Data Elements (KDE) and Important Data Elements (IDE).
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in the design of the new Data Mart for Finance Department and working with Data Architect/ Designer using Erwin Data Modeling Tool.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Corporate with horizontal Data Analyst Community to develop analytical tools using Big Data Platform such as Hadoop.
  • Using Unix Shell Scripts to schedule workflows.
  • Designed and developed an end to end data warehousing and OLAP solution using SSIS, SSAS, SSRS, and SQL Server.
  • Assist Business Objects & Tableau Report Developers to develop reports based on the requirements.
  • Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
  • Designed SSIS Packages to Extract, Transfer and Load (ETL) existing data into SQL Server from different environments for the SSAS cubes
  • Created data models for AWS Redshift and Hive from dimensional data models.
  • Worked on Data modelling, Advanced SQL with Columnar Databases using AWS.
  • Involved using ETL tool Informatica to populate the database, data transformation from the old database to the new database using Oracle.
  • Developed Tableau Data Visualization using Pareto's, Combo charts, Heat Maps, Box and Whisker plots, Scatter plots, Geographic Map, Cross tabs and Histograms.
  • Published and shared dashboards and views on Tableau server.
  • Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Performed logical data modeling, physical Data Modeling (including reverse engineering) using the ERWIN Data Modeling tool.
  • Extraction and importing of data from MongoDB and configuration of the MongoBI connectors and the ODBC driver to enable smooth communication between MongoDB and SSRS and PowerBI.
  • Created reports from OLAP, sub reports, bar charts and matrix reports using SSRS.
  • Experienced in Providing SQL to tollgate and data quality check to ETL to test the data inserted by the Java rules engine into staging table as per requirement and logic to automate the process.
  • Developed detailed ER diagram and data flow diagram using modeling tools following the SDLC structure
  • Experienced in Provided PL/SQL queries to developer as source queries to identify the data provided logic to assign.
  • After sign-off from the client on technical brief, started developing the SAS codes.
  • Wrote the data validation SAS codes with the help of Univariate, Frequency procedures.
  • Extensively used SAS procedures like IMPORT, EXPORT, SORT, FREQ, MEANS, FORMAT, APPEND, UNIVARIATE, DATASETS and REPORT.
  • Implemented cluster analysis (PROC CLUSTER and PROC FASTCLUS) iteratively.
  • Performed Data audit, QA of SAS code/projects and sense check of results.

Confidential

Business Analyst/ Developer

Responsibilities:

  • Involved in designing conceptual, logical and physical models using Erwin and build data marts using hybrid Inmon and Kimball DW methodologies.
  • Worked closely with Business team, Data Governance team, SMEs, and Vendors to define data requirements.
  • Used Microsoft Excel for formatting data as a table, visualization and analyzing data by using certain methods like Conditional Formatting, removing Duplicates, Pivot and Unpivot tables, create Charts, sort and filter Data Set.
  • Created SQL tables with referential integrity and developed queries using SQL, SQL*PLUS and PL/SQL.
  • Design, coding, unit testing of ETL package source marts and subject marts using Informatica ETL processes for Oracle database.
  • Worked on Migration of projects from Forms 3.0 on UNIX to Forms 6.0 on Windows NT 4.0.
  • Utilized Hibernate for Object/Relational Mapping purposes for transparent persistence onto the SQL server.
  • Developed portlet kind of user experience using Ajax, jQuery.
  • Used spring IOC for creating the beans to be injected at the run time.
  • Modified the existing JSP pages using JSTL.
  • Expertise in web designing using HTML5, XHTML, XML, CSS3, JavaScript, jQuery, AJAX and Angular JS.
  • Integrated Spring Dependency Injection among different layers of an application with Spring and O/R mapping tool of Hibernate for rapid development and ease of maintenance.
  • Gathering, reviewing business requirements and Analyzing data sources from Excel/SQL for design. Development, testing, and production rollover of reporting and analysis projects within Tableau Desktop.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snowflake Schemas.
  • Worked on MVC reporting and javascripts.
  • Developed the RESTful web services using Spring IOC to provide user a way to run the job and generate daily status report.
  • Developed and exposed the SOAP web services by using JAX-WS, WSDL, AXIS, JAXP and JAXB
  • Involved in developing business components using EJB Session Beans and persistence using EJB Entity beans.
  • Used data analysis techniques to validate business rules and identify low quality for Missing data in the existing data warehouse.
  • Involved in defining the business/transformation rules applied for ICP data.
  • Define the list codes and code conversions between the source systems and the data mart.
  • Worked with internal architects and, assisting in the development of current and target state data architectures.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Evaluated data profiling, cleansing, integration and extraction tools(e.g. Informatica).
  • Supported SOA, Data Warehousing, Data Mining, and Enterprise Service model standards in designing and developed standardization of processes like configuration management.
  • Created data transformations from internal and third party data sources into data suitable for handheld devices, including XML.

We'd love your feedback!