We provide IT Staff Augmentation Services!

Data Governance / Data Analyst Resume

4.00/5 (Submit Your Rating)

MichigaN

SUMMARY

  • Overall IT experience around 8 years, in the field of Data Analytics and Business Intelligence. Predominantly in Health Care, Automotive and Human Capital industry.
  • Hands on experience with data monitoring, model creation, data linage, meta data management, source to target mapping, data catalog management, data quality controls, data security and data reporting providing organizational management with guidance on streamlining data management.
  • Excellent practice with Data visualization tools like Tableau, Power BI and SaS. Experienced with data modeling, data warehousing (snowflake model), and building ETL pipelines.
  • Created different Action filters, Parameters and Calculated fields to create most complicated Tableau dashboards. Very good at Query Editor (Cleaning the data), Relationships and DAX (Building models), and Visualization (Consumption of Data to end users) with Power BI.
  • Over 5 years of IT experience with specialization in Data Warehousing, Decision support Systems and extensive experience in implementing Full Life Cycle Data Warehousing Projects.
  • Possess Analytical knowledge in use of Advanced Pivots and LOOKUP’s in Excel for capturing and transforming data into simplified manner and thereby providing charts, graphs and tables to discover insights conducted data migration technique using (Microsoft Access).
  • Strong understanding on cloud services like AWS and Microsoft Azure. Cognitive about designing, deploying and operating highly available, scalable and fault tolerant systems using Amazon Web Services.
  • Assisted in the process for creating Weekly, Monthly & Quarterly Business Reviews. Provided analytical insights, consolidating format, proofreading, and pulling standard reports.
  • ETL Mappings, Mapplets, Workflows, Worklets using Informatica Power center 9.6x.
  • Experience working with different databases such as MS SQL, Oracle and SAP.
  • Experienced in the areas of Analysis, Design, Development, Implementation and testing of Interfaces using ETL tool (Informatica) and Database (Microsoft Access/SQL Server Management Studio) applications.
  • Expert in Data Extraction, Transforming and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), Log Shipping, DTS, Bulk Insert and BCP.
  • Completed Huge amount of ad - hoc report generation comes from clients using multiple Vendor Managed systems to help customers understand impact of their business operations.
  • Extensively worked in designing ETL data flows using SSIS; creating mappings/workflows to extract data from SQL Server and Data Migration and Transformation from Oracle/Access/Excel Sheets using SSIS.
  • Extracting data from excel files, high volume of data sets from data files, Oracle, DB2, Salesforce.com (SFDC) using Informatica ETL Mappings/SQL PLSQL scripts and loaded to data store area.
  • Experience in developing a batch processing framework to ingest data into HDFS, Hive, and HBase.
  • Developed Sqoop jobs to import and store massive volumes of data in HDFS and Hive
  • Maintained team organizational channels like SharePoint (Remote Repository) and Microsoft Teams (Communication Channel) in parallel.
  • Provided Executives with analytics and decision-support tools, used as the basis for decision making and their business operational reviews.
  • Excellent knowledge of operating systems Windows, UNIX and databases including Teradata, Oracle, Netezza, Hive, Impala, SQL Server and DB2.
  • Performing Ad hoc reports, Financial reports in Teradata by using Basic Teradata Query
  • Part of Global IMS Team (Information Managed services) supporting Vendor Managed services for global clients and possess knowledge of different Reporting suites.
  • Worked with the Stakeholder Strategy, Sustainability, and Foundation team to create & visualize data for the Power of reviewing Business Operations using Power BI and SQL Server Database.
  • Ability to explore options and suggest new solution and visualization techniques to the customer.
  • Strong data analysis experience, including data profiling, building customized views, writing SQL queries, understanding DW data model.
  • Excellent analytical and technical skills to quickly adapt and work with challenging environments.
  • Experienced in design, implementation and management of business intelligence systems using a wide range of technologies.
  • Capable of meeting deadlines and deliver high quality work under all circumstances.
  • Ability to work as part of a team or individually and produce high quality work.
  • Experienced in Project Management and delivering solutions in Onsite/Offshore model.
  • Ability to work in multiple technologies integrating different application areas in cross functional environment.

TECHNICAL SKILLS

Technologies/Languages: Data warehousing (ETL), Shell Scripts and Python

Databases: Teradata, Bteq scripts, SQL Server, Oracle and MYSQL

Operating Systems: Windows& DOS, UNIX/LINUX/AIX and Solaris

BI Tool: Tableau, SaS, Alteryx, Qlikview

Data Modeling: Erwin and Microsoft Visio

Configuration Management: Microsoft Team Foundation Server (MSTFS)Programming Languages SQL, PL/SQL, Java, UNIX commands and Shell/Bash Scripting

Domains worked: Healthcare, Automotive and Human Capital

Scheduling Tools: Control-M, & Autosys.

Tools: used: Tableau, Python, Data stage 11.5(Server and Parallel jobs), InformaticaCDC, Excel, Teradata SQL Assistant, Oracle SQL Developer, IBM DB2 Clients, PL/SQL, Hadoop file system, Cloudera, Microsoft SQL Server, MSTFS, Putty, DBeaver, Striim, Toad and MS Visio.

PROFESSIONAL EXPERIENCE

Confidential

Data Governance / Data Analyst

Responsibilities:

  • As a data governance analyst, I was specialized in managing the security, integrity, and availability of data within the Enterprise Data Platform. Moved Data from On Prem (Hadoop 3.1) to Google cloud Platform (GCS).
  • Worked with Data replication tool Striim for data migration from On Prem Hadoop environment to Google cloud Storage bucket as well in Data Proc and Big Query.
  • Analyzed and overseen strategies designed to enhance data reliability and minimize redundancies.
  • Applied data governance standards within the context of data ingestion and data discovery processes.
  • Worked on Jira tickets for creating and overseeing data access requests processes.
  • Processed Data Decryption authorization based on PHI and PII standard protocols.
  • Operational support carried for New hive data request, Data Egress and Data ingestion process.
  • Reviewing, developing, and implementing various standards involving data governance concepts.
  • Main scope of Operational support model was creation and revision of data governance procedural documentation periodically.
  • Ensuring metadata is captured correctly.
  • Created and utilized methods for monitoring and reporting any incidents of misused or unsecured data.
  • Participated in Scaled Agile Framework (SAEE) efforts as needed for the projects. Worked on different projects with the EDP model.
  • Created and maintained data dictionaries, target mapping and catalogs.

Environment: Hadoop 2.6 & 3.1, Voltage, data guise, Data Proc/Hive, Secure CRT(Shell), Secure Fx, Google Cloud Platform (GCS/ Data Proc/ Big Query), Aginity Pro, Putty, Striim, Java eclipse.

Confidential, Michigan

Sr. Business Analyst

Responsibilities:

  • Analyzed one of our most complex packages surrounding customer contacts, then applied a new business rule architecture to make it more flexible and easier to modify
  • Introduced Oracle arrays to provide numerous overall benefits from a performance, readability, and business rule footprint stand point
  • Coached Jr. team members on internal system knowledge as well as technical skills related to performance tuning, organization, readability and usability
  • Facilitated team meetings to discuss direction, issues, knowledge sharing and project work
  • Reviewed with team members any new technology utilized, resolutions to issues or any other relevant knowledge sharing items.
  • Developed a comprehensive search tool that allows us to quickly search for items such as database objects or strings across all team code sets
  • Involved in designing, developing and deploying reports in MS SQL Server environment using SSRS-2008 and SSIS in Business Intelligence Development Studio (BIDS).
  • Assisted in the creation and development of various projects, by planning with Management the resources needed to meet the timelines set for the finalization of the project.
  • Identified KPIs that are most appropriate to the business and monitored business performance. Developed and implemented new reporting strategies and procedures to enhance the effectiveness of company operations.
  • Identified and refined data quality checks and data monitoring supporting the business rules and changes, defines, and refines data quality measures, reports and dashboards.
  • Developed Tableau data visualization using Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density Chart.
  • Created detailed level summary report using Trend lines, Statistics, log axes, groups, and hierarchies.
  • Prepared Dashboards using calculations, parameters in Tableau and created calculated fields, groups, sets and hierarchies etc.
  • Expert level capability in Tableau calculations and applied complex, compound calculations to large, complex data sets.
  • Involved in end to end implementation and enhancements on Web based dashboard platform starting from requirement gathering and dashboards development in Tableau Desktop.
  • Developed, maintained and produced individualized monthly reporting packages for MSP clients based on the specific needs of each customer organization, as well as ad hoc reporting.

Environment: Tableau, Informatica, Pivotal Cloud, Microsoft SQL Server SSIS, SSRS, SSAS, SQL/ PL and Python

Confidential, Michigan

Data Engineer/ Sr. ETL Developer

Responsibilities:

  • Worked on Migration review testing and reviewed on database tables for encryption of PP and PHI data attributes.
  • Worked on Data Migration projects from platform Hadoop 2.6 to 3.1 version. Data replication Tool like Striim had been utilized during this process.
  • Analyzed large datasets to identify metrics, drivers, performance gaps and opportunities for improvement.
  • Conducted Weekly/ Monthly Business Reviews to provide business updates to executive team for closing in operational gaps.
  • Work with developers and business representatives in the resolution of testing incidents
  • Responsible for communicating status, issues and risks to management and the relevant project teams
  • Responsible for tracking and escalating incomplete issues to closure
  • Develops implementation plans, training plans and user documentation as needed
  • Able to design customer solutions that include various Transcend Insights’ products
  • Serve as the Transcend Insights Technical Subject Matter Expert, and create solution architecture plans to ensure the environments adhere to Transcend Insights’ technical best practices, client requirements, sandbox testing results, and Transcend Insights’ business and strategic objectives.
  • Experience in migrating client's reports from excel (static) based solution to an interactive service.
  • Gather data from multiple data sources to provide enterprise-wide analytics for executive management.
  • Involved in extraction, transformation and loading of data directly from different source systems like flat files, Excel, Oracle and SQL Server.

Environment: Hadoop 2.6 & 3.1, Aginity Pro, Tableau, Qlikview, Microsoft Excel, IBM DB2, Informatica (Data Migration) and Python

Confidential, Michigan

Data Analyst

Responsibilities:

  • Prepared monthly and quarterly senior management reporting of worldwide consolidated results including biweekly forecasts, variance analysis, and executive summary reports
  • Involved with Global Data &Analytics team for automation effort through the partnership with Database engineers to ingest, cleanse, and predictably expose client data for reporting/dashboards.
  • Assisted with the development of business rules to impose on raw data and documentation of client metadata.
  • Providing support and learning to make thorough understanding of the Envision platform and its advanced analytics modules.
  • Leverage Envision to support Program team requests, participation in weekly Envision meetings, and engaging in the process in which data is refreshed in the Envision platform through extract development and processing.
  • Acted as a Point of contact between end-customer, Program team, and reporting vendors in addressing issues, evaluating new standard requests or modifying current processes for proper completion and report delivery. Adhere to GRI's standards for professional communication and responsiveness to all inquiries.
  • Developed and maintained the current GRI analytical offering, ensuring the delivery of reporting and adhoc analysis is on time, adhere to the quality standards of the department, and align to the analytical needs of our clients and/or Program team.
  • Execute ad-hoc reports for management to decide critical business decisions that will improve the efficiency of the company
  • Clean, maintain and update database for accuracy of all reports
  • Consulted with end users on mid-level to complex reporting solutions. Consulted with end user to determine client reporting needs as to how data can be provided to support business decisions.

Environment: Tableau, Microsoft Excel, Microsoft SQL Server, MS SSIS, SSAS

Confidential

Data Analyst

Responsibilities:

  • Created and designed new standard reports and process definitions.
  • Hands on experience in installing configuring and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS HBase, Hive, Sqoop, Pig Zookeeper and Flume.
  • Good Exposure on Apache Hadoop Map Reduce programming PIG Scripting and Distribute Application and HDFS.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • Work with users and report developers to develop functional specifications and data mapping for report development and system improvements.
  • Analyze and transform data leveraging tools such as SQL Server, Excel and develop and execute SQL scripts to create data extracts and reports.
  • Worked with company management to ensure mission critical reporting and business intelligence reporting is being developed and delivered.
  • Responding and action on incoming metrics, reporting requests from senior management and global team.
  • Utilized excel functionality to gather, compile and analyze data from pivot tables and created graphs/charts. Provided analysis on any claims data discrepancies in reports or dashboards.
  • Gather data from multiple data sources to provide enterprise-wide analytics for executive management.
  • Conducted analysis on data from different sources by identifying data trends, gaps and looks for an order and patterns in data as well as spotting trends that helps to improve business decisions.
  • Made transitions from older versions of reports to visually appealing reports by building enhanced analytic reports, dashboards in tableau. Predominant use of advanced Pivots and VLOOKUP up in excel.
  • Periodically built metric reports to display the progress and summarize the activity of the business.
  • Made some recommendations in establishing new or modified reporting methods and procedures to improve report content and completeness of information.
  • Acted as a point of contact between end users, cross functional team members and offshore team in addressing reporting issues, evaluating new standard request or modifying current processes for proper completion.
  • Had regularly updated the dashboards and maintained internal reporting by including the correct data and removing the non-valid ones, schedule and ensured that deliverables are completed on time.

Environment: Informatica (Data Migration), Teradata, Hadoop, Hive QL, HBase, Tableau, MS Excel and UNIX

Confidential

Database/ETL Developer

Responsibilities:

  • Provide process improvement analysis using process modeling and simulation techniques to understand the impact of process changes
  • Provide operational analytics to support better management of MIS
  • Attend technology design meetings and recommend security and data improvements - and will be empowered to work on those prioritized changes
  • Develop scripts to automate the execution of ETL using shell scripts under Unix environment
  • Providing routine process/operational control reports and/or Controls Dashboard support
  • Designed database, tables and views structure for the application.
  • Used SQL to connect database with other oracle versions.
  • Conducted data migrations from third party databases to create consolidated database system.
  • Successfully transferred inbound data from various sources like flat files, MS Access, and Excel into database using SSIS Packages.
  • Optimized query performance and populated test data
  • Maintained the database using Procedures, Functions, Cursors and Triggers
  • Performed quality assurance and testing of SQL server environment
  • Assisted clients in using implemented software projects
  • Coordinated with the management to discuss business needs and designed database solutions accordingly.
  • Implemented batch job processing, scheduling and production support.

Environment: MS SQL Server, MS SSIS, MS SSAS, MS SSRS, Informatica (ETL).

We'd love your feedback!