We provide IT Staff Augmentation Services!

Senior Associate Resume

0/5 (Submit Your Rating)

SUMMARY:

  • Permanent Resident of Canada (Work Authorization)
  • 9+ years of hands on experience in Data Warehousing and Business Intelligence domain in complete life cycle of Data Warehousing projects.
  • Worked in following ETL Tools
  • Informatica Powercenter & Informatica Cloud
  • SSIS
  • Alteryx
  • Talend Open Studio for Data Integration & SnapLogic
  • Extensively worked in writing complex SQL queries (SQL Server 2008/2012/2016 ). Query tuning and optimizing query performance using Query execution plans and SQL profiler.
  • Worked in data warehousing in Cloud (Amazon Redshift) & On - premises (Teradata)
  • Experience in writing basic UNIX scripts and maintaining data quality.
  • Worked briefly on ASP.Net (c#)
  • Bilingual speaking both English & French. DELF B1 Certified (French)

TECHNICAL SKILLS:

Languages: SQL, PL/SQL, C#.

Databases: MS SQL Server, Oracle, IBM DB2, Amazon Redshift

ETL Tools: Informatica Cloud and Powercenter BDE, Talend Open Studio, Microsoft Visio, DB2 Command Editor, IBM Data Studio, SSMS, SQL Developer, SAP HANA Data Modeler, Teradata SQL Assistant, MS PDW

Job Scheduling: Autosys

Bug Tracking Tools: JIRA, GitHub

WORK EXPERIENCE:

Confidential

Responsibilities:

  • Worked as ETL Developer to create workflows for a high visibility summary report containing metrics across domain.
  • Co-ordinated with Visualization team for report generation
  • Worked with business to gather the business rules and exceptions
  • Used GitHub as source control to manage workflows and JIRA for scrum board (Agile).

Tools: used: Informatica Powercenter 9.6, Alteryx 11.7, Tableau, SSIS, Teradata SQL Assistant, MS PDW, GitHub, JIRA Scrum, MITI, Hadoop

Confidential

Senior Associate

Responsibilities:

  • Ingested the different stream of data using SSIS & Alteryx packages for analysis
  • Profiled and cleansed during ingestion phase
  • Worked in SAP HANA data modeler to create calculation views for visualization
  • Helped/Supported Visual Design team in creating dashboard on Lumira.
  • Conducting interviews with key stakeholders and data owners
  • Leveraged already created Material by Tech BA Team
  • Prepared the Business flow in Swim lanes and validated with client
  • Suggested loopholes in the existing system

Tools: used: Erwin Data Modeler, Microsoft Visio, SSIS, MS SQL Server

Confidential

Responsibilities:

  • Designed standard SSIS template and created packages to ingest data from 6 different sources. Implemented automation for daily feed.
  • Created ETL process for data cleansing.
  • Designed and Implemented logical and physical data models following Data Architecture best practices and standards.
  • Worked on Google Big Query to bring GA session level data
  • Worked on SQL Server 2016 T-SQL for parsing JSON files
  • Worked on SQL Server master data services to leverage the regular expressions functionality
  • Worked with Agency to import marketing cost data
  • Trained IJV (India Joint Venture) resources to handle managed service for 24/7 availability
  • Worked on Customer Journey, Marketing Performance and Channel aspect of data

Tools: used: Informatica Powercenter, SSIS, Google Big Query, MS SQL Server, Oracle, JSON, MS SQL master data services

Confidential

Responsibilities:

  • Imported member, transaction and product related data in SQL Server
  • Worked on data profiling to report data quality back to Client
  • Authored multiple SQL scripts to perform data cleaning and data preparation for Analytical model ingestion
  • Worked on SSIS packages to build analytical data set after customer de-duping, product aggregation.
  • Prepared QC document and unit test cases.
  • Helped data modeling team to validate data findings.

Tools: used: SSIS, SQL, MS SQL Server, Oracle, MS SQL master data services, GitHub

Confidential

Responsibilities:

  • Worked on Informatica Powercenter BDE as part of ETL process to insert data in Kerberos authenticated Cloudera (Hadoop).
  • Created Oracle PLSQL scripts to clean and aggregate the data
  • Created Analytical data set for Modeling and Visualization team at different granularity.
  • Created scripts for QC the staging and ADS tables.
  • Used online API like Zip wise, Bing API and Google API to retrieve co-ordinates from postal code. Automated the process to pull data for over 80,000 postal codes.
  • Used Java transformation in some of Powercenter mappings to calculate distance between two co-ordinates.

Tools: used: Informatica Powercenter, Hadoop, SQL, MS SQL Server, Oracle, MS SQL master data services, GitHub

Confidential

ETL Developer

Responsibilities:

  • Worked onInformatica Powercenter to load strategic source data to build the data marts and operational data store using daily/weekly ETL jobs.
  • Used various transformations such as Aggregator, Router, Expression, Source Qualifier, Filter, Lookup, Joiner, Sorter, and XML Source qualifier, Rank, Stored Procedure and Update Strategy to load the data.
  • Monitoring workflows and investigating session and workflow logs for slowness/ bottlenecks using thread statistics.
  • Worked with DBA in creating new/modifying existing table schema, indexes/keys and other database objects based on the query execution plan to increase data retrieval.
  • Maintaining cloud data warehouse (Amazon Redshift) for ETL failures and other data/performance related issues. Implemented distribution and sort keys for faster data retrieval.
  • Worked on creating dimensional model using Erwin data modeler.
  • Worked on POC of Snaplogic ETL tool by implementing various complex use cases.
  • Worked on UNIX shell scripting and maintaining data quality.

Tools: used: Informatica Powercenter BDE, Informatica Cloud, Amazon Redshift, SSIS, Amazon S3, Amazon Dynamo DB, Snaplogic, SQL, MS SQL Server

Confidential

ETL Developer

Responsibilities:

  • Worked closely with business analysts in requirement gathering, Analysis, process design, Data Design, development, unit testing and Implementation of load processes and data transformation processes.
  • Analyzing the source data coming from different sources flat files, MS SQL Server
  • Prepared technical documentation of transformation components and Participated in design and development reviews.
  • Worked on creating various documents during complete life cycle of project like functional specification document, high and low level design documents and mapping document.
  • CreatedInformaticamappings to build business rules to load data into different targets.
  • Extensively usedInformaticaPower Center Designer, Workflow Manager and Workflow Monitor to develop, manage and monitor workflows and sessions.
  • Designed and developed mappings using diverse transformations like Unconnected/Connected Static/Dynamic Lookup, Expression, Router, Rank, Joiner, Sorter, Aggregator, Normalizer, Transaction control, SQL, Source Qualifier transformations.
  • Worked on data cleansing, data mining, data profiling, created stored procedures/triggers and necessary Test plans to ensure the successful execution of the data loading processes.
  • Expert in SQL query optimization by using query execution plan and SQL Profiler.
  • Worked as Informatica developer at onshore location.
  • Worked closely with Business partners in requirement gathering.
  • Involved in designing theInformaticamappings by translating the business requirements and implemented SCD1, SCD2 and SCD3.
  • Worked on mappings with constraint based loading.
  • Offshore designer and worked with onsite team to prepare design of high priority module.
  • Worked on designing the code following discussions with requirements team.
  • Wrote the application code in ASP.Net from scratch.
  • Worked on bug fixing.
  • Worked on 3 complex releases till date.
  • Analysed defects and provided root cause analysis.
  • Created unit test plan and did unit testing for project.

Tools: used: Informatica Powercenter, SSIS, Oracle, Teradata, SQL, MS SQL Server, DB2, IBM Data Studio, SQL Developer, SSMS

We'd love your feedback!