We provide IT Staff Augmentation Services!

Senior Azure Data Engineer Resume

0/5 (Submit Your Rating)

SUMMARY

  • A Cloud and ETL professional with over 12 + years of IT experience as Team Lead, Senior Data Engineer/Developer, Extensive experience in Snowflake and Azure Data Factory, Data Modeler and post product support engineer. Hands on experience in design, development, and testing in Data Warehousing, Architect, design and implement solutions for Microsoft Azure cloud applications (Strong hands on experience on Azure data factory, Snowflake, Python) and internal infrastructure solutions.
  • Extensive ETL experience using Informatica Power Center and SSIS.
  • Architect, design and implement best practice security patterns
  • Ensure success in building and migrating applications, software and services on the Azure platform and internally
  • Strong hands on experience on Azure Data Factory ( Creating Pipelines, loading data to Snowflake, moving data between BLOBs, Moving data from On premise to Cloud etc)
  • Good understanding and hands on experience on Snowflake ( Cloud computing) and experience in developing and implementing the complex applications.
  • Participate in deep architectural discussions to ensure solutions are designed for successful deployment in the cloud
  • Troubleshooting production issues
  • 3 or more years’ experience in Cloud architecture (Azure specifically)
  • Experienced in working on Agile operation process including various tools (Code review, unit test automation, Environment, Service, Incident and Change Management).
  • Experience in branching, tagging and maintaining the version across the environments using SCM tools like GIT
  • Strong DWH & ETL experience in Informatica Power Center development, design, process and best practices experience as ETL Analyst and Developer.
  • Hands on experience Design Source to Target maps, Code Migration, Version control, scheduling tools, Auditing, shared folders, data movement, naming in accordance with ETL Best Practices, Standards and Procedures.
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema.
  • Extensive experience in Extraction, Transformation and Loading of data from multiple sources into Data warehouse.
  • Proficient experience in different Databases like SQL Server, Oracle Server, DB2 and flat files.
  • Automated and scheduled the Informatica jobs using Control M and monitoring the jobs submitted by production team through Control M
  • Proficient in analyzing and translating business requirements to technical requirements and architecture.
  • Competent in team management. Handled multiple roles - Module Lead and Application Developer.
  • Expertise on Tools like File-Aid, Xpeditor, Endeavor and Change man.
  • Experience in Clarity, Clear quest, Requisites pro to update documents and tracking the project.
  • Well versed knowledge of IBM Utilities/Tools, RDBMS, DB2 and SQL. In depth experience in unit testing, integration testing and end-to-end testing and documenting and resolving testing defects during all testing phases.
  • Excellent analytical and inter-personal skills with experience in interacting with clients, user groups and ability to work in a fast paced environment under tight deadlines.
  • Knowledge in Data Warehousing Concepts like Star Schema, Snow Flake Schema, Fact Table, Dimension Table, Logical Data Modeling, Physical Modeling, Dimension Data Modeling, Data profiling and data cleansing.
  • Worked on Repository Server Administration Console, Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Normalizer, Union and XML Source Qualifier.
  • Extensively used Repository Manager, Workflow Manager, Workflow Monitor and worked on Informatica Designer Components like Source analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designers.
  • Extensive experience in using Informatica tool, SSIS package for implementation of ETL methodology in Data Extraction, Transformation and Loading.
  • Expertise in Developing Mappings, Defining Workflows & Tasks, Monitoring Sessions, Export & Import Mappings and Workflows and Backup Recovery.
  • Extensive experience in extraction, transformation and loading of data directly from different heterogeneous source systems like Flat files (Fixed width & Delimited), XML Files, COBOL files, VSAM, IBM DB2 UDB, Excel, Oracle, Sybase, MS SQL Server, Teradata and Netezza.
  • Experience on handling unstructured sources to Hadoop, installation and configuring Hadoop and its ecosystem tools, designing new applications in Hadoop ecosystem for analyzing big data.
  • Informatica Workflows on a daily Batch Jobs.
  • Excellent skills in understanding business needs and converting them into technical solutions.
  • Excellent problem solving skills with a strong technical background and result oriented team player with excellent communication and interpersonal skills.

TECHNICAL SKILLS

Operating System: TSO/ISPF, OS/390, MVS/XA/ESA, Windows

Languages: EASYTRIEVE, Unix Shell Scripting, Perl, Python and PL/SQL.

Data Warehousing/ETL/Cloud Tools: Azure Data Factory, Informatica PowerCenter (Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Repository manager), Erwin for Data Modelling

Database: Snowflake, SQL Server, Oracle, DB2, VSAM, Teradata

Utilities: Ultra Edit, iView, DB Compare, iValidator

Tools: Iview, I validator, DB Compare, Control M, Requisite Pro, Clear Case, Clear Quest,, Informatica Workflow Manager, SQL Visual Studio, MS Visio

Hadoop ecosystem: Hive, YARN, MR, HDFS, Sqoop, Pig, Hue, Oozie, Flume, Spark and Zookeeper

Domain Knowledge: Insurance

PROFESSIONAL EXPERIENCE

Confidential

Senior Azure Data Engineer

Responsibilities:

  • Data analytics and engineering experience in multiple Azure platforms such as Azure SQL, Azure SQL Data warehouse, Azure Data Factory, Azure Storage Account etc. for source stream extraction, cleansing, consumption and publishing across multiple user bases.
  • Created Azure Data Factory pipeline to insert the flat file data into Azure SQL.
  • Cloud based report generation, development and implementation using SCOPE constructs and power BI. Expert in U-SQL constructs for interacting multiple source streams with in Azure Data Lake.
  • Involved in data analysis. Performed data quality checks and prepared data quality assessment report
  • Designed source target mapping sheets for data loads and transformation
  • Developed pipelines to transform data using activities like U-SQL scripts on Azure Data Lake Analytics
  • Transform data using Hadoop Streaming activity in Azure Data Factory
  • Developed informatica mappings to load data from on prem to AZURE cloud database.
  • Loaded JSON input files to Azure Data warehouse
  • Developed Pipelines in Azure data factory using copy activity to load data
  • Designed and developed stored procedure in SQL server
  • Developed Pipelines in Azure data factory to call stored procedures to transform data for reporting and analytics.
  • Reports are developed on Power BI on top of Views in Azure SQL.
  • Scheduled Pipelines in Azure pipeline
  • Did daily health check and monitored batch jobs
  • Worked extensively on SSIS, ADF, SQL DWH and Azure Snowflake.
  • Involve in performance tuning of the process and establish the platform standards.
  • Test all the applications and transported the data to the target Warehouse SQL/Snowflake DWH.
  • Use Debugger to test the data flow between source and target and to fix issues in mappings.
  • Interaction with offshore team in assigning tasks and issue resolutions.
  • Prepared the System test job flow document.
  • Involve in Unit, Integration and System testing reviews and giving suggestions to offshore team.
  • Created ad-hoc programs to prepare a test data for system and UAT test scenarios.
  • Responsible for the UAT test Acceptance and post Production Cycles Support
  • Responsible to convert technical requirements to business requirements and share the details with offshore and co-ordinate the deliverables.

Confidential

Senior Developer/ Data Modeler

Responsibilities:

  • Provide Technical/ user documentation. Analyze the input record layouts to the DW layouts in order to prepare Technical Documentation.
  • Work on data quality issues through in depth analysis.
  • Work closely with Data Modelers and Database architect during the design and development of ETL technical specification document.
  • Participate in gathering the user requirements and carryout with suitable ETL procedures in order to develop mappings.
  • Independently Responsible for the Analysis, ETL Design and development.
  • Use extensively Mapplets, Re-usable transformations, Parameters and Variables to build Data Marts.
  • Involve in performance tuning of the Informatica mapping using various components like Parameter files, variables and Dynamic Cache.
  • Test all the applications and transported the data to the target Warehouse SQL Server, Oracle Server tables, schedule and run extraction and load process and monitor sessions and batches by using Informatica Workflow Manager.
  • Use Debugger to test the data flow between source and target and to fix issues in mappings.
  • Interaction with offshore team in assigning tasks and issue resolutions.
  • Prepared the System test job flow document.
  • Involve in Unit, Integration and System testing reviews and giving suggestions to offshore team.
  • Created ad-hoc programs to prepare a test data for system and UAT test scenarios.
  • Responsible for the UAT test Acceptance and post Production Cycles Support
  • Responsible to convert technical requirements to business requirements and share the details with offshore and co-ordinate the deliverables.

Confidential

Senior Informatica Develeoper

Responsibilities:

  • Provide Technical/ user documentation. Analyze the input record layouts to the DW layouts in order to prepare Technical Documentation.
  • Work on data quality issues through in depth analysis.
  • Work closely with Data Modelers and Database architect during the design and development of ETL technical specification document.
  • Participate in gathering the user requirements and carryout with suitable ETL procedures in order to develop mappings.
  • Independently Responsible for the Analysis, ETL Design and development.
  • Work on Data Modeling by involving in the design of the Subscription Data Mart, Staging Customer & Contact Model.
  • Use extensively Mapplets, Re-usable transformations, Parameters and Variables to build Data Marts.
  • Involve in performance tuning of the Informatica mapping using various components like Parameter files, variables and Dynamic Cache.
  • Develop UNIX scripts to automate the data warehouse loading.
  • Test all the applications and transported the data to the target Warehouse DB2 tables, schedule and run extraction and load process and monitor sessions and batches by using Informatica Workflow Manager.
  • Use Debugger to test the data flow between source and target and to fix issues in mappings.
  • Interaction with offshore team in assigning tasks and issue resolutions.
  • Gather requirements from users and prepared high level user requirement documents.
  • Prepared functional design documents
  • Reviewed all the deliverables that were received from 3rd party associates.
  • Created unit test environment for all the impacted applications.
  • Prepared the System test job flow document.
  • Scheduled the system test jobs for System/UAT testing.
  • Involve in Unit, Integration and System testing reviews and giving suggestions to offshore team.
  • Created ad-hoc programs to prepare a test data for system and UAT test scenarios.
  • Responsible for the UAT test Acceptance and post Production Cycles Support

We'd love your feedback!