We provide IT Staff Augmentation Services!

Data Engineer Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • 14 years of Experience working in IT at Auto, Financial, health care, insurance & telecom industries.
  • Large data integration, real - time integration and multi-tenant data warehousing projects. created enterprise data frameworks&solid data architecture solutions for Cloud and on premise.
  • Experience in Cloud (Azure, AWS, GCP), DevOps, Configuration management, Infrastructure automation, Continuous Integration and Delivery (CI/CD).
  • Implemented effective strategies for N - Tier application development in both Cloud and On-premises environments.
  • Experience in Infrastructure Development and Operations involving AWS Cloud Services, EC2, EBS, VPC, RDS, SES, ELB, Auto scaling, CloudFront, Cloud Formation, Elastic Cache, API Gateway, Route 53, Cloud Watch, SNS.
  • Expertise in building CI/CD on AWS environment using AWS Code Commit, Code Build, Code Deploy and Code Pipeline and experience in using AWS CloudFormation, API Gateway, and AWS Lambda in automation and securing the infrastructure on AWS.
  • Hands-on experience in bootstrapping the nodes using knife and automated by testing Chef Recipes, Cookbooks with test-kitchen and chef spec . Refactored Chef and Ops Works in AWS cloud environment.
  • Expert in Setting up Continuous Integration (CI) by configuring Build, Code, Deploy and test automation Jobs in Jenkins for different applications and in creation and deployment using Codeship to automate branch & project creation in Git using (Groovy language) in Jenkins file and automating (using Ansible).
  • Professional in deploying and configuring Elasticsearch, Logstash, Kibana (ELK) and AWS Kinesis for log analytics and skilled in monitoring servers using Nagios, Splunk, AWS CloudWatch, Azure Monitor and ELK.
  • Experience in changing over existing AWS infrastructure to Serverless architecture (AWS Lambda, AWS Kinesis) through the creation of a Serverless Architecture using AWS Lambda, API gateway, Route 53, S3 buckets.
  • Experience in creating the methodologies and technologies that depict the flow of data within and between application systems and business functions/operations & developed Data Flow Diagrams.
  • Extensive working experience with large scale process flows, relational/MPP databases, dimensional modeling, BI dashboards and reporting framework, to build efficient data integration strategy (data mapping, standardization and transformation) across multiple systems.
  • Performing requirement gathering analysis design development testing implementation support and maintenance phases of Data Integration Projects
  • Data Integration in large scale implementation environments.
  • Experience in working with other IT team members business partners data stewards’ stakeholders steering committee members and executive sponsors for BI and data governance related activities.
  • Experience in data profiling methods, data mining & defining data quality rules.
  • Hands-on experience implementing & supporting Informatica suite of products
  • Hands-on experience writing scripts in korn shell,batch scripts, sql, pl/sql, Tsql and python to automate ETL loads to process large and unstructured data sets & to process real time data.
  • Working Knowledge of processing XML Files,JSON files.
  • Broad knowledge of technologies and best practices in business analytics, data management and data virtualization.
  • Hands-on experience in infrastructure management including database management and storage management.
  • Experience in validating different enterprise software products and experience with agile development methodologies.
  • Developed and Lead analytic tools and modeling frameworks to enhance decision and optimize technologies to address high value business problems.
  • Experience working on MPP Databases.
  • Worked with Big Data systems. Hadoop platform using tools Spark, Hive QL, Sqoop, Pig Latin and Python scripting & NoSQL databases

PROFESSIONAL EXPERIENCE

Confidential

DATA Engineer

Responsibilities:

  • Worked on the project Data Market Place which is a custom-built product for providing users with shopping experience for Data.
  • Worked on Analysis and understanding of the data from different domains in order to integrate to Data Market Place.
  • Worked with Different business units to drive the design & development strategy.
  • Created functional specification and technical design documentation.
  • Co-ordinated with Different Teams like cloud security, Identity Access Management, Platform, Network, Dev OPS in order to get all the necessary accreditations and intake process.
  • Written Terraform scripts to automate AWS services which include ELB, CloudFront distribution, RDS, EC2, database security groups, Route 53, VPC, Subnets, Security Groups, and S3 Bucket and converted existing AWS infrastructure to AWS Lambda deployed via Terraform and AWS CloudFormation.
  • Implemented AWS Elastic Container Service (ECS) scheduler to automate application deployment in the cloud using Docker Automation techniques.
  • Architected and configured a virtual data center in the AWS cloud to support Enterprise Data Warehouse hosting including Virtual Private Cloud (VPC), Public and Private Subnets, Security Groups and Route Tables.
  • Designed various Jenkins jobs to continuously integrate the processes and executed CI/CD pipeline using Jenkins.
  • Worked with API gateways, Reports,Dashboards,Databases,Security groups,Data Science Models,Labmda functions
  • Created automation and deployment templates for relational and NOSQL databases
  • Used Git version control to manage the source code and integrating Git with Jenkins to support build automation and integrated with Jira to monitor the commits.
  • Created Document process flow diagrams to share with cross functional teams.

Environment: AWS, Power BI, Tableau 2020.1,P

stgresql,Angular JS, Python,Snowflake,Dynamo DB, Informatica EDC, AXON, Azure, Mainframe,DB2,IMS,My Sql,Salesforce,JIRA,Terraform,Jenkins Confidential

BI DATA Engineer

Responsibilities:

  • worked on different projects like legacy conversion, Open lane integration, Sales force, sales data Mart, performance tuning, Whole Car, PWI,Cognos Audit data Model
  • Responsible for bridging technology and business and to drive the design strategy for data sourcing, data hosting and provisioning to various surround application systems.
  • Involved in all phases of SDLC (software development life cycle) right from requirement gathering and participated in meetings with business users along with business analysts. worked on converting the functional specs from BA to technical specifications and created high level documentation for projects like who the point of contact is and how much space we need in database and what will be our SLA (service level agreement)
  • Implemented a CI/CD pipeline with Docker, Jenkins (TFS Plugin installed), Team Foundation Server (TFS), GitHub and Azure Container Service, whenever a new TFS/GitHub branch gets started, Jenkins, our Continuous Integration (CI) server, automatically attempts to build a new Docker container from it.
  • Worked with Terraform Templates to automate the Azure Iaas virtual machines using terraform modules and deployed virtual machine scale sets in production environment.
  • Managed Azure Infrastructure Azure Web Roles, Worker Roles, VM Role, Azure SQL, Azure Storage, Azure AD Licenses, Virtual Machine Backup and Recover from a Recovery Services Vault using Azure PowerShell and Azure Portal.
  • Written Templates for Azure Infrastructure as code using Terraform to build staging and production environments. Integrated Azure Log Analytics with Azure VMs for monitoring the log files, store them and track metrics and used Terraform as a tool, Managed different infrastructure resources Cloud, VMware, and Docker containers.
  • Virtualized the servers using the Docker for the test environments and development environment and performed configuration automation using Docker containers. responsible from architecting solutions utilizing Microsoft Azure.
  • Created a process for migrated data to Azure Data Lake and Azure SQL Data Warehouse.
  • Architect and document Azure SQL Data Warehouse (Synapse Analytics).
  • Create and implement Azure Data Factory Data Flow ETL and Pipelines.
  • Designed complex ETL process to load fact tables.
  • Create custom Azure Functions to automate repetitive tasks for maintenance.
  • Utilize Azure API’s to automate functionality in Data Lake, Data Factory, and other sources
  • Set up Azure Repo and Pipelines for CI/CD deployment of objects.
  • Setup security and access to Azure Resources with AD and KeyVault.
  • Set up Azure SQL DB and other PAAS and IAAS Resources as needed in Azure
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Worked extensively on Informatica partitioning when dealing with huge volumes of data and also partitioned the tables in Oracle for optimal performance.
  • Worked on regular database capacity planning related to database growth and system utilization, trend analysis and predicting future database resource requirements as per warehouse growth in volumes of the data.
  • Experience in Installing and configuring Informatica Power Center 9.6.1, Metadata Manager, Informatica Data validation option(DVO)
  • Worked on handling semi structured data.
  • Worked on Processing JSON, Parquet files using SNOWFLAKE.
  • Worked on creating .TDE objects in tableau.
  • Experience with Building Tableau Bar Chart, Text Table, Line Chart, Scatter Plot, Heat Map, Histogram, Gantt Chart, Pie Chart,Treemap, Box Plot, packed Bubble Chart, and Map View.
  • Creation and maintenance of Informatica users and privileges.
  • Worked on Salesforce target objects, salesforce look ups For integration of PWI(preferred Warranties)
  • Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments. worked on creating the documentation for the entire process
  • Lead the efforts for source system analysis, data discovery, enterprise data modeling, custody systems integration, data quality and automation.
  • Deep understanding of agile methodologies and the challenges, co-ordinate with the Scrum master and development/QA teams to ensure resource availability and allocations.
  • Worked with JIRA to follow agile approach.
  • Document process flow diagrams & data strategies for informative presentations with cross functional teams.

Environment: Informatica Power Center10.2, 9.6.1,9.5, 8.6.1/8.5.1 ,Informatica Data quality, SNOWFLAKE, Informatica Meta data Manager, Informatica DVO(Data Validation Option), Informatica Proactive monitoring,ER Studio 9.1, Azure,AWS,Windows XP, Unix, Oracle 11g, SQL SERVER2012/2014, SQL*Loader, IBM Mainframe, SQL Navigator, SQL, PL/SQL, Korn shell, Appworx, Toad, MS-Office tools, Flat files, IBM AIX 4.2, Cognos version 10.2.1,, Report Studio, Query Studio, Metric Designer, Framework Manger, T-SQL, SQL Agent,Tableau 10,JIRA,Terraform

Confidential

BI Consultant

Responsibilities:

  • Worked with Business Users to understand the requirements for Elavon Scoreboard and Collections.
  • Worked on analyzing the data coming from source systems like TCS, TCS, DIALLER, SWITCH&TRIAD.
  • Worked on designing the table structure based on the Collector data and Credit Card data provided by the users.
  • Designed the Normalized the table structure to conform to 3rd normal form for TCS files, Master Files, &ICS data which is the current to date data for credit card customers.
  • Designed the Data Model and created several Facts&Dimensions based on the reporting needs for the Projects.
  • Worked Closely with Elavon Financial services and Coordinated with the Team on creating Reports for Merchant Connect&Small Business by providing Data mart for Elavon.
  • Designed Several complex ETL process to load the incoming files from ICS,TCS,Switch,Triad,Mainframes,Informix in to Prestaging&Staging Areas.
  • Build out best practices regarding data staging data cleansing and data transformation routines within the Informatica solution.
  • Lead the Team of ETL Developers by Distributing the modules
  • Explained the requirements to Team of Developers by handling over Design & Source - Target Documentation.
  • Created Base Tables Stages tables based on the data model and number of source systems

Environment: Informatica Power Center 8.6.1/8.5.1 , Informatica Power Analyzer, Informatica Meta data Manager, Informatica Power Exchange, Informatica Data Profiler, Erwin 4.5, Oracle 11g, DB2 version 9.5/8.2, SQL*Loader, IBM Mainframe, SQL Navigator, SQL, PL/SQL, Korn shell, SQL Server 2005/2000,Appworx, Toad, MS-Office tools, Flat files, IBM AIX 4.2, Cognos version 8.4/8.1, Report Studio, Query Studio, Metric Designer, Framework Manger, SQL Server 2008R2, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), T-SQL, Microsoft Office SharePoint Server, MS Access 2010, Flat Files, Subversion Source Control.

We'd love your feedback!