We provide IT Staff Augmentation Services!

Lead Data Engineer/ Bigdata Cloud Solution Architect Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Over 16 years of strong experience inall aspects of Software development methodology including gathering System Requirements, Analyzing the requirements, Designing and developing systems and very good experience in all phases of Software Development Life Cycle (SDLC).
  • 6 years of experience in Big Data Hadoop Lead/Architect having end to end experience in developing applications in Hadoop ecosystems.
  • Experience in AWS and Azure cloud solution architect
  • Experience with cloud - based architecture, data management and integration.
  • Experience in created Data Pipeline, Spark streaming with Java, Scala, Python, PySpark, Kafka and Unix Shell Scripting.
  • Worked on Data Modeling, DWH tools and NoSQLs
  • Experience in understanding the client’s Big Data business requirements and transform it into Hadoop centric technologies.
  • Experience in designed architecture for Bigdata application.
  • Worked on creating Hive and Impala scripts to analyze the data efficiently.
  • Hands on NoSQL database experience with HBase and Cassandra
  • Experiencein dealingwithApache Hadoop components likeHDFS,Map Reduce, HiveQL, Impala, Pig, Big Data and Big Data Analytics.
  • Experience in integrating Hadoop with Informatica BDM for processing data
  • Experience in integrating Hadoop with Informatica, Talend and working on the pipelines for processing data.
  • Performed Importing and Exporting data from RDBMS into HDFS and Hive using Sqoop.
  • Professional Certified and Experienced in Dell Boomi integration tool
  • Closely worked with Tableau, PowerBI, Qlik BI Reporting team
  • Worked on Snowflake datawarehouse.
  • Good working knowledge various development tool like: Eclipse IDE, PyCharm, Toad, Oracle-SQL developer tool
  • Experience in Scrum, Agile and Waterfall models.
  • Experience in ETL data warehousing and data integration projects using Pentaho tool.
  • Strong experience in database design, writing complex SQL Queries and Stored Procedures.
  • Worked knowledge of database such as Oracle 8i/9i/10g, Microsoft SQL Server, DB2.
  • Possess excellent communication, interpersonal and analytical skills along withpositiveattitude.

TECHNICAL SKILLS

Big Data Eco Systems: Apache Hadoop, YARN, Hue, HDFS, Hive, Pig, Impala, Sqoop, Oozie, Kafka, Spark, Databricks, ADF

Language: Java, C#.Net, Python, Scala

No Sql: HBase, Cassandra

Cloud: AWS, Azure

ETL Tools: Pentaho, Talend, Informatica BDM

Integration tools: Dell Boomi

RDBMS/Databases: SQL Server, Oracle11g and MySQL

Version Controls: VSS, GitHub, SVN

Operating Systems: Windows, Unix, Linux

Datawarehouse: Snowflake

PROFESSIONAL EXPERIENCE

Confidential

Lead Data Engineer/ BigData Cloud Solution Architect

Responsibilities:

  • Contributed in design related activities and review complex design on scalability, performance and maintainability.
  • Migrated data from on premises to AWS S3 and Azure Blob storage.
  • Worked on Data ingestion patterns.
  • Handled the team and client interfacing for complex reviews. Conducting requirement gathering meetings, status meetings, technical review meetings and Design meetings.
  • Provided regular status reports to management on application status and other metrics.
  • Worked with developers and business customers to ensure that customer requirements are correctly translated to appropriate data design specifications.
  • Performed design reviews, quality reviews and performance reviews.
  • Developed PySpark code to process transformation and insertion.
  • Optimized the PySpark jobs for faster data processing
  • Designed and implemented of ETL pipelines using PySpark on AWS EMR
  • Developed Airflow DAGs for workflow.
  • Worked on CI/CD and Terraform.
  • Worked on Docker and Kubernetes
  • Retrieved data using Athena for client success team.
  • Interacted with teams like infrastructure, release management, change management, QA and Client success team.
  • Worked on AWS Data Pipeline and Redshift.
  • Done Poc in AWS Glue

Confidential, NE, Omaha

Lead Data Engineer / BigData Cloud Solution Architect

Responsibilities:

  • Requirement gathering from multiple business users and preparing mapping document for business logics.
  • Created PowerShell script runbook for automate creating and configuring AWS and Azure cluster
  • Migrated data from on premises to AWS S3 and Azure Blob storage.
  • Contributed in design related activities and review complex design on scalability, performance and maintainability.
  • Worked on Azure Storage and Data Lake Store Gen 2 in Azure
  • Worked with developers and business customers to ensure that customer requirements are correctly translated to appropriate data design specifications.
  • Worked Talend for ETL process
  • Performed design reviews, quality reviews and performance reviews.
  • HBase table designed to store all required details.
  • Worked on Cloud data warehouse technologies like Snowflake, Redshift (PoC)
  • Developed Spark and Scala code to process transformation and insertion into Hbase table.
  • Created Hive tables and ingested data into multiple zones.
  • Implemented Partitioning, Dynamic Partitioning and Bucketing in HIVE.
  • Prepared reports for visualization team
  • Interacted with teams like infrastructure, release management, QA and DBA.

Confidential, NJ

Bigdata Lead consultant / Technology Architect

Responsibilities:

  • Developed Spark and Java code to process transformation and insertion into Hbase table and Cassandra tables.
  • Involved in all module’s development and leading offshore team
  • Involved in code review and performance tuning activities.
  • Prepared mapping document for business logics
  • Contributed in design related activities and review complex design on scalability, performance and maintainability
  • Handled the team and client interfacing for complex reviews.
  • Cassandra table designed to store all required details.
  • Created Hive tables and ingested data into BDR.
  • Conducted requirement gathering meetings, status meetings, technical review meetings and Design meetings.
  • Interacted with teams like infrastructure, release management, change management, QA, DBA and application.
  • Worked with developers and business customers to ensure that customer requirements are correctly translated to appropriate data design specifications.
  • Provided regular status reports to management on application status and other metrics.
  • Performed design reviews, quality reviews and performance reviews.
  • Used InformaticaBDM for transforming and moving data from ASIS layer to Govern to Gold to Provision layers
  • Implemented Partitioning, Dynamic Partitioning and Bucketing in HIVE.

Confidential, NJ

Bigdata Lead consultant / Technology Architect

Responsibilities:

  • Requirement gathered from multiple product processor teams
  • Hbase table designed to store all transactions details.
  • Developed Spark streaming code and consumed kafka topics.
  • Conducted requirement gathering meetings, status meetings, technical review meetings and Design meetings.
  • Prepared mapping document for multiple product processors.
  • Performed design reviews, quality reviews and performance reviews.
  • Worked with developers and business customers to ensure that customer requirements are correctly translated to appropriate data design specifications.
  • Provided technical support for production cycles jobs and troubleshoot issues.
  • Interacted with teams like infrastructure, release management, change management, QA, DBA and application.
  • Provided regular status reports to management on application status and other metrics.
  • Developed Spark sand Scala code to process transformation and insertion into Hbase table

Confidential, CA

Bigdata Lead consultant / Technology Architect

Responsibilities:

  • Conducted requirement gathering meetings, status meetings, technical review meetings and Design meetings.
  • Performed design reviews, quality reviews and performance reviews.
  • Worked with developers and business customers to ensure that customer requirements are correctly translated to appropriate data design specifications.
  • Provided technical support for production cycles jobs and troubleshoot issues.
  • Interacted with teams like infrastructure, release management, change management, QA, DBA and application.
  • Provided regular status reports to management on application status and other metrics.
  • Implemented Hbase table to store trade statistics report.
  • Developed Spark and Scala code to process transformation and insertion into Hbase table
  • Implemented Partitioning, Dynamic Partitioning and Bucketing in HIVE.
  • Developed Java API and Sqoop job for handling special characters
  • Implemented HiveQL for inserting data from raw layer to stage layer.
  • Implemented Spark SQL queries for processing data from stage layer to processed layer

Confidential, RI

Bigdata Lead consultant / Technology Architect

Responsibilities:

  • Conducted requirement gathering meetings, status meetings, technical review meetings and Design meetings.
  • Performed design reviews, quality reviews and performance reviews.
  • Worked with developers and business customers to ensure that customer requirements are correctly translated to appropriate data design specifications.
  • Provided technical support for production cycles jobs and troubleshoot issues.
  • Interacted with teams like infrastructure, release management, change management, QA, DBA and application.
  • Provided regular status reports to management on application status and other metrics.
  • Implemented Partitioning, Dynamic Partitioning and Bucketing in HIVE.
  • Developed Hive queries to process the data and generate the report for PMA team.
  • Developed Avro format Sqoop job for handling special characters

Confidential, TX

Lead consultant / Sr. Software developer

Responsibilities:

  • Involved in code development and code review.
  • Involved in performance tuning actions
  • Involved in deployment activities
  • Coordinated with other developers and software professionals
  • Streamlined Deployment of Application on Test and Production Server.
  • Performed complex analysis, designing and programming to meet business requirements.
  • Developed stored procedures, functions, views, triggers on SQL Server to accomplish the desired functionalities

Confidential, OH

Associate / Application Developer

Responsibilities:

  • Involved in code development and code review.
  • Built flexible data models and seamless integration points
  • Involved in performance tuning actions
  • Involved in deployment activities
  • Coordinated with other developers and software professionals
  • Streamlined Deployment of Application on Test and Production Server.
  • Performed complex analysis, designing and programming to meet business requirements.
  • Closely involved with the database team in developing stored procedures, functions, views, triggers on SQL Server to accomplish the desired functionalities

Environment: SQL server, Oracle, C#, Java, JavaScript, Web Services, VSS

Confidential

Lead consultant

Responsibilities:

  • Integrated various components of various key modules of the project
  • Involved in development and support
  • Engaged in code review and testing
  • Involved in documentation activities
  • Involved in performance tuning actions
  • Involved in supporting other developers throughout the project phase
  • Involved in deployment activities

Environment: Java, Oracle, Jquery, C#.Net, SOAP UI, Webservices

Confidential

Sr. Software developer

Responsibilities:

  • Engaged in coding, review and testing
  • Responsible for report generation
  • Integrated various components of various key modules of the project
  • Supported other developers throughout the project phase
  • Involved in deployment document creation

Environment: Java, C#.Net, jQuery and Oracle 10g

We'd love your feedback!