Job Seekers, Please send resumes to resumes@hireitpeople.com
Job Responsibilities:
- Design, develop, and maintain scalable data pipelines and ETL processes on Google Cloud Platform (GCP).
- Implement data ingestion, transformation, and storage solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Collaborate with cross-functional teams to gather requirements and design data models that meet business needs.
- Optimize data pipelines for performance, reliability, and cost-effectiveness.
- Ensure data quality and integrity throughout the data lifecycle by implementing data validation and monitoring processes.
- Troubleshoot and resolve issues related to data pipelines, infrastructure, and performance.
- Stay up-to-date with industry trends and best practices in data engineering and GCP technologies.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or related field.
- Active certification in Google Cloud Platform (GCP), such as Google Certified Professional Data Engineer.
- Proven experience in designing and implementing data pipelines and ETL processes.
- Strong proficiency in GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Experience with programming languages such as Python, Java, or Scala.
- Familiarity with data modeling concepts and database technologies.
- Excellent problem-solving skills and attention to detail.
- Effective communication and collaboration skills.