Job Seekers, Please send resumes to resumes@hireitpeople.com
Job Duties:
- Develop and maintain Python scripts to automate data ingestion and processing tasks for the Analytical Data Warehouse, leveraging Snowflake and Databricks.
- Collaborate with stakeholders to understand data requirements and design database schemas accordingly.
- Use SQL to write efficient queries, perform data manipulation, and optimize database performance.
- Implement ETL processes to load data into the data warehouse, ensuring data quality and integrity.
- Work closely with the Business Intelligence team to support their reporting and analytics needs.
- Troubleshoot and resolve issues related to data pipelines, ETL processes, and database performance.
- Monitor and optimize data pipelines for efficiency and scalability.
- Collaborate with cross-functional teams to integrate data from various sources into the data warehouse.
- Provide technical expertise and guidance to junior members of the team.
- Stay up-to-date with industry best practices and emerging technologies in data engineering.
- Participate in Agile development processes, including sprint planning, daily stand-ups, and retrospectives.
- Contribute to the design and architecture of the data warehouse infrastructure.
- Document data workflows, processes, and procedures for future reference and knowledge sharing.
- Assist in the evaluation and implementation of new tools and technologies to enhance data processing capabilities.
Education: The minimum qualification required for the performance of the above specialty occupation duties is a bachelor's degree or equivalent in Computer Science or equivalent in a related field or a foreign equivalent is required closely related field with relevant experience.