Job ID :
41291
Company :
Internal Postings
Location :
Trenton, NJ
Type :
Contract
Duration :
12 Months
Salary :
DOE
Status :
Active
Openings :
1
Posted :
04 Feb 2025
Job Seekers, Please send resumes to resumes@hireitpeople.com

Job Description:

  • Design, implement, and optimize data solutions using the Snowflake cloud data platform.
  • This role plays a pivotal part in architecting data warehouses, data lakes, and ETL processes to enable efficient data storage, processing, and analytics for the DOE.

Job Responsibilities:

Data Architecture:

  • Collaborate with data architects to design and develop Snowflake data models and schemas.
  • Create and maintain a well-structured data warehouse and data lake architecture.

Data Integration:

  • Develop ETL (Extract, Transform, Load) processes to ingest data from various sources into Snowflake.
  • Ensure data integration processes are efficient, reliable, and scalable.
  • Design and implement data pipelines using Snowflake features like tasks and streams.

Performance Optimization:

  • Optimize query performance by creating and maintaining appropriate indexes, materialized views, and clustering keys.
  • Identify and resolve performance bottlenecks in data processing.

SQL Development:

  • Write complex SQL queries, stored procedures, and user-defined functions (UDFs) to support data analytics and reporting needs.
  • Ensure SQL code follows best practices for readability and performance.

Security and Access Control:

  • Implement and manage security measures, including role-based access control (RBAC) and data encryption, to protect sensitive data.
  • Audit and monitor data access and user activities.

Data Quality Assurance:

  • Define and implement data quality checks and validation processes to maintain data accuracy.
  • Establish data quality rules and alerts to proactively identify issues.

Documentation:

  • Create and maintain technical documentation for data models, ETL processes, and data dictionaries.
  • Document best practices, standards, and guidelines for Snowflake development.

Version Control and Deployment:

  • Use version control systems (e.g., Git) for managing Snowflake SQL scripts and objects.
  • Coordinate the deployment of changes to Snowflake environments.

Monitoring and Alerts:

  • Set up monitoring and alerting for Snowflake environments to proactively detect and respond to issues.
  • Troubleshoot and resolve incidents related to data processing and performance.

Backup and Recovery:

  • Implement backup and recovery strategies to ensure data integrity and availability.
  • Develop and test data recovery procedures.

Collaboration:

  • Collaborate with data engineers, data scientists, and business analysts to understand data requirements and provide data solutions.
  • Work with cross-functional teams to support data-related projects and initiatives.

Qualifications:

  • Bachelors or masters degree in computer science, data engineering, or a related field.
  • 7+ years of experience as a Snowflake developer or data engineer with a focus on data warehousing and ETL.

Snowflake certification(s) is a plus:

  • Strong SQL skills and proficiency in data modeling and database design.
  • Knowledge of cloud data warehousing concepts and best practices.
  • Familiarity with data integration tools and technologies.
  • Solid understanding of data governance, data security, and compliance requirements.
  • Experience with version control systems and deployment processes.
  • Excellent problem-solving and troubleshooting skills.
  • Strong communication and collaboration abilities.
  • Ability to work in an Agile or iterative development environment.