Job Seekers, Please send resumes to resumes@hireitpeople.com
Primary Duties and Responsibilities:
- Establish Data Engineering architecture strategy, best practices, standards, and roadmap.
- Experience developing ETL Pipeline using Python, Snowflake and IDMC.
- Experience with loading batch data and streaming data via Kafka
- Build Data Flows mapping Source systems and Process flows.
- Assemble large, complex data sets that meet nonfunctional and functional business requirements.
- Mentor team members on best practices, efficient implementations and delivering high quality data products.
- Lead onshore and offshore teams.
- Perform code reviews and assist developers in optimization and troubleshooting.
- Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features.
- Knowledge in AWS and management technologies such as S3.
- Strong written communication skills
- Is effective and persuasive in both written and oral communication.