Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills:
- Must have at least 8+ years of development and administration experience of AWS, Kafka and Snowflake
Detailed Job Description:
- Configure Replicate software (attunity/streamsets, etc) to capture CDC from OSS and BSS systems to AWS S3 and Snowflake/Redshift
- Building streaming data pipelines from Kafka/kinesis to AWS S3 and Snowflake/Redshift
- Building API framework for all possible API use cases to AWS and Snowflake/Redshift
- Building batch data pipelines as per end user requirements
- Pipeline for tokenization of data in AWS S3 using data bricks
- Data Curation framework on AWS S3 using data bricks
- Data Curation Framework on Snowflake/Redshift (ex: using snowpipe, streams and tasks)
- Unit testing the data from OSS, BSS and other sources using above frameworks
Minimum years of experience: 5+
Certifications: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute:
- Building streaming data pipelines from Kafkakinesis to AWS S3 and SnowflakeRedshift
- Data Curation Framework on SnowflakeRedshift
- Pipeline for tokenization of data in AWS S3 using data bricks
Interview Process (Is face to face required?) No
Does this position require Visa independent candidates only? No