Job Seekers, Please send resumes to resumes@hireitpeople.comMust Have Skills:
- Python, Spark
- Snowflake
- AWS, Docker, Kubernetes
- Proficiency in script development and scripting languages in Python and Spark
- Experience with data pipeline and workflow management in AWS based cloud native tools like AWS Glue, S3, Kinesis, Lambda, State Machines, CloudWatch, Athena and Redshift
- Experience in continuous delivery automation, and the integration of cloud-native services to create fully functioning, cohesive delivery pipelines.
- Skills should include tools like Jenkins, BitBucket
- AWS build tools including: AWS CodeCommit, AWS CodeBuild and AWS CodePipeline
- Hands-on experience building containers and automating container orchestration—deployment, management, and scaling, using tools such as: Docker, Kubernetes, Amazon ECS/EKS
- Demonstrate expertise in Snowflake data modeling and ELT using Snowflake SQL, implementing complex stored procedures and best practices with data warehouse and ETL concepts.
- Experience delivering Data migration projects in AWS and Snowflake
- Experience with open sourced webservices environments (Java, REST, and SOAP)
- Experience with traditional and Cloud based API development
Minimum years of experience: 8-10 years
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute:
- Script development.
- Creations of data pipeline and workflow management.
- Work on continuous delivery automation, and the integration of cloudnative services to create fully functioning, cohesive delivery pipelines.
Interview Process (Is face to face required?): No
Does this position require Visa independent candidates only? No