Job ID :
28364
Company :
Internal Postings
Location :
San Francisco, CA
Type :
Contract
Duration :
6 Months
Salary :
DOE
Status :
Active
Openings :
1
Posted :
28 Sep 2020
Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills (Top 3 technical skills only)*:
  1. Azure (ADF) Architect
  2. Databricks
  3. Spark
Detailed Job Description:
  • Responsible for the build, deployment, onboarding and operations of multi - tenant data platform components/tools.
  • Experience building cloud native production grade enterprise data lake(s) leveraging streaming/micro-batch and batch processing engines
  • Best practices, design and operational practices experience of data platform components and capabilities, preferably in an Azure ecosystem
  • Experience with the design, deployment and operations of elastic spark clusters (preferably Azure Databricks)
  • Experience with and working knowledge in the design of/development with Databricks delta lake on cloud data lake (preferably ADLS gen 2)
  • Experience with the design, deployment, integration and operational practices of Data Catalog and Governance tooling (such as Azure Data Catalog, Collibra, Alation)
  • Experience building and deploying production grade data pipelines with Azure Data Factory.
  • Automation of deployment, configuration and operations of Cloud PaaS data services (preferably Azure)

Minimum years of experience*: 12+

Certifications Needed: No

Top 3 responsibilities you would expect the Subcon to shoulder and execute*:

  1. Participate in building architecture and design
  2. Develop, test and deploy data integration solutions
  3. Provide data management knowledge to deliver the most architecturally efficient

Interview Process (Is face to face required?) No

Does this position require Visa independent candidates only? No