Job ID :
11490
Company :
Internal Postings
Location :
SAN FRANCISCO, CA
Type :
Contract
Duration :
6 months
Salary :
open
Status :
Active
Openings :
1
Posted :
26 Feb 2018
Job Seekers, Please send resumes to resumes@hireitpeople.com
Senior Contractor Profile:

Responsibilities:
  • Responsible to design, develop and deploy data engineering solutions in Hadoop ecosystem and Microsoft Azure.
  • Working with cross-functional teams including Product Owners, Solution Architects and Developers to implement BI projects.
  • Collaborate with peers to find solutions to complex data integration challenges. Share knowledge with peers and junior engineers.
Requirements
  • 7+ years of overall Software Development experience
  • 2+ years of experience in Hadoop technology stack - Map Reduce concepts, Distributed computing concepts, HDFS, Hive, Pig and Spark.
  • 1+ year experience of actual project implementation in Microsoft Azure ecosystems (Azure Functions, Azure web apps, Stream Analytics, Azure Analytics Services etc.)
  • Proven track record of building software solutions dealing with high data volume
  • Solid understanding of any RDBMS, Data warehousing concepts and data modeling concepts
  • Strong knowledge in building data pipelines, ETL concepts, near real-time integration patterns and technologies like Kafka, Rabbit MQ etc.
  • Familiar with Service Oriented Architecture Principles - Micro Services, JSON Structures, SOA integration patterns
  • Proficient in any one/more programming language - Python, Java, Javascript, Perl, Scala etc.
  • Strong knowledge of at least one BI reporting tool - Microstrategy, Power BI, Tableau, Qlikview etc.
  • Adapt to new tools and technologies in traditional BI space and Hadoop ecosystem
  • Experienced in developing software using Agile methodologies
  • Strong verbal and written communication skills. Ability to explain technical concepts to non-technical audience
  • Experience in Hadoop data ingestion would be a big plus
  • Retail domain experience would be a big plus
Junior Contractor Profile:

Responsibilities:
  • Responsible to develop, test and deploy data engineering solutions in Hadoop ecosystem and/or Microsoft Azure.
  • Working with cross-functional teams including Product Owners, Solution Architects and Developers to implement BI and Analytics projects.
Requirements
  • 3+ years of overall Software Development experience
  • Strong knowledge in Hadoop technology stack - Map Reduce concepts, HDFS, Hive and Spark.
  • Proficient in any one/more programming language - Python, Java, Javascript, Perl, Scala etc.
  • Good understanding of any RDBMS, Data Modeling, Data Warehousing and ETL concepts
  • Experience in Azure stack (Azure Functions, Azure web apps, Stream Analytics, Azure Analytics Services etc.)
  • Strong verbal and written communication skills.
  • Experience in at least one BI reporting tool such as Microstrategy, Power BI, Tableau or QlikView
  • Experience in Kafka or Rabbit MQ
  • Experience in Hadoop data ingestion
  • Familiar with Service Oriented Architecture Principles and JSON structures
  • Experience in developing software using Agile methodologies would be a plus
  • Retail domain experience would be a plus.