Job Seekers, Please send resumes to resumes@hireitpeople.comMust Have Skills:
- Drive AI initiative
- Strong understanding of data modeling, SDLC, and generative AI tools
- Knowledge of DevSecOps concepts like test automation, Github action pipelines and AWS infrastructure are big pluses
- Knowledge of DevSecOps concepts like test automation, Github action pipelines and AWS infrastructure
- Experience with natural language processing (NLP) and text data
Detailed Job Description: Data engineer/SDET resource. The primary responsibility of this role is to drive AI initiatives by leveraging data engineering skillsets and working with the NEE AI platform. The successful candidate will have a strong understanding of data modeling, SDLC, and generative AI tools. They will play a key role in standardizing prompts and best practices across our application teams. Knowledge of DevSecOps concepts like test automation, Github action pipelines and AWS infrastructure are big pluses.
Responsibilities:
- Collaborate with cross-functional teams to identify AI initiatives/opportunities and understand their data engineering requirements.
- Analyze and preprocess data sets to ensure they are compatible with API calls and generative AI models.
- Develop data pipelines and workflows to automate data collection, processing, and integration tasks.
- Implement data cleaning, transformation, and quality assurance techniques to ensure high-quality input data.
- Standardize and document best practices for using the NEE AI platform, including prompt development and model selection.
- Work closely with the NEE AI platform team to customize and optimize platform functionalities for specific AI projects.
- Stay up to date with the latest advancements in AI, data engineering, and related fields.
- Provide technical guidance and support to other team members working on AI initiatives.
Technology Requirements:
- Bachelor's or master's degree in computer science, engineering, or a related field.
- Strong understanding of generative AI algorithms and techniques.
- Proficiency in a major programming language such as Java, Python, JavaScript, C#
- Knowledge of cloud computing platforms and services, such as AWS or Azure.
- Excellent problem-solving skills and ability to work independently or in a team.
- Strong communication and collaboration skills.
- Ability to multitask and prioritize tasks in a fast-paced environment.
- Experience with natural language processing (NLP) and text data.
- Please feel free to call me if you have any questions. I will be available over the phone.
Minimum Years of Experience: 7 years
Top 3 responsibilities you would expect the Subcon to shoulder and execute:
- Collaborate with cross functional teams to identify AI initiativesopportunities and understand their data engineering requirements
- Analyze and preprocess data sets to ensure they are compatible with API calls and generative AI models
- Develop data pipelines and workflows to automate data collection, processing, and integration tasks