Job Seekers, Please send resumes to resumes@hireitpeople.comDetailed Job Description:
10-14 years of experience:
- Create and maintain data flow design and technical requirements documentation using defined documentation templates that meets Agile product development standards (such as data analysis or methodology, MS Excel calculations).
- Understanding business objectives and developing models that help to achieve them, along with metrics to track their progress.
- Demonstrating the results of various algorithmic approaches and evaluating their performance
- Leverage a broad set of modern technologies - including Python, R, Scala, and Spark to analyze and gain insights within large data sets and implement systems for automatic data collection, curation and model training
- Analyze diverse sources of data, extract features from data sources, train and test models, and Productionalize the models that significantly improve business outcomes.
- Works closely with Data Scientists and Data Engineers to develop predictive algorithms
- Training models and tuning their hyperparameters.
- Collaborate with other team members, subject matter experts, and delivery teams to deliver strategic advanced analytic based solutions from design to deployment
- Develop and maintain an understanding of relevant industry standards, best practices, business processes and technology used in modeling and within the financial services industry
- Manage own work with minimal oversight and proactively communicate status and risks to leadership.
Required Qualifications:
- 10+ years overall experience with 1-3 years in an ESG(Environmental, social and governance) technology focused role within asset management or financial services industry.
- Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas.
- Experience with machine learning tools such as scikit-learn, R, Theano, TensorFlow, SparkML. or Foundry
- Experience in using two or more of the following modeling types to solve business problems: classification, regression, time series, clustering, text analytics, survival, association, optimization, reinforcement learning.
- Understanding of data models, large datasets, business/technical requirements. Bl tools, statistical programming languages and libraries
- Demonstrates functional knowledge of data visualization libraries such as matplotlib or ggplot2; knowledge of other visualization tools such as Microsoft Power Quick Sight or Tableau .
- Knowledge of cloud & computing technologies such as: Hadoop, Apache Spark, AWS, Microsoft Azure or Google cloud.
- Bachelors or Masters degree in computer science; Data science, statistics, mathematics, or a related field