- Good understanding of ML/AI concepts: types of algorithms, machine learning frameworks, model efficiency metrics, model life-cycle, AI architectures.
- Good understanding of Cloud concepts and architectures as well as working knowledge with selected cloud services.
- Good understanding of CI/CD and DevOps concepts, and experience in working with selected tools (preferably GitHub Actions, GitLab or Azure DevOps).
- Strong experience in at least one of following domains: Data Warehouse, Data Lake, Data Integration, Data Governance, Machine Learning, Deep Learning.
- At least 4 years of experience in production ready code development.
- Experience in designing and implementing data pipelines.
- Good communication skills.
- Ability to work in team and support others.
- Taking responsibility for tasks and deliverables.
- Great problem-solving skills and critical thinking.
- Excellent English skills.
Nice to have skills & knowledge:
- Practical experience in AutoML within H2O, DataRobot, Azure AutoML, GCP AutoML, TPOT or AutoSKLearn.
- Practical experience in MLOps tools like MLFlow, Azure ML, GCP ML, AWS Sagemaker.
- Experience in Tensorflow or PyTorch.
- Experience in programming ML algorithms and data processing pipelines using Python, Java, Scala, C++, JavaScript or R.