Hello everyone and welcome!
We are happy to share our newest DeepPavlov Library v0.16.0! In this release we’ve updated the underlying Transformers library to 4.6.0, delivered Distill RuBERT for Russian language, and continued the Spring Cleaning of the DeepPavlov Library. And also in this release we’ve added an experimental multi-GPU support.
We are anxious to hear your thoughts about the new version!
CUDA_VISIBLE_DEVICES=0,1,2 python -m deeppavlov train glue_mnli_roberta