BERT models from huggingface - Transformers.jl



Am planning to use an already pre-trained BERT model for Greek published in huggingface. As we know for using this type of pre-trained models, one depends on python’s hugginface and pytorch. I know that @chengchingwen has been doing some exciting work with Transformers.jl. I really appreciate the effort he puts on this!

On the package’s page stays “Current we only support a few model and the tokenizer part is not finished yet.” Any news on the front?





If it is a BERT model with different weight, it should be workable, but not fully automatic. For the question, there are some progress taking place at different repo but haven’t get integrated together.