×
ParsBERT is a monolingual language model based on Google's BERT architecture. This model is pre-trained on large Persian corpora with various writing styles ...
Missing: شرکت نیکان فارمد?
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.
Missing: شرکت نیکان فارمد? q= HooshvareLab/ fa-
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: نیکان فارمد? q=
People also ask
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: شرکت نیکان فارمد? q=
See the [model hub](https://huggingface.co/models ... HooshvareLab/bert-fa-base-uncased") tokenizer ... bert-fa-base-uncased-ner-arman) ## Eval results ...
Missing: شرکت نیکان فارمد? q=
ParsBERT is a monolingual language model based on Google's BERT architecture with the same configurations as BERT-Base. Paper presenting ParsBERT: arXiv: ...
Missing: نیکان فارمد? q= fa-
Introduction. ParsBERT is a monolingual language model based on Google's BERT architecture. This model is pre-trained on large Persian corpora with various ...
Missing: شرکت نیکان فارمد? q= https:// uncased
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: نیکان فارمد? q= https:// uncased
... https://huggingface.co/HooshvareLab/bert-fa-base-uncased-sentiment-digikala"> - <meta property="og:image" content="https://huggingface.co/front/thumbnails ...
In order to show you the most relevant results, we have omitted some entries very similar to the 10 already displayed. If you like, you can repeat the search with the omitted results included.