Kaynağa Gözat

[BERT/PyT] Update pretrained checkpoint links (#660)

* Update pretrained checkpoint links

* update link
Sharath T S 5 yıl önce
ebeveyn
işleme
5cc03caa15
1 değiştirilmiş dosya ile 1 ekleme ve 1 silme
  1. 1 1
      PyTorch/LanguageModeling/BERT/README.md

+ 1 - 1
PyTorch/LanguageModeling/BERT/README.md

@@ -245,7 +245,7 @@ To train your model using mixed or TF32 precision with Tensor Cores or using FP3
  
 2. Download the NVIDIA pretrained checkpoint.
  
-If you want to use a pre-trained checkpoint, visit [NGC](https://ngc.nvidia.com/catalog/models/nvidia:bert_large_pyt_amp_ckpt_pretraining_lamb). This downloaded checkpoint is used to fine-tune on SQuAD. Ensure you unzip the downloaded file and place the checkpoint in the `checkpoints/` folder. For a checkpoint already fine-tuned for QA on SQuAD v1.1 visit [NGC](https://ngc.nvidia.com/catalog/models/nvidia:bert_large_pyt_amp_ckpt_squad_qa1_1).
+If you want to use a pre-trained checkpoint, visit [NGC](https://ngc.nvidia.com/catalog/models/nvidia:bert_pyt_ckpt_large_pretraining_amp_lamb/files). This downloaded checkpoint is used to fine-tune on SQuAD. Ensure you unzip the downloaded file and place the checkpoint in the `checkpoints/` folder. For a checkpoint already fine-tuned for QA on SQuAD v1.1 visit [NGC](https://ngc.nvidia.com/catalog/models/nvidia:bert_pyt_ckpt_large_qa_squad11_amp/files).
  
 3. Build BERT on top of the  NGC container.
 `bash scripts/docker/build.sh`