From c6a01c691dfd53f57b5fa3688a15f54b3d0af881 Mon Sep 17 00:00:00 2001 From: wizardforcel <562826179@qq.com> Date: Mon, 19 Oct 2020 21:54:21 +0800 Subject: [PATCH] 2020-10-19 21:54:21 --- {img/tf_2.0 => docs/TensorFlow2.x/img}/bert.png | Bin docs/TensorFlow2.x/实战项目_5_bert.md | 2 +- 2 files changed, 1 insertion(+), 1 deletion(-) rename {img/tf_2.0 => docs/TensorFlow2.x/img}/bert.png (100%) diff --git a/img/tf_2.0/bert.png b/docs/TensorFlow2.x/img/bert.png similarity index 100% rename from img/tf_2.0/bert.png rename to docs/TensorFlow2.x/img/bert.png diff --git a/docs/TensorFlow2.x/实战项目_5_bert.md b/docs/TensorFlow2.x/实战项目_5_bert.md index 99b53dc7..68c6105e 100644 --- a/docs/TensorFlow2.x/实战项目_5_bert.md +++ b/docs/TensorFlow2.x/实战项目_5_bert.md @@ -17,7 +17,7 @@ BERT的全称为Bidirectional Encoder Representation from Transformers,是一 2. 预训练后,只需要添加一个额外的输出层进行fine-tune,就可以在各种各样的下游任务中取得state-of-the-art的表现。在这过程中并不需要对BERT进行任务特定的结构修改。 -![](/img/tf_2.0/bert.png) +![](img/bert.png) 总结一下: