Skip to content

Commit 8a3342b

Browse files
[docs] Update a removed article with a new source (#3309)
* Update README.md The Towards Data Science article no longer exist. Instead, I found the same in a different site. * Also update the 2nd reference to the removed TDS article --------- Co-authored-by: Tom Aarsen <[email protected]>
1 parent 8d73d4f commit 8a3342b

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

docs/publications.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ When you use GPL, please have a look at: [GPL: Generative Pseudo Labeling for U
113113
In the following you find a (selective) list of articles / applications using SentenceTransformers to do amazing stuff. Feel free to contact me ([email protected]) to add you application here.
114114
- **December 2021 - [Sentence Transformer Fine-Tuning (SetFit): Outperforming GPT-3 on few-shot Text-Classification while being 1600 times smaller](https://towardsdatascience.com/sentence-transformer-fine-tuning-setfit-outperforms-gpt-3-on-few-shot-text-classification-while-d9a3788f0b4e?gi=4bdbaff416e3)**
115115
- **October 2021: [Natural Language Processing (NLP) for Semantic Search](https://www.pinecone.io/learn/nlp)**
116-
- **January 2021 - [Advance BERT model via transferring knowledge from Cross-Encoders to Bi-Encoders](https://towardsdatascience.com/advance-nlp-model-via-transferring-knowledge-from-cross-encoders-to-bi-encoders-3e0fc564f554)**
116+
- **January 2021 - [Advance BERT model via transferring knowledge from Cross-Encoders to Bi-Encoders](https://resources.experfy.com/ai-ml/bert-model-transferring-knowledge-cross-encoders-bi-encoders/)**
117117
- **November 2020 - [How to Build a Semantic Search Engine With Transformers and Faiss](https://towardsdatascience.com/how-to-build-a-semantic-search-engine-with-transformers-and-faiss-dcbea307a0e8)**
118118
- **October 2020 - [Topic Modeling with BERT](https://towardsdatascience.com/topic-modeling-with-bert-779f7db187e6)**
119119
- **September 2020 - [Elastic Transformers -

examples/sentence_transformer/training/data_augmentation/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Bi-encoders (a.k.a. sentence embeddings models) require substantial training dat
88

99
For more details, refer to our publication - [Augmented SBERT: Data Augmentation Method for Improving Bi-Encoders for Pairwise Sentence Scoring Tasks](https://arxiv.org/abs/2010.08240) which is a joint effort by Nandan Thakur, Nils Reimers and Johannes Daxenberger of UKP Lab, TU Darmstadt.
1010

11-
Chien Vu also wrote a nice blog article on this technique: [Advance BERT model via transferring knowledge from Cross-Encoders to Bi-Encoders](https://towardsdatascience.com/advance-nlp-model-via-transferring-knowledge-from-cross-encoders-to-bi-encoders-3e0fc564f554)
11+
Chien Vu also wrote a nice blog article on this technique: [Advance BERT model via transferring knowledge from Cross-Encoders to Bi-Encoders](https://resources.experfy.com/ai-ml/bert-model-transferring-knowledge-cross-encoders-bi-encoders/)
1212

1313
## Extend to your own datasets
1414

@@ -97,4 +97,4 @@ If you use the code for augmented sbert, feel free to cite our publication [Augm
9797
year = "2020",
9898
url = "https://arxiv.org/abs/2010.08240",
9999
}
100-
```
100+
```

0 commit comments

Comments
 (0)