github UKPLab/sentence-transformers v0.3.7
v0.3.7 - Upgrade transformers, Model Distillation Example, Multi-Input to Transformers Model

latest releases: v2.7.0, v2.6.1, v2.6.0...
3 years ago
  • Upgrade transformers dependency, transformers 3.1.0, 3.2.0 and 3.3.1 are working
  • Added example code for model distillation: Sentence Embeddings models can be drastically reduced to e.g. only 2-4 layers while keeping 98+% of their performance. Code can be found in examples/training/distillation
  • Transformer models can now accepts two inputs ['sentence 1', 'context for sent1'], which are encoded as the two inputs for BERT.

Minor changes:

  • Tokenization in the multi-processes encoding setup now happens in the child processes, not in the parent process.
  • Added models.Normalize() to allow the normalization of embeddings to unit length

Don't miss a new sentence-transformers release

NewReleases is sending notifications on new releases.