github UKPLab/sentence-transformers v0.3.5
v0.3.5 - Automatic Mixed Precision & Bugfixes

latest releases: v3.3.0, v3.2.1, v3.2.0...
4 years ago
  • The old FP16 training code in model.fit() was replaced by using Pytorch 1.6.0 automatic mixed precision (AMP). When setting model.fit(use_amp=True), AMP will be used. On suitable GPUs, this leads to a significant speed-up while requiring less memory.
  • Performance improvements in paraphrase mining & semantic search by replacing np.argpartition with torch.topk
  • If a sentence-transformer model is not found, it will fall back to huggingface transformers repository and create it with mean pooling.
  • Fixing huggingface transformers to version 3.0.2. Next release will make it compatible with huggingface transformers 3.1.0
  • Several bugfixes: Downloading of files, mutli-GPU-encoding

Don't miss a new sentence-transformers release

NewReleases is sending notifications on new releases.