mlx-community/bge-small-en-v1.5-8bit
The Model mlx-community/bge-small-en-v1.5-8bit was converted to MLX format from BAAI/bge-small-en-v1.5 using mlx-lm version 0.0.3.
Use with mlx
pip install mlx-embeddings
from mlx_embeddings import load, generate
import mlx.core as mx
model, tokenizer = load("mlx-community/bge-small-en-v1.5-8bit")
# For text embeddings
output = generate(model, processor, texts=["I like grapes", "I like fruits"])
embeddings = output.text_embeds # Normalized embeddings
# Compute dot product between normalized embeddings
similarity_matrix = mx.matmul(embeddings, embeddings.T)
print("Similarity matrix between texts:")
print(similarity_matrix)
- Downloads last month
- 16
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Evaluation results
- accuracy on MTEB AmazonCounterfactualClassification (en)test set self-reported73.791
- ap on MTEB AmazonCounterfactualClassification (en)test set self-reported37.219
- f1 on MTEB AmazonCounterfactualClassification (en)test set self-reported68.091
- accuracy on MTEB AmazonPolarityClassificationtest set self-reported92.754
- ap on MTEB AmazonPolarityClassificationtest set self-reported89.468
- f1 on MTEB AmazonPolarityClassificationtest set self-reported92.739
- accuracy on MTEB AmazonReviewsClassification (en)test set self-reported46.986
- f1 on MTEB AmazonReviewsClassification (en)test set self-reported46.559
- map_at_1 on MTEB ArguAnatest set self-reported35.846
- map_at_10 on MTEB ArguAnatest set self-reported51.388