BGE-Base-EN v1.5
This model maps any text to a low-dimensional dense vector using FlagEmbedding.
About model
BGE-Base-EN v1.5 generates English text embeddings using a BERT-based encoder architecture, achieving strong retrieval performance at a compact 109M parameter size. It enhances retrieval without requiring query instructions and is widely used for its balance of embedding quality and efficiency. Suitable for developers building search and RAG applications.
API usage
Endpoint:
- Model providerBAAI
- TypeEmbeddings
- Main use casesEmbeddings
- DeploymentServerlessMonthly Reserved
- Endpoint
- Parameters109M
- Context length512
- Input price
$0.01 / 1M tokens
- Input modalitiesText
- Output modalitiesStructured Data
- External link
- CategoryEmbeddings