Embedding
In one line: A list of numbers that represents a piece of text in a way that lets computers measure 'similarity' mathematically. The foundation of semantic search and RAG.
An embedding is a vector — typically 768 to 3072 numbers — that represents a piece of text in a high-dimensional space. Texts with similar meaning end up close to each other in that space.
Embeddings are how computers measure 'similarity' between texts. They power:
- Semantic search — find documents about 'climate change' even if they say 'global warming'.
- RAG — retrieving relevant context before generating an answer.
- Recommendation systems.
- Clustering and topic discovery.
OpenAI, Voyage and Cohere all sell embedding APIs. AskAI.free uses embeddings internally for chat history search.
See it in action — ask any AI about embedding on AskAI.free.
Try it free →