BERT
In one line: An older Google model (2018) that was a major step before LLMs. Today mostly used inside Google Search, not for chat.
BERT (Bidirectional Encoder Representations from Transformers) was Google's 2018 transformer model that pioneered bidirectional training — looking at words from both directions when learning context.
BERT isn't a generative chatbot like ChatGPT. It's an encoder — it understands text but doesn't write new text. BERT and its descendants (RoBERTa, DeBERTa) are still widely used inside Google Search, content classification systems, and embedding models.
For everyday users, BERT has been replaced by LLMs like Gemini and ChatGPT.
See it in action — ask any AI about bert on AskAI.free.
Try it free →