Lab 3: What Are Embeddings?
See how text becomes numbers — and how those numbers capture meaning.
🔢
The Solution Foundation: Text → Math
An embedding model converts any text into a list of numbers (a vector). Similar meanings produce similar vectors. This is what makes semantic search possible.
Section 1: Text → Vector
Type any text and watch it become a 1536-dimensional vector.
Section 2: How Similar Are Two Sentences?
Enter two sentences and see a cosine similarity score (0 = unrelated, 1 = identical meaning).
Try these revealing pairs
Takeaway: Embeddings turn text into vectors where similar meanings cluster together in space. Cosine similarity measures the angle between two vectors — a small angle means similar meaning. This is the mathematical engine behind semantic search in Lab 2, and behind RAG retrieval in Lab 5.