Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
@@ -12,9 +12,15 @@ short_description: Logic Gate Learning with Neural Networks
|
|
12 |
license: mit
|
13 |
---
|
14 |
|
15 |
-
#
|
16 |
|
17 |
-
|
18 |
|
19 |
-
|
20 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
12 |
license: mit
|
13 |
---
|
14 |
|
15 |
+
# **Embedding Dimension Visualizer**
|
16 |
|
17 |
+
An **Embedding Dimension Visualizer** is an interactive Streamlit tool designed for teaching and experimentation with modern transformer embeddings. It lets you:
|
18 |
|
19 |
+
* **Tokenize** any input text using tiktoken or HuggingFace’s BPE tokenizer, showing each subword token and its ID.
|
20 |
+
* **Visualize embeddings** by generating a demo embedding vector for every token.
|
21 |
+
* **Compute and display sinusoidal positional encodings** (sin / cos) per token position.
|
22 |
+
* **Combine embeddings + positional encodings** and present the final per-token vectors exactly as they’d be fed into attention heads.
|
23 |
+
* **Expose theory** via an expandable section—complete with LaTeX formulas—covering tokenization, BPE, and the positional-encoding equations.
|
24 |
+
* **Lock sliders** into read-only mode, so learners can observe values without accidentally altering them.
|
25 |
+
|
26 |
+
This app is ideal for workshops, live demos, or self-study when you want a hands-on, visual understanding of how embeddings and positional information come together inside a transformer model.
|