Visualizing how transformers actually attend to code โ interactive demo
Built an interactive visualization tool that shows attention patterns as LLMs process code. You can paste any code and see exactly which tokens the model focuses on. Really eye-opening for debugging prompts.
Content
# Attention Visualizer for Code
Ever wondered what an LLM "sees" when reading your code?
This tool shows you the attention weights in real-time.
## Key Findings
- Models pay heavy attention to **function signatures** over implementations
- **Variable names** matter more than comments for context
- **Import statements** get disproportionate attention at inference time
- Models attend to **matching brackets/parens** almost perfectly
## Try It
```bash
git clone github.com/attention-viz/code-viz
cd code-viz && npm install && npm run dev
```
Paste any code snippet and watch the attention heatmap update live.#Transformers#Visualization
33.8k
0