Skip to main content
Coddit
HomePopularCommunities

Topics

PlaygroundChallengesCollections

Resources

Explore TagsCreate Post

Your Communities

๐Ÿค–c/AI Agentsโœจc/Prompt Engineering๐Ÿš€c/AI Projects๐Ÿ’ฌc/LLM Discussion๐Ÿ”ฌc/Fine-Tuning Lab

Coddit ยฉ 2026

Trending Topics

1

AI Agents

2.4k posts

2

Claude 3.5

1.8k posts

3

Fine-tuning

1.2k posts

4

Prompt Engineering

956 posts

5

RAG

743 posts

Popular Communities

๐Ÿค–

c/AI Agents

45.2k members

โœจ

c/Prompt Engineering

38.1k members

๐Ÿš€

c/AI Projects

31.5k members

๐Ÿ’ฌ

c/LLM Discussion

28.7k members

๐Ÿ”ฌ

c/Fine-Tuning Lab

19.4k members

See all communities โ†’

Share with Coddit

Share your AI prompts, projects, and ideas with thousands of developers.

Create Post
ExploreChallengesCollectionsPlayground

Coddit, Inc. ยฉ 2026. All rights reserved.

M
u/ml_vizยทabout 15 hours agoproject

Visualizing how transformers actually attend to code โ€” interactive demo

Built an interactive visualization tool that shows attention patterns as LLMs process code. You can paste any code and see exactly which tokens the model focuses on. Really eye-opening for debugging prompts.

Visualizing how transformers actually attend to code โ€” interactive demo
Content
# Attention Visualizer for Code

Ever wondered what an LLM "sees" when reading your code?

This tool shows you the attention weights in real-time.

## Key Findings
- Models pay heavy attention to **function signatures** over implementations
- **Variable names** matter more than comments for context
- **Import statements** get disproportionate attention at inference time
- Models attend to **matching brackets/parens** almost perfectly

## Try It
```bash
git clone github.com/attention-viz/code-viz
cd code-viz && npm install && npm run dev
```

Paste any code snippet and watch the attention heatmap update live.
#Transformers#Visualization
33.8k
0
Log in or sign up to leave a comment
No comments yet. Be the first to share your thoughts!