Spaces:
Sleeping
Sleeping
Upload 2 files
Browse files- Dockerfile +24 -13
- README_HF.md +49 -0
Dockerfile
CHANGED
|
@@ -1,13 +1,24 @@
|
|
| 1 |
-
FROM python:3.11
|
| 2 |
-
|
| 3 |
-
WORKDIR /code
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
FROM python:3.11-slim
|
| 2 |
+
|
| 3 |
+
WORKDIR /code
|
| 4 |
+
|
| 5 |
+
# Install system dependencies
|
| 6 |
+
RUN apt-get update && apt-get install -y --no-install-recommends \
|
| 7 |
+
build-essential \
|
| 8 |
+
&& rm -rf /var/lib/apt/lists/*
|
| 9 |
+
|
| 10 |
+
# Copy requirements
|
| 11 |
+
COPY requirements.txt .
|
| 12 |
+
|
| 13 |
+
# Install dependencies
|
| 14 |
+
RUN pip install --no-cache-dir --upgrade pip && \
|
| 15 |
+
pip install --no-cache-dir -r requirements.txt
|
| 16 |
+
|
| 17 |
+
# Copy application code
|
| 18 |
+
COPY . .
|
| 19 |
+
|
| 20 |
+
# Expose port
|
| 21 |
+
EXPOSE 8000
|
| 22 |
+
|
| 23 |
+
# Run application
|
| 24 |
+
CMD ["shiny", "run", "app.py", "--host", "0.0.0.0", "--port", "8000"]
|
README_HF.md
ADDED
|
@@ -0,0 +1,49 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
title: Attention Atlas
|
| 3 |
+
emoji: π
|
| 4 |
+
colorFrom: pink
|
| 5 |
+
colorTo: blue
|
| 6 |
+
sdk: docker
|
| 7 |
+
pinned: false
|
| 8 |
+
license: mit
|
| 9 |
+
short_description: Tool for exploring attention patterns, assessing bias, etc.
|
| 10 |
+
app_port: 8000
|
| 11 |
+
---
|
| 12 |
+
|
| 13 |
+
# Attention Atlas π
|
| 14 |
+
|
| 15 |
+
An interactive application for visualizing and exploring **Transformer architectures** (BERT, GPT-2) in detail, with special focus on **multi-head attention patterns**, **head specializations**, **bias detection**, and **inter-sentence attention analysis**.
|
| 16 |
+
|
| 17 |
+
## Overview
|
| 18 |
+
|
| 19 |
+
Attention Atlas is an educational and analytical tool that allows you to visually explore every component of BERT and GPT-2 architectures:
|
| 20 |
+
|
| 21 |
+
- **Token Embeddings & Positional Encodings**
|
| 22 |
+
- **Q/K/V Projections** & **Scaled Dot-Product Attention**
|
| 23 |
+
- **Multi-Head Attention** (Interactive Maps & Flow)
|
| 24 |
+
- **Head Specialization Radar** (Syntax, Semantics, etc.)
|
| 25 |
+
- **Bias Detection** (Token-level & Attention interaction)
|
| 26 |
+
- **Token Influence Tree** (Hierarchical dependencies)
|
| 27 |
+
- **Inter-Sentence Attention (ISA)**
|
| 28 |
+
|
| 29 |
+
## Features
|
| 30 |
+
|
| 31 |
+
- **Interactive Visualizations**: Powered by Plotly and D3.js.
|
| 32 |
+
- **Real-Time Inference**: Uses PyTorch backend to run BERT/GPT-2 models on the fly.
|
| 33 |
+
- **Bias Analysis**: Detects generalizations, stereotypes, and unfair language, analyzing how attention mechanisms process them.
|
| 34 |
+
- **Full Architecture Explorer**: Inspect every layer, head, and residual connection.
|
| 35 |
+
|
| 36 |
+
## Technologies
|
| 37 |
+
|
| 38 |
+
- **Shiny for Python**
|
| 39 |
+
- **Transformers (Hugging Face)**
|
| 40 |
+
- **PyTorch**
|
| 41 |
+
- **Plotly**
|
| 42 |
+
|
| 43 |
+
## Usage
|
| 44 |
+
|
| 45 |
+
Simply enter a sentence in the input box, select a model (BERT or GPT-2), and click **Generate** / **Analyze Bias**.
|
| 46 |
+
|
| 47 |
+
---
|
| 48 |
+
|
| 49 |
+
*Part of a Master's thesis on Interpretable Large Language Models.*
|