Spaces:
Paused
Paused
corrected readme.md
Browse files
README.md
CHANGED
@@ -1 +1,48 @@
|
|
1 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
title: "GPT Transformer Text Generator"
|
3 |
+
emoji: "🤖"
|
4 |
+
colorFrom: "blue"
|
5 |
+
colorTo: "green"
|
6 |
+
sdk: "gradio"
|
7 |
+
sdk_version: "3.0.0"
|
8 |
+
app_file: "app.py"
|
9 |
+
pinned: false
|
10 |
+
---
|
11 |
+
|
12 |
+
# GPT Transformer Model
|
13 |
+
|
14 |
+
This repository contains a GPT-like transformer model built using PyTorch for natural language generation. The model is based on the architecture introduced in GPT-2, which has been trained on a custom dataset for text generation.
|
15 |
+
|
16 |
+
## Model Overview
|
17 |
+
|
18 |
+
The model is a multi-layer transformer-based neural network, consisting of the following components:
|
19 |
+
|
20 |
+
- **Causal Self-Attention:** A core component of the transformer that performs self-attention to process the input sequence.
|
21 |
+
- **MLP (Feedforward Layer):** Applied to each block in the transformer, which helps the model to learn complex relationships.
|
22 |
+
- **Layer Normalization:** Applied before each attention and feedforward layer to stabilize training.
|
23 |
+
- **Embedding Layers:** Token embeddings for words and positional embeddings for the sequence.
|
24 |
+
|
25 |
+
### Architecture
|
26 |
+
- **Embedding Dimension (`n_embd`)**: 768
|
27 |
+
- **Number of Attention Heads (`n_head`)**: 12
|
28 |
+
- **Number of Layers (`n_layer`)**: 12
|
29 |
+
- **Vocabulary Size (`vocab_size`)**: 50,257
|
30 |
+
- **Max Sequence Length (`block_size`)**: 1024
|
31 |
+
|
32 |
+
The model is trained for text generation and can be fine-tuned with custom data.
|
33 |
+
|
34 |
+
## Requirements
|
35 |
+
|
36 |
+
To run the model and perform inference, you will need the following dependencies:
|
37 |
+
|
38 |
+
- Python 3.7+
|
39 |
+
- PyTorch
|
40 |
+
- Gradio
|
41 |
+
- Transformers
|
42 |
+
- Tokenizers (GPT-2)
|
43 |
+
|
44 |
+
You can install the required libraries using:
|
45 |
+
|
46 |
+
```bash
|
47 |
+
pip install torch gradio transformers tiktoken
|
48 |
+
|