Reddit Text Generation Model
This is a GPT-2 based model fine-tuned on Reddit posts to generate Reddit-style content.
Model Details
- Architecture: GPT-2 (125M parameters)
- Training Data: Reddit posts with 1000+ upvotes
- Use Cases: Generate Reddit-style text, social media content
- Training Framework: HuggingFace Transformers + DeepSpeed
Usage
from transformers import GPT2LMHeadModel, GPT2TokenizerFast
import torch
# Load model and tokenizer
model_name = "chimcis/reddit-text-model"
tokenizer = GPT2TokenizerFast.from_pretrained(model_name)
model = GPT2LMHeadModel.from_pretrained(model_name)
# Generate text
prompt = "What I learned after 5 years of programming:"
inputs = tokenizer.encode(prompt, return_tensors='pt')
with torch.no_grad():
outputs = model.generate(
inputs,
max_length=200,
temperature=0.8,
do_sample=True,
top_p=0.9,
pad_token_id=tokenizer.eos_token_id
)
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(generated_text)
Training Details
- Training Steps: 50,000
- Batch Size: 64 (global)
- Learning Rate: 1e-4
- Hardware: 4x NVIDIA H100 80GB
- Training Time: ~5 hours
Limitations
- Model generates general Reddit-style content
- May produce inconsistent or off-topic text
- Should not be used for harmful content generation
Citation
@misc{reddit-text-model,
author = {Reddit Model},
title = {Reddit Text Generation Model},
year = {2025},
publisher = {Hugging Face},
url = {https://huggingface.co/chimcis/reddit-text-model}
}
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for chimcis/reddit-text-model
Base model
openai-community/gpt2