reddit-text-model / README.md
chimcis's picture
Add model card
a3f366d verified
metadata
license: mit
base_model: gpt2
tags:
  - text-generation
  - reddit
  - social-media
  - gpt2
language:
  - en
datasets:
  - custom
widget:
  - text: 'What I learned after 5 years of programming:'
  - text: TIL that
  - text: DAE think that

Reddit Text Generation Model

This is a GPT-2 based model fine-tuned on Reddit posts to generate Reddit-style content.

Model Details

  • Architecture: GPT-2 (125M parameters)
  • Training Data: Reddit posts with 1000+ upvotes
  • Use Cases: Generate Reddit-style text, social media content
  • Training Framework: HuggingFace Transformers + DeepSpeed

Usage

from transformers import GPT2LMHeadModel, GPT2TokenizerFast
import torch

# Load model and tokenizer
model_name = "chimcis/reddit-text-model"
tokenizer = GPT2TokenizerFast.from_pretrained(model_name)
model = GPT2LMHeadModel.from_pretrained(model_name)

# Generate text
prompt = "What I learned after 5 years of programming:"
inputs = tokenizer.encode(prompt, return_tensors='pt')

with torch.no_grad():
    outputs = model.generate(
        inputs,
        max_length=200,
        temperature=0.8,
        do_sample=True,
        top_p=0.9,
        pad_token_id=tokenizer.eos_token_id
    )

generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(generated_text)

Training Details

  • Training Steps: 50,000
  • Batch Size: 64 (global)
  • Learning Rate: 1e-4
  • Hardware: 4x NVIDIA H100 80GB
  • Training Time: ~5 hours

Limitations

  • Model generates general Reddit-style content
  • May produce inconsistent or off-topic text
  • Should not be used for harmful content generation

Citation

@misc{reddit-text-model,
  author = {Reddit Model},
  title = {Reddit Text Generation Model},
  year = {2025},
  publisher = {Hugging Face},
  url = {https://huggingface.co/chimcis/reddit-text-model}
}