test_v2_medium / README.md
sjster's picture
local branch name test (#5)
2d3e6bc verified
metadata
license: mit
language:
  - en
pipeline_tag: text-generation
tags:
  - phi
  - nlp
  - math
  - code
  - chat
  - conversational
inference:
  parameters:
    temperature: 0
widget:
  - messages:
      - role: user
        content: How should I explain the Internet?
library_name: transformers

ModernBert Model Card

ModernBert Technical Report

Model Summary

Developed by Micro
Description phi-4 is a state-of-the-art open model built upon a blend of synthetic datasets, data from filtered public domain websites, and acquired academic books and Q&A datasets. The goal of this approach was to ensure that small capable models were trained with data focused on high quality and advanced reasoning.

phi-4 underwent a rigorous enhancement and alignment process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures
Architecture 14B parameters, dense decoder-only Transformer model
Inputs Text, best suited for prompts in the chat format
Context length 16K tokens
GPUs 1920 H100-80G
Training time 21 days
Training data 9.8T tokens
Outputs Generated text in response to input
Dates October 2024 – November 2024
Status Static model trained on an offline dataset with cutoff dates of June 2024 and earlier for publicly available data
Release date March 17, 2025
License MIT