metadata
license: mit
language:
- en
pipeline_tag: text-generation
OpenSML: An Family of Open Small Language Models (coming soon)
William Zebrowski
Introduce OpenSML, a series of Open SMaLl Language Models. These models arcitecture are built stricly will Apple's MLX framework.
The pre-training dataset is a slice of OpenWebText dataset with approximately 2.3 billion tokens.
Bias, Risks, and Limitations
OpenSMLis shared to advance open research by granting access to cutting-edge language models. However, because it’s trained on publicly sourced data and released without safety warranties, it may produce content that is inaccurate, harmful, biased, or otherwise objectionable. Users and developers should therefore conduct rigorous safety evaluations and put in place filtering or other safeguards that suit their specific use cases.
Citation
If you find our work useful, please cite:
@misc{zebrowski2025opensml,
title={OpenSML: A Family of Small Language Models},
author={William Zebrowski},
year={2025},
howpublished={\url{https://github.com/wzebrowski/opensml}}
}