Improve model card with Hugging Face paper link

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +8 -6
README.md CHANGED
@@ -1,18 +1,20 @@
1
  ---
2
- license: apache-2.0
3
  language:
4
  - zh
5
  - en
6
- pipeline_tag: text-generation
7
  library_name: transformers
 
 
8
  ---
 
9
  <div align="center">
10
  <img src="https://github.com/OpenBMB/MiniCPM/blob/main/assets/minicpm_logo.png?raw=true" width="500em" ></img>
11
  </div>
12
 
13
  <p align="center">
14
  <a href="https://github.com/OpenBMB/MiniCPM/" target="_blank">GitHub Repo</a> |
15
- <a href="https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf" target="_blank">Technical Report</a>
 
16
  </p>
17
  <p align="center">
18
  ๐Ÿ‘‹ Join us on <a href="https://discord.gg/3cGQn9b3YM" target="_blank">Discord</a> and <a href="https://github.com/OpenBMB/MiniCPM/blob/main/assets/wechat.jpg" target="_blank">WeChat</a>
@@ -45,7 +47,7 @@ BitCPM4 are ternary quantized models derived from the MiniCPM series models thro
45
  ## Usage
46
  ### Inference with Transformers
47
  BitCPM4's parameters are stored in a fake-quantized format, which supports direct inference within the Huggingface framework.
48
- ```
49
  from transformers import AutoModelForCausalLM, AutoTokenizer
50
  import torch
51
 
@@ -89,7 +91,7 @@ BitCPM4's performance is comparable with other full-precision models in same mod
89
  - This repository and MiniCPM models are released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
90
 
91
  ## Citation
92
- - Please cite our [paper](https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf) if you find our work valuable.
93
 
94
  ```bibtex
95
  @article{minicpm4,
@@ -97,4 +99,4 @@ BitCPM4's performance is comparable with other full-precision models in same mod
97
  author={MiniCPM Team},
98
  year={2025}
99
  }
100
- ```
 
1
  ---
 
2
  language:
3
  - zh
4
  - en
 
5
  library_name: transformers
6
+ license: apache-2.0
7
+ pipeline_tag: text-generation
8
  ---
9
+
10
  <div align="center">
11
  <img src="https://github.com/OpenBMB/MiniCPM/blob/main/assets/minicpm_logo.png?raw=true" width="500em" ></img>
12
  </div>
13
 
14
  <p align="center">
15
  <a href="https://github.com/OpenBMB/MiniCPM/" target="_blank">GitHub Repo</a> |
16
+ <a href="https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf" target="_blank">Technical Report</a> |
17
+ <a href="https://huggingface.co/papers/2506.07900" target="_blank">Hugging Face Paper</a>
18
  </p>
19
  <p align="center">
20
  ๐Ÿ‘‹ Join us on <a href="https://discord.gg/3cGQn9b3YM" target="_blank">Discord</a> and <a href="https://github.com/OpenBMB/MiniCPM/blob/main/assets/wechat.jpg" target="_blank">WeChat</a>
 
47
  ## Usage
48
  ### Inference with Transformers
49
  BitCPM4's parameters are stored in a fake-quantized format, which supports direct inference within the Huggingface framework.
50
+ ```python
51
  from transformers import AutoModelForCausalLM, AutoTokenizer
52
  import torch
53
 
 
91
  - This repository and MiniCPM models are released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
92
 
93
  ## Citation
94
+ - Please cite our [paper](https://huggingface.co/papers/2506.07900) if you find our work valuable.
95
 
96
  ```bibtex
97
  @article{minicpm4,
 
99
  author={MiniCPM Team},
100
  year={2025}
101
  }
102
+ ```