moelanoby commited on
Commit
968db96
·
verified ·
1 Parent(s): f39f7ce

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -1,4 +1,3 @@
1
-
2
  ---
3
  language:
4
  - ar
@@ -14,7 +13,6 @@ base_model:
14
  library_name: transformers
15
  tags:
16
  - code
17
- - open-source
18
  ---
19
  # M3-V2: An Open Source Model for State-of-the-Art Code Generation
20
 
@@ -60,7 +58,7 @@ This model is licensed under the **Apache 2.0 License**. You are free to use, mo
60
 
61
  ## Ethical Considerations
62
 
63
- While this model is open source, users are encouraged to use it responsibly. Finetuning the model to generate harmful, illegal, or unethical content is strongly discouraged. We advocate for the use of this technology to build positive and safe applications.
64
 
65
  ---
66
 
@@ -125,4 +123,4 @@ except AttributeError:
125
 
126
  - The base of this model utilizes the **Phi-3** architecture developed by Microsoft.
127
  - The benchmark results were obtained using the **HumanEval** dataset from OpenAI.
128
- - We thank the open-source community for their continuous contributions to AI research.
 
 
1
  ---
2
  language:
3
  - ar
 
13
  library_name: transformers
14
  tags:
15
  - code
 
16
  ---
17
  # M3-V2: An Open Source Model for State-of-the-Art Code Generation
18
 
 
58
 
59
  ## Ethical Considerations
60
 
61
+ While this model is open source, users are encouraged to use it responsibly. Finetuning the model to generate harmful, illegal, or unethical content is strongly discouraged. I advocate for the use of this technology to build positive and safe applications.
62
 
63
  ---
64
 
 
123
 
124
  - The base of this model utilizes the **Phi-3** architecture developed by Microsoft.
125
  - The benchmark results were obtained using the **HumanEval** dataset from OpenAI.
126
+ - I thank the open-source community for their continuous contributions to AI research.