chansung commited on
Commit
7ecefbc
Β·
1 Parent(s): 10d8562

Update strings.py

Browse files
Files changed (1) hide show
  1. strings.py +1 -2
strings.py CHANGED
@@ -1,7 +1,7 @@
1
  TITLE = "Alpaca-LoRA Playground"
2
 
3
  ABSTRACT = """
4
- Thanks to [tolen](https://github.com/tloen/alpaca-lora), this application runs Alpaca-LoRA which is instruction fine-tuned version of [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/). This demo currently runs 30B version on a 3*A6000 instance at [Jarvislabs.ai](https://jarvislabs.ai/).
5
 
6
  NOTE: too long input (context, instruction) will not be allowed. Please keep them < 150
7
  """
@@ -11,7 +11,6 @@ This demo application runs the open source project, [Alpaca-LoRA-Serve](https://
11
 
12
  Alpaca-LoRA is built on the same concept as Standford Alpaca project, but it lets us train and inference on a smaller GPUs such as RTX4090 for 7B version. Also, we could build very small size of checkpoints on top of base models thanks to [πŸ€— transformers](https://huggingface.co/docs/transformers/index), [πŸ€— peft](https://github.com/huggingface/peft), and [bitsandbytes](https://github.com/TimDettmers/bitsandbytes/tree/main) libraries.
13
 
14
- We are thankful to the [Jarvislabs.ai](https://jarvislabs.ai/) who generously provided free GPU instances.
15
  """
16
 
17
  DEFAULT_EXAMPLES = {
 
1
  TITLE = "Alpaca-LoRA Playground"
2
 
3
  ABSTRACT = """
4
+ Thanks to [tolen](https://github.com/tloen/alpaca-lora), this application runs Alpaca-LoRA which is instruction fine-tuned version of [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/). This demo currently runs 8Bit 13B version on a A10 instance.
5
 
6
  NOTE: too long input (context, instruction) will not be allowed. Please keep them < 150
7
  """
 
11
 
12
  Alpaca-LoRA is built on the same concept as Standford Alpaca project, but it lets us train and inference on a smaller GPUs such as RTX4090 for 7B version. Also, we could build very small size of checkpoints on top of base models thanks to [πŸ€— transformers](https://huggingface.co/docs/transformers/index), [πŸ€— peft](https://github.com/huggingface/peft), and [bitsandbytes](https://github.com/TimDettmers/bitsandbytes/tree/main) libraries.
13
 
 
14
  """
15
 
16
  DEFAULT_EXAMPLES = {