Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
cefaaf5
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
105 commits
mishig
HF Staff
Rm advanced options for config
cefaaf5
about 1 year ago
InferencePlayground.svelte
Safe
9.84 kB
make tokens count working for non-streaming as well
about 1 year ago
InferencePlaygroundCodeSnippets.svelte
Safe
8.64 kB
Better escape of contents of msg
about 1 year ago
InferencePlaygroundConversation.svelte
Safe
1.61 kB
wip
about 1 year ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.09 kB
Rm advanced options for config
about 1 year ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.26 kB
wip
about 1 year ago
InferencePlaygroundMessage.svelte
Safe
1.54 kB
order imports
about 1 year ago
InferencePlaygroundModelSelector.svelte
Safe
2.05 kB
types file
about 1 year ago
InferencePlaygroundModelSelectorModal.svelte
Safe
3.57 kB
make search working
about 1 year ago
generationConfigSettings.ts
Safe
933 Bytes
Rm advanced options for config
about 1 year ago
inferencePlaygroundUtils.ts
Safe
2.16 kB
make tokens count working for non-streaming as well
about 1 year ago
types.ts
Safe
607 Bytes
System message as part of Conversation
about 1 year ago