Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
d6c1b24
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
110 commits
victor
HF Staff
padding
d6c1b24
about 1 year ago
InferencePlayground.svelte
Safe
11.4 kB
padding
about 1 year ago
InferencePlaygroundCodeSnippets.svelte
Safe
8.65 kB
padding
about 1 year ago
InferencePlaygroundConversation.svelte
Safe
1.61 kB
wip
about 1 year ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.08 kB
handle when /api/model err
about 1 year ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.26 kB
wip
about 1 year ago
InferencePlaygroundMessage.svelte
Safe
1.54 kB
order imports
about 1 year ago
InferencePlaygroundModelSelector.svelte
Safe
2.07 kB
quick fixes
about 1 year ago
InferencePlaygroundModelSelectorModal.svelte
Safe
3.6 kB
quick fixes
about 1 year ago
generationConfigSettings.ts
Safe
933 Bytes
Rm advanced options for config
about 1 year ago
inferencePlaygroundUtils.ts
Safe
2.16 kB
make tokens count working for non-streaming as well
about 1 year ago
types.ts
Safe
607 Bytes
System message as part of Conversation
about 1 year ago