Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
da972f7
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
115 commits
mishig
HF Staff
save hf token for later
da972f7
10 months ago
InferencePlayground.svelte
11.8 kB
save hf token for later
10 months ago
InferencePlaygroundCodeSnippets.svelte
9.44 kB
snippets "showToken" feature
10 months ago
InferencePlaygroundConversation.svelte
1.64 kB
snippets "showToken" feature
10 months ago
InferencePlaygroundGenerationConfig.svelte
2.08 kB
handle when /api/model err
11 months ago
InferencePlaygroundHFTokenModal.svelte
4.81 kB
save hf token for later
10 months ago
InferencePlaygroundMessage.svelte
1.54 kB
order imports
11 months ago
InferencePlaygroundModelSelector.svelte
2.07 kB
quick fixes
11 months ago
InferencePlaygroundModelSelectorModal.svelte
3.62 kB
Model selector w-full
10 months ago
generationConfigSettings.ts
933 Bytes
Rm advanced options for config
11 months ago
inferencePlaygroundUtils.ts
2.16 kB
make tokens count working for non-streaming as well
11 months ago
types.ts
607 Bytes
System message as part of Conversation
11 months ago