Spaces:
Running
Running
Add 2 files
Browse files- index.html +200 -0
- prompts.txt +2 -1
index.html
CHANGED
@@ -20,6 +20,10 @@
|
|
20 |
.pulse-animation {
|
21 |
animation: pulse 2s infinite;
|
22 |
}
|
|
|
|
|
|
|
|
|
23 |
@keyframes pulse {
|
24 |
0% { opacity: 0.6; }
|
25 |
50% { opacity: 1; }
|
@@ -30,6 +34,21 @@
|
|
30 |
background: linear-gradient(white, white) padding-box,
|
31 |
linear-gradient(to right, #6366f1, #8b5cf6) border-box;
|
32 |
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
</style>
|
34 |
</head>
|
35 |
<body class="bg-gray-50 min-h-screen">
|
@@ -152,6 +171,95 @@
|
|
152 |
</div>
|
153 |
</div>
|
154 |
</div>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
155 |
|
156 |
<!-- System Status -->
|
157 |
<div class="mt-12 bg-white rounded-xl shadow-md overflow-hidden">
|
@@ -350,6 +458,98 @@
|
|
350 |
}
|
351 |
}, 1500 + Math.random() * 2000); // Random delay between 1.5-3.5 seconds
|
352 |
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
353 |
});
|
354 |
</script>
|
355 |
<p style="border-radius: 8px; text-align: center; font-size: 12px; color: #fff; margin-top: 16px;position: fixed; left: 8px; bottom: 8px; z-index: 10; background: rgba(0, 0, 0, 0.8); padding: 4px 8px;">Made with <img src="https://enzostvs-deepsite.hf.space/logo.svg" alt="DeepSite Logo" style="width: 16px; height: 16px; vertical-align: middle;display:inline-block;margin-right:3px;filter:brightness(0) invert(1);"><a href="https://enzostvs-deepsite.hf.space" style="color: #fff;text-decoration: underline;" target="_blank" >DeepSite</a> - 🧬 <a href="https://enzostvs-deepsite.hf.space?remix=privateuserh/aurarecall1" style="color: #fff;text-decoration: underline;" target="_blank" >Remix</a></p></body>
|
|
|
20 |
.pulse-animation {
|
21 |
animation: pulse 2s infinite;
|
22 |
}
|
23 |
+
.memory-object {
|
24 |
+
background: linear-gradient(145deg, #ffffff, #f5f3ff);
|
25 |
+
box-shadow: 0 4px 6px rgba(0, 0, 0, 0.05);
|
26 |
+
}
|
27 |
@keyframes pulse {
|
28 |
0% { opacity: 0.6; }
|
29 |
50% { opacity: 1; }
|
|
|
34 |
background: linear-gradient(white, white) padding-box,
|
35 |
linear-gradient(to right, #6366f1, #8b5cf6) border-box;
|
36 |
}
|
37 |
+
.processing-step {
|
38 |
+
position: relative;
|
39 |
+
}
|
40 |
+
.processing-step:not(:last-child):after {
|
41 |
+
content: "";
|
42 |
+
position: absolute;
|
43 |
+
right: -20px;
|
44 |
+
top: 50%;
|
45 |
+
transform: translateY(-50%);
|
46 |
+
width: 0;
|
47 |
+
height: 0;
|
48 |
+
border-top: 10px solid transparent;
|
49 |
+
border-bottom: 10px solid transparent;
|
50 |
+
border-left: 10px solid #8b5cf6;
|
51 |
+
}
|
52 |
</style>
|
53 |
</head>
|
54 |
<body class="bg-gray-50 min-h-screen">
|
|
|
171 |
</div>
|
172 |
</div>
|
173 |
</div>
|
174 |
+
|
175 |
+
<!-- AI Processing Demo Section -->
|
176 |
+
<div class="mt-12 bg-white rounded-xl shadow-md overflow-hidden">
|
177 |
+
<div class="bg-gradient-to-r from-indigo-600 to-purple-600 px-6 py-4">
|
178 |
+
<h3 class="text-xl font-semibold text-white flex items-center">
|
179 |
+
<i class="fas fa-microchip mr-3"></i> AI Processing Demonstration
|
180 |
+
</h3>
|
181 |
+
</div>
|
182 |
+
<div class="p-6">
|
183 |
+
<p class="text-gray-600 mb-6">
|
184 |
+
This section demonstrates how AuraRecall processes your memory fragments to create structured memory objects.
|
185 |
+
</p>
|
186 |
+
|
187 |
+
<div class="grid grid-cols-1 md:grid-cols-3 gap-4 mb-8">
|
188 |
+
<div class="processing-step bg-indigo-50 p-4 rounded-lg flex flex-col items-center">
|
189 |
+
<div class="w-12 h-12 rounded-full bg-indigo-100 flex items-center justify-center mb-3">
|
190 |
+
<i class="fas fa-file-audio text-indigo-600"></i>
|
191 |
+
</div>
|
192 |
+
<h4 class="font-medium text-indigo-800 text-center">Audio Transcription</h4>
|
193 |
+
<p class="text-sm text-gray-600 text-center mt-1">Convert speech to text</p>
|
194 |
+
<div id="transcription-result" class="mt-3 text-sm text-gray-700 hidden"></div>
|
195 |
+
</div>
|
196 |
+
|
197 |
+
<div class="processing-step bg-purple-50 p-4 rounded-lg flex flex-col items-center">
|
198 |
+
<div class="w-12 h-12 rounded-full bg-purple-100 flex items-center justify-center mb-3">
|
199 |
+
<i class="fas fa-image text-purple-600"></i>
|
200 |
+
</div>
|
201 |
+
<h4 class="font-medium text-purple-800 text-center">Visual Analysis</h4>
|
202 |
+
<p class="text-sm text-gray-600 text-center mt-1">Detect objects and faces</p>
|
203 |
+
<div id="visual-result" class="mt-3 text-sm text-gray-700 hidden"></div>
|
204 |
+
</div>
|
205 |
+
|
206 |
+
<div class="processing-step bg-blue-50 p-4 rounded-lg flex flex-col items-center">
|
207 |
+
<div class="w-12 h-12 rounded-full bg-blue-100 flex items-center justify-center mb-3">
|
208 |
+
<i class="fas fa-link text-blue-600"></i>
|
209 |
+
</div>
|
210 |
+
<h4 class="font-medium text-blue-800 text-center">Context Linking</h4>
|
211 |
+
<p class="text-sm text-gray-600 text-center mt-1">Connect related elements</p>
|
212 |
+
<div id="linking-result" class="mt-3 text-sm text-gray-700 hidden"></div>
|
213 |
+
</div>
|
214 |
+
</div>
|
215 |
+
|
216 |
+
<div id="memory-object-demo" class="hidden">
|
217 |
+
<h4 class="font-medium text-gray-700 mb-3 flex items-center">
|
218 |
+
<i class="fas fa-cube text-indigo-500 mr-2"></i> Generated Memory Object
|
219 |
+
</h4>
|
220 |
+
<div class="memory-object p-4 rounded-lg border border-gray-200">
|
221 |
+
<div class="grid grid-cols-1 md:grid-cols-3 gap-4">
|
222 |
+
<div class="bg-indigo-50 p-3 rounded">
|
223 |
+
<h5 class="font-medium text-indigo-700 mb-2 flex items-center">
|
224 |
+
<i class="fas fa-align-left mr-2 text-sm"></i> Transcription
|
225 |
+
</h5>
|
226 |
+
<p id="memory-transcription" class="text-sm text-gray-700"></p>
|
227 |
+
</div>
|
228 |
+
<div class="bg-purple-50 p-3 rounded">
|
229 |
+
<h5 class="font-medium text-purple-700 mb-2 flex items-center">
|
230 |
+
<i class="fas fa-tags mr-2 text-sm"></i> Detected Entities
|
231 |
+
</h5>
|
232 |
+
<div id="memory-entities" class="text-sm text-gray-700"></div>
|
233 |
+
</div>
|
234 |
+
<div class="bg-blue-50 p-3 rounded">
|
235 |
+
<h5 class="font-medium text-blue-700 mb-2 flex items-center">
|
236 |
+
<i class="fas fa-project-diagram mr-2 text-sm"></i> Context Links
|
237 |
+
</h5>
|
238 |
+
<div id="memory-links" class="text-sm text-gray-700"></div>
|
239 |
+
</div>
|
240 |
+
</div>
|
241 |
+
<div class="mt-4 pt-3 border-t border-gray-200">
|
242 |
+
<div class="flex justify-between items-center">
|
243 |
+
<div class="flex items-center">
|
244 |
+
<i class="fas fa-fingerprint text-gray-500 mr-2"></i>
|
245 |
+
<span class="text-xs text-gray-500">Bio-signature: <span class="text-gray-700">user_authenticated</span></span>
|
246 |
+
</div>
|
247 |
+
<div class="text-xs text-gray-500">
|
248 |
+
<i class="fas fa-calendar-alt mr-1"></i>
|
249 |
+
<span id="memory-timestamp"></span>
|
250 |
+
</div>
|
251 |
+
</div>
|
252 |
+
</div>
|
253 |
+
</div>
|
254 |
+
</div>
|
255 |
+
|
256 |
+
<div class="mt-6 text-center">
|
257 |
+
<button id="demo-processing" class="px-4 py-2 bg-gradient-to-r from-indigo-500 to-purple-500 text-white rounded-lg hover:from-indigo-600 hover:to-purple-600 transition">
|
258 |
+
<i class="fas fa-play-circle mr-2"></i> Run Processing Demo
|
259 |
+
</button>
|
260 |
+
</div>
|
261 |
+
</div>
|
262 |
+
</div>
|
263 |
|
264 |
<!-- System Status -->
|
265 |
<div class="mt-12 bg-white rounded-xl shadow-md overflow-hidden">
|
|
|
458 |
}
|
459 |
}, 1500 + Math.random() * 2000); // Random delay between 1.5-3.5 seconds
|
460 |
}
|
461 |
+
|
462 |
+
// AI Processing Demo
|
463 |
+
const demoProcessingBtn = document.getElementById('demo-processing');
|
464 |
+
const transcriptionResult = document.getElementById('transcription-result');
|
465 |
+
const visualResult = document.getElementById('visual-result');
|
466 |
+
const linkingResult = document.getElementById('linking-result');
|
467 |
+
const memoryObjectDemo = document.getElementById('memory-object-demo');
|
468 |
+
const memoryTranscription = document.getElementById('memory-transcription');
|
469 |
+
const memoryEntities = document.getElementById('memory-entities');
|
470 |
+
const memoryLinks = document.getElementById('memory-links');
|
471 |
+
const memoryTimestamp = document.getElementById('memory-timestamp');
|
472 |
+
|
473 |
+
demoProcessingBtn.addEventListener('click', runProcessingDemo);
|
474 |
+
|
475 |
+
function runProcessingDemo() {
|
476 |
+
// Reset previous demo
|
477 |
+
transcriptionResult.classList.add('hidden');
|
478 |
+
visualResult.classList.add('hidden');
|
479 |
+
linkingResult.classList.add('hidden');
|
480 |
+
memoryObjectDemo.classList.add('hidden');
|
481 |
+
|
482 |
+
// Disable button during demo
|
483 |
+
demoProcessingBtn.disabled = true;
|
484 |
+
demoProcessingBtn.innerHTML = '<i class="fas fa-cog fa-spin mr-2"></i> Processing...';
|
485 |
+
|
486 |
+
// Simulate transcription step
|
487 |
+
setTimeout(() => {
|
488 |
+
transcriptionResult.textContent = '"We had an amazing dinner at Le Petit Jardin last night. The duck confit was incredible!"';
|
489 |
+
transcriptionResult.classList.remove('hidden');
|
490 |
+
|
491 |
+
// Simulate visual analysis step
|
492 |
+
setTimeout(() => {
|
493 |
+
visualResult.innerHTML = `
|
494 |
+
<div class="flex flex-wrap gap-1">
|
495 |
+
<span class="px-2 py-1 bg-purple-100 text-purple-800 text-xs rounded">Restaurant</span>
|
496 |
+
<span class="px-2 py-1 bg-purple-100 text-purple-800 text-xs rounded">Duck dish</span>
|
497 |
+
<span class="px-2 py-1 bg-purple-100 text-purple-800 text-xs rounded">Wine bottle</span>
|
498 |
+
<span class="px-2 py-1 bg-purple-100 text-purple-800 text-xs rounded">2 people</span>
|
499 |
+
</div>
|
500 |
+
`;
|
501 |
+
visualResult.classList.remove('hidden');
|
502 |
+
|
503 |
+
// Simulate context linking step
|
504 |
+
setTimeout(() => {
|
505 |
+
linkingResult.innerHTML = `
|
506 |
+
<div class="text-center">
|
507 |
+
<div class="inline-block px-3 py-1 bg-blue-100 text-blue-800 text-xs rounded-full">
|
508 |
+
<i class="fas fa-link mr-1"></i> Matched location
|
509 |
+
</div>
|
510 |
+
<div class="mt-1 text-xs">Paris, France</div>
|
511 |
+
</div>
|
512 |
+
`;
|
513 |
+
linkingResult.classList.remove('hidden');
|
514 |
+
|
515 |
+
// Show final memory object
|
516 |
+
setTimeout(() => {
|
517 |
+
memoryTranscription.textContent = '"We had an amazing dinner at Le Petit Jardin last night. The duck confit was incredible!"';
|
518 |
+
memoryEntities.innerHTML = `
|
519 |
+
<div class="flex flex-wrap gap-1">
|
520 |
+
<span class="px-2 py-1 bg-purple-100 text-purple-800 text-xs rounded">Restaurant</span>
|
521 |
+
<span class="px-2 py-1 bg-purple-100 text-purple-800 text-xs rounded">Duck dish</span>
|
522 |
+
<span class="px-2 py-1 bg-purple-100 text-purple-800 text-xs rounded">Wine bottle</span>
|
523 |
+
<span class="px-2 py-1 bg-purple-100 text-purple-800 text-xs rounded">2 people</span>
|
524 |
+
</div>
|
525 |
+
`;
|
526 |
+
memoryLinks.innerHTML = `
|
527 |
+
<div class="space-y-1">
|
528 |
+
<div class="flex items-center">
|
529 |
+
<i class="fas fa-map-marker-alt text-blue-500 mr-2 text-xs"></i>
|
530 |
+
<span class="text-xs">Le Petit Jardin, Paris</span>
|
531 |
+
</div>
|
532 |
+
<div class="flex items-center">
|
533 |
+
<i class="fas fa-utensils text-blue-500 mr-2 text-xs"></i>
|
534 |
+
<span class="text-xs">Duck confit mentioned in audio</span>
|
535 |
+
</div>
|
536 |
+
<div class="flex items-center">
|
537 |
+
<i class="fas fa-user-friends text-blue-500 mr-2 text-xs"></i>
|
538 |
+
<span class="text-xs">Dining companion: Sarah</span>
|
539 |
+
</div>
|
540 |
+
</div>
|
541 |
+
`;
|
542 |
+
memoryTimestamp.textContent = new Date().toLocaleDateString() + ' ' + new Date().toLocaleTimeString([], {hour: '2-digit', minute:'2-digit'});
|
543 |
+
memoryObjectDemo.classList.remove('hidden');
|
544 |
+
|
545 |
+
// Re-enable button
|
546 |
+
demoProcessingBtn.disabled = false;
|
547 |
+
demoProcessingBtn.innerHTML = '<i class="fas fa-play-circle mr-2"></i> Run Processing Demo';
|
548 |
+
}, 800);
|
549 |
+
}, 800);
|
550 |
+
}, 800);
|
551 |
+
}, 800);
|
552 |
+
}
|
553 |
});
|
554 |
</script>
|
555 |
<p style="border-radius: 8px; text-align: center; font-size: 12px; color: #fff; margin-top: 16px;position: fixed; left: 8px; bottom: 8px; z-index: 10; background: rgba(0, 0, 0, 0.8); padding: 4px 8px;">Made with <img src="https://enzostvs-deepsite.hf.space/logo.svg" alt="DeepSite Logo" style="width: 16px; height: 16px; vertical-align: middle;display:inline-block;margin-right:3px;filter:brightness(0) invert(1);"><a href="https://enzostvs-deepsite.hf.space" style="color: #fff;text-decoration: underline;" target="_blank" >DeepSite</a> - 🧬 <a href="https://enzostvs-deepsite.hf.space?remix=privateuserh/aurarecall1" style="color: #fff;text-decoration: underline;" target="_blank" >Remix</a></p></body>
|
prompts.txt
CHANGED
@@ -1 +1,2 @@
|
|
1 |
-
Modify front end AuraRecall Frontend with Cloudflare AI Gateway IntegrationThis document presents an updated React frontend application. It retains the file upload functionality (targeting your backend's ingestion service) and adds a new section to demonstrate interaction with the Cloudflare AI Gateway for sending text queries to an AI model.Key Features:File upload for memory fragments (Audio/Visual).Section for sending text queries to an LLM via the Cloudflare AI Gateway.Displays the response from the AI model.Basic styling using Tailwind CSS.Note:The file upload still targets your custom backend ingestion endpoint (YOUR_BACKEND_INGESTION_URL).The text query targets your Cloudflare AI Gateway endpoint (YOUR_CLOUDFLARE_GATEWAY_URL).You need to replace placeholder URLs and add your Hugging Face token (YOUR_HF_TOKEN).The response from the AI query is just displayed as text; a real application would use this response to refine a query sent to your backend's Memory Retrieval Service.
|
|
|
|
1 |
+
Modify front end AuraRecall Frontend with Cloudflare AI Gateway IntegrationThis document presents an updated React frontend application. It retains the file upload functionality (targeting your backend's ingestion service) and adds a new section to demonstrate interaction with the Cloudflare AI Gateway for sending text queries to an AI model.Key Features:File upload for memory fragments (Audio/Visual).Section for sending text queries to an LLM via the Cloudflare AI Gateway.Displays the response from the AI model.Basic styling using Tailwind CSS.Note:The file upload still targets your custom backend ingestion endpoint (YOUR_BACKEND_INGESTION_URL).The text query targets your Cloudflare AI Gateway endpoint (YOUR_CLOUDFLARE_GATEWAY_URL).You need to replace placeholder URLs and add your Hugging Face token (YOUR_HF_TOKEN).The response from the AI query is just displayed as text; a real application would use this response to refine a query sent to your backend's Memory Retrieval Service.
|
2 |
+
Keep all components consider Suggestions for AuraRecall Hugging Face SpaceAssuming the Hugging Face Space privateuserh/aurarecall1 is intended as a demo or prototype for the AuraRecall personal AI memory system, here are some suggestions for features and content it could include to effectively showcase the project:Demonstrate Core AI Processing:Multi-modal Input: Allow users to upload a short audio file and a corresponding image or short video clip.Processing Simulation: Show a simplified output of what the AI backend would do. For example:Display the transcription of the audio.List detected objects or faces in the image/video.Show a simple "contextual link" created between the audio and visual based on timestamp or identified elements.Suggestion: Even if the full complex synthesis isn't possible in the Space, demonstrating these individual processing steps (transcription, object detection) is valuable.Illustrate Memory Synthesis (Conceptual):After processing, show a conceptual "Memory Object" created from the inputs. This could be a simple visual representation linking the text transcription, detected entities, and the original media.Suggestion: Use a simple diagram or structured text output to represent the linked data, explaining how the AI connects the pieces.Showcase Retrieval (Simulated):Include a text input field where a user can type a simple query (e.g., "Show memories with [Person Name]" or "Find audio about [Topic]").Since the full memory database isn't likely in the Space, simulate a retrieval. Based on the input query and the previously uploaded/processed data, highlight or display the relevant simulated "Memory Object" or its components.Suggestion: Have a few pre-loaded, simple example "memories" in the Space that the query function can retrieve, alongside the ability to query newly uploaded (and processed) fragments.Address Bio-Signature (Conceptually):Explain how the bio-signature would integrate. You could have a placeholder section mentioning "Authenticated User: [Simulated User ID]" or a brief text explanation of its role in personalization and security.Suggestion: Avoid implementing actual biometric capture in a public Space for privacy reasons, but explain the concept clearly.Include Clear Explanations:Use the Space's README or description area to clearly explain the AuraRecall concept, the purpose of the demo, and what the user can do/see in the Space.Explain which parts are fully functional demos and which are simulations or conceptual representations.Keep it Focused and Simple:Hugging Face Spaces work best for demonstrating specific aspects or a simplified workflow. Don't try to build the entire SaaS in the Space. Focus on showcasing the unique AI processing and linking capabilities.Provide Links:Link back to any relevant repositories (like the backend code if it's public) or documentation.If the Space currently contains something different, or if you gain access to view its contents, I would be happy to provide more specific feedback based on its actual implementation!
|