File size: 13,271 Bytes
dddf8ea
 
 
 
 
7744aeb
 
dddf8ea
7744aeb
dddf8ea
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7744aeb
dddf8ea
 
7744aeb
dddf8ea
 
 
 
7744aeb
dddf8ea
 
 
 
 
 
60a7cd5
dddf8ea
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7744aeb
60a7cd5
7744aeb
 
dddf8ea
 
 
 
 
 
 
 
 
 
7744aeb
 
 
 
 
dddf8ea
7744aeb
 
 
 
 
dddf8ea
7744aeb
 
 
 
 
dddf8ea
7744aeb
 
 
 
 
dddf8ea
7744aeb
 
 
 
 
dddf8ea
 
 
 
 
 
 
 
 
7744aeb
dddf8ea
 
7744aeb
dddf8ea
 
7744aeb
dddf8ea
7744aeb
 
 
dddf8ea
 
7744aeb
dddf8ea
 
 
 
 
 
 
 
 
7744aeb
 
 
 
 
 
 
 
 
 
 
dddf8ea
 
 
 
 
 
 
 
 
 
 
 
7744aeb
dddf8ea
 
7744aeb
 
 
 
 
 
 
 
dddf8ea
 
7744aeb
dddf8ea
 
7744aeb
dddf8ea
 
7744aeb
dddf8ea
 
7744aeb
dddf8ea
 
7744aeb
 
 
 
 
dddf8ea
 
7744aeb
dddf8ea
 
7744aeb
dddf8ea
 
7744aeb
 
dddf8ea
 
7744aeb
dddf8ea
 
 
 
7744aeb
dddf8ea
 
 
 
 
 
 
 
60a7cd5
dddf8ea
 
 
 
 
 
 
 
 
 
 
 
7744aeb
dddf8ea
 
 
 
 
 
 
 
7744aeb
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
<!DOCTYPE html>
<html>
<head>
  <meta charset="utf-8">
  <meta name="description"
        content="MPCA is a novel, bio-inspired AI framework that moves beyond traditional machine learning models.">
  <meta name="keywords" content="MPCA, MycoPhysarum, Cognitive Architecture, Bio-inspired AI, Slime Mold, Mycelium, Graph AI, Efficient AI">
  <meta name="viewport" content="width=device-width, initial-scale=1">
  <title>MycoPhysarum Cognitive Architecture (MPCA)</title>

  <link href="https://fonts.googleapis.com/css?family=Google+Sans|Noto+Sans|Castoro"
        rel="stylesheet">

  <link rel="stylesheet" href="./static/css/bulma.min.css">
  <link rel="stylesheet" href="./static/css/bulma-carousel.min.css">
  <link rel="stylesheet" href="./static/css/bulma-slider.min.css">
  <link rel="stylesheet" href="./static/css/fontawesome.all.min.css">
  <link rel="stylesheet"
        href="https://cdn.jsdelivr.net/gh/jpswalsh/academicons@1/css/academicons.min.css">
  <link rel="stylesheet" href="./static/css/index.css">
  <link rel="icon" href="./static/images/favicon.svg">

  <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.5.1/jquery.min.js"></script>
  <script defer src="./static/js/fontawesome.all.min.js"></script>
  <script src="./static/js/bulma-carousel.min.js"></script>
  <script src="./static/js/bulma-slider.min.js"></script>
  <script src="./static/js/index.js"></script>
</head>
<body>

<section class="hero">
  <div class="hero-body">
    <div class="container is-max-desktop">
      <div class="columns is-centered">
        <div class="column has-text-centered">
          <h1 class="title is-1 publication-title">MycoPhysarum Cognitive Architecture (MPCA)</h1>
          <div class="is-size-5 publication-authors">
            <span class="author-block">
              A Novel, Bio-Inspired AI Framework
            </span>
          </div>

          <div class="is-size-5 publication-authors">
            <span class="author-block">Inspired by the decentralized intelligence of slime molds and mycorrhizal networks.</span>
          </div>

          <div class="column has-text-centered">
            <div class="publication-links">
              <!-- Code Link. -->
              <span class="link-block">
                <a href="https://github.com/bahira/myco-physarum" target="_blank"
                   class="external-link button is-normal is-rounded is-dark">
                  <span class="icon">
                      <i class="fab fa-github"></i>
                  </span>
                  <span>Code</span>
                  </a>
              </span>
            </div>
          </div>
        </div>
      </div>
    </div>
  </div>
</section>

<section class="hero teaser">
  <div class="container is-max-desktop">
    <div class="hero-body">
      <!-- Replaced video with a conceptual diagram -->
      <img src="https://huggingface.co/bahira/myco-physarum/resolve/main/mpca.png" alt="MPCA Architecture Diagram" style="width: 100%; max-width: 800px; margin: auto; display: block; border-radius: 10px;">
      <h2 class="subtitle has-text-centered" style="margin-top: 2rem;">
        MPCA represents knowledge not as static data, but as a dynamic, living graph that evolves through interaction and self-reflection.
      </h2>
    </div>
  </div>
</section>


<section class="hero is-light is-small">
  <div class="hero-body">
    <div class="container">
      <div id="results-carousel" class="carousel results-carousel">
        <div class="item">
          <div class="content" style="padding: 2rem;">
            <h3 class="title is-4">Mycelium</h3>
            <p>The heart of the system. A <code>networkx</code> directed graph where nodes are concepts and edges represent the relationships between them. The strength of these connections is dynamic, changing based on usage and learning.</p>
          </div>
        </div>
        <div class="item">
           <div class="content" style="padding: 2rem;">
            <h3 class="title is-4">Builder</h3>
            <p>The architect of the Mycelium. The Builder ingests raw text, uses <code>spaCy</code>'s dependency parser to understand grammatical structure, and translates it into a rich graph of nodes and relationships.</p>
          </div>
        </div>
        <div class="item">
           <div class="content" style="padding: 2rem;">
            <h3 class="title is-4">Solver</h3>
            <p>The "consciousness" of MPCA. It traverses the Mycelium to find relevant pathways to answer queries. Implements Hebbian learning to reinforce successful paths and "dreaming" to form new speculative connections.</p>
          </div>
        </div>
        <div class="item">
           <div class="content" style="padding: 2rem;">
            <h3 class="title is-4">Node</h3>
            <p>The fundamental unit of knowledge. Each node represents a word and has a <code>type</code> (concept, action, property) and a <code>strength</code>, indicating its importance in the network.</p>
          </div>
        </div>
        <div class="item">
           <div class="content" style="padding: 2rem;">
            <h3 class="title is-4">Spore</h3>
            <p>A highly efficient persistence mechanism. A "spore" is a serialized (<code>pickle</code>) and compressed snapshot of the Mycelium, allowing the system's learned state to be saved and loaded from a tiny file (~9 MB).</p>
          </div>
        </div>
      </div>
    </div>
  </div>
</section>


<section class="section">
  <div class="container is-max-desktop">
    <!-- Abstract -->
    <div class="columns is-centered has-text-centered">
      <div class="column is-four-fifths">
        <h2 class="title is-3">Core Philosophy</h2>
        <div class="content has-text-justified">
          <p>
            The core of MPCA is the "Cognitive Mycelium," a graph-based knowledge structure. Unlike rigid, pre-trained models, the Mycelium is built from the ground up to understand the grammatical and conceptual relationships in language. It learns, forgets, and even "dreams" to form new connections, creating a resilient and emergent form of intelligence with a fraction of the computational overhead of conventional architectures.
          </p>
          <h2 class="title is-3" style="margin-top: 2rem;">Radical Efficiency: The Spore Advantage</h2>
           <p>
            A key breakthrough of the MPCA is its incredible efficiency. A fully-formed Cognitive Mycelium, built from a large dataset (~100k entries, >2.9M sentences), can be compressed into a <code>spore</code> file of only <strong>~9 MB</strong>. This stands in stark contrast to conventional AI models like Transformers (GPT-2 is ~500 MB; modern models are many gigabytes).
          </p>
          <p>
            This efficiency is a direct result of the architecture's design. Instead of storing billions of statistical weights to predict tokens, the Mycelium stores a compressed graph of concepts and their relationships. It captures knowledge, not just statistical patterns, leading to a powerful, lightweight, and truly novel form of intelligence.
          </p>
        </div>
      </div>
    </div>
    <!--/ Abstract. -->

    <!-- Paper video. -->
    <div class="columns is-centered has-text-centered">
      <div class="column is-four-fifths">
        <h2 class="title is-3">How to Run</h2>
        <div class="content has-text-justified">
          <p><strong>1. Install Dependencies:</strong></p>
          <pre><code>pip install -r requirements.txt
python -m spacy download en_core_web_sm</code></pre>
          <p><strong>2. Build a New Mycelium:</strong></p>
           <p>To create a new knowledge graph from a dataset, run the main script with the <code>--build</code> flag. The default dataset is <code>mlabonne/FineTome-100k</code>.</p>
          <pre><code>python main.py --build --spore-file mycelium_new.spore --limit 1000</code></pre>
          <p><strong>3. Interact with an Existing Mycelium:</strong></p>
           <p>To chat with a pre-built Mycelium, use the <code>--interactive</code> flag.</p>
          <pre><code>python main.py --interactive --spore-file mycelium.spore</code></pre>
        </div>
      </div>
    </div>
    <!--/ Paper video. -->
  </div>
</section>


<section class="section">
  <div class="container is-max-desktop">

    <div class="columns is-centered">
      <!-- Lifecycle -->
      <div class="column">
        <div class="content">
          <h2 class="title is-3">The MPCA Lifecycle</h2>
           <ol>
                <li><strong>Genesis (Building):</strong> The <code>Builder</code> creates a Mycelium from a data source, performing grammatical parsing to construct a graph of nodes and relationships, which is then saved as a <code>.spore</code> file.</li>
                <li><strong>Awakening (Loading):</strong> The system loads a <code>.spore</code> file into memory, awakening the Cognitive Mycelium.</li>
                <li><strong>Interaction (Solving):</strong> A user asks a question. The <code>Solver</code> takes the core concepts and finds a thought-path through the Mycelium to construct an answer.</li>
                <li><strong>Evolution (Learning):</strong> Successful thought-paths are reinforced via Hebbian learning, strengthening the system's knowledge.</li>
                <li><strong>Introspection (Dreaming):</strong> During downtime, the system can dream to form new, speculative connections, expanding its creative potential.</li>
           </ol>
        </div>
      </div>
      <!--/ Lifecycle. -->
    </div>

    <!-- The Path Forward -->
    <div class="columns is-centered">
      <div class="column is-full-width">
        <h2 class="title is-3">The Path Forward: Extending MPCA to a Multi-Modal World</h2>
        <div class="content has-text-justified">
          <p>
            The current architecture is a powerful foundation for understanding language. Its true potential lies in extending this conceptual graph to understand and generate other forms of data. The core principle is that MPCA acts as a <strong>central orchestrator</strong>, connecting its abstract conceptual understanding to specialized external models for processing and generation.
          </p>
        </div>
        
        <h3 class="title is-4">Image Understanding and Generation</h3>
        <div class="content has-text-justified">
            <p><strong>Understanding:</strong> A Vision-Language Model (VLM) like CLIP analyzes an image and outputs concept tags (e.g., "a red car on a street"). The <code>Builder</code> integrates these concepts into the Mycelium, linking an <code>ImageNode</code> to existing nodes like <code>Node('car')</code> and <code>Node('red')</code>. The graph learns <em>what's in the image</em>, not the pixels themselves.</p>
            <p><strong>Generation:</strong> The <code>Solver</code> assembles a conceptual blueprint (e.g., <code>Node('boat') → Node('blue') → Node('ocean')</code>). This blueprint is passed as a highly-structured prompt to an external image generation model (like a VAE or Diffusion model) to render the final image.</p>
        </div>

        <h3 class="title is-4">Audio & Video</h3>
        <div class="content has-text-justified">
          <p>
            A similar approach applies to audio and video. For audio, speech-to-text models provide text for integration, while event detection models can identify non-speech sounds ("dog barking"). For video, an analysis model tracks objects and actions over time, which the <code>Builder</code> represents as a complex, time-stamped sub-graph.
          </p>
        </div>
        
        <h3 class="title is-4">Taking Action (Agency)</h3>
        <div class="content has-text-justified">
          <p>
            Actions are a native <code>Node</code> type in MPCA. To enable agency, these action nodes can be linked to real-world API calls or robotic functions. When the <code>Solver</code>'s thought-path traverses an <code>ActionNode</code> linked to an external function (e.g., <code>Node('turn_on_light')</code>), it triggers that function. This turns the MPCA from a passive knowledge base into an active agent that can perceive, reason about, and act upon its environment.
          </p>
        </div>
      </div>
    </div>
    <!--/ The Path Forward -->

  </div>
</section>


<footer class="footer">
  <div class="container">
    <div class="content has-text-centered">
      <a class="icon-link" href="https://github.com/bahira/myco-physarum" target="_blank" class="external-link" disabled>
        <i class="fab fa-github"></i>
      </a>
    </div>
    <div class="columns is-centered">
      <div class="column is-8">
        <div class="content">
          <p>
            This website is licensed under a <a rel="license" target="_blank"
                                                href="http://creativecommons.org/licenses/by-sa/4.0/">Creative
            Commons Attribution-ShareAlike 4.0 International License</a>.
          </p>
          <p>
            This page template was borrowed from the <a target="_blank" href="https://github.com/nerfies/nerfies.github.io">Nerfies</a> project website.
          </p>
        </div>
      </div>
    </div>
  </div>
</footer>

</body>
</html>