Symiotic-14B / README.md
reaperdoesntknow's picture
Update README.md
434c1e1 verified
metadata
license: afl-3.0
datasets:
  - 0xZee/dataset-CoT-Advanced-Calculus-268
language:
  - en
base_model:
  - Qwen/Qwen3-14B
pipeline_tag: text-generation
library_name: transformers
tags:
  - qwen3
  - symbiotic
  - symbioticai
  - llm
  - Symbols

SymbioticLM-14B

Model Type: Hybrid Symbolic–Transformer with Persistent Memory
Base Model: Qwen-14B
Framework: PyTorch + HuggingFace Transformers
Purpose: Full-scale cognitive reasoning model with self-organizing memory and generative symbolic evolution


Overview

SymbioticLM-14B is a state-of-the-art 17.8 billion parameter symbolic–transformer hybrid model that tightly couples high-capacity neural representation with structured symbolic cognition. Designed to match or exceed performance of top-tier LLMs in symbolic domains, it supports persistent memory, entropic recall, multi-stage symbolic routing, and self-organizing knowledge structures.

This model is ideal for advanced reasoning agents, research assistants, and symbolic math/code generation systems.


Architecture Highlights

  • Backbone: Qwen-14B transformer with rotary embeddings + FlashAttention
  • Symbolic Dim: 8192
  • Symbolic Modules:
    • ThoughtDynamicsLNN (multi-head LSTM attention)
    • LiquidThoughtProcessor
    • CrystallineProcessor (DNAConv GNN)
    • HelicalDNAProcessor (linear helical encoding)
  • Memory: 4096 symbolic states in FP32, retrieved using entropy + contextual similarity
  • Dream Mode: Background symbolic simulation for open-ended cognition
  • Router: Intent classifier + entropy gating for processor path selection

Files Included

File Description
model.bin Transformer weights (LFS)
model.safetensors Memory-safe weights, optimized for loading
memory.pt 4096-symbolic vector bank
config.json Model and architectural metadata
generation_config.json Top-p, temperature, decoding settings
tokenizer.json Full tokenizer with symbolic tag support
added_tokens.json Tags like <D_LIM>, <PROOF>, <BY_MEASURE>, etc.
special_tokens_map.json Special token mapping for tokenizer

Intended Uses

  • Multi-step conversational agents with true memory
  • Long-form symbolic theorem generation and proof planning
  • Scientific dialogue, symbolic simulations, math/code synthesis
  • Reasoning in fuzzy, discontinuous, or non-smooth problem domains

Limitations

  • Memory requires curation and seeding for maximum utility
  • Symbolic cognition is not instruction-tuned for general QA
  • FlashAttention and symbolic modules increase VRAM usage during generation

Citations

Please cite "SymbioticLM" when using symbolic memory components in research or applications.