Spaces:
Sleeping
Sleeping
metadata
title: EvoTransformer Demo
emoji: π§¬
colorFrom: pink
colorTo: green
sdk: gradio
app_file: app.py
pinned: false
license: mit
sdk_version: 5.36.2
𧬠EvoTransformer Demo
Welcome to the official demo of EvoTransformer β an evolving Transformer architecture built to adapt itself during training using principles inspired by evolutionary algorithms.
This project showcases a lightweight, in-training neural architecture search (NAS) system that mutates key traits such as:
- Number of layers
- Attention heads
- Feed-forward dimension
- Dropout
- Memory module toggle
π Developed by Dr. Heman Mohabeer, Intelligent Africa Ltd
π€ Submitted to JMLR 2025 | π Built from Mauritius
π Try It Live
Use the Gradio interface to simulate architectural evolution across generations.
Visualize how traits adapt β and get a simulated accuracy + parameter estimate.
π Behind the Scenes
EvoTransformer includes:
- Genetic operators: mutation, crossover (demo limited to mutation)
- Structural traits representation
- Online evolution loop
- Lightweight scoring and parameter estimation
This demo is a simplified, live-running version of the full EvoTransformer system submitted for peer review.
π Citation
@misc{mohabeer2024evotransformer,
title={EvoTransformer: In-Training Evolution of Transformer Architectures for Adaptive and Efficient NLP},
author={Heman Mohabeer},
year={2024},
note={Hugging Face Demo},
url={https://huggingface.co/spaces/HemanM/EvoTransformer-Demo}
}
---
## π Links
- π [JMLR Submission PDF (coming soon)]()
- π§ [Colab Notebook (in progress)]()
- π [More from Dr. Heman Mohabeer](https://linkedin.com/in/hemanmohabeer)
---
## π License
MIT License β feel free to use, fork, and build upon this demo.