File size: 1,166 Bytes
95fe3ff
 
 
 
 
 
 
 
 
 
 
 
b7fe99b
 
1f5f2f5
b7fe99b
1f5f2f5
b7fe99b
 
1f5f2f5
b7fe99b
 
 
 
 
1f5f2f5
b7fe99b
 
 
 
 
 
1f5f2f5
b7fe99b
 
 
1f5f2f5
b7fe99b
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
---
title: EvoTransformer V2.1
emoji: πŸ”₯
colorFrom: pink
colorTo: blue
sdk: gradio
sdk_version: 5.36.2
app_file: app.py
pinned: false
license: apache-2.0
---

'''
# EvoTransformer v2.1 🧠✨

EvoTransformer is an evolving neural architecture built from scratch to tackle reasoning tasks with minimal compute.

## πŸ“Œ What It Does
This model answers PIQA-style commonsense reasoning questions. Given a goal and two solution choices, EvoTransformer chooses the more logical one.

## πŸ”§ Architecture
- Built with 4 Transformer encoder layers
- ~13 million parameters
- Custom embedding, pooling, and classifier layers
- Fully open and adaptable for NAS or self-evolving tasks

## πŸ‹οΈβ€β™‚οΈ Training Details
- Dataset: PIQA (1000 training, 500 validation examples)
- Optimizer: Adam
- Loss: CrossEntropy
- Epochs: 5
- Hardware: Colab GPU

## πŸš€ Live Demo
Try it on Hugging Face Spaces:
πŸ‘‰ [Demo Link](https://huggingface.co/spaces/YOUR_USERNAME/evo-transformer-demo)

## πŸ’‘ Why EvoTransformer?
- Lean, fast, and efficient
- Custom-built from scratch (no pretraining dependencies)
- Can evolve structurally in future versions

## πŸ“œ License
MIT
'''