Spaces:
Sleeping
Sleeping
title: EvoTransformer V2.1 | |
emoji: π₯ | |
colorFrom: pink | |
colorTo: blue | |
sdk: gradio | |
sdk_version: 5.36.2 | |
app_file: app.py | |
pinned: false | |
license: apache-2.0 | |
EvoTransformer β An Evolving Reasoning Engine | |
A custom-built Transformer that learns how to reason β and evolves its own architecture with real-time feedback. | |
π What It Does | |
β Compares two options and picks the more logical one | |
β Trained on PIQA-style physical reasoning | |
β Competes live with GPT-3.5 on real-world prompts | |
β Accepts user feedback and evolves post-deployment | |
β Visualizes model performance generation by generation | |
βοΈ Architecture Overview | |
Component Spec | |
Layers 6 Γ TransformerEncoder | |
Hidden Size 384 | |
Attention Heads 6 | |
FFN Dim 1024 | |
Memory Module Optional (π disabled in v2.1) | |
Total Parameters ~13M | |
Built With torch.nn.TransformerEncoder | |
π§ͺ Training Configuration | |
Task: Commonsense Reasoning (PIQA benchmark) | |
Dataset: 1,000 training | 500 validation | |
Loss: CrossEntropyLoss | |
Optimizer: Adam | |
Epochs: 5 | |
Environment: Google Colab (T4 / A100 GPU) | |
π Live Feedback Loop | |
Every user interaction fuels EvoTransformer's growth: | |
β Feedback is logged to Firebase | |
π Model can be retrained via a button in the app | |
π Accuracy and architecture history is visualized | |
𧬠Architecture can mutate across generations | |
π Try the Live Demo | |
βΆοΈ Launch on Hugging Face Spaces | |
Ask a question, give two options, and watch Evo reason, decide, and grow. | |
π‘ Why EvoTransformer? | |
Feature Benefit | |
β Fully Custom Architecture No reliance on pretrained backbones | |
β Lightweight & Fast Runs on Colab or entry-level GPUs | |
β Evolves with Feedback Learns even after deployment | |
β Transparent You can inspect, retrain, and mutate it | |
β GPT-Competitive Performs well in commonsense benchmarks | |
π Repository Structure | |
bash | |
Copy | |
Edit | |
π¦ root | |
βββ app.py # Main Gradio app | |
βββ evo_model.py # EvoTransformer architecture | |
βββ inference.py # Generates model outputs | |
βββ logger.py # Logs user feedback to Firebase | |
βββ watchdog.py # Retraining + logging engine | |
βββ dashboard.py # Accuracy/evolution plots | |
βββ init_model.py # Weight & config initializer | |
βββ firebase_key.json # π Private Firebase credentials | |
βββ trained_model/ # Fine-tuned weights + logs | |
π License | |
Apache 2.0 | |
Open to experimentation, extension, and collaboration. | |
Fork it. Mutate it. Evolve it. | |
π€ Author | |
Dr. Heman Mohabeer | |
Founder, Intelligent Africa Ltd | |
π AI Strategist β’ Researcher β’ Visionary | |
π§ heman.m@iafrica.solutions | |
π LinkedIn | |
π Contribute | |
Want to push Evo further? | |
𧬠Propose new mutations | |
π§ͺ Submit architecture variants | |
π§ Benchmark against other reasoning datasets | |
π¦ PRs welcome for live retraining, visual dashboards, or token optimization | |
Letβs build the worldβs most adaptable reasoning AI β one mutation at a time. | |