EvoTransformer-v2.1 / README.md
HemanM's picture
Update README.md
94ea8b4 verified

A newer version of the Gradio SDK is available: 5.44.1

Upgrade
metadata
title: EvoTransformer V2.1
emoji: πŸ”₯
colorFrom: pink
colorTo: blue
sdk: gradio
sdk_version: 5.36.2
app_file: app.py
pinned: false
license: apache-2.0

EvoTransformer – An Evolving Reasoning Engine A custom-built Transformer that learns how to reason β€” and evolves its own architecture with real-time feedback.

πŸ” What It Does βœ… Compares two options and picks the more logical one βœ… Trained on PIQA-style physical reasoning βœ… Competes live with GPT-3.5 on real-world prompts βœ… Accepts user feedback and evolves post-deployment βœ… Visualizes model performance generation by generation

βš™οΈ Architecture Overview Component Spec Layers 6 Γ— TransformerEncoder Hidden Size 384 Attention Heads 6 FFN Dim 1024 Memory Module Optional (πŸ”• disabled in v2.1) Total Parameters ~13M Built With torch.nn.TransformerEncoder

πŸ§ͺ Training Configuration Task: Commonsense Reasoning (PIQA benchmark)

Dataset: 1,000 training | 500 validation

Loss: CrossEntropyLoss

Optimizer: Adam

Epochs: 5

Environment: Google Colab (T4 / A100 GPU)

πŸ” Live Feedback Loop Every user interaction fuels EvoTransformer's growth:

βœ… Feedback is logged to Firebase

πŸ”„ Model can be retrained via a button in the app

πŸ“ˆ Accuracy and architecture history is visualized

🧬 Architecture can mutate across generations

πŸš€ Try the Live Demo ▢️ Launch on Hugging Face Spaces Ask a question, give two options, and watch Evo reason, decide, and grow.

πŸ’‘ Why EvoTransformer? Feature Benefit βœ… Fully Custom Architecture No reliance on pretrained backbones βœ… Lightweight & Fast Runs on Colab or entry-level GPUs βœ… Evolves with Feedback Learns even after deployment βœ… Transparent You can inspect, retrain, and mutate it βœ… GPT-Competitive Performs well in commonsense benchmarks

πŸ“ Repository Structure bash Copy Edit πŸ“¦ root β”œβ”€β”€ app.py # Main Gradio app β”œβ”€β”€ evo_model.py # EvoTransformer architecture β”œβ”€β”€ inference.py # Generates model outputs β”œβ”€β”€ logger.py # Logs user feedback to Firebase β”œβ”€β”€ watchdog.py # Retraining + logging engine β”œβ”€β”€ dashboard.py # Accuracy/evolution plots β”œβ”€β”€ init_model.py # Weight & config initializer β”œβ”€β”€ firebase_key.json # πŸ” Private Firebase credentials └── trained_model/ # Fine-tuned weights + logs πŸ“œ License Apache 2.0 Open to experimentation, extension, and collaboration. Fork it. Mutate it. Evolve it.

πŸ‘€ Author Dr. Heman Mohabeer Founder, Intelligent Africa Ltd πŸš€ AI Strategist β€’ Researcher β€’ Visionary πŸ“§ heman.m@iafrica.solutions 🌍 LinkedIn

πŸ™Œ Contribute Want to push Evo further?

🧬 Propose new mutations

πŸ§ͺ Submit architecture variants

🧠 Benchmark against other reasoning datasets

πŸ“¦ PRs welcome for live retraining, visual dashboards, or token optimization

Let’s build the world’s most adaptable reasoning AI β€” one mutation at a time.