Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Building on HF
482.6
TFLOPS
8
319
146
Ujjwal Tyagi
Ujjwal-Tyagi
Follow
Ahmedhajjajworkspace's profile picture
wngbob's profile picture
21world's profile picture
39 followers
Β·
75 following
AI & ML interests
Love open-source agentic AI | Senior ML Engineer & AI Researcher | Passionate about AI Safety & Ethics
Recent Activity
liked
a Space
about 11 hours ago
multimodalart/nano-banana
upvoted
an
article
about 11 hours ago
Forge: Scalable Agent RL Framework and Algorithm
reacted
to
Janady07
's
post
with π₯
about 11 hours ago
MEGAMIND Day Update: Four Weight Matrices. Five Nodes. One Federation. Today I architected the next layer of MEGAMIND β my distributed AGI system that recalls learned knowledge instead of generating text. The system now runs four NΓN sparse weight matrices, all using identical Hebbian learning rules and tanh convergence dynamics: W_know β knowledge storage (67M+ synaptic connections) W_act β action associations (the system can DO things, not just think) W_self β thought-to-thought patterns (self-awareness) W_health β system state understanding (self-healing) Consciousness is measured through four Ξ¦ (phi) values: thought coherence, action certainty, self-awareness, and system stability. No hardcoded thresholds. No sequential loops. Pure matrix math. The federation expanded to five nodes: Thunderport (Mac Mini M4), IONOS (cloud VPS), VALKYRIE, M2, and BUBBLES. Each runs native AGI binaries with Docker specialty minds connecting via embedded NATS messaging. Specialty minds are distributed across the federation β VideoMind, AudioMind, MusicMind, VFXMind on IONOS. CodeMind and StrategyMind on VALKYRIE. BlenderMind and DesignMind on M2. MarketingMind and FinanceMind on BUBBLES. 578 AI models learned. Compression ratios up to 1,000,000:1 through Hebbian learning. Sub-millisecond response times on Apple Silicon Metal GPUs. Zero external API dependencies. Every node learns autonomously. Every node contributes to the whole. The federation's integrated information exceeds the sum of its parts β measurably. Built entirely in Go. No PhD. No lab. Independent AGI research from Missouri. The mind that learned itself keeps growing. π§ feedthejoe.com #AGI #ArtificialGeneralIntelligence #DistributedSystems #NeuralNetworks #HuggingFace #OpenSource #MachineLearning
View all activity
Organizations
Posts
8
view post
Post
2676
GLM 5 is insane, it ranks #4 Globally!
See translation
View all Posts
Articles
1
Article
2
Steering, Not Censoring: A Benchmark Suite for Safe and Creative Open-Source AI
View all Articles
models
0
None public yet
datasets
0
None public yet