Abstract
Tensor logic unifies neural and symbolic AI by using tensor equations, enabling scalable, learnable, and transparent AI systems.
Progress in AI is hindered by the lack of a programming language with all the requisite features. Libraries like PyTorch and TensorFlow provide automatic differentiation and efficient GPU implementation, but are additions to Python, which was never intended for AI. Their lack of support for automated reasoning and knowledge acquisition has led to a long and costly series of hacky attempts to tack them on. On the other hand, AI languages like LISP an Prolog lack scalability and support for learning. This paper proposes tensor logic, a language that solves these problems by unifying neural and symbolic AI at a fundamental level. The sole construct in tensor logic is the tensor equation, based on the observation that logical rules and Einstein summation are essentially the same operation, and all else can be reduced to them. I show how to elegantly implement key forms of neural, symbolic and statistical AI in tensor logic, including transformers, formal reasoning, kernel machines and graphical models. Most importantly, tensor logic makes new directions possible, such as sound reasoning in embedding space. This combines the scalability and learnability of neural networks with the reliability and transparency of symbolic reasoning, and is potentially a basis for the wider adoption of AI.
Community
New paper exploring neural symbolic AI
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Neuro-Symbolic Frameworks: Conceptual Characterization and Empirical Comparative Analysis (2025)
- The Syntax and Semantics of einsum (2025)
- The DeepLog Neurosymbolic Machine (2025)
- GPU-Accelerated Loopy Belief Propagation for Program Analysis (2025)
- Learning words in groups: fusion algebras, tensor ranks and grokking (2025)
- What do language models model? Transformers, automata, and the format of thought (2025)
- Constructing Invariant and Equivariant Operations by Symmetric Tensor Network (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper