Mathematical Colloquium Symbolic neural network learning
- Thursday, 30. October 2025, 16:15 - 17:15
- Hörsaal Mathematikon
- Prof. Dr. Helmut Bölcskei (ETH Zurich)
A central challenge for AI systems is symbolic reasoning. This is illustrated by the Abstraction and Reasoning Corpus (ARC) benchmark, with humans achieving 84% performance compared to 10% by GPT-5. Recent successes in AI reasoning, such as e.g. DeepMind’s AlphaEvolve or AlphaGeometry as well as IMO gold-medal level performance achieving systems rely on hybrid methods that combine neural networks with symbolic or geometric reasoning modules. In this talk, we consider the problem of learning the transition rules of cellular automata (CA) from observed evolution traces, a symbolic learning challenge that is even more demanding than ARC. While it has long been known that binary CA are essentially machines realizing operations in Boolean logic, we show that, in fact, all CA are logical machines, specifically, in Łukasiewicz propositional many-valued logic. This is accomplished by interpolating CA transition functions to continuous piecewise-linear maps and invoking the McNaughton theorem. Since deep ReLU networks realize continuous piecewise-linear functions, they are naturally suited to extract these logical rules from CA evolution traces. We show that all CA can be learned by recurrent neural networks. Moreover, the formula in many-valued logic characterizing the CA transition function can be extracted from the learned recurrent neural network. The talk builds a bridge between symbolic logic, cellular automata, and deep learning, pointing toward a path for endowing neural networks with genuine symbolic reasoning capability.
Address
Hörsaal Mathematikon
Live-stream
Organizer
Petra Schwer, Johannes Walcher
Event Type
Talk
Event Homepage
Contact
Robert Scheichl