A minimal DSL + IR that compiles logical rules (predicates, quantifiers, implications) into optimized einsum graphs. Full neurosymbolic integration with differentiable fuzzy/probabilistic semantics — now available as 0.1.0-rc.1.
A quiet but powerful addition to the COOLJAPAN ecosystem.
On March 7 we made TensorLogic 0.1.0-rc.1 available — a new layer that turns symbolic logic into native tensor computation.
This is not just another logic library.
It is a compiler that translates arbitrary logical expressions into highly optimized einsum graphs, enabling seamless hybrid neural + symbolic + probabilistic reasoning inside a single tensor runtime.
Traditional neurosymbolic systems struggle with two problems:
TensorLogic solves both by treating logic as first-class tensor algebra.
You write rules in a tiny DSL:
let rule = exists(x, forall(y, implies(P(x), Q(x, y))));
The compiler turns it into an optimized EinsumGraph that runs on SciRS2 (or ToRSh) at full SIMD/GPU speed — with gradients if you want them.
DSL → AST
TLExpr with predicates, quantifiers (∀, ∃), connectives, and custom operators.
IR Generation
Static analysis produces an EinsumGraph IR with:
Logic → Tensor Mapping (configurable via CompilationStrategy)
| Logical Operator | Tensor Operation (default) | Differentiable variant |
|---|---|---|
AND(a, b) | a * b (Hadamard) | soft product |
OR(a, b) | max(a, b) | soft max / logsumexp |
NOT(a) | 1.0 - a | sigmoid-based |
∃x.P(x) | sum(P, axis=x) | logsumexp |
∀x.P(x) | 1.0 - sum(1-P, axis=x) | soft min |
a → b | max(1-a, b) | ReLU(b - a) |
Six built-in strategies: hard_boolean, soft_differentiable, fuzzy_godel, fuzzy_product, fuzzy_lukasiewicz, probabilistic.
pytensorlogic)TensorLogic is now the neurosymbolic glue for the entire COOLJAPAN stack:
Availability
→ https://github.com/cool-japan/tensorlogic
Star the repo if you want to train neural networks that actually obey logic — with gradients.
The boundary between symbolic and neural is gone.
It’s all tensors now.
— KitaSan at COOLJAPAN OU
March 7, 2026