Paper List
-
A Unified Variational Principle for Branching Transport Networks: Wave Impedance, Viscous Flow, and Tissue Metabolism
This paper solves the core problem of predicting the empirically observed branching exponent (α≈2.7) in mammalian arterial trees, which neither Murray...
-
Household Bubbling Strategies for Epidemic Control and Social Connectivity
This paper addresses the core challenge of designing household merging (social bubble) strategies that effectively control epidemic risk while maximiz...
-
Empowering Chemical Structures with Biological Insights for Scalable Phenotypic Virtual Screening
This paper addresses the core challenge of bridging the gap between scalable chemical structure screening and biologically informative but resource-in...
-
A mechanical bifurcation constrains the evolution of cell sheet folding in the family Volvocaceae
This paper addresses the core problem of why there is an evolutionary gap in species with intermediate cell numbers (e.g., 256 cells) in Volvocaceae, ...
-
Bayesian Inference in Epidemic Modelling: A Beginner’s Guide Illustrated with the SIR Model
This guide addresses the core challenge of estimating uncertain epidemiological parameters (like transmission and recovery rates) from noisy, real-wor...
-
Geometric framework for biological evolution
This paper addresses the fundamental challenge of developing a coordinate-independent, geometric description of evolutionary dynamics that bridges gen...
-
A multiscale discrete-to-continuum framework for structured population models
This paper addresses the core challenge of systematically deriving uniformly valid continuum approximations from discrete structured population models...
-
Whole slide and microscopy image analysis with QuPath and OMERO
使QuPath能够直接分析存储在OMERO服务器中的图像而无需下载整个数据集,克服了大规模研究的本地存储限制。
Equivalence of approximation by networks of single- and multi-spike neurons
Faculty of Mathematics and Research Network DataScience @ Uni Vienna, University of Vienna
30秒速读
IN SHORT: This paper resolves the fundamental question of whether single-spike spiking neural networks (SNNs) are inherently less expressive than multi-spike SNNs, proving their theoretical equivalence in approximation capabilities.
核心创新
- Theory Established a formal transference principle (Theorem 1) proving that approximation bounds for multi-spike SNNs directly translate to single-spike SNNs with at most N_s·n neurons, and vice versa.
- Methodology Developed constructive proofs showing how to replace any multi-spike neuron with N_s single-spike neurons (by threshold adjustment) and any single-spike neuron with αN_s multi-spike neurons (via spike cancellation).
- Theory Extended the equivalence to include lower bounds (Corollary 1) and common input encoders (Corollary 2), making existing theoretical results for one paradigm immediately applicable to the other.
主要结论
- Single-spike and multi-spike SNNs are theoretically equivalent in approximation capabilities for a large class of neuron models including LIF with subtractive reset.
- Any approximation bound for multi-spike SNNs with n neurons translates to single-spike SNNs with at most N_s·n neurons (linear scaling in maximum spike count).
- The reverse direction holds with prefactor α ≤ min(1, 6/π² + 1/√N_s) for N_s ≥ 1, and α < 6/π² + 1/(2√N_s) for N_s ≥ 8.
摘要: In a spiking neural network, is it enough for each neuron to spike at most once? In recent work, approximation bounds for spiking neural networks have been derived, quantifying how well they can fit target functions. However, these results are only valid for neurons that spike at most once, which is commonly thought to be a strong limitation. Here, we show that the opposite is true for a large class of spiking neuron models, including the commonly used leaky integrate-and-fire model with subtractive reset: for every approximation bound that is valid for a set of multi-spike neural networks, there is an equivalent set of single-spike neural networks with only linearly more neurons (in the maximum number of spikes) for which the bound holds. The same is true for the reverse direction too, showing that regarding their approximation capabilities in general machine learning tasks, single-spike and multi-spike neural networks are equivalent. Consequently, many approximation results in the literature for single-spike neural networks also hold for the multi-spike case.