Paper List

Journal: ArXiv Preprint
Published: Unknown
Network ScienceMachine Learning

Approximate Bayesian Inference on Mechanisms of Network Growth and Evolution

Harvard T.H. Chan School of Public Health

Maxwell H Wang, Till Hoffmann, Jukka-Pekka Onnela
Figure
Figure
Figure
Figure
Figure

The 30-Second View

IN SHORT: This paper addresses the core challenge of inferring the relative contributions of multiple, simultaneous generative mechanisms in network formation when the true likelihood is intractable.

Innovation (TL;DR)

  • Methodology Proposes an event-wise mixture-of-mechanisms model that assigns generative rules (e.g., Preferential Attachment, Random Attachment) to each edge formation event, rather than to nodes, increasing model flexibility and realism.
  • Methodology Introduces a novel GNN-MDN (Graph Neural Network - Mixture Density Network) architecture that automatically learns informative, low-dimensional network embeddings for conditional density estimation, bypassing the need for manually specified summary statistics.
  • Theory Formalizes a unified framework that incorporates both growth mechanisms (adding nodes/edges) and evolution mechanisms (modifying existing edges), allowing the model to capture a wider range of network dynamics like triangle formation.

Key conclusions

  • The proposed GNN-MDN method provides valid approximate Bayesian inference, demonstrated via simulation studies showing that the 95% credible intervals achieve nominal coverage (e.g., containing the true parameter values).
  • The event-wise model successfully infers dominant mechanisms in simulated scenarios; for instance, it accurately recovers a weight vector of (0.95, 0.025, 0.025) for a scenario where Preferential Attachment is the primary growth mechanism.
  • The method is applicable to real-world networks, providing interpretable decompositions of their formation processes into quantifiable contributions from mechanisms like Random Attachment, Preferential Attachment, and Triangle Formation.
Background and Gap: Existing methods for inferring mechanism weights in network growth models rely on manually selected, often high-dimensional summary statistics (e.g., clustering coefficient, motif counts), which are subject to the curse of dimensionality, may not be Bayes-sufficient, and cannot discover more informative, lower-dimensional transformations of the data.

Abstract: Mechanistic models can provide an intuitive and interpretable explanation of network growth by specifying a set of generative rules. These rules can be defined by domain knowledge about real-world mechanisms governing network growth or may be designed to facilitate the appearance of certain network motifs. In the formation of real-world networks, multiple mechanisms may be simultaneously involved; it is then important to understand the relative contribution of each of these mechanisms. In this paper, we propose the use of a conditional density estimator, augmented with a graph neural network, to perform inference on a flexible mixture of network-forming mechanisms. This event-wise mixture-of-mechanisms model assigns mechanisms to each edge formation event rather than stipulating node-level mechanisms, thus allowing for an explanation of the network generation process, as well as the dynamic evolution of the network over time. We demonstrate that our approximate Bayesian approach yields valid inferences for the relative weights of the mechanisms in our model, and we utilize this method to investigate the mechanisms behind the formation of a variety of real-world networks.