The quest to understand structure-function relationships in networks across
scientific disciplines has intensified. However, the optimal network
architecture remains elusive, particularly for complex information processing.
Therefore, we investigate how optimal and specific network structures form to
efficiently solve distinct tasks using a novel framework of
performance-dependent network evolution, leveraging reservoir computing
principles. Our study demonstrates that task-specific minimal network
structures obtained through this framework consistently outperform networks
generated by alternative growth strategies and Erd\H{o}s-R\'enyi random
networks. Evolved networks exhibit unexpected sparsity and adhere to scaling
laws in node-density space while showcasing a distinctive asymmetry in input
and information readout nodes distribution. Consequently, we propose a
heuristic for quantifying task complexity from performance-dependently evolved
networks, offering valuable insights into the evolutionary dynamics of network
structure-function relationships. Our findings not only advance the fundamental
understanding of process-specific network evolution but also shed light on the
design and optimization of complex information processing mechanisms, notably
in machine learning.Comment: 22 pages, 6 figure