34,133 research outputs found
Organic Design of Massively Distributed Systems: A Complex Networks Perspective
The vision of Organic Computing addresses challenges that arise in the design
of future information systems that are comprised of numerous, heterogeneous,
resource-constrained and error-prone components or devices. Here, the notion
organic particularly highlights the idea that, in order to be manageable, such
systems should exhibit self-organization, self-adaptation and self-healing
characteristics similar to those of biological systems. In recent years, the
principles underlying many of the interesting characteristics of natural
systems have been investigated from the perspective of complex systems science,
particularly using the conceptual framework of statistical physics and
statistical mechanics. In this article, we review some of the interesting
relations between statistical physics and networked systems and discuss
applications in the engineering of organic networked computing systems with
predictable, quantifiable and controllable self-* properties.Comment: 17 pages, 14 figures, preprint of submission to Informatik-Spektrum
published by Springe
QAmplifyNet: Pushing the Boundaries of Supply Chain Backorder Prediction Using Interpretable Hybrid Quantum - Classical Neural Network
Supply chain management relies on accurate backorder prediction for
optimizing inventory control, reducing costs, and enhancing customer
satisfaction. However, traditional machine-learning models struggle with
large-scale datasets and complex relationships, hindering real-world data
collection. This research introduces a novel methodological framework for
supply chain backorder prediction, addressing the challenge of handling large
datasets. Our proposed model, QAmplifyNet, employs quantum-inspired techniques
within a quantum-classical neural network to predict backorders effectively on
short and imbalanced datasets. Experimental evaluations on a benchmark dataset
demonstrate QAmplifyNet's superiority over classical models, quantum ensembles,
quantum neural networks, and deep reinforcement learning. Its proficiency in
handling short, imbalanced datasets makes it an ideal solution for supply chain
management. To enhance model interpretability, we use Explainable Artificial
Intelligence techniques. Practical implications include improved inventory
control, reduced backorders, and enhanced operational efficiency. QAmplifyNet
seamlessly integrates into real-world supply chain management systems, enabling
proactive decision-making and efficient resource allocation. Future work
involves exploring additional quantum-inspired techniques, expanding the
dataset, and investigating other supply chain applications. This research
unlocks the potential of quantum computing in supply chain optimization and
paves the way for further exploration of quantum-inspired machine learning
models in supply chain management. Our framework and QAmplifyNet model offer a
breakthrough approach to supply chain backorder prediction, providing superior
performance and opening new avenues for leveraging quantum-inspired techniques
in supply chain management
Recommended from our members
Neurons and symbols: a manifesto
We discuss the purpose of neural-symbolic integration including its principles, mechanisms and applications. We outline a cognitive computational model for neural-symbolic integration, position the model in the broader context of multi-agent systems, machine learning and automated reasoning, and list some of the challenges for the area of
neural-symbolic computation to achieve the promise of effective integration of robust learning and expressive reasoning under uncertainty
From Relational Data to Graphs: Inferring Significant Links using Generalized Hypergeometric Ensembles
The inference of network topologies from relational data is an important
problem in data analysis. Exemplary applications include the reconstruction of
social ties from data on human interactions, the inference of gene
co-expression networks from DNA microarray data, or the learning of semantic
relationships based on co-occurrences of words in documents. Solving these
problems requires techniques to infer significant links in noisy relational
data. In this short paper, we propose a new statistical modeling framework to
address this challenge. It builds on generalized hypergeometric ensembles, a
class of generative stochastic models that give rise to analytically tractable
probability spaces of directed, multi-edge graphs. We show how this framework
can be used to assess the significance of links in noisy relational data. We
illustrate our method in two data sets capturing spatio-temporal proximity
relations between actors in a social system. The results show that our
analytical framework provides a new approach to infer significant links from
relational data, with interesting perspectives for the mining of data on social
systems.Comment: 10 pages, 8 figures, accepted at SocInfo201
Optimal Ensemble Control of Loads in Distribution Grids with Network Constraints
Flexible loads, e.g. thermostatically controlled loads (TCLs), are
technically feasible to participate in demand response (DR) programs. On the
other hand, there is a number of challenges that need to be resolved before it
can be implemented in practice en masse. First, individual TCLs must be
aggregated and operated in sync to scale DR benefits. Second, the uncertainty
of TCLs needs to be accounted for. Third, exercising the flexibility of TCLs
needs to be coordinated with distribution system operations to avoid
unnecessary power losses and compliance with power flow and voltage limits.
This paper addresses these challenges. We propose a network-constrained,
open-loop, stochastic optimal control formulation. The first part of this
formulation represents ensembles of collocated TCLs modelled by an aggregated
Markov Process (MP), where each MP state is associated with a given power
consumption or production level. The second part extends MPs to a multi-period
distribution power flow optimization. In this optimization, the control of TCL
ensembles is regulated by transition probability matrices and physically
enabled by local active and reactive power controls at TCL locations. The
optimization is solved with a Spatio-Temporal Dual Decomposition (ST-D2)
algorithm. The performance of the proposed formulation and algorithm is
demonstrated on the IEEE 33-bus distribution model.Comment: 7 pages, 6 figures, accepted PSCC 201
Secure, performance-oriented data management for nanoCMOS electronics
The EPSRC pilot project Meeting the Design Challenges of nanoCMOS Electronics (nanoCMOS) is focused upon delivering a production level e-Infrastructure to meet the challenges facing the semiconductor industry in dealing with the next generation of ‘atomic-scale’ transistor devices. This scale means that previous assumptions on the uniformity of transistor devices in electronics circuit and systems design are no longer valid, and the industry as a whole must deal with variability throughout the design process. Infrastructures to tackle this problem must provide seamless access to very large HPC resources for computationally expensive simulation of statistic ensembles of microscopically varying physical devices, and manage the many hundreds of thousands of files and meta-data associated with these simulations. A key challenge in undertaking this is in protecting the intellectual property associated with the data, simulations and design process as a whole. In this paper we present the nanoCMOS infrastructure and outline an evaluation undertaken on the Storage Resource Broker (SRB) and the Andrew File System (AFS) considering in particular the extent that they meet the performance and security requirements of the nanoCMOS domain. We also describe how metadata management is supported and linked to simulations and results in a scalable and secure manner
- …