139,731 research outputs found
Optimal Control of Transient Flow in Natural Gas Networks
We outline a new control system model for the distributed dynamics of
compressible gas flow through large-scale pipeline networks with time-varying
injections, withdrawals, and control actions of compressors and regulators. The
gas dynamics PDE equations over the pipelines, together with boundary
conditions at junctions, are reduced using lumped elements to a sparse
nonlinear ODE system expressed in vector-matrix form using graph theoretic
notation. This system, which we call the reduced network flow (RNF) model, is a
consistent discretization of the PDE equations for gas flow. The RNF forms the
dynamic constraints for optimal control problems for pipeline systems with
known time-varying withdrawals and injections and gas pressure limits
throughout the network. The objectives include economic transient compression
(ETC) and minimum load shedding (MLS), which involve minimizing compression
costs or, if that is infeasible, minimizing the unfulfilled deliveries,
respectively. These continuous functional optimization problems are
approximated using the Legendre-Gauss-Lobatto (LGL) pseudospectral collocation
scheme to yield a family of nonlinear programs, whose solutions approach the
optima with finer discretization. Simulation and optimization of time-varying
scenarios on an example natural gas transmission network demonstrate the gains
in security and efficiency over methods that assume steady-state behavior
Network Applications and the Utah Homeless Network
Graph theory is the foundation on which social network analysis (SNA) is built. With the flood of big data, graph theoretical concepts and their linear algebraic counterparts are essential tools for analysis in the burgeoning field of network data analysis, in which SNA is a subset. Here we begin with an overview of SNA. We then discuss the common descriptive measures taken on network data as well as proposing new measures specific to homeless networks. We also define a new data structure which we call the location sequence matrix. This data structure makes certain computational network analyses particularly easy. Finally we apply Pulse Processes in a new way to the homeless network in Utah. We believe the new data structure and pulse processes, when used for analysis of the Utah homeless services. In particular, pulse processes, first introduced by Brown, Roberts, and Spencer, to analyze energy demand, form a dynamic population model that can provide a measure of the stability in a network and the patterns of action of individuals experiencing homelessness
Identifiability and transportability in dynamic causal networks
In this paper we propose a causal analog to the purely observational Dynamic Bayesian Networks, which we call Dynamic Causal Networks.
We provide a sound and complete algorithm for identification of Dynamic Causal Networks, namely, for computing the effect of an intervention or experiment, based on passive observations only, whenever possible. We note the existence of two types of confounder variables that affect in substantially different ways the identification
procedures, a distinction with no analog in either Dynamic Bayesian Networks or standard causal graphs. We further propose a procedure
for the transportability of causal effects in Dynamic Causal Network settings, where the result of causal experiments in a source domain may be used for the identification of causal effects in a target domain.Preprin
Faster Worst Case Deterministic Dynamic Connectivity
We present a deterministic dynamic connectivity data structure for undirected
graphs with worst case update time and constant query time. This improves on the previous best
deterministic worst case algorithm of Frederickson (STOC 1983) and Eppstein
Galil, Italiano, and Nissenzweig (J. ACM 1997), which had update time
. All other algorithms for dynamic connectivity are either
randomized (Monte Carlo) or have only amortized performance guarantees
Recommended from our members
Computer-aided programming for multiprocessing systems
As both the number of processors and the complexity of problems to be solved increase, programming multiprocessing systems becomes more difficult and error-prone. This report discusses parallel models of computation and tools for computer-aided programming (CAP). Program development tools are necessary since programmers are not able to develop complex parallel programs efficiently. In particular, a CAP tool, named Hypertool, is described here. It performs scheduling and handles the communication primitive insertion automatically so that many errors are eliminated. It also generates the performance estimates and other program quality measures to help programmers in improving their algorithms and programs. Experiments have shown that up to a 300% performance improvement can be achieved by computer-aided programming
- …