132,509 research outputs found
Certainty Closure: Reliable Constraint Reasoning with Incomplete or Erroneous Data
Constraint Programming (CP) has proved an effective paradigm to model and
solve difficult combinatorial satisfaction and optimisation problems from
disparate domains. Many such problems arising from the commercial world are
permeated by data uncertainty. Existing CP approaches that accommodate
uncertainty are less suited to uncertainty arising due to incomplete and
erroneous data, because they do not build reliable models and solutions
guaranteed to address the user's genuine problem as she perceives it. Other
fields such as reliable computation offer combinations of models and associated
methods to handle these types of uncertain data, but lack an expressive
framework characterising the resolution methodology independently of the model.
We present a unifying framework that extends the CP formalism in both model
and solutions, to tackle ill-defined combinatorial problems with incomplete or
erroneous data. The certainty closure framework brings together modelling and
solving methodologies from different fields into the CP paradigm to provide
reliable and efficient approches for uncertain constraint problems. We
demonstrate the applicability of the framework on a case study in network
diagnosis. We define resolution forms that give generic templates, and their
associated operational semantics, to derive practical solution methods for
reliable solutions.Comment: Revised versio
Network coding meets TCP
We propose a mechanism that incorporates network coding into TCP with only
minor changes to the protocol stack, thereby allowing incremental deployment.
In our scheme, the source transmits random linear combinations of packets
currently in the congestion window. At the heart of our scheme is a new
interpretation of ACKs - the sink acknowledges every degree of freedom (i.e., a
linear combination that reveals one unit of new information) even if it does
not reveal an original packet immediately. Such ACKs enable a TCP-like
sliding-window approach to network coding. Our scheme has the nice property
that packet losses are essentially masked from the congestion control
algorithm. Our algorithm therefore reacts to packet drops in a smooth manner,
resulting in a novel and effective approach for congestion control over
networks involving lossy links such as wireless links. Our experiments show
that our algorithm achieves higher throughput compared to TCP in the presence
of lossy wireless links. We also establish the soundness and fairness
properties of our algorithm.Comment: 9 pages, 9 figures, submitted to IEEE INFOCOM 200
Random Linear Network Coding for 5G Mobile Video Delivery
An exponential increase in mobile video delivery will continue with the
demand for higher resolution, multi-view and large-scale multicast video
services. Novel fifth generation (5G) 3GPP New Radio (NR) standard will bring a
number of new opportunities for optimizing video delivery across both 5G core
and radio access networks. One of the promising approaches for video quality
adaptation, throughput enhancement and erasure protection is the use of
packet-level random linear network coding (RLNC). In this review paper, we
discuss the integration of RLNC into the 5G NR standard, building upon the
ideas and opportunities identified in 4G LTE. We explicitly identify and
discuss in detail novel 5G NR features that provide support for RLNC-based
video delivery in 5G, thus pointing out to the promising avenues for future
research.Comment: Invited paper for Special Issue "Network and Rateless Coding for
Video Streaming" - MDPI Informatio
An efficient null space inexact Newton method for hydraulic simulation of water distribution networks
Null space Newton algorithms are efficient in solving the nonlinear equations
arising in hydraulic analysis of water distribution networks. In this article,
we propose and evaluate an inexact Newton method that relies on partial updates
of the network pipes' frictional headloss computations to solve the linear
systems more efficiently and with numerical reliability. The update set
parameters are studied to propose appropriate values. Different null space
basis generation schemes are analysed to choose methods for sparse and
well-conditioned null space bases resulting in a smaller update set. The Newton
steps are computed in the null space by solving sparse, symmetric positive
definite systems with sparse Cholesky factorizations. By using the constant
structure of the null space system matrices, a single symbolic factorization in
the Cholesky decomposition is used multiple times, reducing the computational
cost of linear solves. The algorithms and analyses are validated using medium
to large-scale water network models.Comment: 15 pages, 9 figures, Preprint extension of Abraham and Stoianov, 2015
(https://dx.doi.org/10.1061/(ASCE)HY.1943-7900.0001089), September 2015.
Includes extended exposition, additional case studies and new simulations and
analysi
A parallel interaction potential approach coupled with the immersed boundary method for fully resolved simulations of deformable interfaces and membranes
In this paper we show and discuss the use of a versatile interaction
potential approach coupled with an immersed boundary method to simulate a
variety of flows involving deformable bodies. In particular, we focus on two
kinds of problems, namely (i) deformation of liquid-liquid interfaces and (ii)
flow in the left ventricle of the heart with either a mechanical or a natural
valve. Both examples have in common the two-way interaction of the flow with a
deformable interface or a membrane. The interaction potential approach (de
Tullio & Pascazio, Jou. Comp. Phys., 2016; Tanaka, Wada and Nakamura,
Computational Biomechanics, 2016) with minor modifications can be used to
capture the deformation dynamics in both classes of problems. We show that the
approach can be used to replicate the deformation dynamics of liquid-liquid
interfaces through the use of ad-hoc elastic constants. The results from our
simulations agree very well with previous studies on the deformation of drops
in standard flow configurations such as deforming drop in a shear flow or a
cross flow. We show that the same potential approach can also be used to study
the flow in the left ventricle of the heart. The flow imposed into the
ventricle interacts dynamically with the mitral valve (mechanical or natural)
and the ventricle which are simulated using the same model. Results from these
simulations are compared with ad- hoc in-house experimental measurements.
Finally, a parallelisation scheme is presented, as parallelisation is
unavoidable when studying large scale problems involving several thousands of
simultaneously deforming bodies on hundreds of distributed memory computing
processors
ESTIMATION ISSUES IN SINGLE COMMODITY GRAVITY TRADE MODELS
Recently gravity trade models are applied to disaggregated trade data. Here many zeros are characteristic. In the presence of excess zeros usual Poisson Pseudo Maximum Likelihood (PPML) is still consistent, the variance covariance matrix however is invalid. Correct economic interpretation however requires also the last. So alternative estimators are looked for. STAUB &WINKELMANN (2010) argue that zero-inflated count data models (i.e. zero-inflated Poisson / Negative Binomial Pseudo Maximum Likelihood (ZIPPML / ZINBPML)) are no alternative since under model misspecification these estimators are inconsistent. Yet zeroinflated Poisson Quasi-Likelihood (PQL) is a reliable alternative. It is consistent even under model misspecifications and beyond that robust against unobserved heterogeneity. Another alternative is a log-skew-normal Two-Part Model (G2PM) which generalises the standard lognormal Two-Part Model (2PM). It is insofar advantageous as it adjusts for (negative) skewness and regression coefficients retain usual interpretations as in log-normal models. PQL is useful for multiplicative gravity model estimation and G2PM for log-linear gravity model estimation. Exemplarily the estimators are applied to intra-European piglet trade to assess their empirical performance and applicability for single commodity trade flow analysis. The empirical part favours PQL but G2PM is a reliable alternative for other trade flow analyses. PQL and G2PM should become standard tools for single commodity trade flow analysis.Gravity Model, Excess Zeros, Poisson Quasi Likelihood, Generalised Two Part Model, Gravitationsmodell, Exzess an Nullen, Poisson Quasi Likelihood, Generalisiertes Zwei-Teile Modell, Agribusiness, Agricultural and Food Policy, Agricultural Finance, Demand and Price Analysis, Financial Economics,
- …