431 research outputs found
A critical discussion on the usefulness and reliability of mathematical modeling for service life design of infrastructure
In view of the increasing age of existing structures asset managers are becoming more and more interested to have a clearer picture on the actual condition of the complete stock of existing infrastructure as to anticipate on possible maintenance regarding planning and allocation of the financial resources. Consequently, a clear need is emerging for prediction of the condition level over time using mathematical models. Regarding the design of new structures, the current codes are based on traditional options and thus give ample possibilities for alternative options. For instance, at present the significantly different performance of binders is not taken into account. Therefore it is not surprising that in recent years a clear trend can be observed towards the application of mathematical modelling using a probabilistic approach for durability, e.g. the fib Model Code on Service Life Design. In order to allow for prediction of the condition of a structural component over time or to demonstrate equal performance of design solutions, widely accepted mathematical models that describe degradation processes are required. Ideally, such models should be mathematically and physically sound, provide logical and realistic results, understandable and usable for practitioners, and thus to be to a considerable extent foolproof. However, most models include significant pitfalls and limitations which are either not mentioned or not known even to the developer. In addition, in most cases the quantification of the input parameters is not addressed which will undoubtedly result in ‘shopping’. In this respect the use of input values based on expert opinion should be treated with serious caution. In addition it has to be noted that most models have been calibrated on results obtained for laboratory experiments that have been performed under ideal conditions not reflecting situations encountered in practice. Experience has also shown that probabilistic approaches are frequently misused as to support a wrong decision or an execution error (shallow cover depths)
Non-Backtracking Spectrum of Degree-Corrected Stochastic Block Models
Motivated by community detection, we characterise the spectrum of the
non-backtracking matrix in the Degree-Corrected Stochastic Block Model.
Specifically, we consider a random graph on vertices partitioned into two
equal-sized clusters. The vertices have i.i.d. weights
with second moment . The intra-cluster connection probability for
vertices and is and the inter-cluster
connection probability is .
We show that with high probability, the following holds: The leading
eigenvalue of the non-backtracking matrix is asymptotic to . The second eigenvalue is asymptotic to when , but asymptotically bounded by
when . All the remaining eigenvalues are
asymptotically bounded by . As a result, a clustering
positively-correlated with the true communities can be obtained based on the
second eigenvector of in the regime where
In a previous work we obtained that detection is impossible when meaning that there occurs a phase-transition in the sparse regime of the
Degree-Corrected Stochastic Block Model.
As a corollary, we obtain that Degree-Corrected Erd\H{o}s-R\'enyi graphs
asymptotically satisfy the graph Riemann hypothesis, a quasi-Ramanujan
property.
A by-product of our proof is a weak law of large numbers for
local-functionals on Degree-Corrected Stochastic Block Models, which could be
of independent interest
A spectral method for community detection in moderately-sparse degree-corrected stochastic block models
We consider community detection in Degree-Corrected Stochastic Block Models
(DC-SBM). We propose a spectral clustering algorithm based on a suitably
normalized adjacency matrix. We show that this algorithm consistently recovers
the block-membership of all but a vanishing fraction of nodes, in the regime
where the lowest degree is of order log or higher. Recovery succeeds even
for very heterogeneous degree-distributions. The used algorithm does not rely
on parameters as input. In particular, it does not need to know the number of
communities
The effect of perception anisotropy on particle systems describing pedestrian flows in corridors
We consider a microscopic model (a system of self-propelled particles) to
study the behaviour of a large group of pedestrians walking in a corridor. Our
point of interest is the effect of anisotropic interactions on the global
behaviour of the crowd. The anisotropy we have in mind reflects the fact that
people do not perceive (i.e. see, hear, feel or smell) their environment
equally well in all directions. The dynamics of the individuals in our model
follow from a system of Newton-like equations in the overdamped limit. The
instantaneous velocity is modelled in such a way that it accounts for the angle
under which an individual perceives another individual. We investigate the
effects of this perception anisotropy by means of simulations, very much in the
spirit of molecular dynamics. We define a number of characteristic quantifiers
(including the polarization index and Morisita index) that serve as measures
for e.g. organization and clustering, and we use these indices to investigate
the influence of anisotropy on the global behaviour of the crowd. The goal of
the paper is to investigate the potentiality of this model; extensive
statistical analysis of simulation data, or reproducing any specific real-life
situation are beyond its scope.Comment: 24 page
Adaptive Matching for Expert Systems with Uncertain Task Types
A matching in a two-sided market often incurs an externality: a matched
resource may become unavailable to the other side of the market, at least for a
while. This is especially an issue in online platforms involving human experts
as the expert resources are often scarce. The efficient utilization of experts
in these platforms is made challenging by the fact that the information
available about the parties involved is usually limited.
To address this challenge, we develop a model of a task-expert matching
system where a task is matched to an expert using not only the prior
information about the task but also the feedback obtained from the past
matches. In our model the tasks arrive online while the experts are fixed and
constrained by a finite service capacity. For this model, we characterize the
maximum task resolution throughput a platform can achieve. We show that the
natural greedy approaches where each expert is assigned a task most suitable to
her skill is suboptimal, as it does not internalize the above externality. We
develop a throughput optimal backpressure algorithm which does so by accounting
for the `congestion' among different task types. Finally, we validate our model
and confirm our theoretical findings with data-driven simulations via logs of
Math.StackExchange, a StackOverflow forum dedicated to mathematics.Comment: A part of it presented at Allerton Conference 2017, 18 page
Explosiveness of Age-Dependent Branching Processes with Contagious and Incubation Periods
We study explosiveness of age-dependent branching processes describing the
early stages of an epidemic-spread: both forward- and backward process are
analysed. For the classical age-dependent branching process , where the
offspring has probability generating function and all individuals have
life-lengths independently picked from a distribution , we focus on the
setting , with a function varying slowly at infinity and
. Here, as . For a fixed , the process
explodes either for all or for no , regardless of . Next, we add contagious periods to all
individuals and let their offspring survive only if their life-length is
smaller than the contagious period of their mother: a forward process. An
explosive process , as above, stays explosive when adding a
non-zero contagious period. We extend this setting to backward processes with
contagious periods. Further, we consider processes with incubation periods
during which an individual has already contracted the disease but is not able
yet to infect her acquaintances. We let these incubation periods follow a
distribution . In the forward process , every
individual possesses an incubation period and only her offspring with life-time
larger than this period survives. In the backward process
, individuals survive only if their life-time exceeds
their own incubation period. These two processes are the content of the third
main result that we establish: under a mild condition on and ,
explosiveness of both and is necessary and sufficient for
processes and to explode.Comment: References adde
- …