17,363 research outputs found
Image processing for plastic surgery planning
This thesis presents some image processing tools for plastic surgery planning. In particular,
it presents a novel method that combines local and global context in a probabilistic
relaxation framework to identify cephalometric landmarks used in Maxillofacial plastic
surgery. It also uses a method that utilises global and local symmetry to identify abnormalities
in CT frontal images of the human body. The proposed methodologies are
evaluated with the help of several clinical data supplied by collaborating plastic surgeons
Significant edges in the case of a non-stationary Gaussian noise
In this paper, we propose an edge detection technique based on some local
smoothing of the image followed by a statistical hypothesis testing on the
gradient. An edge point being defined as a zero-crossing of the Laplacian, it
is said to be a significant edge point if the gradient at this point is larger
than a threshold s(\eps) defined by: if the image is pure noise, then
\P(\norm{\nabla I}\geq s(\eps) \bigm| \Delta I = 0) \leq\eps. In other words,
a significant edge is an edge which has a very low probability to be there
because of noise. We will show that the threshold s(\eps) can be explicitly
computed in the case of a stationary Gaussian noise. In images we are
interested in, which are obtained by tomographic reconstruction from a
radiograph, this method fails since the Gaussian noise is not stationary
anymore. But in this case again, we will be able to give the law of the
gradient conditionally on the zero-crossing of the Laplacian, and thus compute
the threshold s(\eps). We will end this paper with some experiments and
compare the results with the ones obtained with some other methods of edge
detection
Optimal Probabilistic Ring Exploration by Asynchronous Oblivious Robots
We consider a team of identical, oblivious, asynchronous mobile robots
that are able to sense (\emph{i.e.}, view) their environment, yet are unable to
communicate, and evolve on a constrained path. Previous results in this weak
scenario show that initial symmetry yields high lower bounds when problems are
to be solved by \emph{deterministic} robots. In this paper, we initiate
research on probabilistic bounds and solutions in this context, and focus on
the \emph{exploration} problem of anonymous unoriented rings of any size. It is
known that robots are necessary and sufficient to solve the
problem with deterministic robots, provided that and are coprime.
By contrast, we show that \emph{four} identical probabilistic robots are
necessary and sufficient to solve the same problem, also removing the coprime
constraint. Our positive results are constructive
Complex Networks and Symmetry I: A Review
In this review we establish various connections between complex networks and
symmetry. While special types of symmetries (e.g., automorphisms) are studied
in detail within discrete mathematics for particular classes of deterministic
graphs, the analysis of more general symmetries in real complex networks is far
less developed. We argue that real networks, as any entity characterized by
imperfections or errors, necessarily require a stochastic notion of invariance.
We therefore propose a definition of stochastic symmetry based on graph
ensembles and use it to review the main results of network theory from an
unusual perspective. The results discussed here and in a companion paper show
that stochastic symmetry highlights the most informative topological properties
of real networks, even in noisy situations unaccessible to exact techniques.Comment: Final accepted versio
A Combinatorial Approach to Nonlocality and Contextuality
So far, most of the literature on (quantum) contextuality and the
Kochen-Specker theorem seems either to concern particular examples of
contextuality, or be considered as quantum logic. Here, we develop a general
formalism for contextuality scenarios based on the combinatorics of hypergraphs
which significantly refines a similar recent approach by Cabello, Severini and
Winter (CSW). In contrast to CSW, we explicitly include the normalization of
probabilities, which gives us a much finer control over the various sets of
probabilistic models like classical, quantum and generalized probabilistic. In
particular, our framework specializes to (quantum) nonlocality in the case of
Bell scenarios, which arise very naturally from a certain product of
contextuality scenarios due to Foulis and Randall. In the spirit of CSW, we
find close relationships to several graph invariants. The recently proposed
Local Orthogonality principle turns out to be a special case of a general
principle for contextuality scenarios related to the Shannon capacity of
graphs. Our results imply that it is strictly dominated by a low level of the
Navascu\'es-Pironio-Ac\'in hierarchy of semidefinite programs, which we also
apply to contextuality scenarios.
We derive a wealth of results in our framework, many of these relating to
quantum and supraquantum contextuality and nonlocality, and state numerous open
problems. For example, we show that the set of quantum models on a
contextuality scenario can in general not be characterized in terms of a graph
invariant.
In terms of graph theory, our main result is this: there exist two graphs
and with the properties \begin{align*} \alpha(G_1) &= \Theta(G_1),
& \alpha(G_2) &= \vartheta(G_2), \\[6pt] \Theta(G_1\boxtimes G_2) & >
\Theta(G_1)\cdot \Theta(G_2),& \Theta(G_1 + G_2) & > \Theta(G_1) + \Theta(G_2).
\end{align*}Comment: minor revision, same results as in v2, to appear in Comm. Math. Phy
Practical applications of probabilistic model checking to communication protocols
Probabilistic model checking is a formal verification technique for the analysis of systems that exhibit stochastic behaviour. It has been successfully employed in an extremely wide array of application domains including, for example, communication and multimedia protocols, security and power management. In this chapter we focus on the applicability of these techniques to the analysis of communication protocols. An analysis of the performance of such systems must successfully incorporate several crucial aspects, including concurrency between multiple components, real-time constraints and randomisation. Probabilistic model checking, in particular using probabilistic timed automata, is well suited to such an analysis. We provide an overview of this area, with emphasis on an industrially relevant case study: the IEEE 802.3 (CSMA/CD) protocol. We also discuss two contrasting approaches to the implementation of probabilistic model checking, namely those based on numerical computation and those based on discrete-event simulation. Using results from the two tools PRISM and APMC, we summarise the advantages, disadvantages and trade-offs associated with these techniques
- âŠ