44,739 research outputs found
Developing a Mathematical Model for Bobbin Lace
Bobbin lace is a fibre art form in which intricate and delicate patterns are
created by braiding together many threads. An overview of how bobbin lace is
made is presented and illustrated with a simple, traditional bookmark design.
Research on the topology of textiles and braid theory form a base for the
current work and is briefly summarized. We define a new mathematical model that
supports the enumeration and generation of bobbin lace patterns using an
intelligent combinatorial search. Results of this new approach are presented
and, by comparison to existing bobbin lace patterns, it is demonstrated that
this model reveals new patterns that have never been seen before. Finally, we
apply our new patterns to an original bookmark design and propose future areas
for exploration.Comment: 20 pages, 18 figures, intended audience includes Artists as well as
Computer Scientists and Mathematician
A version of the random directed forest and its convergence to the Brownian web
Several authors have studied convergence in distribution to the Brownian web
under diffusive scaling of Markovian random walks. In a paper by R. Roy, K.
Saha and A. Sarkar, convergence to the Brownian web is proved for a system of
coalescing random paths - the Random Directed Forest- which are not Markovian.
Paths in the Random Directed Forest do not cross each other before coalescence.
Here we study a generalization of the non-Markovian Random Directed Forest
where paths can cross each other and prove convergence to the Brownian web.
This provides an example of how the techniques to prove convergence to the
Brownian web for systems allowing crossings can be applied to non-Markovian
systems
Path-tracing Monte Carlo Library for 3D Radiative Transfer in Highly Resolved Cloudy Atmospheres
Interactions between clouds and radiation are at the root of many
difficulties in numerically predicting future weather and climate and in
retrieving the state of the atmosphere from remote sensing observations. The
large range of issues related to these interactions, and in particular to
three-dimensional interactions, motivated the development of accurate radiative
tools able to compute all types of radiative metrics, from monochromatic, local
and directional observables, to integrated energetic quantities. In the
continuity of this community effort, we propose here an open-source library for
general use in Monte Carlo algorithms. This library is devoted to the
acceleration of path-tracing in complex data, typically high-resolution
large-domain grounds and clouds. The main algorithmic advances embedded in the
library are those related to the construction and traversal of hierarchical
grids accelerating the tracing of paths through heterogeneous fields in
null-collision (maximum cross-section) algorithms. We show that with these
hierarchical grids, the computing time is only weakly sensitivive to the
refinement of the volumetric data. The library is tested with a rendering
algorithm that produces synthetic images of cloud radiances. Two other examples
are given as illustrations, that are respectively used to analyse the
transmission of solar radiation under a cloud together with its sensitivity to
an optical parameter, and to assess a parametrization of 3D radiative effects
of clouds.Comment: Submitted to JAMES, revised and submitted again (this is v2
Using tracked mobile sensors to make maps of environmental effects
We present a study the results of a study of environmental carbon monoxide pollution that has uses a set of
tracked, mobile pollution sensors. The motivating concept is that we will be able to map pollution and other
properties of the real world a fine scale if we can deploy a large set of sensors with members of the general public
who would carry them as they go about their normal everyday activities. To prove the viability of this concept
we have to demonstrate that data gathered in an ad-hoc manner is reliable enough in order to allow us to
build interesting geo-temporal maps.
We present a trial using a small number of global positioning system-tracked CO sensors. From analysis of raw
GPS logs we find some well-known spatial and temporal properties of CO. Further, by processing the GPS logs
we can find fine-grained variations in pollution readings such as when crossing roads. We then discuss the space
of possibilities that may be enabled by tracking sensors around the urban environment – both in getting at personal
experience of properties of the environment and in making summative maps to predict future conditions.
Although we present a study of CO, the techniques will be applicable to other environmental properties such as
radio signal strength, noise, weather and so on
Spectral networks
We introduce new geometric objects called spectral networks. Spectral
networks are networks of trajectories on Riemann surfaces obeying certain local
rules. Spectral networks arise naturally in four-dimensional N=2 theories
coupled to surface defects, particularly the theories of class S. In these
theories spectral networks provide a useful tool for the computation of BPS
degeneracies: the network directly determines the degeneracies of solitons
living on the surface defect, which in turn determine the degeneracies for
particles living in the 4d bulk. Spectral networks also lead to a new map
between flat GL(K,C) connections on a two-dimensional surface C and flat
abelian connections on an appropriate branched cover Sigma of C. This
construction produces natural coordinate systems on moduli spaces of flat
GL(K,C) connections on C, which we conjecture are cluster coordinate systems.Comment: 87 pages, 48 figures; v2: typos, correction to general rule for signs
of BPS count
Invisible Pixels Are Dead, Long Live Invisible Pixels!
Privacy has deteriorated in the world wide web ever since the 1990s. The
tracking of browsing habits by different third-parties has been at the center
of this deterioration. Web cookies and so-called web beacons have been the
classical ways to implement third-party tracking. Due to the introduction of
more sophisticated technical tracking solutions and other fundamental
transformations, the use of classical image-based web beacons might be expected
to have lost their appeal. According to a sample of over thirty thousand images
collected from popular websites, this paper shows that such an assumption is a
fallacy: classical 1 x 1 images are still commonly used for third-party
tracking in the contemporary world wide web. While it seems that ad-blockers
are unable to fully block these classical image-based tracking beacons, the
paper further demonstrates that even limited information can be used to
accurately classify the third-party 1 x 1 images from other images. An average
classification accuracy of 0.956 is reached in the empirical experiment. With
these results the paper contributes to the ongoing attempts to better
understand the lack of privacy in the world wide web, and the means by which
the situation might be eventually improved.Comment: Forthcoming in the 17th Workshop on Privacy in the Electronic Society
(WPES 2018), Toronto, AC
- …