14,755 research outputs found
Set-Codes with Small Intersections and Small Discrepancies
We are concerned with the problem of designing large families of subsets over
a common labeled ground set that have small pairwise intersections and the
property that the maximum discrepancy of the label values within each of the
sets is less than or equal to one. Our results, based on transversal designs,
factorizations of packings and Latin rectangles, show that by jointly
constructing the sets and labeling scheme, one can achieve optimal family sizes
for many parameter choices. Probabilistic arguments akin to those used for
pseudorandom generators lead to significantly suboptimal results when compared
to the proposed combinatorial methods. The design problem considered is
motivated by applications in molecular data storage and theoretical computer
science
Series of experiments for empirical validation of solar gain modelling in building energy simulation codes - experimental setup, test cell characterization, specifications and uncertainty analysis
Empirical validation of building energy simulation codes is an important component in understanding the capacity and limitations of the software. Within the framework of Task 34/Annex 43 of the International Energy Agency (IEA), a series of experiments was performed in an outdoor test cell. The objective of these experiments was to provide a high-quality data set for code developers and modelers to validate their solar gain models for windows with and without shading devices. A description of the necessary specifications for modeling these experiments is provided in this paper, which includes information about the test site location, experimental setup, geometrical and thermophysical cell properties including estimated uncertainties. Computed overall thermal cell properties were confirmed by conducting a steady-state experiment without solar gains. A transient experiment, also without solar gains, and corresponding simulations from four different building energy simulation codes showed that the provided specifications result in accurate thermal cell modeling. A good foundation for the following experiments with solar gains was therefore accomplished
Security of GPS/INS based On-road Location Tracking Systems
Location information is critical to a wide-variety of navigation and tracking
applications. Today, GPS is the de-facto outdoor localization system but has
been shown to be vulnerable to signal spoofing attacks. Inertial Navigation
Systems (INS) are emerging as a popular complementary system, especially in
road transportation systems as they enable improved navigation and tracking as
well as offer resilience to wireless signals spoofing, and jamming attacks. In
this paper, we evaluate the security guarantees of INS-aided GPS tracking and
navigation for road transportation systems. We consider an adversary required
to travel from a source location to a destination, and monitored by a INS-aided
GPS system. The goal of the adversary is to travel to alternate locations
without being detected. We developed and evaluated algorithms that achieve such
goal, providing the adversary significant latitude. Our algorithms build a
graph model for a given road network and enable us to derive potential
destinations an attacker can reach without raising alarms even with the
INS-aided GPS tracking and navigation system. The algorithms render the
gyroscope and accelerometer sensors useless as they generate road trajectories
indistinguishable from plausible paths (both in terms of turn angles and roads
curvature). We also designed, built, and demonstrated that the magnetometer can
be actively spoofed using a combination of carefully controlled coils. We
implemented and evaluated the impact of the attack using both real-world and
simulated driving traces in more than 10 cities located around the world. Our
evaluations show that it is possible for an attacker to reach destinations that
are as far as 30 km away from the true destination without being detected. We
also show that it is possible for the adversary to reach almost 60-80% of
possible points within the target region in some cities
Automated linking of historical data
The recent digitization of complete count census data is an extraordinary opportunity for social scientists to create large longitudinal datasets by linking individuals from one census to another or from other sources to the census. We evaluate different automated methods for record linkage, performing a series of comparisons across methods and against hand linking. We have three main findings that lead us to conclude that automated methods perform well. First, a number of
automated methods generate very low (less than 5%) false positive rates. The automated methods trace out a frontier illustrating the tradeoff between the false positive rate and the (true) match rate. Relative to more conservative automated algorithms, humans tend to link more observations but at a cost of higher rates of false positives. Second, when human linkers and algorithms use the same linking variables, there is relatively little disagreement between them. Third, across a number of plausible analyses, coefficient estimates and parameters of interest are very similar when using linked samples based on each of the different automated methods. We provide code and Stata commands to implement the various automated methods.Accepted manuscriptFirst author draf
The Cauchy-Lagrangian method for numerical analysis of Euler flow
A novel semi-Lagrangian method is introduced to solve numerically the Euler
equation for ideal incompressible flow in arbitrary space dimension. It
exploits the time-analyticity of fluid particle trajectories and requires, in
principle, only limited spatial smoothness of the initial data. Efficient
generation of high-order time-Taylor coefficients is made possible by a
recurrence relation that follows from the Cauchy invariants formulation of the
Euler equation (Zheligovsky & Frisch, J. Fluid Mech. 2014, 749, 404-430).
Truncated time-Taylor series of very high order allow the use of time steps
vastly exceeding the Courant-Friedrichs-Lewy limit, without compromising the
accuracy of the solution. Tests performed on the two-dimensional Euler equation
indicate that the Cauchy-Lagrangian method is more - and occasionally much more
- efficient and less prone to instability than Eulerian Runge-Kutta methods,
and less prone to rapid growth of rounding errors than the high-order Eulerian
time-Taylor algorithm. We also develop tools of analysis adapted to the
Cauchy-Lagrangian method, such as the monitoring of the radius of convergence
of the time-Taylor series. Certain other fluid equations can be handled
similarly.Comment: 30 pp., 13 figures, 45 references. Minor revision. In press in
Journal of Scientific Computin
Surface tension and interfacial fluctuations in d-dimensional Ising model
The surface tension of rough interfaces between coexisting phases in 2D and
3D Ising models are discussed in view of the known results and some original
calculations presented in this paper. The results are summarised in a formula,
which allows to interpolate the corrections to finite-size scaling between two
and three dimensions. The physical meaning of an analytic continuation to
noninteger values of the spatial dimensionality d is discussed. Lattices and
interfaces with properly defined fractal dimensions should fulfil certain
requirements to possibly have properties of an analytic continuation from
d-dimensional hypercubes. Here 2 appears as the marginal value of d below which
the (d-1)-dimensional interface splits in disconnected pieces. Some
phenomenological arguments are proposed to describe such interfaces. They show
that the character of the interfacial fluctuations at d<2 is not the same as
provided by a formal analytic continuation from d-dimensional hypercubes with d
>= 2. It, probably, is true also for the related critical exponents.Comment: 10 pages, no figures. In the second version changes are made to make
it consistent with the published paper (Sec.2 is completed
Recommended from our members
Student Identity Disclosed: Analysis of an Online Student Profile Tool
In the University of Minnesota’s Student Writing Support program,
we gather, record, and share student and course information in
order to support consultants in their work with writers; to assess
and improve our own practice; and to make compelling, datadriven
arguments for the center’s continued existence. Recognizing
moments when these data-collection practices worked against the
relationships we wanted to build with student writers, we began to
critique these practices, with the goal of creating more intentional
criteria and methods for soliciting client information. In Fall 2013,
we developed and introduced an online Student Profile tool where
clients could indicate their preferred name, provide a guide to
pronouncing their name, include their gender pronouns, list any
language(s) they speak and/or write, and indicate anything else they
would like our consultants to know about them as writers/learners.
We have become particularly interested in what students choose to
share about themselves in that last open-ended prompt: When we
give students opportunities to disclose aspects of their identity,
what do we learn about them and about how they construct their
identities in the context of a writing consultation? In this article we
share our analysis of client data we collected in 2016–17, which
reveals students’ awareness of their identities as writers, students,
and learners as well as the complexities of these identities in a
writing center context. Our findings also speak to larger
conversations about the ways student identities are constructed and
created within higher education.University Writing Cente
Solar stereoscopy - where are we and what developments do we require to progress?
Observations from the two STEREO-spacecraft give us for the first time the
possibility to use stereoscopic methods to reconstruct the 3D solar corona.
Classical stereoscopy works best for solid objects with clear edges.
Consequently an application of classical stereoscopic methods to the faint
structures visible in the optically thin coronal plasma is by no means straight
forward and several problems have to be treated adequately: 1.)First there is
the problem of identifying one dimensional structures -e.g. active region
coronal loops or polar plumes- from the two individual EUV-images observed with
STEREO/EUVI. 2.) As a next step one has the association problem to find
corresponding structures in both images. 3.) Within the reconstruction problem
stereoscopic methods are used to compute the 3D-geometry of the identified
structures. Without any prior assumptions, e.g., regarding the footpoints of
coronal loops, the reconstruction problem has not one unique solution. 4.) One
has to estimate the reconstruction error or accuracy of the reconstructed
3D-structure, which depends on the accuracy of the identified structures in 2D,
the separation angle between the spacecraft, but also on the location, e.g.,
for east-west directed coronal loops the reconstruction error is highest close
to the loop top. 5.) Eventually we are not only interested in the 3D-geometry
of loops or plumes, but also in physical parameters like density, temperature,
plasma flow, magnetic field strength etc. Helpful for treating some of these
problems are coronal magnetic field models extrapolated from photospheric
measurements, because observed EUV-loops outline the magnetic field. This
feature has been used for a new method dubbed 'magnetic stereoscopy'. As
examples we show recent application to active region loops.Comment: 12 Pages, 9 Figures, a Review articl
- …