22,923 research outputs found
Development of a noise annoyance sensitivity scale
Examining the problem of noise pollution from the psychological rather than the engineering view, a test of human sensitivity to noise was developed against the criterion of noise annoyance. Test development evolved from a previous study in which biographical, attitudinal, and personality data was collected on a sample of 166 subjects drawn from the adult community of Raleigh. Analysis revealed that only a small subset of the data collected was predictive of noise annoyance. Item analysis yielded 74 predictive items that composed the preliminary noise sensitivity test. This was administered to a sample of 80 adults who later rate the annoyance value of six sounds (equated in terms of peak sound pressure level) presented in a simulated home, living-room environment. A predictive model involving 20 test items was developed using multiple regression techniques, and an item weighting scheme was evaluated
Ames ER-2 ozone measurements
The objective of this research is to study ozone (O3) in the stratosphere. Measurements of the ozone mixing ratio at 1 s intervals are obtained with an ultraviolet photometer which flies on the ER-2 aircraft. The photometer determines the amount of ozone in air by measuring the transmission of ultraviolet light through a fixed path with and without ambient O3 present
The Cluster Distribution as a Test of Dark Matter Models. IV: Topology and Geometry
We study the geometry and topology of the large-scale structure traced by
galaxy clusters in numerical simulations of a box of side 320 Mpc, and
compare them with available data on real clusters. The simulations we use are
generated by the Zel'dovich approximation, using the same methods as we have
used in the first three papers in this series. We consider the following models
to see if there are measurable differences in the topology and geometry of the
superclustering they produce: (i) the standard CDM model (SCDM); (ii) a CDM
model with (OCDM); (iii) a CDM model with a `tilted' power
spectrum having (TCDM); (iv) a CDM model with a very low Hubble
constant, (LOWH); (v) a model with mixed CDM and HDM (CHDM); (vi) a
flat low-density CDM model with and a non-zero cosmological
term (CDM). We analyse these models using a variety of
statistical tests based on the analysis of: (i) the Euler-Poincar\'{e}
characteristic; (ii) percolation properties; (iii) the Minimal Spanning Tree
construction. Taking all these tests together we find that the best fitting
model is CDM and, indeed, the others do not appear to be consistent
with the data. Our results demonstrate that despite their biased and extremely
sparse sampling of the cosmological density field, it is possible to use
clusters to probe subtle statistical diagnostics of models which go far beyond
the low-order correlation functions usually applied to study superclustering.Comment: 17 pages, 7 postscript figures, uses mn.sty, MNRAS in pres
The "zeroth law" of turbulence: Isotropic turbulence simulations revisited
The dimensionless kinetic energy dissipation rate C_epsilon is estimated from
numerical simulations of statistically stationary isotropic box turbulence that
is slightly compressible. The Taylor microscale Reynolds number Re_lambda range
is 20 < Re_lambda < 220 and the statistical stationarity is achieved with a
random phase forcing method. The strong Re_lambda dependence of C_epsilon
abates when Re_lambda approx. 100 after which C_epsilon slowly approaches
approx 0.5 a value slightly different to previously reported simulations but in
good agreement with experimental results. If C_epsilon is estimated at a
specific time step from the time series of the quantities involved it is
necessary to account for the time lag between energy injection and energy
dissipation. Also, the resulting value can differ from the ensemble averaged
value by up to +-30%. This may explain the spread in results from previously
published estimates of C_epsilon.Comment: 7 pages, 7 figures. Submitted to Phys. Rev.
Selecting children for head CT following head injury
OBJECTIVE: Indicators for head CT scan defined by the 2007 National Institute for Health and Care Excellence (NICE) guidelines were analysed to identify CT uptake, influential variables and yield. DESIGN: Cross-sectional study. SETTING: Hospital inpatient units: England, Wales, Northern Ireland and the Channel Islands. PATIENTS: Children (3 years were much more likely to have CT than those <3 years (OR 2.35 (95% CI 2.08 to 2.65)). CONCLUSION: Compliance with guidelines and diagnostic yield was variable across age groups, the type of hospital and region where children were admitted. With this pattern of clinical practice the risks of both missing intracranial injury and overuse of CT are considerable
Oblivion: Mitigating Privacy Leaks by Controlling the Discoverability of Online Information
Search engines are the prevalently used tools to collect information about
individuals on the Internet. Search results typically comprise a variety of
sources that contain personal information -- either intentionally released by
the person herself, or unintentionally leaked or published by third parties,
often with detrimental effects on the individual's privacy. To grant
individuals the ability to regain control over their disseminated personal
information, the European Court of Justice recently ruled that EU citizens have
a right to be forgotten in the sense that indexing systems, must offer them
technical means to request removal of links from search results that point to
sources violating their data protection rights. As of now, these technical
means consist of a web form that requires a user to manually identify all
relevant links upfront and to insert them into the web form, followed by a
manual evaluation by employees of the indexing system to assess if the request
is eligible and lawful.
We propose a universal framework Oblivion to support the automation of the
right to be forgotten in a scalable, provable and privacy-preserving manner.
First, Oblivion enables a user to automatically find and tag her disseminated
personal information using natural language processing and image recognition
techniques and file a request in a privacy-preserving manner. Second, Oblivion
provides indexing systems with an automated and provable eligibility mechanism,
asserting that the author of a request is indeed affected by an online
resource. The automated ligibility proof ensures censorship-resistance so that
only legitimately affected individuals can request the removal of corresponding
links from search results. We have conducted comprehensive evaluations, showing
that Oblivion is capable of handling 278 removal requests per second, and is
hence suitable for large-scale deployment
Diffusion, peer pressure and tailed distributions
We present a general, physically motivated non-linear and non-local advection
equation in which the diffusion of interacting random walkers competes with a
local drift arising from a kind of peer pressure. We show, using a mapping to
an integrable dynamical system, that on varying a parameter, the steady state
behaviour undergoes a transition from the standard diffusive behavior to a
localized stationary state characterized by a tailed distribution. Finally, we
show that recent empirical laws on economic growth can be explained as a
collective phenomenon due to peer pressure interaction.Comment: RevTex: 4 pages + 3 eps-figures. Minor Revision and figure 3
replaced. To appear in Phys. Rev. Letter
- …