1,086 research outputs found
Music, Musicians and Barroom Aggression
The purpose of this study was to explore the relationship between live bands and the music they play and aggression in barrooms catering to young, college-aged patrons. Twenty musicians representing 14 different cover bands playing in licensed drinking establishments throughout Northeast Pennsylvania were interviewed about their influence on the behaviors of bar patrons. Content analysis of completed interviews revealed several important findings. Most notably, each of the musicians interviewed in this study reported being able to control and manipulate patron behavior, not just through the music they play, but also through their stage presence, their physical appearance and attire, and the way they interact with patrons while on and away from the stage. While none of the musicians reported ever deliberately trying to push bar patrons towards aggression, most agreed that they had the power to do so if desired. Conversely, musicians identified themselves as potentially important agents of social control within bars. Implications for future research and policy are discussed
Conidial harvest from solid media using fiberglass screening.
Conidial harvest from solid media using fiberglass screening
On Verifying Causal Consistency
Causal consistency is one of the most adopted consistency criteria for
distributed implementations of data structures. It ensures that operations are
executed at all sites according to their causal precedence. We address the
issue of verifying automatically whether the executions of an implementation of
a data structure are causally consistent. We consider two problems: (1)
checking whether one single execution is causally consistent, which is relevant
for developing testing and bug finding algorithms, and (2) verifying whether
all the executions of an implementation are causally consistent.
We show that the first problem is NP-complete. This holds even for the
read-write memory abstraction, which is a building block of many modern
distributed systems. Indeed, such systems often store data in key-value stores,
which are instances of the read-write memory abstraction. Moreover, we prove
that, surprisingly, the second problem is undecidable, and again this holds
even for the read-write memory abstraction. However, we show that for the
read-write memory abstraction, these negative results can be circumvented if
the implementations are data independent, i.e., their behaviors do not depend
on the data values that are written or read at each moment, which is a
realistic assumption.Comment: extended version of POPL 201
Testing three hypotheses about effects of sensitive-insensitive parenting on telomeres.
Telomeres are the protective DNA-protein sequences appearing at the ends of chromosomes; they shorten with each cell division and are considered a biomarker of aging. Shorter telomere length and greater erosion have been associated with compromised physical and mental health and are hypothesized to be affected by early life stress. In the latter case, most work has relied on retrospective measures of early life stressors. The Dutch research (n = 193) presented herein tested 3 hypotheses prospectively regarding effects of sensitive-insensitive parenting during the first 2.5 years on telomere length at age 6, when first measured, and change over the following 4 years. It was predicted that (1) less sensitive parenting would predict shorter telomeres and greater erosion and that such effects would be most pronounced in children (2) exposed to prenatal stress and/or (3) who were highly negatively emotional as infants. Results revealed, only, that prenatal stress amplified parenting effects on telomere change-in a differential-susceptibility-related manner: Prenatally stressed children displayed more erosion when they experienced insensitive parenting and less erosion when they experienced sensitive parenting. Mechanisms that might initiate greater postnatal plasticity as a result of prenatal stress are highlighted and future work outlined. (PsycINFO Database Record (c) 2020 APA, all rights reserved)
A Formal Account of the Open Provenance Model
On the Web, where resources such as documents and data are published, shared, transformed, and republished, provenance is a crucial piece of metadata that would allow users to place their trust in the resources they access. The Open Provenance Model (OPM) is a community data model for provenance that is designed to facilitate the meaningful interchange of provenance information between systems. Underpinning OPM is a notion of directed graph, where nodes represent data products and processes involved in past computations, and edges represent dependencies between them; it is complemented by graphical inference rules allowing new dependencies to be derived. Until now, however, the OPM model was a purely syntactical endeavor. The present paper extends OPM graphs with an explicit distinction between precise and imprecise edges. Then a formal semantics for the thus enriched OPM graphs is proposed, by viewing OPM graphs as temporal theories on the temporal events represented in the graph. The original OPM inference rules are scrutinized in view of the semantics and found to be sound but incomplete. An extended set of graphical rules is provided and proved to be complete for inference. The paper concludes with applications of the formal semantics to inferencing in OPM graphs, operators on OPM graphs, and a formal notion of refinement among OPM graphs
A vector for Aspergillus transformation conferring phleomycin resistance.
Recently, transformation of Aspergillus species with vector pAN7-1, conferring resistance to hygromycin B was reported (Punt et al. 1987 Gene 56:117-124)
The gaseous pixel device
The Gaseous Pixel Chamber is a new device developed during the last year within the LAA project at CERN. Basically we print electrodes onto a flexible Kapton foil with standard printed circuit technology used in the CERN workshops. We have found a design which allows us to operate the foil as a particle detector working in the gaseous limited streamer mode. This work has been previously reported. We are well satisfied with the operational characteristics that this device has reached so far (efficiency, ease to build and to operate). However, the demands imposed on any detector device at future hadron colliders are very stringent. There are still many possible improvements needed to meet the technical challenge for a device to work at the LHC,SSC or Eloisatron hadron collider (such as time response, space resolution, energy proportionality). Therefore we propose an R&D programme for studying the aspects that are relevant for application of this kind of detector within a hadron collider environment
Link Prediction with Social Vector Clocks
State-of-the-art link prediction utilizes combinations of complex features
derived from network panel data. We here show that computationally less
expensive features can achieve the same performance in the common scenario in
which the data is available as a sequence of interactions. Our features are
based on social vector clocks, an adaptation of the vector-clock concept
introduced in distributed computing to social interaction networks. In fact,
our experiments suggest that by taking into account the order and spacing of
interactions, social vector clocks exploit different aspects of link formation
so that their combination with previous approaches yields the most accurate
predictor to date.Comment: 9 pages, 6 figure
A new discrete dipole kernel for quantitative susceptibility mapping
PURPOSE: Most approaches for quantitative susceptibility mapping (QSM) are based on a forward model approximation that employs a continuous Fourier transform operator to solve a differential equation system. Such formulation, however, is prone to high-frequency aliasing. The aim of this study was to reduce such errors using an alternative dipole kernel formulation based on the discrete Fourier transform and discrete operators. METHODS: The impact of such an approach on forward model calculation and susceptibility inversion was evaluated in contrast to the continuous formulation both with synthetic phantoms and in vivo MRI data. RESULTS: The discrete kernel demonstrated systematically better fits to analytic field solutions, and showed less over-oscillations and aliasing artifacts while preserving low- and medium-frequency responses relative to those obtained with the continuous kernel. In the context of QSM estimation, the use of the proposed discrete kernel resulted in error reduction and increased sharpness. CONCLUSION: This proof-of-concept study demonstrated that discretizing the dipole kernel is advantageous for QSM. The impact on small or narrow structures such as the venous vasculature might by particularly relevant to high-resolution QSM applications with ultra-high field MRI - a topic for future investigations. The proposed dipole kernel has a straightforward implementation to existing QSM routines
On reducing the complexity of matrix clocks
Matrix clocks are a generalization of the notion of vector clocks that allows
the local representation of causal precedence to reach into an asynchronous
distributed computation's past with depth , where is an integer.
Maintaining matrix clocks correctly in a system of nodes requires that
everymessage be accompanied by numbers, which reflects an exponential
dependency of the complexity of matrix clocks upon the desired depth . We
introduce a novel type of matrix clock, one that requires only numbers to
be attached to each message while maintaining what for many applications may be
the most significant portion of the information that the original matrix clock
carries. In order to illustrate the new clock's applicability, we demonstrate
its use in the monitoring of certain resource-sharing computations
- âŠ