2,082 research outputs found
Intrinsically Legal-For-Trade Objects by Digital Signatures
The established techniques for legal-for-trade registration of weight values
meet the legal requirements, but in praxis they show serious disadvantages. We
report on the first implementation of intrinsically legal-for-trade objects,
namely weight values signed by the scale, that is accepted by the approval
authority. The strict requirements from both the approval- and the
verification-authority as well as the limitations due to the hardware of the
scale were a special challenge. The presented solution fulfills all legal
requirements and eliminates the existing practical disadvantages.Comment: 4 pages, 0 figure
Properties of the phi meson at high temperatures and densities
We calculate the spectral density of the phi meson in a hot bath of nucleons
and pions using a general formalism relating self-energy to the forward
scattering amplitude (FSA). In order to describe the low energy FSA, we use
experimental data along with a background term. For the high energy FSA, a
Regge parameterization is employed. We verify the resulting FSA using
dispersion techniques. We find that the position of the peak of the spectral
density is slightly shifted from its vacuum position and that its width is
considerably increased. The width of the spectral density at a temperature of
150 MeV and at normal nuclear density is more than 90 MeV.Comment: 4 pages, 5 figures, Poster presented at Quark Matter 200
Spatially Adaptive Bayesian P-Splines with Heteroscedastic Errors
An increasingly popular tool for nonparametric smoothing are penalized splines (P-splines) which use low-rank spline bases to make computations tractable while maintaining accuracy as good as smoothing splines. This paper extends penalized spline methodology by both modeling the variance function nonparametrically and using a spatially adaptive smoothing parameter. These extensions have been studied before, but never together and never in the multivariate case. This combination is needed for satisfactory inference and can be implemented effectively by Bayesian \mbox{MCMC}. The variance process controlling the spatially-adaptive shrinkage of the mean and the variance of the heteroscedastic error process are modeled as log-penalized splines. We discuss the choice of priors and extensions of the methodology,in particular, to multivariate smoothing using low-rank thin plate splines. A fully Bayesian approach provides the joint posterior distribution of all parameters, in particular, of the error standard deviation and penalty functions. In the multivariate case we produce maps of the standard deviation and penalty functions. Our methodology can be implemented using the Bayesian software WinBUGS
What does the rho-meson do? In-medium mass shift scenarios versus hadronic model calculations
The NA60 experiment has studied low-mass muon pair production in In-In
collisions at with unprecedented precision. With these results
there is hope that the in-medium modifications of the vector meson spectral
function can be constrained more thoroughly than before. We investigate in
particular what can be learned about collisional broadening by a hot and dense
medium and what constrains the experimental results put on in-medium mass shift
scenarios. The data show a clear indication of considerable in-medium
broadening effects but disfavor mass shift scenarios where the -meson
mass scales with the square root of the chiral condensate. Scaling scenarios
which predict at finite density a dropping of the -meson mass that is
stronger than that of the quark condensate are clearly ruled out since they are
also accompanied by a sharpening of the spectral function.Comment: Proceeding contribution, Talk given by J. Ruppert at Workshop for
Young Scientists on the Physics of Ultrarelativistic Nucleus-Nucleus
Collisions (Hot Quarks 2006), Villasimius, Sardinia, Italy, 15-20 May 2006.
To appear in EPJ
Protein O-Mannosylation in the Murine Brain: Occurrence of Mono-O-Mannosyl Glycans and Identification of New Substrates
Protein O-mannosylation is a post-translational modification essential for correct development of mammals. In humans, deficient O-mannosylation results in severe congenital muscular dystrophies often associated with impaired brain and eye development. Although various O-mannosylated proteins have been identified in the recent years, the distribution of O-mannosyl glycans in the mammalian brain and target proteins are still not well defined. In the present study, rabbit monoclonal antibodies directed against the O-mannosylated peptide YAT(α1-Man)AV were generated. Detailed characterization of clone RKU-1-3-5 revealed that this monoclonal antibody recognizes O-linked mannose also in different peptide and protein contexts. Using this tool, we observed that mono-O-mannosyl glycans occur ubiquitously throughout the murine brain but are especially enriched at inhibitory GABAergic neurons and at the perineural nets. Using a mass spectrometry-based approach, we further identified glycoproteins from the murine brain that bear single O-mannose residues. Among the candidates identified are members of the cadherin and plexin superfamilies and the perineural net protein neurocan. In addition, we identified neurexin 3, a cell adhesion protein involved in synaptic plasticity, and inter-alpha-trypsin inhibitor 5, a protease inhibitor important in stabilizing the extracellular matrix, as new O-mannosylated glycoproteins
Semiparametric Regression During 2003–2007
Semiparametric regression is a fusion between parametric regression and nonparametric regression and the title of a book that we published on the topic in early 2003. We review developments in the field during the five year period since the book was written. We find semiparametric regression to be a vibrant field with substantial involvement and activity, continual enhancement and widespread application
Hard and soft probe - medium interactions in a 3D hydro+micro approach at RHIC
We utilize a 3D hybrid hydro+micro model for a comprehensive and consistent
description of soft and hard particle production in ultra-relativistic
heavy-ion collisions at RHIC. In the soft sector we focus on the dynamics of
(multi-)strange baryons, where a clear strangeness dependence of their
collision rates and freeze-out is observed. In the hard sector we study the
radiative energy loss of hard partons in a soft medium in the multiple soft
scattering approximation. While the nuclear suppression factor does
not reflect the high quality of the medium description (except in a reduced
systematic uncertainty in extracting the quenching power of the medium), the
hydrodynamical model also allows to study different centralities and in
particular the angular variation of with respect to the reaction
plane, allowing for a controlled variation of the in-medium path-length.Comment: 5 pages, 4 figures, Quark Matter 2006 proceedings, to appear in
Journal of Physics
Tuning the Level of Concurrency in Software Transactional Memory: An Overview of Recent Analytical, Machine Learning and Mixed Approaches
Synchronization transparency offered by Software Transactional Memory (STM) must not come at the expense of run-time efficiency, thus demanding from the STM-designer the inclusion of mechanisms properly oriented to performance and other quality indexes. Particularly, one core issue to cope with in STM is related to exploiting parallelism while also avoiding thrashing phenomena due to excessive transaction rollbacks, caused by excessively high levels of contention on logical resources, namely concurrently accessed data portions. A means to address run-time efficiency consists in dynamically determining the best-suited level of concurrency (number of threads) to be employed for running the application (or specific application phases) on top of the STM layer. For too low levels of concurrency, parallelism can be hampered. Conversely, over-dimensioning the concurrency level may give rise to the aforementioned thrashing phenomena caused by excessive data contention—an aspect which has reflections also on the side of reduced energy-efficiency. In this chapter we overview a set of recent techniques aimed at building “application-specific” performance models that can be exploited to dynamically tune the level of concurrency to the best-suited value. Although they share some base concepts while modeling the system performance vs the degree of concurrency, these techniques rely on disparate methods, such as machine learning or analytic methods (or combinations of the two), and achieve different tradeoffs in terms of the relation between the precision of the performance model and the latency for model instantiation. Implications of the different tradeoffs in real-life scenarios are also discussed
Electrical control of inter-dot electron tunneling in a quantum dot molecule
We employ ultrafast pump-probe spectroscopy to directly monitor electron
tunneling between discrete orbital states in a pair of spatially separated
quantum dots. Immediately after excitation, several peaks are observed in the
pump-probe spectrum due to Coulomb interactions between the photo-generated
charge carriers. By tuning the relative energy of the orbital states in the two
dots and monitoring the temporal evolution of the pump-probe spectra the
electron and hole tunneling times are separately measured and resonant
tunneling between the two dots is shown to be mediated both by elastic and
inelastic processes. Ultrafast (< 5 ps) inter-dot tunneling is shown to occur
over a surprisingly wide bandwidth, up to ~8 meV, reflecting the spectrum of
exciton-acoustic phonon coupling in the system
Development and assessment of an environmental DNA (eDNA) assay for a cryptic Siren (Amphibia: Sirenidae)
Environmental DNA (eDNA) assays have become a major aspect of surveys for aquatic organisms in the past decade. These methods are highly sensitive, making them well-suited for monitoring rare and cryptic species. Current efforts to study the Rio Grande Siren in southern Texas have been hampered due to the cryptic nature of these aquatic salamanders. Arid conditions further add to the difficulty in studying this species, as many water bodies they inhabit are ephemeral, sometimes constraining sampling efforts to a short window after heavy rain. Additionally, sirens are known to cease activity and reside underground when ponds begin to dry or as water temperatures increase. Conventional sampling efforts require extensive trap-hours to be effective, which is not always possible within the required sampling window. This study presents the development of a novel eDNA assay technique for this elusive species using conventional PCR and Sanger sequencing and compares eDNA sampling results with simultaneous trapping at multiple sites to assess the relative effectiveness of the procedure. Rio Grande Siren detection via eDNA sampling was significantly higher at all sites compared to trapping, confirming the utility of this assay for species detection. This methodology gives promise for future work assessing the distribution and status of the Rio Grande Siren and has potential for use on other southern Texas amphibians
- …