3,485 research outputs found
Data-matched filter
After amplification and normalization, incoming data bits are fed, alternately, to pair of integrators. While one integrator is operating, content of other is on hold, sample, and dump. Clock derived in bit-timing extractor times and controls integrators. Frequency of clock is one-half data rate
The Extraction of the Gluon Density from Jet Production in Deeply Inelastic Scattering
The prospects of a direct extraction of the proton's gluon density in
next-to-leading order via jet rates in deeply inelastic scattering are studied.
The employed method is based on the Mellin transform, and can be applied, in
principle, to all infra-red-safe observables of hadronic final states. We
investigate the dependence of the error band on the extracted gluon
distribution on the statistical and systematic error of the data.Comment: 5 pages (Latex); 2 figures are included via epsfig; contribution to
the workshop ``Future Physics at HERA'' at DESY, Hamburg, 1995/96; to be
published in the proceedings; compressed postscript version also available at
http://wwwcn.cern.ch/~graudenz/publications.htm
The Mellin Transform Technique for the Extraction of the Gluon Density
A new method is presented to determine the gluon density in the proton from
jet production in deeply inelastic scattering. By using the technique of Mellin
transforms not only for the solution of the scale evolution equation of the
parton densities but also for the evaluation of scattering cross sections, the
gluon density can be extracted in next-to-leading order QCD. The method
described in this paper is, however, more general, and can be used in
situations where a repeated fast numerical evaluation of scattering cross
sections for varying parton distribution functions is required.Comment: 13 pages (LaTeX); 2 figures are included via epsfig; the
corresponding postscript files are uuencode
Threshold logic implementation of a modular computer system design
Threshold logic implementation for LSI design of modular computer syste
Cleaning the USNO-B Catalog through automatic detection of optical artifacts
The USNO-B Catalog contains spurious entries that are caused by diffraction
spikes and circular reflection halos around bright stars in the original
imaging data. These spurious entries appear in the Catalog as if they were real
stars; they are confusing for some scientific tasks. The spurious entries can
be identified by simple computer vision techniques because they produce
repeatable patterns on the sky. Some techniques employed here are variants of
the Hough transform, one of which is sensitive to (two-dimensional)
overdensities of faint stars in thin right-angle cross patterns centered on
bright (<13 \mag) stars, and one of which is sensitive to thin annular
overdensities centered on very bright (<7 \mag) stars. After enforcing
conservative statistical requirements on spurious-entry identifications, we
find that of the 1,042,618,261 entries in the USNO-B Catalog, 24,148,382 of
them (2.3 \percent) are identified as spurious by diffraction-spike criteria
and 196,133 (0.02 \percent) are identified as spurious by reflection-halo
criteria. The spurious entries are often detected in more than 2 bands and are
not overwhelmingly outliers in any photometric properties; they therefore
cannot be rejected easily on other grounds, i.e., without the use of computer
vision techniques. We demonstrate our method, and return to the community in
electronic form a table of spurious entries in the Catalog.Comment: published in A
Considering even-order terms in stochastic nonlinear system modeling with respect to broadband data communication
As a tradeoff between efficiency and costs modern communication systems contain a variety of components that can at least be considered weakly nonlinear. A critical element in evaluating the degree of nonlinearity of any underlying nonlinear system is the amount of undesired signal strength or signal power this system is introducing outside the transmission bandwidth. This phenomenon called spectral regrowth or spectral broadening is subject to stringent restrictions mainly imposed by the given specifications of the particular communication standard. Consequently, achieving the highest possible efficiency without exceeding the linearity requirements is one of the main tasks in system design. Starting from this challenging engineering problem there grows a certain need for specialized tools that are capable of predicting linearity and efficiency of the underlying design. Besides a multitude of methods aiming at the prediction of spectral regrowth a statistical approach in modeling and analyzing nonlinear systems offers the advantage of short processing times due to closed form mathematical expressions in terms of input and output power spectra and is therefore further examined throughout this article
Highly Sensitive Gamma-Spectrometers of GERDA for Material Screening: Part 2
The previous article about material screening for GERDA points out the
importance of strict material screening and selection for radioimpurities as a
key to meet the aspired background levels of the GERDA experiment. This is
directly done using low-level gamma-spectroscopy. In order to provide
sufficient selective power in the mBq/kg range and below, the employed
gamma-spectrometers themselves have to meet strict material requirements, and
make use of an elaborate shielding system. This article gives an account of the
setup of two such spectrometers. Corrado is located in a depth of 15 m w.e. at
the MPI-K in Heidelberg (Germany), GeMPI III is situated at the Gran-Sasso
underground laboratory at 3500 m w.e. (Italy). The latter one aims at detecting
sample activities of the order ~0.01 mBq/kg, which is the current
state-of-the-art level. The applied techniques to meet the respective needs are
discussed and demonstrated by experimental results.Comment: Featured in: Proceedings of the XIV International Baksan School
"Particles and Cosmology" Baksan Valley, Kabardino-Balkaria, Russia, April
16-21,2007. INR RAS, Moscow 2008. ISBN 978-5-94274-055-9, pp. 233-238; (6
pages, 4 figures
Historic Wooden Shipwrecks Influence Dispersal of Deep-Sea Biofilms
Wood arrives on the seabed from natural and anthropogenic sources (e.g., wood falls and wooden shipwrecks, respectively) and creates seafloor habitats for macro-, meio- and microbiota. The way these habitats shape microbial communities and their biogeographic patterns in the deep sea requires study. The objective of this work was to investigate how historic wooden-hulled shipwrecks impact the dispersal of wood-colonizing microbial biofilms. The study addressed how proximity to wooden shipwrecks shapes diversity, richness, and community composition in the surrounding environment. Study sites included two historic shipwrecks in the northern Gulf of Mexico identified as wooden-hulled sailing vessels dating to the late 19th century. Two experimental microbial recruitment arrays containing pine and oak samples were deployed by remotely operated vehicle proximate (0–200 m) to each shipwreck and used to establish new wooden habitat features to be colonized by biofilms. The experiments remained in place for approximately 4 months, were subsequently recovered, and biofilms were analyzed using 16S rRNA gene amplification and sequencing for bacteria and archaea and ITS2 region amplification and sequencing for fungi to determine alpha diversity metrics and community composition. The work examined the influence of wood type, proximity to shipwrecks, and environmental context on the biofilms formed on the surfaces. Wood type was the most significant feature shaping bacterial composition, but not archaeal or fungal composition. Proximity to shipwrecks was also a significant influence on bacterial and archaeal composition and alpha diversity, but not on fungal communities. In all 3 domains, a peak in alpha diversity and richness was observed on pine and oak samples placed ~125 m from the shipwrecks. This peak may be evidence of an ecotone, or convergence zone, between the shipwreck influenced seabed and the surrounding seafloor. This study provides evidence that historic wooden shipwrecks influence microbial biofilm dispersal in the deep sea
Training teachers for the multimedia age: developing teacher expertise to enhance online learner interaction and collaboration
This article considers the skills that enable teachers to foster interaction and collaboration in online language learning. Drawing on Hampel and Stickler’s (2005) skills pyramid for online language learning and teaching, it presents the pre-service and in-service training programme that associate lecturers in the Department of Languages at the Open University undergo in the context of teaching languages with the help of online communication tools. Two projects are presented that shed more light on the expertise required to teach languages in virtual learning environments. The first project highlights the skills that are needed to teach in a complex online environment; the second one, a teacher training study, aimed to examine distance teachers’ experience of facilitating online group work, identify development needs, try out the potential of specific asynchronous and synchronous tools to support collaborative learning and trial possible development activities. The paper concludes by describing the kind of training programme that tutors require in order to acquire the skills identified
- …