75,679 research outputs found
Measurement of the b-jet cross-section with associated vector boson production with the ATLAS experiment at the LHC
A measurement of the cross-section for vector boson production in association
with jets containing b-hadrons is presented using 35 pb-1 of data from the LHC
collected by the ATLAS experiment in 2010. Such processes are not only
important tests of pQCD but also large, irreducible backgrounds to searches
such as a low mass Higgs boson decaying to pairs of b-quarks when the Higgs is
produced in association with a vector boson. Theoretical predictions of the V+b
production rate have large uncertainties and previous measurements have
reported discrepancies. Cross-sections measured using in the electron and muon
channels will be shown. Comparisons will be made to recent theoretical
predictions at the next-to-leading order in alpha_s.Comment: Presented at the 2011 Hadron Collider Physics symposium (HCP-2011),
Paris, France, November 14-18 2011, 3 pages,6 figure.
What medium? What message? Smoking education for teenagers
It would seem fairly safe to say that an important purpose of schools is to transmit messages; by their very nature they are in an advantageous position to do this. Schools have captive audiences as Dreeben (1970) states, though he is careful to point out that the children may not be in all cases an audience of captives, yet most could be classed as 'victims of institutionalised education' (Gammage 1982). Many secondary teachers particularly may well feel that they have much in common with prison warders for, after all, apart from prisons which have a selective intake, schools are the only institutions where all individuals are compulsorily incarcerated for part of their lives - an estimated 15,000 hours in the United Kingdom. The extent to which schools function in the transmission of messages in the broadest sense has been the subject of much discussion in the last two decades particularly and in spite of the gloomy picture that emerged from the Coleman report (1966) subsequent findings have been much more optimistic. Schools DO make a difference.peer-reviewe
Vector quantization
During the past ten years Vector Quantization (VQ) has developed from a theoretical possibility promised by Shannon's source coding theorems into a powerful and competitive technique for speech and image coding and compression at medium to low bit rates. In this survey, the basic ideas behind the design of vector quantizers are sketched and some comments made on the state-of-the-art and current research efforts
Strong latitudinal shear in the shallow convection zone of a rapidly rotating A-star
We have derived the mean broadening profile of the star V102 in the region of
the open cluster IC4665 from high resolution spectroscopy. At a projected
equatorial rotation velocity of vsini = (105 +- 12)km/s we find strong
deviation from classical rotation. We discuss several scenarios, the most
plausible being strong differential rotation in latitudinal direction. For this
scenario we find a difference in angular velocity of DeltaOmega = 3.6 +- 0.8
rad/d (DeltaOmega/Omega = 0.42 +- 0.09). From the Halpha line we derive a
spectral type of A9 and support photometric measurements classifying IC4665
V102 as a non-member of IC4665. At such early spectral type this is the
strongest case of differential rotation observed so far. Together with three
similar stars, IC4665 V102 seems to form a new class of objects that exhibit
extreme latitudinal shear in a very shallow convective envelope.Comment: accepted for A&A Letter
A NEW ROLE FOR CONSERVATION IN U.S. FARM POLICY, CONSERVATION OPERATIONS: USDA'S CHALLENGE TO MAKE IT WORK
Agricultural and Food Policy, Land Economics/Use,
Variable dimension weighted universal vector quantization and noiseless coding
A new algorithm for variable dimension weighted universal coding is introduced. Combining the multi-codebook system of weighted universal vector quantization (WUVQ), the partitioning technique of variable dimension vector quantization, and the optimal design strategy common to both, variable dimension WUVQ allows mixture sources to be effectively carved into their component subsources, each of which can then be encoded with the codebook best matched to that source. Application of variable dimension WUVQ to a sequence of medical images provides up to 4.8 dB improvement in signal to quantization noise ratio over WUVQ and up to 11 dB improvement over a standard full-search vector quantizer followed by an entropy code. The optimal partitioning technique can likewise be applied with a collection of noiseless codes, as found in weighted universal noiseless coding (WUNC). The resulting algorithm for variable dimension WUNC is also described
Toxics Use Reduction: Pro and Con
With the Massachusetts Toxics Use Reduction Act as an example, important issues related to the goals and effectiveness of TUR are examined. The benefits as claimed by proponents are contrasted with shortcomings outlined by opponents in point-counterpoint style. Ultimately, the authors call for more balanced analysis
One-pass adaptive universal vector quantization
The authors introduce a one-pass adaptive universal quantization technique for real, bounded alphabet, stationary sources. The algorithm is set on line without any prior knowledge of the statistics of the sources which it might encounter and asymptotically achieves ideal performance on all sources that it sees. The system consists of an encoder and a decoder. At increasing intervals, the encoder refines its codebook using knowledge about incoming data symbols. This codebook is then described to the decoder in the form of updates on the previous codebook. The accuracy to which the codebook is described increases as the number of symbols seen, and thus the accuracy to which the codebook is known, grows
Rates of convergence in adaptive universal vector quantization
We consider the problem of adaptive universal quantization. By adaptive quantization we mean quantization for which the delay associated with encoding the jth sample in a sequence of length n is bounded for all n>j. We demonstrate the existence of an adaptive universal quantization algorithm for which any weighted sum of the rate and the expected mean square error converges almost surely and in expectation as O(√(log log n/log n)) to the corresponding weighted sum of the rate and the distortion-rate function at that rate
A boundary element regularised Stokeslet method applied to cilia and flagella-driven flow
A boundary element implementation of the regularised Stokeslet method of
Cortez is applied to cilia and flagella-driven flows in biology.
Previously-published approaches implicitly combine the force discretisation and
the numerical quadrature used to evaluate boundary integrals. By contrast, a
boundary element method can be implemented by discretising the force using
basis functions, and calculating integrals using accurate numerical or analytic
integration. This substantially weakens the coupling of the mesh size for the
force and the regularisation parameter, and greatly reduces the number of
degrees of freedom required. When modelling a cilium or flagellum as a
one-dimensional filament, the regularisation parameter can be considered a
proxy for the body radius, as opposed to being a parameter used to minimise
numerical errors. Modelling a patch of cilia, it is found that: (1) For a fixed
number of cilia, reducing cilia spacing reduces transport. (2) For fixed patch
dimension, increasing cilia number increases the transport, up to a plateau at
cilia. Modelling a choanoflagellate cell it is found that the
presence of a lorica structure significantly affects transport and flow outside
the lorica, but does not significantly alter the force experienced by the
flagellum.Comment: 20 pages, 7 figures, postprin
- …