1,280 research outputs found
Is there a core set of skills for visual analysis across different imaging technologies?
This research examines the technological challenges posed by security imaging technologies for human visual analysis of images. Imaging technologies are increasing becoming part of an overall security strategy that incorporates a range of camera technologies, x-ray technologies, and other electromagnetic imaging such as millimetre wave and terahertz based systems. Still and video image types are increasingly becoming presented to viewers or screeners in forms that are only representative in nature and highly abstract, and the use of filters is increasing the complexity of interpretation. Despite a range of factors that are being looked at to enhance visual analysis, the contribution of individualised image processing skills is poorly understood and recognised. The paper explores examples of how an assessment exercise which examines visual analysis, ScanX, correlates against performance in four major studies set in different environments and using both x-ray and CCTV technologies. Correlations show strong relationships to performance despite the difference in image technology and environmental settings, as well as detection targets and criteria. Support for a set of core image analysis skills that can be used across a range of technologies by a common operator group is shown by the research. These skills appear to relate more to the nature of processing applicable to various forms of image rather than the image content itself
A comprehensive evaluation of colonic mucosal isolates of Sutterella wadsworthensis from inflammatory bowel disease
Peer reviewedPublisher PD
Single-Scale Natural SUSY
We consider the prospects for natural SUSY models consistent with current
data. Recent constraints make the standard paradigm unnatural so we consider
what could be a minimal extension consistent with what we now know. The most
promising such scenarios extend the MSSM with new tree-level Higgs interactions
that can lift its mass to at least 125 GeV and also allow for flavor-dependent
soft terms so that the third generation squarks are lighter than current bounds
on the first and second generation squarks. We argue that a common feature of
almost all such models is the need for a new scale near 10 TeV, such as a scale
of Higgsing or confinement of a new gauge group. We consider the question
whether such a model can naturally derive from a single mass scale associated
with supersymmetry breaking. Most such models simply postulate new scales,
leaving their proximity to the scale of MSSM soft terms a mystery. This
coincidence problem may be thought of as a mild tuning, analogous to the usual
mu problem. We find that a single mass scale origin is challenging, but suggest
that a more natural origin for such a new dynamical scale is the gravitino
mass, m_{3/2}, in theories where the MSSM soft terms are a loop factor below
m_{3/2}. As an example, we build a variant of the NMSSM where the singlet S is
composite, and the strong dynamics leading to compositeness is triggered by
masses of order m_{3/2} for some fields. Our focus is the Higgs sector, but our
model is compatible with a light stop (with the other generation squarks heavy,
or with R-parity violation or another mechanism to hide them from current
searches). All the interesting low-energy mass scales, including linear terms
for S playing a key role in EWSB, arise dynamically from the single scale
m_{3/2}. However, numerical coefficients from RG effects and wavefunction
factors in an extra dimension complicate the otherwise simple story.Comment: 32 pages, 3 figures; version accepted by JHE
Excess Higgs Production in Neutralino Decays
The ATLAS and CMS experiments have recently claimed discovery of a Higgs
boson-like particle at ~5 sigma confidence and are beginning to test the
Standard Model predictions for its production and decay. In a variety of
supersymmetric models, a neutralino NLSP can decay dominantly to the Higgs and
the LSP. In natural SUSY models, a light third generation squark decaying
through this chain can lead to large excess Higgs production while evading
existing BSM searches. Such models can be observed at the 8 TeV LHC in channels
exploiting the rare diphoton decays of the Higgs produced in the cascade decay.
Identifying a diphoton resonance in association with missing energy, a lepton,
or b-tagged jets is a promising search strategy for discovery of these models,
and would immediately signal new physics involving production of a Higgs boson.
We also discuss the possibility that excess Higgs production in these SUSY
decays can be responsible for enhancements of up to 50% over the SM prediction
for the observed rate in the existing inclusive diphoton searches, a scenario
which would likely by the end of the 8 TeV run be accompanied by excesses in
the diphoton + lepton/MET and SUSY multi-lepton/b searches and a potential
discovery in a diphoton + 2b search.Comment: 42 pages, 19 figure
The first legal mortgagor: a consumer without adequate protection?
This article contends that the UK government’s attempt to create a well-functioning consumer credit market will be undermined if it fails to reform the private law framework relating to the first legal mortgage. Such agreements are governed by two distinct regulatory regimes that are founded upon very different conceptions of the mortgagor. The first, the regulation of financial services overseen by the Financial Conduct Authority, derives from public law and is founded upon a conception of the mortgagor as “consumer”. The other is land law, private law regulation implemented by the judiciary and underpinned by a conception of the mortgagor as “landowner”. Evidence suggests that the operation of these two regimes prevents mortgagors from receiving fair and consistent treatment. The current reform of financial services regulation therefore will change only one part of this governance regime and will leave mortgagors heavily reliant upon a regulator that still has to prove itself. What this article argues is that reform of the rules of private law must also be undertaken with the aim of initiating a paradigm shift in the conception of the mortgagor from “landowner” to “consumer”. Cultural shifts of this kind take time but the hope is that this conceptual transformation will occur in time to deter the predicted rise in mortgage possessions
X-Ray Spectroscopy of Stars
(abridged) Non-degenerate stars of essentially all spectral classes are soft
X-ray sources. Low-mass stars on the cooler part of the main sequence and their
pre-main sequence predecessors define the dominant stellar population in the
galaxy by number. Their X-ray spectra are reminiscent, in the broadest sense,
of X-ray spectra from the solar corona. X-ray emission from cool stars is
indeed ascribed to magnetically trapped hot gas analogous to the solar coronal
plasma. Coronal structure, its thermal stratification and geometric extent can
be interpreted based on various spectral diagnostics. New features have been
identified in pre-main sequence stars; some of these may be related to
accretion shocks on the stellar surface, fluorescence on circumstellar disks
due to X-ray irradiation, or shock heating in stellar outflows. Massive, hot
stars clearly dominate the interaction with the galactic interstellar medium:
they are the main sources of ionizing radiation, mechanical energy and chemical
enrichment in galaxies. High-energy emission permits to probe some of the most
important processes at work in these stars, and put constraints on their most
peculiar feature: the stellar wind. Here, we review recent advances in our
understanding of cool and hot stars through the study of X-ray spectra, in
particular high-resolution spectra now available from XMM-Newton and Chandra.
We address issues related to coronal structure, flares, the composition of
coronal plasma, X-ray production in accretion streams and outflows, X-rays from
single OB-type stars, massive binaries, magnetic hot objects and evolved WR
stars.Comment: accepted for Astron. Astrophys. Rev., 98 journal pages, 30 figures
(partly multiple); some corrections made after proof stag
Assessing the cost of global biodiversity and conservation knowledge
Knowledge products comprise assessments of authoritative information supported by stan-dards, governance, quality control, data, tools, and capacity building mechanisms. Considerable resources are dedicated to developing and maintaining knowledge productsfor biodiversity conservation, and they are widely used to inform policy and advise decisionmakers and practitioners. However, the financial cost of delivering this information is largelyundocumented. We evaluated the costs and funding sources for developing and maintain-ing four global biodiversity and conservation knowledge products: The IUCN Red List ofThreatened Species, the IUCN Red List of Ecosystems, Protected Planet, and the WorldDatabase of Key Biodiversity Areas. These are secondary data sets, built on primary datacollected by extensive networks of expert contributors worldwide. We estimate that US116–204 million), plus 293 person-years of volunteer time (range: 278–308 person-years) valued at US12–16 million), were invested inthese four knowledge products between 1979 and 2013. More than half of this financingwas provided through philanthropy, and nearly three-quarters was spent on personnelcosts. The estimated annual cost of maintaining data and platforms for three of these knowl-edge products (excluding the IUCN Red List of Ecosystems for which annual costs were notpossible to estimate for 2013) is US6.2–6.7 million). We esti-mated that an additional US12 million. These costs are much lower than those tomaintain many other, similarly important, global knowledge products. Ensuring that biodi-versity and conservation knowledge products are sufficiently up to date, comprehensiveand accurate is fundamental to inform decision-making for biodiversity conservation andsustainable development. Thus, the development and implementation of plans for sustain-able long-term financing for them is critical
What is the 'problem' that outreach work seeks to address and how might it be tackled? Seeking theory in a primary health prevention programme
<b>Background</b> Preventive approaches to health are disproportionately accessed by the more affluent and recent health improvement policy advocates the use of targeted preventive primary care to reduce risk factors in poorer individuals and communities. Outreach has become part of the health service response. Outreach has a long history of engaging those who do not otherwise access services. It has, however, been described as eclectic in its purpose, clientele and mode of practice; its effectiveness is unproven. Using a primary prevention programme in the UK as a case, this paper addresses two research questions: what are the perceived problems of non-engagement that outreach aims to address; and, what specific mechanisms of outreach are hypothesised to tackle these.<p></p>
<b>Methods</b> Drawing on a wider programme evaluation, the study undertook qualitative interviews with strategically selected health-care professionals. The analysis was thematically guided by the concept of 'candidacy' which theorises the dynamic process through which services and individuals negotiate appropriate service use.<p></p>
<b>Results</b> The study identified seven types of engagement 'problem' and corresponding solutions. These 'problems' lie on a continuum of complexity in terms of the challenges they present to primary care. Reasons for non-engagement are congruent with the concept of 'candidacy' but point to ways in which it can be expanded.<p></p>
<b>Conclusions</b> The paper draws conclusions about the role of outreach in contributing to the implementation of inequalities focused primary prevention and identifies further research needed in the theoretical development of both outreach as an approach and candidacy as a conceptual framework
Internet-based medical education: a realist review of what works, for whom and in what circumstances
http://creativecommons.org/licenses/by/2.0
Perception of limb orientation in the vertical plane depends on center of mass rather than inertial eigenvectors
We performed two experiments to test the hypothesis that the perception of limb orientation depends on inertial eigenvectors (
- …
