10,825 research outputs found
Cultivating knowledge sharing through the relationship management maturity model
Purpose - The purpose of this paper is to present the development of the relationship management maturity model (RMMM), the output of an initiative aimed at bridging the gap between business units and the IT organisation. It does this through improving and assessing knowledge sharing between business and IT staff in Finco, a large financial services organisation. Design/methodology/approach - The objectives were achieved by undertaking ethnographic research with the relationship managers (RMs) as they carried out their activities, and developing the RMMM by visualizing the development of a community of practice (CoP) between business and IT. Findings - The RMMM demonstrates a learning mechanism to bridge the business/IT gap through an interpretive approach to knowledge sharing by defining knowledge sharing processes between business and IT and defining the tasks of the relationship managers as facilitators of knowledge sharing. Research limitations/implications - More research is necessary to determine whether the RMMM is a useful tool on which Finco can base the development of RM over the next few years. Practical implications - The RMMM acts as a practical knowledge management tool, and will act as a future reference for the RMs as they attempt to further develop the business/IT relationship. Originality/value - The findings provide an initial endorsement of the knowledge sharing perspective to understand the business/IT relationship. Also, the RMMM can be used to identify problematic issues and develop processes to address them
Reviewing and extending the five-user assumption: A grounded procedure for interaction evaluation
" © ACM, 2013. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Computer-Human Interaction (TOCHI), {VOL 20, ISS 5, (November 2013)} http://doi.acm.org/10.1145/2506210 "The debate concerning how many participants represents a sufficient number for interaction testing is
well-established and long-running, with prominent contributions arguing that five users provide a good
benchmark when seeking to discover interaction problems. We argue that adoption of five users in this
context is often done with little understanding of the basis for, or implications of, the decision. We present
an analysis of relevant research to clarify the meaning of the five-user assumption and to examine the
way in which the original research that suggested it has been applied. This includes its blind adoption and
application in some studies, and complaints about its inadequacies in others. We argue that the five-user
assumption is often misunderstood, not only in the field of Human-Computer Interaction, but also in fields
such as medical device design, or in business and information applications. The analysis that we present
allows us to define a systematic approach for monitoring the sample discovery likelihood, in formative and
summative evaluations, and for gathering information in order to make critical decisions during the
interaction testing, while respecting the aim of the evaluation and allotted budget. This approach – which
we call the ‘Grounded Procedure’ – is introduced and its value argued.The MATCH programme (EPSRC Grants: EP/F063822/1 EP/G012393/1
Recommended from our members
Characterization of high purity germanium point contact detectors with low net impurity concentration
High Purity germanium point-contact detectors have low energy thresholds and excellent energy resolution over a wide energy range, and are thus widely used in nuclear and particle physics. In rare event searches, such as neutrinoless double beta decay, the point-contact geometry is of particular importance since it allows for pulse-shape discrimination, and therefore for a significant background reduction. In this paper we investigate the pulse-shape discrimination performance of ultra-high purity germanium point contact detectors. It is demonstrated that a minimal net impurity concentration is required to meet the pulse-shape performance requirements
The treatment of the infrared region in perturbative QCD
We discuss the contribution coming from the infrared region to NLO matrix
elements and/or coefficient functions of hard QCD processes. Strictly speaking,
this contribution is not known theoretically, since it is beyond perturbative
QCD. For DGLAP evolution all the infrared contributions are collected in the
phenomenological input parton distribution functions (PDFs), at some relatively
low scale Q_0; functions which are obtained from a fit to the `global' data.
However dimensional regularization sometimes produces a non-zero result coming
from the infrared region. Instead of this conventional regularization
treatment, we argue that the proper procedure is to first subtract from the NLO
matrix element the contribution already generated at the same order in \alpha_s
by the LO DGLAP splitting function convoluted with the LO matrix element. This
prescription eliminates the logarithmic infrared divergence, giving a
well-defined result which is consistent with the original idea that everything
below Q_0 is collected in the PDF input. We quantify the difference between the
proposed treatment and the conventional approach using low-mass Drell-Yan
production and deep inelastic electron-proton scattering as examples; and
discuss the potential impact on the `global' PDF analyses. We present arguments
to show that the difference cannot be regarded as simply the use of an
alternative factorization scheme.Comment: 15 pages, 5 figures, title changed, text considerably modified to
improve presentation, and discussion section enlarge
Meaningful characterisation of perturbative theoretical uncertainties
We consider the problem of assigning a meaningful degree of belief to
uncertainty estimates of perturbative series. We analyse the assumptions which
are implicit in the conventional estimates made using renormalisation scale
variations. We then formulate a Bayesian model that, given equivalent initial
hypotheses, allows one to characterise a perturbative theoretical uncertainty
in a rigorous way in terms of a credibility interval for the remainder of the
series. We compare its outcome to the conventional uncertainty estimates in the
simple case of the calculation of QCD corrections to the e+e- -> hadrons
process. We find comparable results, but with important conceptual differences.
This work represents a first step in the direction of a more comprehensive and
rigorous handling of theoretical uncertainties in perturbative calculations
used in high energy phenomenology.Comment: 28 pages, 5 figures. Language modified in order to make it more
'bayesian'. No change in results. Version published in JHE
The inevitable QSAR renaissance
QSAR approaches, including recent advances in 3D-QSAR, are advantageous during the lead optimization phase of drug discovery and complementary with bioinformatics and growing data accessibility. Hints for future QSAR practitioners are also offered
Theoretical Uncertainties in Electroweak Boson Production Cross Sections at 7, 10, and 14 TeV at the LHC
We present an updated study of the systematic errors in the measurements of
the electroweak boson cross-sections at the LHC for various experimental cuts
for a center of mass energy of 7, 10 and 14 TeV. The size of both electroweak
and NNLO QCD contributions are estimated, together with the systematic error
from the parton distributions. The effects of new versions of the MSTW, CTEQ,
and NNPDF PDFs are considered.Comment: PDFLatex with JHEP3.cls. 22 pages, 43 figures. Version 2 adds the
CT10W PDF set to analysis and updates the final systematic error table and
conclusions, plus several citations and minor wording changes. Version 3 adds
some references on electroweak and mixed QED/QCD corrections. Version 4 adds
more references and acknowledgement
Fitting Parton Distribution Data with Multiplicative Normalization Uncertainties
We consider the generic problem of performing a global fit to many
independent data sets each with a different overall multiplicative
normalization uncertainty. We show that the methods in common use to treat
multiplicative uncertainties lead to systematic biases. We develop a method
which is unbiased, based on a self--consistent iterative procedure. We
demonstrate the use of this method by applying it to the determination of
parton distribution functions with the NNPDF methodology, which uses a Monte
Carlo method for uncertainty estimation.Comment: 33 pages, 5 figures: published versio
Subtle changes in the flavour and texture of a drink enhance expectations of satiety
Background: The consumption of liquid calories has been implicated in the development of obesity and weight gain. Energy-containing drinks are often reported to have a weak satiety value: one explanation for this is that because of their fluid texture they are not expected to have much nutritional value. It is important to consider what features of these drinks can be manipulated to enhance their expected satiety value. Two studies investigated the perception of subtle changes in a drink’s viscosity, and the extent to which thick texture and creamy flavour contribute to the generation of satiety expectations. Participants in the first study rated the sensory characteristics of 16 fruit yogurt drinks of increasing viscosity. In study two, a new set of participants evaluated eight versions of the fruit yogurt drink, which varied in thick texture, creamy flavour and energy content, for sensory and hedonic characteristics and satiety expectations.
Results: In study one, participants were able to perceive small changes in drink viscosity that were strongly related to the actual viscosity of the drinks. In study two, the thick versions of the drink were expected to be more filling and have a greater expected satiety value, independent of the drink’s actual energy content. A creamy flavour enhanced the extent to which the drink was expected to be filling, but did not affect its expected satiety.
Conclusions: These results indicate that subtle manipulations of texture and creamy flavour can increase expectations that a fruit yogurt drink will be filling and suppress hunger, irrespective of the drink’s energy content. A thicker texture enhanced expectations of satiety to a greater extent than a creamier flavour, and may be one way to improve the anticipated satiating value of energy-containing beverages
When Anomaly Mediation is UV Sensitive
Despite its successes---such as solving the supersymmetric flavor
problem---anomaly mediated supersymmetry breaking is untenable because of its
prediction of tachyonic sleptons. An appealing solution to this problem was
proposed by Pomarol and Rattazzi where a threshold controlled by a light field
deflects the anomaly mediated supersymmetry breaking trajectory, thus evading
tachyonic sleptons. In this paper we examine an alternate class of deflection
models where the non-supersymmetric threshold is accompanied by a heavy,
instead of light, singlet. The low energy form of this model is the so-called
extended anomaly mediation proposed by Nelson and Weiner, but with potential
for a much higher deflection threshold. The existence of this high deflection
threshold implies that the space of anomaly mediated supersymmetry breaking
deflecting models is larger than previously thought.Comment: 14 pages, 1 figure (version to appear in JHEP
- …