4,525 research outputs found

    The Impact of Interface Quality on Trust in Web Retailers

    Get PDF
    Web retailing is expected to grow at aggressive rates in future years. One of the most important factors that is slowing down this growth is the lack of trust of potential customers. So, as transactions through the internet develop and mature, success will largely be dependent on gaining and0501ntaining this trust. It has been suggested that the quality of the user interface of the Web site is a determinant of the initial establishment of trust. In this article, we describe a study where 66 subjects were asked to perform some predefined book purchasing task in a series of sites with varying interface quality. We found a strong relationship between interface quality and trust. We also found some components of user interface quality to be more important than others and discuss the implications for Web site design. Le manque de confiance constitue une des barrières les plus importantes à l'adoption et au développement du commerce électronique. Centré sur le commerce électronique de détail, ce travail présente un modèle permettant d'analyser le développement de la confiance du consommateur en fonction des caractéristiques de ce dernier - sa propension - et de la perception qu'il a de certaines caractéristiques du commerçant, soit l'intégrité, l'habilité et la bienveillance de ce dernier. L'utilisabilité de l'interface graphique a été retenue comme étant le facteur-clé en ce qui concerne la perception de ces caractéristiques. Le rôle de chacune des dimensions composant l'utilisabilité a donc été étudié et mis en évidence relativement à la confiance développée par le consommateur.User interface, laboratory experiment, trust, usability, electronic retailing, web design, Interface utilisateur, expérimentation en laboratoire, confiance, b2c, commerce électronique (détail), design de site web

    Lensing and time-delay contributions to galaxy correlations

    Get PDF
    Galaxy clustering on very large scales can be probed via the 2-point correlation function in the general case of wide and deep separations, including all the lightcone and relativistic effects. Using our recently developed formalism, we analyze the behavior of the local and integrated contributions and how these depend on redshift range, linear and angular separations and luminosity function. Relativistic corrections to the local part of the correlation can be non-negligible but they remain generally sub-dominant. On the other hand, the additional correlations arising from lensing convergence and time-delay effects can become very important and even dominate the observed total correlation function. We investigate different configurations formed by the observer and the pair of galaxies, and we find that the case of near-radial large-scale separations is where these effects will be the most important.Comment: 13 pages, 11 figures; Minor changes. Version accepted by GR

    Shared Intentions, Loose Groups and Pooled Knowledge

    Get PDF
    We study shared intentions in what we call “loose groups”. These are groups that lack a codified organizational structure, and where the communication channels between group members are either unreliable or not completely open. We start by formulating two desiderata for shared intentions in such groups. We then argue that no existing account meets these two desiderata, because they assume either too strong or too weak an epistemic condition, that is, a condition on what the group members know and believe about what the others intend, know, and believe. We propose an alternative, pooled knowledge, and argue that it allows formulating conditions on shared intentions that meet the two desiderata

    Shared Intentions, Loose Groups and Pooled Knowledge

    Get PDF
    We study shared intentions in what we call “loose groups”. These are groups that lack a codified organizational structure, and where the communication channels between group members are either unreliable or not completely open. We start by formulating two desiderata for shared intentions in such groups. We then argue that no existing account meets these two desiderata, because they assume either too strong or too weak an epistemic condition, that is, a condition on what the group members know and believe about what the others intend, know, and believe. We propose an alternative, pooled knowledge, and argue that it allows formulating conditions on shared intentions that meet the two desiderata

    Observing transiting planets with JWST -- Prime targets and their synthetic spectral observations

    Full text link
    The James Webb Space Telescope will enable astronomers to obtain exoplanet spectra of unprecedented precision. Especially the MIRI instrument may shed light on the nature of the cloud particles obscuring planetary transmission spectra in the optical and near-infrared. We provide self-consistent atmospheric models and synthetic JWST observations for prime exoplanet targets in order to identify spectral regions of interest and estimate the number of transits needed to distinguish between model setups. We select targets which span a wide range in planetary temperature and surface gravity, ranging from super-Earths to giant planets, and have a high expected SNR. For all targets we vary the enrichment, C/O ratio, presence of optical absorbers (TiO/VO) and cloud treatment. We calculate atmospheric structures and emission and transmission spectra for all targets and use a radiometric model to obtain simulated observations. We analyze JWST's ability to distinguish between various scenarios. We find that in very cloudy planets such as GJ 1214b less than 10 transits with NIRSpec may be enough to reveal molecular features. Further, the presence of small silicate grains in atmospheres of hot Jupiters may be detectable with a single JWST MIRI transit. For a more detailed characterization of such particles less than 10 transits are necessary. Finally, we find that some of the hottest hot Jupiters are well fitted by models which neglect the redistribution of the insolation and harbor inversions, and that 1-4 eclipse measurements with NIRSpec are needed to distinguish between the inversion models. Wet thus demonstrate the capabilities of JWST for solving some of the most intriguing puzzles in current exoplanet atmospheric research. Further, by publishing all models calculated for this study we enable the community to carry out similar or retrieval analyses for all planets included in our target list.Comment: 24 pages, 7 figures, accepted for publication in A&

    QTL detection for a medium density SNP panel: comparison of different LD and LA methods

    Get PDF
    Background: New molecular technologies allow high throughput genotyping for QTL mapping with dense genetic maps. Therefore, the interest of linkage analysis models against linkage disequilibrium could be questioned. As these two strategies are very sensitive to marker density, experimental design structures, linkage disequilibrium extent and QTL effect, we propose to investigate these parameters effects on QTL detection.[br/] Methods: The XIIIth QTLMAS workshop simulated dataset was analysed using three linkage disequilibrium models and a linkage analysis model. Interval mapping, multivariate and interaction between QTL analyses were performed using QTLMAP.[br/] Results: The linkage analysis models identified 13 QTL, from which 10 mapped close of the 18 which were simulated and three other positions being falsely mapped as containing a QTL. Most of the QTLs identified by interval mapping analysis are not clearly detected by any linkage disequilibrium model. In addition, QTL effects are evolving during the time which was not observed using the linkage disequilibrium models.[br/] Conclusions: Our results show that for such a marker density the interval mapping strategy is still better than using the linkage disequilibrium only. While the experimental design structure gives a lot of power to both approaches, the marker density and informativity clearly affect linkage disequilibrium efficiency for QTL detection

    Comparison of analyses of the XVth QTLMAS common dataset III: Genomic Estimations of Breeding Values

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The QTLMAS XV<sup>th </sup>dataset consisted of pedigree, marker genotypes and quantitative trait performances of animals with a sib family structure. Pedigree and genotypes concerned 3,000 progenies among those 2,000 were phenotyped. The trait was regulated by 8 QTLs which displayed additive, imprinting or epistatic effects. The 1,000 unphenotyped progenies were considered as candidates to selection and their Genomic Estimated Breeding Values (GEBV) were evaluated by participants of the XV<sup>th </sup>QTLMAS workshop. This paper aims at comparing the GEBV estimation results obtained by seven participants to the workshop.</p> <p>Methods</p> <p>From the known QTL genotypes of each candidate, two "true" genomic values (TV) were estimated by organizers: the genotypic value of the candidate (TGV) and the expectation of its progeny genotypic values (TBV). GEBV were computed by the participants following different statistical methods: random linear models (including BLUP and Ridge Regression), selection variable techniques (LASSO, Elastic Net) and Bayesian methods. Accuracy was evaluated by the correlation between TV (TGV or TBV) and GEBV presented by participants. Rank correlation of the best 10% of individuals and error in predictions were also evaluated. Bias was tested by regression of TV on GEBV.</p> <p>Results</p> <p>Large differences between methods were found for all criteria and type of genetic values (TGV, TBV). In general, the criteria ranked consistently methods belonging to the same family.</p> <p>Conclusions</p> <p>Bayesian methods - A<B<C<Cπ - were the most efficient whatever the criteria and the True Value considered (with the notable exception of the MSEP of the TBV). The selection variable procedures (LASSO, Elastic Net and some adaptations) performed similarly, probably at a much lower computing cost. The TABLUP, which combines BayesB and GBLUP, generally did well. The simplest methods, GBLUP or Ridge Regression, and even worst, the fixed linear model, were much less efficient.</p

    Knowledge, belief, normality, and introspection

    Get PDF
    We study two logics of knowledge and belief stemming from the work of Stalnaker (2006), omitting positive introspection for knowledge. The two systems are equivalent with positive introspection, but not without. We show that while the logic of beliefs remains unaffected by omitting introspection for knowledge in one system, it brings significant changes to the other. The resulting logic of belief is non-normal, and its complete axiomatization uses an infinite hierarchy of coherence constraints. We conclude by returning to the philosophical interpretation underlying both models of belief, showing that neither is strong enough to support a probabilistic interpretation, nor an interpretation in terms of certainty or the "mental component" of knowledge
    corecore