5,466 research outputs found
Phase order in superfluid helium films
Classic experimental data on helium films are transformed to estimate a
finite-size phase order parameter that measures the thermal degradation of the
condensate fraction in the two-dimensional superfluid. The order parameter is
found to evolve thermally with the exponent , a
characteristic, in analogous magnetic systems, of the
Berezinskii-Kosterlitz-Thouless (BKT) phase transition. Universal scaling near
the BKT fixed point generates a collapse of experimental data on helium and
ferromagnetic films, and implies new experiments and theoretical protocols to
explore the phase order. These results give a striking example of experimental
finite-size scaling in a critical system that is broadly relevant to
two-dimensional Bose fluids.Comment: 6 pages, 2 figure
DNA, statistics and the law : a cross-disciplinary approach to forensic inference
The use of results of DNA analyses in the legal process is a highly ambivalent topic. On the one hand, scientists have never been in a better position to analyse biological matter of various natures, even in limited quantities and degraded conditions. On the other hand, the increasing amounts of scientific data that can be gen-erated through modern analytical processes do not necessarily imply that evaluative questions that arise in the legal context are given more satisfactory answers. A fundamental question that has accompanied DNA analyses since the early days of their use in the legal process thus remains: how do we handle the challenges presented to us by the use of contemporary scientific and tech-nological developments in the field of law? Under the general theme “DNA, statistics and the law, ” the collection of articles in this Frontiers Research Topic pursues the goal of investigating this question from an interdisciplinary perspective, and with a
Towards a Bayesian evaluation of features in questioned handwritten signatures
In this work, we propose the construction of a evaluative framework for supporting experts in questioned signature examinations. Through the use of Bayesian networks, we envision to quantify the probative value of well defined measurements performed on questioned signatures, in a way that is both formalised and part of a coherent approach to evaluation.
At the current stage, our project is explorative, focusing on the broad range of aspects that relate to comparative signature examinations. The goal is to identify writing features which are both highly discriminant, and easy for forensic examiners to detect. We also seek for a balance between case-specific features and characteristics which can be measured in the vast majority of signatures. Care is also taken at preserving the interpretability at every step of the reasoning process.
This paves the way for future work, which will aim at merging the different contributions to a single probabilistic measure of strength of evidence using Bayesian networks
Prediction in forensic science: a critical examination of common understandings
In this commentary, we argue that the term 'prediction' is overly used when in fact, referring to foundational writings of de Finetti, the correspondent term should be inference. In particular, we intend (i) to summarize and clarify relevant subject matter on prediction from established statistical theory, and (ii) point out the logic of this understanding with respect practical uses of the term prediction. Written from an interdisciplinary perspective, associating statistics and forensic science as an example, this discussion also connects to related fields such as medical diagnosis and other areas of application where reasoning based on scientific results is practiced in societal relevant contexts. This includes forensic psychology that uses prediction as part of its vocabulary when dealing with matters that arise in the course of legal proceedings
Estimating the quantity of transferred DNA in primary and secondary transfers.
We conducted experiments to characterize the quantity of DNA recovered on surfaces using 6 donors with a view to help assigning probabilities to the observation of given quantities of DNA under different transfer scenarios. The donors were asked to conduct a total of 120 simulations involving primary transfer on a knife handle. With 2 selected donors, 60 associated experiments involving secondary transfer were also carried out. DNA recovered on COPAN’s FLOQSwab™ was extracted, quantified and profiled using standard commercial kits. DNA mixtures were subsequently deconvoluted using STRmix™ to obtain the proportion corresponding to the person of interest (POI). The transfer proportion between the quantity of DNA on the bare hands and the amounts recovered on the touched surfaces was also measured and studied.
For a given activity, each donor left varying amounts of DNA amounting to distributions that can be characterized by their means and standard deviations. The quantity of transferred DNA is dependent on the donor and on the type of transfer. Typically, our “best” donor left an average of 0.84ng (SD = 1.23) on a knife handle compared to a mean of 0.07ng (SD = 0.09) for the least “prone to leave DNA” donor. For secondary transfer, we recorded a mean of 0.04ng (SD = 0.11) for the first donor and of 0.002 (SD = 0.01, max = 0.04ng) for the second.
Linked to the above is the observation that the transfer proportion (i.e. the ratio of the quantity of DNA on an hand to the amount of DNA recovered following a transfer) depends also on the donor and on the type of the transfer. Hence the amount of DNA obtained on a given touched surface cannot simply be deduced from the quantity of DNA available on a donor’s hand.
Given these sources of variability, it is not advised to use a single and fixed label, such as “good “or “bad” regardless of the circumstances, to describe a donor’s ability to leave DNA. To properly evaluate the probability of finding a given quantity of DNA the whole variation of DNA quantity should be accounted for. This can be done by using or measuring empirically the appropriate underpinning distribution for that quantity. Note however that it will be conditioned upon the donor, the receiving surface and the transfer mechanism.
We also explored the potential benefit of deconvoluting mixtures to better characterize the quantity of DNA left by the POI as opposed to the total quantity of DNA measured by quantification. Our results show that such deconvolution is beneficial when low quantities of POI’s DNA may be mixed with larger quantities such as in secondary transfer scenarios. For primary transfers on clean surfaces, the touching person will dominate in the recovered DNA and the deconvolution is not critical
La naturaleza decisoria de las conclusiones de los expertos en ciencia forense (The decisionalization of individualization)
En la ciencia forense y ramas de la ciencia adyacentes, tanto investigadores del ámbito académico como quienes las practican continúan divergiendo en la percepción y comprensión del término “individualización”, es decir, la defensa de la tesis de que es posible reducir un conjunto de potenciales donantes de un vestigio forense a una única fuente. En concreto, se ha puesto de manifiesto que recientes cambios que entienden la práctica de la individualización como una decisión no son más que un mero cambio de etiqueta [1], dejando los cambios fundamentales en el orden del pensar y del entender aún pendientes. Es más, asociaciones profesionales y expertos huyen de adherirse a la noción de decisión tal y como la define la teoría formal de la decisión en la que la individualización puede contextualizarse, principalmente por las dificultades para tratar sobre las medidas de deseabilidad o no de las consecuencias de las decisiones (por ejemplo, utilizando las funciones de utilidad). Apoyándose en investigaciones existentes en esta área, este artículo presenta y discute sobre conceptos fundamentales de utilidades y costes, con particular referencia a su aplicación a la individualización forense. El artículo subraya que una adecuada comprensión de las herramientas de la decisión no solo reduce el número de asignaciones individuales que la aplicación de la teoría de la decisión requiere, sino que también muestra cómo esas asignaciones pueden relacionarse significativamente con las propiedades constituyentes del problema de la decisión en el mundo real al que se aplica la teoría. Se argumenta que la “decisionalización” de la individualización requiere esa percepción fundamental para iniciar cambios en las comprensiones subyacentes de esos campos, no meramente en el ámbito de sus etiquetas
Estimating the time since discharge of spent cartridges: a logical approach fro interpreting the evidence
Estimating the time since discharge of a spent cartridge or a firearm can be useful in criminal situa-tions involving firearms. The analysis of volatile gunshot residue remaining after shooting using solid-phase microextraction (SPME) followed by gas chromatography (GC) was proposed to meet this objective. However, current interpretative models suffer from several conceptual drawbacks which render them inadequate to assess the evidential value of a given measurement. This paper aims to fill this gap by proposing a logical approach based on the assessment of likelihood ratios. A probabilistic model was thus developed and applied to a hypothetical scenario where alternative hy-potheses about the discharge time of a spent cartridge found on a crime scene were forwarded. In order to estimate the parameters required to implement this solution, a non-linear regression model was proposed and applied to real published data. The proposed approach proved to be a valuable method for interpreting aging-related data
Bayes Factors for Forensic Decision Analyses with R
Bayes Factors for Forensic Decision Analyses with R provides a self-contained introduction to computational Bayesian statistics using R. With its primary focus on Bayes factors supported by data sets, this book features an operational perspective, practical relevance, and applicability—keeping theoretical and philosophical justifications limited. It offers a balanced approach to three naturally interrelated topics:
– Probabilistic Inference: Relies on the core concept of Bayesian inferential statistics, to help practicing forensic scientists in the logical and balanced evaluation of the weight of evidence.
– Decision Making: Features how Bayes factors are interpreted in practical applications to help address questions of decision analysis involving the use of forensic science in the law.
– Operational Relevance: Combines inference and decision, backed up with practical examples and complete sample code in R, including sensitivity analyses and discussion on how to interpret results in context.
Over the past decades, probabilistic methods have established a firm position as a reference approach for the management of uncertainty in virtually all areas of science, including forensic science, with Bayes' theorem providing the fundamental logical tenet for assessing how new information—scientific evidence—ought to be weighed. Central to this approach is the Bayes factor, which clarifies the evidential meaning of new information, by providing a measure of the change in the odds in favor of a proposition of interest, when going from the prior to the posterior distribution. Bayes factors should guide the scientist's thinking about the value of scientific evidence and form the basis of logical and balanced reporting practices, thus representing essential foundations for rational decision making under uncertainty.
This book would be relevant to students, practitioners, and applied statisticians interested in inference and decision analyses in the critical field of forensic science. It could be used to support practical courses on Bayesian statistics and decision theory at both undergraduate and graduate levels, and will be of equal interest to forensic scientists and practitioners of Bayesian statistics for driving their evaluations and the use of R for their purposes
Probabilistic evaluation of n traces with no putative source: A likelihood ratio based approach in an investigative framework
Analysis of marks recovered from different crime scenes can be useful to detect a linkage between criminal cases, even though a putative source for the recovered traces is not available. This particular circumstance is often encountered in the early stage of investigations and thus, the evaluation of evidence association may provide useful information for the investigators. This association is evaluated here from a probabilistic point of view: a likelihood ratio based approach is suggested in order to quantify the strength of the evidence of trace association in the light of two mutually exclusive propositions, namely that the n traces come from a common source or from an unspecified number of sources. To deal with this kind of problem, probabilistic graphical models are used, in form of Bayesian networks and object-oriented Bayesian networks, allowing users to intuitively handle with uncertainty related to the inferential problem
- …