88 research outputs found
Representing SNOMED CT Concept Evolutions using Process Profiles
Abstract. SNOMED CT is a very large biomedical terminology supported by a concept-based ontology. In recent years it has been distributed under the new release format 'RF2'. RF2 provides a more consistent and coherent mechanism for keeping track of changes over versions, even to the extent that -in theory at leastany release will contain enough information to allow reconstruction of all previous versions. In this paper, using the January 2016 release of SNOMED CT, we explore various ways to transform change-assertions in RF2 into a more uniform representation with the goal of assessing how faithful these changes are with respect to biomedical reality. Key elements in our approach are (1) recent proposals for the Information Artifact Ontology that provide a realism-based perspective on what it means for a representation to be about something, and (2) the expectation that the theory of what we call 'process profiles' can be applied not merely to quantitative information artifacts but also to other sorts of symbolic representations of processes
Symmetry without Symmetry: Numerical Simulation of Axisymmetric Systems using Cartesian Grids
We present a new technique for the numerical simulation of axisymmetric
systems. This technique avoids the coordinate singularities which often arise
when cylindrical or polar-spherical coordinate finite difference grids are
used, particularly in simulating tensor partial differential equations like
those of 3+1 numerical relativity. For a system axisymmetric about the z axis,
the basic idea is to use a 3-dimensional Cartesian (x,y,z) coordinate grid
which covers (say) the y=0 plane, but is only one
finite-difference-molecule--width thick in the y direction. The field variables
in the central y=0 grid plane can be updated using normal (x,y,z)--coordinate
finite differencing, while those in the y \neq 0 grid planes can be computed
from those in the central plane by using the axisymmetry assumption and
interpolation. We demonstrate the effectiveness of the approach on a set of
fully nonlinear test computations in 3+1 numerical general relativity,
involving both black holes and collapsing gravitational waves.Comment: 17 pages, 4 figure
Masses and Mixings in a Grand Unified Toy Model
The generation of the fermion mass hierarchy in the standard model of
particle physics is a long-standing puzzle. The recent discoveries from
neutrino physics suggests that the mixing in the lepton sector is large
compared to the quark mixings. To understand this asymmetry between the quark
and lepton mixings is an important aim for particle physics. In this regard,
two promising approaches from the theoretical side are grand unified theories
and family symmetries. In this note we try to understand certain general
features of grand unified theories with Abelian family symmetries by taking the
simplest SU(5) grand unified theory as a prototype. We construct an SU(5) toy
model with family symmetry
that, in a natural way, duplicates the observed mass hierarchy and mixing
matrices to lowest approximation. The system for generating the mass hierarchy
is through a Froggatt-Nielsen type mechanism. One idea that we use in the model
is that the quark and charged lepton sectors are hierarchical with small mixing
angles while the light neutrino sector is democratic with larger mixing angles.
We also discuss some of the difficulties in incorporating finer details into
the model without making further assumptions or adding a large scalar sector.Comment: 21 pages, 2 figures, RevTeX, v2: references updated and typos
corrected, v3: updated top quark mass, comments on MiniBooNE result, and
typos correcte
Guinea pig models for translation of the developmental origins of health and disease hypothesis into the clinic
Over 30 years ago Professor David Barker first proposed the theory that events in early life could explain an individual\u27s risk of non-communicable disease in later life: the developmental origins of health and disease (DOHaD) hypothesis. During the 1990s the validity of the DOHaD hypothesis was extensively tested in a number of human populations and the mechanisms underpinning it characterised in a range of experimental animal models. Over the past decade, researchers have sought to use this mechanistic understanding of DOHaD to develop therapeutic interventions during pregnancy and early life to improve adult health. A variety of animal models have been used to develop and evaluate interventions, each with strengths and limitations. It is becoming apparent that effective translational research requires that the animal paradigm selected mirrors the tempo of human fetal growth and development as closely as possible so that the effect of a perinatal insult and/or therapeutic intervention can be fully assessed. The guinea pig is one such animal model that over the past two decades has demonstrated itself to be a very useful platform for these important reproductive studies. This review highlights similarities in the in utero development between humans and guinea pigs, the strengths and limitations of the guinea pig as an experimental model of DOHaD and the guinea pig\u27s potential to enhance clinical therapeutic innovation to improve human health. (Figure presented.)
New implementation of data standards for AI research in precision oncology. Experience from EuCanImage
An unprecedented amount of personal health data, with the potential to revolutionise precision medicine, is generated at healthcare institutions worldwide. The exploitation of such data using artificial intelligence relies on the ability to combine heterogeneous, multicentric, multimodal and multiparametric data, as well as thoughtful representation of knowledge and data availability. Despite these possibilities, significant methodological challenges and ethico-legal constraints still impede the real-world implementation of data models. The EuCanImage is an international consortium aimed at developing AI algorithms for precision medicine in oncology and enabling secondary use of the data based on necessary ethical approvals. The use of well-defined clinical data standards to allow interoperability was a central element within the initiative. The consortium is focused on three different cancer types and addresses seven unmet clinical needs. This article synthesises our experience and procedures for healthcare data interoperability and standardisation.Competing Interest StatementThe authors have declared no competing interest.Funding StatementThis project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 952103.Author DeclarationsI confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.YesI confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals.YesI understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).YesI have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable.YesThis study describes a new process to harmonize and standardize clinical data. The data will be available upon request to the authors
New implementation of data standards for AI research in precision oncology. Experience from EuCanImage
An unprecedented amount of personal health data, with the potential to revolutionise precision medicine, is generated at healthcare institutions worldwide. The exploitation of such data using artificial intelligence relies on the ability to combine heterogeneous, multicentric, multimodal and multiparametric data, as well as thoughtful representation of knowledge and data availability. Despite these possibilities, significant methodological challenges and ethico-legal constraints still impede the real-world implementation of data models. The EuCanImage is an international consortium aimed at developing AI algorithms for precision medicine in oncology and enabling secondary use of the data based on necessary ethical approvals. The use of well-defined clinical data standards to allow interoperability was a central element within the initiative. The consortium is focused on three different cancer types and addresses seven unmet clinical needs. This article synthesises our experience and procedures for healthcare data interoperability and standardisation.Competing Interest StatementThe authors have declared no competing interest.Funding StatementThis project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 952103.Author DeclarationsI confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.YesI confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals.YesI understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).YesI have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable.YesThis study describes a new process to harmonize and standardize clinical data. The data will be available upon request to the authors
Measurement of the W-boson mass in pp collisions at √s=7 TeV with the ATLAS detector
A measurement of the mass of the W boson is presented based on proton–proton collision data recorded in 2011 at a centre-of-mass energy of 7 TeV with the ATLAS detector at the LHC, and corresponding to 4.6 fb−1 of integrated luminosity. The selected data sample consists of 7.8×106 candidates in the W→μν channel and 5.9×106 candidates in the W→eν channel. The W-boson mass is obtained from template fits to the reconstructed distributions of the charged lepton transverse momentum and of the W boson transverse mass in the electron and muon decay channels, yielding
mW=80370±7 (stat.)±11(exp. syst.)
±14(mod. syst.) MeV
=80370±19MeV,
where the first uncertainty is statistical, the second corresponds to the experimental systematic uncertainty, and the third to the physics-modelling systematic uncertainty. A measurement of the mass difference between the W+ and W−bosons yields mW+−mW−=−29±28 MeV
- …