96 research outputs found

    Top-Level Categories of Constitutively Organized Material Entities - Suggestions for a Formal Top-Level Ontology

    Get PDF
    Application oriented ontologies are important for reliably communicating and managing data in databases. Unfortunately, they often differ in the definitions they use and thus do not live up to their potential. This problem can be reduced when using a standardized and ontologically consistent template for the top-level categories from a top-level formal foundational ontology. This would support ontological consistency within application oriented ontologies and compatibility between them. The Basic Formal Ontology (BFO) is such a foundational ontology for the biomedical domain that has been developed following the single inheritance policy. It provides the top-level template within the Open Biological and Biomedical Ontologies Foundry. If it wants to live up to its expected role, its three top-level categories of material entity (i.e., 'object', 'fiat object part', 'object aggregate') must be exhaustive, i.e. every concrete material entity must instantiate exactly one of them.By systematically evaluating all possible basic configurations of material building blocks we show that BFO's top-level categories of material entity are not exhaustive. We provide examples from biology and everyday life that demonstrate the necessity for two additional categories: 'fiat object part aggregate' and 'object with fiat object part aggregate'. By distinguishing topological coherence, topological adherence, and metric proximity we furthermore provide a differentiation of clusters and groups as two distinct subcategories for each of the three categories of material entity aggregates, resulting in six additional subcategories of material entity.We suggest extending BFO to incorporate two additional categories of material entity as well as two subcategories for each of the three categories of material entity aggregates. With these additions, BFO would exhaustively cover all top-level types of material entity that application oriented ontologies may use as templates. Our result, however, depends on the premise that all material entities are organized according to a constitutive granularity

    Classificatory Theory in Data-Intensive Science: The Case of Open Biomedical Ontologies

    Get PDF
    publication-status: Publishedtypes: ArticleThis is the author's version of a paper that was subsequently published in International Studies in the Philosophy of Science. Please cite the published version by following the DOI link.Knowledge-making practices in biology are being strongly affected by the availability of data on an unprecedented scale, the insistence on systemic approaches and growing reliance on bioinformatics and digital infrastructures. What role does theory play within data-intensive science, and what does that tell us about scientific theories in general? To answer these questions, I focus on Open Biomedical Ontologies, digital classification tools that have become crucial to sharing results across research contexts in the biological and biomedical sciences, and argue that they constitute an example of classificatory theory. This form of theorizing emerges from classification practices in conjunction with experimental know-how and expresses the knowledge underpinning the analysis and interpretation of data disseminated online.Economic and Social Research Council (ESRC)The British AcademyLeverhulme Trus

    Precise measurement of the thermal and stellar 54^{54}Fe(n,γn, \gamma)55^{55}Fe cross sections via AMS

    Get PDF
    The detection of long-lived radionuclides through ultra-sensitive single atom counting via accelerator mass spectrometry (AMS) offers opportunities for precise measurements of neutron capture cross sections, e.g. for nuclear astrophysics. The technique represents a truly complementary approach, completely independent of previous experimental methods. The potential of this technique is highlighted at the example of the 54^{54}Fe(n,γn, \gamma)55^{55}Fe reaction. Following a series of irradiations with neutrons from cold and thermal to keV energies, the produced long-lived 55^{55}Fe nuclei (t1/2=2.744(9)t_{1/2}=2.744(9) yr) were analyzed at the Vienna Environmental Research Accelerator (VERA). A reproducibility of about 1% could be achieved for the detection of 55^{55}Fe, yielding cross section uncertainties of less than 3%. Thus, the new data can serve as anchor points to time-of-flight experiments. We report significantly improved neutron capture cross sections at thermal energy (σth=2.30±0.07\sigma_{th}=2.30\pm0.07 b) as well as for a quasi-Maxwellian spectrum of kT=25kT=25 keV (σ=30.3±1.2\sigma=30.3\pm1.2 mb) and for En=481±53E_n=481\pm53 keV (σ=6.01±0.23\sigma= 6.01\pm0.23 mb). The new experimental cross sections have been used to deduce improved Maxwellian average cross sections in the temperature regime of the common ss-process scenarios. The astrophysical impact is discussed using stellar models for low-mass AGB stars

    Deficient Liver Biosynthesis of Docosahexaenoic Acid Correlates with Cognitive Impairment in Alzheimer's Disease

    Get PDF
    Reduced brain levels of docosahexaenoic acid (C22:6n-3), a neurotrophic and neuroprotective fatty acid, may contribute to cognitive decline in Alzheimer's disease. Here, we investigated whether the liver enzyme system that provides docosahexaenoic acid to the brain is dysfunctional in this disease. Docosahexaenoic acid levels were reduced in temporal cortex, mid-frontal cortex and cerebellum of subjects with Alzheimer's disease, compared to control subjects (P = 0.007). Mini Mental State Examination (MMSE) scores positively correlated with docosahexaenoic/α-linolenic ratios in temporal cortex (P = 0.005) and mid-frontal cortex (P = 0.018), but not cerebellum. Similarly, liver docosahexaenoic acid content was lower in Alzheimer's disease patients than control subjects (P = 0.011). Liver docosahexaenoic/α-linolenic ratios correlated positively with MMSE scores (r = 0.78; P<0.0001), and negatively with global deterioration scale grades (P = 0.013). Docosahexaenoic acid precursors, including tetracosahexaenoic acid (C24:6n-3), were elevated in liver of Alzheimer's disease patients (P = 0.041), whereas expression of peroxisomal d-bifunctional protein, which catalyzes the conversion of tetracosahexaenoic acid into docosahexaenoic acid, was reduced (P = 0.048). Other genes involved in docosahexaenoic acid metabolism were not affected. The results indicate that a deficit in d-bifunctional protein activity impairs docosahexaenoic acid biosynthesis in liver of Alzheimer's disease patients, lessening the flux of this neuroprotective fatty acid to the brain

    The Roles of Standardization, Certification and Assurance Services in Global Commerce

    Get PDF
    In this article we examine the rapid emergence and expansion of standardized product and process frameworks and a private-sector compliance and enforcement infrastructure that we believe may increasingly be providing a substitute for public and legal regulatory infrastructure in global commerce. This infrastructure is provided by a proliferation of performance codes and standards, many of which define acceptable social and environmental behavior, and a rapidly-growing number of privately-trained and authorized inspectors and certifiers that we call the third-party assurance industry. We offer reasons for this development, evidence of its scope and scale, and then describe the phenomenon in more detail by examining supply chain arrangements in two industries, food products and apparel, where the use of third-party standards and assurance services has expanded especially rapidly. We conclude with a discussion of the implications for the make or buy decision at the core of the theory of the firm. We argue that as quasi-regulatory standards are developed within various industries, and as performance to these standards can be systematically evaluated using third-party inspectors and certifiers, the costs of moving production outside of vertical firm hierarchies drop. We believe this may be an important factor in accelerating the shift to outsourcing that has been observed over the last two decades

    Health-related qualify of life, angina type and coronary artery disease in patients with stable chest pain

    Get PDF
    Background: Health-related quality of life (HRQoL) is impaired in patients with stable angina but patients often present with other forms of chest pain. The aim of this study was to compare the pre-diagnostic HRQoL in patients with suspected coronary artery disease (CAD) according to angina type, gender, and presence of obstructive CAD. Methods: From the pilot study for the European DISCHARGE trial, we analysed data from 24 sites including 1263 patients (45.9% women, 61.1 ± 11.3 years) who were clinically referred for invasive coronary angiography (ICA; 617 patients) or coronary computed tomography angiography (CTA; 646 patients). Prior to the procedures, patients completed HRQoL questionnaires: the Short Form (SF)-12v2, the EuroQoL (EQ-5D-3 L) and the Hospital Anxiety and Depression Scale. Results: Fifty-five percent of ICA and 35% of CTA patients had typical angina, 23 and 33% had atypical angina, 18 and 28% had non-anginal chest discomfort and 5 and 5% had other chest discomfort, respectively. Patients with typical angina had the poorest physical functioning compared to the other angina groups (SF-12 physical component score; 41.2 ± 8.8, 43.3 ± 9.1, 46.2 ± 9.0, 46.4 ± 11.4, respectively, all age and gender-adjusted p &lt; 0.01), and highest anxiety levels (8.3 ± 4.1, 7.5 ± 4.1, 6.5 ± 4.0, 4.7 ± 4.5, respectively, all adjusted p &lt; 0.01). On all other measures, patients with typical or atypical angina had lower HRQoL compared to the two other groups (all adjusted p &lt; 0.05). HRQoL did not differ between patients with and without obstructive CAD while women had worse HRQoL compared with men, irrespective of age and angina type. Conclusions: Prior to a diagnostic procedure for stable chest pain, HRQoL is associated with chest pain characteristics, but not with obstructive CAD, and is significantly lower in women. Trial registration: Clinicaltrials.gov, NCT02400229

    Control of Transonic Cavity Flow Instability by Streamwise Air Injection

    Full text link
    A time-dependent numerical model of a turbulent Mach 1.5 flow over a rectangular cavity has been developed, to investigate suppression strategies for its natural self-sustained instability. This instability adversely affects the cavity form drag, it produces large-amplitude pressure oscillations in the enclosure and it is a source of far-field acoustic radiation. To suppress the natural flow instability, the leading edge of the two-dimensional cavity model is fitted with a simulated air jet that discharges in the downstream direction. The jet mass flow rate and nozzle depth are adjusted to attenuate the instability while minimising the control mass flow rate. The numerical predictions indicate that, at the selected inflow conditions, the configurations with the deepest nozzle (0.75 of the cavity depth) give the most attenuation of the modelled instability, which is dominated by the cavity second mode. The jet affects both the unsteady pressure field and the vorticity distribution inside the enclosure, which are, together, key determinants of the cavity leading instability mode amplitude. The unsteadiness of the pressure field is reduced by the lifting of the cavity shear layer at the rear end above the trailing edge. This disrupts the formation of upstream travelling feed-back pressure waves and the generation of far-field noise. The deep nozzle also promotes a downstream bulk flow in the enclosure, running from the upstream vertical wall to the downstream one. This flow attenuates the large-scale clockwise recirculation that is present in the unsuppressed cavity flow. The same flow alters the top shear layer vorticity thickness and probably affects the convective growth of the shear layer cavity second mode

    POD Analysis of Cavity Flow Instability

    Full text link
    A Mach 1.5 turbulent cavity flow develops large-amplitude oscillations, pressure drag and noise. This type of flow instability affects practical engineering applications, such as aircraft store bays. A simple model of the flow instability is sought towards developing a real-time model-based active control system for simple geometries, representative of open aircraft store bays. An explicit time marching second-order accurate finite-volume scheme has been used to generate time-dependent benchmark cavity flow data. Then, a simpler and leaner numerical predictor for the unsteady cavity pressure was developed, based on a Proper Orthogonal Decomposition of the benchmark data. The low order predictor gives pressure oscillations in good agreement with the benchmark CFD method. This result highlights the importance of large-scale phase-coherent structures in the Mach 1.5 turbulent cavity flow. At the selected test conditions, the significant pressure ‘energy’ content of these structures enabled an effective reduced order model of the cavity dynamic system. Directions and methods to further streamline and simplify the unsteady pressure predictor have been highlighted

    Curation of viral genomes: challenges, applications and the way forward

    Get PDF
    BACKGROUND: Whole genome sequence data is a step towards generating the 'parts list' of life to understand the underlying principles of Biocomplexity. Genome sequencing initiatives of human and model organisms are targeted efforts towards understanding principles of evolution with an application envisaged to improve human health. These efforts culminated in the development of dedicated resources. Whereas a large number of viral genomes have been sequenced by groups or individuals with an interest to study antigenic variation amongst strains and species. These independent efforts enabled viruses to attain the status of 'best-represented taxa' with the highest number of genomes. However, due to lack of concerted efforts, viral genomic sequences merely remained as entries in the public repositories until recently. RESULTS: VirGen is a curated resource of viral genomes and their analyses. Since its first release, it has grown both in terms of coverage of viral families and development of new modules for annotation and analysis. The current release (2.0) includes data for twenty-five families with broad host range as against eight in the first release. The taxonomic description of viruses in VirGen is in accordance with the ICTV nomenclature. A well-characterised strain is identified as a 'representative entry' for every viral species. This non-redundant dataset is used for subsequent annotation and analyses using sequenced-based Bioinformatics approaches. VirGen archives precomputed data on genome and proteome comparisons. A new data module that provides structures of viral proteins available in PDB has been incorporated recently. One of the unique features of VirGen is predicted conformational and sequential epitopes of known antigenic proteins using in-house developed algorithms, a step towards reverse vaccinology. CONCLUSION: Structured organization of genomic data facilitates use of data mining tools, which provides opportunities for knowledge discovery. One of the approaches to achieve this goal is to carry out functional annotations using comparative genomics. VirGen, a comprehensive viral genome resource that serves as an annotation and analysis pipeline has been developed for the curation of public domain viral genome data . Various steps in the curation and annotation of the genomic data and applications of the value-added derived data are substantiated with case studies
    • …
    corecore