41 research outputs found

    Advanced data fusion: Random forest proximities and pseudo-sample principle towards increased prediction accuracy and variable interpretation

    Get PDF
    Data fusion has gained much attention in the field of life sciences, and this is because analysis of biological samples may require the use of data coming from multiple complementary sources to express the samples fully. Data fusion lies in the idea that different data platforms detect different biological entities. Therefore, if these different biological compounds are then combined, they can provide comprehensive profiling and understanding of the research question in hand. Data fusion can be performed in three different traditional ways: low-level, mid-level, and high-level data fusion. However, the increasing complexity and amount of generated data require the development of more sophisticated fusion approaches. In that regard, the current study presents an advanced data fusion approach (i.e. proximities stacking) based on random forest proximities coupled with the pseudo-sample principle. Four different data platforms of 130 samples each (faecal microbiome, blood, blood headspace, and exhaled breath samples of patients who have Crohn's disease) were used to demonstrate the classification performance of this new approach. More specifically, 104 samples were used to train and validate the models, whereas the remaining 26 samples were used to validate the models externally. Mid-level, high-level, as well as individual platform classification predictions, were made and compared against the proximities stacking approach. The performance of each approach was assessed by calculating the sensitivity and specificity of each model for the external test set, and visualized by performing principal component analysis on the proximity matrices of the training samples to then, subsequently, project the test samples onto that space. The implementation of pseudo-samples allowed for the identification of the most important variables per platform, finding relations among variables of the different data platforms, and the ex-amination of how variables behave in the samples. The proximities stacking approach outperforms both mid-level and high-level fusion approaches, as well as all individual platform predictions. Concurrently, it tackles significant bottlenecks of the traditional ways of fusion and of another advanced fusion way discussed in the paper, and finally, it contradicts the general belief that the more data, the merrier the result, and therefore, considerations have to be taken into account before any data fusion analysis is conducted. (c) 2021 Published by Elsevier B.V

    The effect of induced forelimb lameness on thoracolumbar kinematics during treadmill locomotion

    Get PDF
    Reasons for performing study: Lameness has often been suggested to result in altered movement of the back, but there are no detailed studies describing such a relationship in quantitative terms. Objectives: To quantify the effect of induced subtle forelimb lameness on thoracolumbar kinematics in the horse. Methods: Kinematics of 6 riding horses was measured at walk and at trot on a treadmill before and after the induction of reversible forelimb lameness grade 2 (AAEP scale 1-5). Ground reaction forces (GRF) for individual limbs were calculated from kinematics. Results: The horses significantly unloaded the painful limb by 11.5% at trot, while unloading at walk was not significant. The overall flexion-extension range of back motion decreased on average by 0.2° at walk and increased by 3.3° at trot (P<0.05). Changes in angular motion patterns of vertebral joints were noted only at trot, with an increase in flexion of 0.9° at T10 (i.e. angle between T6, T10 and T13) during the stance phase of the sound diagonal and an increase in extension of the thoracolumbar area during stance of the lame diagonal (0.7° at T13, 0.8° at T17, 0.5° at L1, 0.4° at L3 and 0.3° at L5) (P<0.05). Lameness further caused a lateral bending of the cranial thoracic vertebral column towards the lame side (1.3° at T10 and 0.9° at T13) (P<0.05) during stance of the lame diagonal. Conclusions: Both range of motion and vertebral angular motion patterns are affected by subtle forelimb lameness. At walk, the effect is minimal, at trot the horses increased the vertebral range of motion and changed the pattern of thoracolumbar motion in the sagittal and horizontal planes, presumably in an attempt to move the centre of gravity away from the lame side and reduce the force on the affected limb. Potential relevance: Subtle forelimb lameness affects thoracolumbar kinematics. Future studies should aim at elucidating whether the altered movement patterns lead to back and/or neck dysfunction in the case of chronic lameness

    Famine food of vegetal origin consumed in the Netherlands during World War II

    Get PDF
    Background: Periods of extreme food shortages during war force people to eat food that they normally do not consider edible. The last time that countries in Western Europe experienced severe scarcities was during World War II. The so-called Dutch famine or Hunger Winter (1944-1945) made at least 25,000 victims. The Dutch government took action by opening soup kitchens and providing information on wild plants and other famine food sources in "wartime cookbooks." The Dutch wartime diet has never been examined from an ethnobotanical perspective. Methods: We interviewed 78 elderly Dutch citizens to verify what they remembered of the consumption of vegetal and fungal famine food during World War II by them and their close surroundings. We asked whether they experienced any adverse effects from consuming famine food plants and how they knew they were edible. We identified plant species mentioned during interviews by their local Dutch names and illustrated field guides and floras. We hypothesized that people living in rural areas consumed more wild species than urban people. A Welch t test was performed to verify whether the number of wild and cultivated species differed between urban and rural citizens. Results: A total number of 38 emergency food species (14 cultivated and 21 wild plants, three wild fungi) were mentioned during interviews. Sugar beets, tulip bulbs, and potato peels were most frequently consumed. Regularly eaten wild species were common nettle, blackberry, and beechnuts. Almost one third of our interviewees explicitly described to have experienced extreme hunger during the war. People from rural areas listed significantly more wild species than urban people. The number of cultivated species consumed by both groups was similar. Negative effects were limited to sore throats and stomachache from the consumption of sugar beets and tulip bulbs. Knowledge on the edibility of famine food was obtained largely by oral transmission; few people remembered the written recipes in wartime cookbooks. Conclusion: This research shows that 71years after the Second World War, knowledge on famine food species, once crucial for people's survival, is still present in the Dutch society. The information on famine food sources supplied by several institutions was not distributed widely. For the necessary revival of famine food knowledge during the 1940s, people needed to consult a small group of elders. Presumed toxicity was a major reason given by our participants to explain why they did not collect wild plants or mushrooms during the war

    The pro region of human intestinal lactase-phlorizin hydrolase

    No full text
    Item does not contain fulltex

    Electronic health record-triggered research infrastructure combining real-world electronic health record data and patient-reported outcomes to detect benefits, risks, and impact of medication: development study.

    No full text
    Background Real-world data from electronic health records (EHRs) represent a wealth of information for studying the benefits and risks of medical treatment. However, they are limited in scope and should be complemented by information from the patient perspective. Objective The aim of this study is to develop an innovative research infrastructure that combines information from EHRs with patient experiences reported in questionnaires to monitor the risks and benefits of medical treatment. Methods We focused on the treatment of overactive bladder (OAB) in general practice as a use case. To develop the Benefit, Risk, and Impact of Medication Monitor (BRIMM) infrastructure, we first performed a requirement analysis. BRIMM's starting point is routinely recorded general practice EHR data that are sent to the Dutch Nivel Primary Care Database weekly. Patients with OAB were flagged weekly on the basis of diagnoses and prescriptions. They were invited subsequently for participation by their general practitioner (GP), via a trusted third party. Patients received a series of questionnaires on disease status, pharmacological and nonpharmacological treatments, adverse drug reactions, drug adherence, and quality of life. The questionnaires and a dedicated feedback portal were developed in collaboration with a patient association for pelvic-related diseases, Bekkenbodem4All. Participating patients and GPs received feedback. An expert meeting was organized to assess the strengths, weaknesses, opportunities, and threats of the new research infrastructure. Results The BRIMM infrastructure was developed and implemented. In the Nivel Primary Care Database, 2933 patients with OAB from 27 general practices were flagged. GPs selected 1636 (55.78%) patients who were eligible for the study, of whom 295 (18.0% of eligible patients) completed the first questionnaire. A total of 288 (97.6%) patients consented to the linkage of their questionnaire data with their EHR data. According to experts, the strengths of the infrastructure were the linkage of patient-reported outcomes with EHR data, comparison of pharmacological and nonpharmacological treatments, flexibility of the infrastructure, and low registration burden for GPs. Methodological weaknesses, such as susceptibility to bias, patient selection, and low participation rates among GPs and patients, were seen as weaknesses and threats. Opportunities represent usefulness for policy makers and health professionals, conditional approval of medication, data linkage to other data sources, and feedback to patients. Conclusions The BRIMM research infrastructure has the potential to assess the benefits and safety of (medical) treatment in real-life situations using a unique combination of EHRs and patient-reported outcomes. As patient involvement is an important aspect of the treatment process, generating knowledge from clinical and patient perspectives is valuable for health care providers, patients, and policy makers. The developed methodology can easily be applied to other treatments and health problems

    Proteolytic processing of the receptor-type protein tyrosine phosphatase PTPBR7.

    No full text
    Contains fulltext : 53078.pdf (publisher's version ) (Closed access)The single-copy mouse gene Ptprr gives rise to different protein tyrosine phosphatase (PTP) isoforms in neuronal cells through the use of distinct promoters, alternative splicing, and multiple translation initiation sites. Here, we examined the array of post-translational modifications imposed on the PTPRR protein isoforms PTPBR7, PTP-SL, PTPPBSgamma42 and PTPPBSgamma37, which have distinct N-terminal segments and localize to different parts of the cell. All isoforms were found to be short-lived, constitutively phosphorylated proteins. In addition, the transmembrane isoform, PTPBR7, was subject to N-terminal proteolytic processing, in between amino acid position 136 and 137, resulting in an additional, 65-kDa transmembrane PTPRR isoform. Unlike for some other receptor-type PTPs, the proteolytically produced N-terminal ectodomain does not remain associated with this PTPRR-65. Shedding of PTPBR7-derived polypeptides at the cell surface further adds to the molecular complexity of PTPRR biology
    corecore