169 research outputs found

    Is High-frequency stiffness a measure for the number of attached cross-bridges?

    Get PDF
    Muscle stiffness is an important property for movement control. Stiffness is a measure for the resistance against mechanical disturbances in muscular-skeletal systems. In general muscle stiffness is assumed to depend on the number of attached cross-bridges. It is not possible to measure this number in vivo or vitro. In experiments, high frequency perturbations are used to obtain a measurement of stiffness. In this paper a simulation study is presented concerning the correlation between the number of attached cross-bridges and high-frequency stiffness. A model based on the sliding-filament theory was used for the simulation of dynamic contractions. It is concluded that these two methods of muscle stiffness determination do not yield compatible results during lengthenin

    Segmenting Critical Factors for Enhancing the use of IT in Humanitarian Supply Chain Management

    Get PDF
    AbstractThis study intends to explore and segment the critical factors (CFs) to enhance the use of Information Technology (IT) in Humanitarian Supply Chain (HSC), particularly in the Indian context. In this study, ten influencing factors has been identified through an extensive literature review and expert opinion. A structural model and cause–effect relationship diagram was developed using decision-making trial and evaluation laboratory (DEMATEL) method for the identification of CFs. The present study adopt a comprehensive and rigorous procedure to identify six CFs namely, top management support, Government support, feedback mechanism to facilitate learning from prior experiences, transparent and accountable supply chain, strategic planning, and mutual learning with other commercial organizations (COs). The developed framework provides a simple, effective and efficient way to enhance the utilization of IT in HSC and in large to improve the competencies and performance of HSC

    Combined biotic and abiotic stress resistance in tomato

    Get PDF
    Abiotic and biotic stress factors are the major constrains for the realization of crop yield potential. As climate change progresses, the spread and intensity of abiotic as well as biotic stressors is expected to increase, with increased probability of crops being exposed to both types of stress. Shielding crops from combinatorial stress requires a better understanding of the plant’s response and its genetic architecture. In this study, we evaluated resistance to salt stress, powdery mildew and to both stresses combined in tomato, using the Solanum habrochaites LYC4 introgression line (IL) population. The IL population segregated for both salt stress tolerance and powdery mildew resistance. Using SNP array marker data, QTLs were identified for salt tolerance as well as Na+ and Cl- accumulation. Salt stress increased the susceptibility of the population to powdery mildew in an additive manner. Phenotypic variation for disease resistance was reduced under combined stress as indicated by the coefficient of variation. No correlation was found between disease resistance and Na+ and Cl- accumulation under combined stress Most genetic loci were specific for either salt stress tolerance or powdery mildew resistance. These findings increase our understanding of the genetic regulation of responses to abiotic and biotic stress combinations and can provide leads to more efficiently breeding tomatoes and other crops with a high level of disease resistance while maintaining their performance in combination with abiotic stress

    Resistance gene analogues identified through the NBS-profiling method map close to major genes and QTL for disease resistance in apple

    Get PDF
    We used a new method called nucleotide-binding site (NBS) profiling to identify and map resistance gene analogues (RGAs) in apple. This method simultaneously allows the amplification and the mapping of genetic markers anchored in the conserved NBS-encoding domain of plant disease resistance genes. Ninety-four individuals belonging to an F1 progeny derived from a cross between the apple cultivars Discovery and TN10-8 were studied. Two degenerate primers designed from the highly conserved P-loop motif within the NBS domain were used together with adapter primers. Forty-three markers generated with NBS profiling could be mapped in this progeny. After sequencing, 23 markers were identified as RGAs, based on their homologies with known resistance genes or NBS/leucine-rich-repeat-like genes. Markers were mapped on 10 of the 17 linkage groups of the apple genetic map used. Most of these markers were organized in clusters. Twenty-five markers mapped close to major genes or quantitative trait loci for resistance to scab and mildew previously identified in different apple progenies. Several markers could become efficient tools for marker-assisted selection once converted into breeder-friendly markers. This study demonstrates the efficiency of the NBS-profiling method for generating RGA markers for resistance loci in appl

    Possibilities and challenges of the potato genome sequence

    Get PDF
    This paper describes the progress that has been made since the draft genome sequence of potato has been obtained and the analyses that need to be done to make further progress. Although sequencing has become less expensive and read lengths have increased, making optimal use of the information obtained is still difficult, certainly in the tetraploid potato crop. Major challenges in potato genomics are standardized genome assembly and haplotype analysis. Sequencing methods need to be improved further to achieve precision breeding. With the current new generation sequencing technology, the focus in potato breeding will shift from phenotype improvement to genotype improvement. In this respect, it is essential to realize that different alleles of the same gene can lead to different phenotypes depending on the genetic background and that there is significant epistatic interaction between different alleles. Genome-wide association studies will gain statistical power when binary single nucleotide polymorphism (SNP) data can be replaced with multi-allelic haplotype data. Binary SNP can be distributed across the many different alleles per locus or may be haplotype-specific, and potentially tag specific alleles which clearly differ in their contribution to a certain trait value. Assembling reads from the same linkage phase proved to allow constructing sufficiently long haplotype tracts to ensure their uniqueness. Combining large phenotyping data sets with modern approaches to sequencing and haplotype analysis and proper software will allow the efficiency of potato breeding to increase

    Recognition of S100 proteins by signal inhibitory receptor on leukocytes-1 negatively regulates human neutrophils

    Get PDF
    Signal inhibitory receptor on leukocytes-1 (SIRL-1) is an inhibitory receptor with a hitherto unknown ligand, and is expressed on human monocytes and neutrophils. SIRL-1 inhibits myeloid effector functions such as reactive oxygen species (ROS) production. In this study, we identify S100 proteins as SIRL-1 ligands. S100 proteins are composed of two calcium-binding domains. Various S100 proteins are damage-associated molecular patterns (DAMPs) released from damaged cells, after which they initiate inflammation by ligating activating receptors on immune cells. We now show that the inhibitory SIRL-1 recognizes individual calcium-binding domains of all tested S100 proteins. Blocking SIRL-1 on human neutrophils enhanced S100 protein S100A6-induced ROS production, showing that S100A6 suppresses neutrophil ROS production via SIRL-1. Taken together, SIRL-1 is an inhibitory receptor recognizing the S100 protein family of DAMPs. This may help limit tissue damage induced by activated neutrophils.Chemical Immunolog

    Reduction in potentially inappropriate end-of-life hospital care for cancer patients during the COVID-19 pandemic:A retrospective population-based study

    Get PDF
    Background: The COVID-19 pandemic impacted cancer diagnosis and treatment. However, little is known about end-of-life cancer care during the pandemic. Aim: To investigate potentially inappropriate end-of-life hospital care for cancer patients before and during the COVID-19 pandemic. Design: Retrospective population-based cohort study using data from the Netherlands Cancer Registry and the Dutch National Hospital Care Registration. Potentially inappropriate care in the last month of life (chemotherapy administration, &gt;1 emergency room contact, &gt;1 hospitalization, hospitalization &gt;14 days, intensive care unit admission or hospital death) was compared between four COVID-19 periods and corresponding periods in 2018/2019. Participants: A total of 112,919 cancer patients (⩾18 years) who died between January 2018 and May 2021 were included. Results: Fewer patients received potentially inappropriate end-of-life care during the COVID-19 pandemic compared to previous years, especially during the first COVID-19 peak (22.4% vs 26.0%). Regression analysis showed lower odds of potentially inappropriate end-of-life care during all COVID-19 periods (between OR 0.81; 95% CI 0.74–0.88 and OR 0.92; 95% CI 0.87–0.97) after adjustment for age, sex and cancer type. For the individual indicators, fewer patients experienced multiple or long hospitalizations, intensive care unit admission or hospital death during the pandemic. Conclusions:Cancer patients received less potentially inappropriate end-of-life care during the COVID-19 pandemic. Because several factors may have contributed, it is unclear whether this reflects better quality care. However, these findings raise important questions about what pandemic-induced changes in care practices can help provide appropriate end-of-life care for future patients in the context of increasing patient numbers and limited resources.</p

    Reduction in potentially inappropriate end-of-life hospital care for cancer patients during the COVID-19 pandemic:A retrospective population-based study

    Get PDF
    Background: The COVID-19 pandemic impacted cancer diagnosis and treatment. However, little is known about end-of-life cancer care during the pandemic. Aim: To investigate potentially inappropriate end-of-life hospital care for cancer patients before and during the COVID-19 pandemic. Design: Retrospective population-based cohort study using data from the Netherlands Cancer Registry and the Dutch National Hospital Care Registration. Potentially inappropriate care in the last month of life (chemotherapy administration, &gt;1 emergency room contact, &gt;1 hospitalization, hospitalization &gt;14 days, intensive care unit admission or hospital death) was compared between four COVID-19 periods and corresponding periods in 2018/2019. Participants: A total of 112,919 cancer patients (⩾18 years) who died between January 2018 and May 2021 were included. Results: Fewer patients received potentially inappropriate end-of-life care during the COVID-19 pandemic compared to previous years, especially during the first COVID-19 peak (22.4% vs 26.0%). Regression analysis showed lower odds of potentially inappropriate end-of-life care during all COVID-19 periods (between OR 0.81; 95% CI 0.74–0.88 and OR 0.92; 95% CI 0.87–0.97) after adjustment for age, sex and cancer type. For the individual indicators, fewer patients experienced multiple or long hospitalizations, intensive care unit admission or hospital death during the pandemic. Conclusions: Cancer patients received less potentially inappropriate end-of-life care during the COVID-19 pandemic. Because several factors may have contributed, it is unclear whether this reflects better quality care. However, these findings raise important questions about what pandemic-induced changes in care practices can help provide appropriate end-of-life care for future patients in the context of increasing patient numbers and limited resources.</p

    Reduction in potentially inappropriate end-of-life hospital care for cancer patients during the COVID-19 pandemic:A retrospective population-based study

    Get PDF
    Background: The COVID-19 pandemic impacted cancer diagnosis and treatment. However, little is known about end-of-life cancer care during the pandemic. Aim: To investigate potentially inappropriate end-of-life hospital care for cancer patients before and during the COVID-19 pandemic. Design: Retrospective population-based cohort study using data from the Netherlands Cancer Registry and the Dutch National Hospital Care Registration. Potentially inappropriate care in the last month of life (chemotherapy administration, &gt;1 emergency room contact, &gt;1 hospitalization, hospitalization &gt;14 days, intensive care unit admission or hospital death) was compared between four COVID-19 periods and corresponding periods in 2018/2019. Participants: A total of 112,919 cancer patients (⩾18 years) who died between January 2018 and May 2021 were included. Results: Fewer patients received potentially inappropriate end-of-life care during the COVID-19 pandemic compared to previous years, especially during the first COVID-19 peak (22.4% vs 26.0%). Regression analysis showed lower odds of potentially inappropriate end-of-life care during all COVID-19 periods (between OR 0.81; 95% CI 0.74–0.88 and OR 0.92; 95% CI 0.87–0.97) after adjustment for age, sex and cancer type. For the individual indicators, fewer patients experienced multiple or long hospitalizations, intensive care unit admission or hospital death during the pandemic. Conclusions:Cancer patients received less potentially inappropriate end-of-life care during the COVID-19 pandemic. Because several factors may have contributed, it is unclear whether this reflects better quality care. However, these findings raise important questions about what pandemic-induced changes in care practices can help provide appropriate end-of-life care for future patients in the context of increasing patient numbers and limited resources.</p
    corecore