475 research outputs found
Factors Influencing British Adolescents’ Intake of Whole Grains: A Pilot Feasibility Study Using SenseCam Assisted Interviews
High whole grain intake is beneficial for health. However, adolescents consume low levels of whole grain and the understanding of the underpinning reasons for this is poor. Using a visual, participatory method, we carried out a pilot feasibility study to elicit in-depth accounts of young people’s whole grain consumption that were sensitive to their dietary, familial and social context. Furthermore, we explored barriers and suggested facilitators to whole grain intake and assessed the feasibility of using SenseCam to engage adolescents in research. Eight British adolescents (aged 11 to 16 years) wore a SenseCam device which auto-captured images every twenty seconds for three consecutive days. Participants then completed traditional 24-hour dietary recalls followed by in-depth interviews based on day three SenseCam images. Interview data were subjected to thematic analysis. Findings revealed that low adolescent whole grain intake was often due to difficulty in identifying whole grain products and their health benefits; and because of poor availability in and outside of the home. The images also captured the influence of parents and online media on adolescent daily life and choices. Low motivation to consume whole grains, a common explanation for poor diet quality, was rarely mentioned. Participants proposed that adolescent whole grain consumption could be increased by raising awareness through online media, improved sensory appeal, increased availability and variety, and tailoring of products for young people. SenseCam was effective in engaging young people in dietary research and capturing data relevant to dietary choices, which is useful for future research
Artificial Intelligence approaches in Cyber Security
As we all know that, the data that is been generated every second is increasing exponentially, as this information stored or received in any form is directly or indirectly is through Internet that means the data has to be travelled over a network for its completion of task, due to this the security for proper transmission of data plays a vital role in Cyber Security. The speed of processes and the amount of data to be used in defending the cyber space is cannot be handled by humans without considerable automations. However, it is difficult to develop software with conventional fixed algorithms for effectively defending against the dynamically evolving malicious attacks over the network. This situation can be handled by applying method of Artificial Intelligence that provides flexibility and learning capabilities of a network which later helps us in defending the attacks and as well as tracing down the culprits residing behind the terminology. This topic mainly emphasis on how well a packet is transferred from source to destination with proper security so that the end-user acquires the correct data as per his requirements
Proxy Tasks and Subjective Measures Can Be Misleading in Evaluating Explainable AI Systems
Explainable artificially intelligent (XAI) systems form part of
sociotechnical systems, e.g., human+AI teams tasked with making decisions. Yet,
current XAI systems are rarely evaluated by measuring the performance of
human+AI teams on actual decision-making tasks. We conducted two online
experiments and one in-person think-aloud study to evaluate two currently
common techniques for evaluating XAI systems: (1) using proxy, artificial tasks
such as how well humans predict the AI's decision from the given explanations,
and (2) using subjective measures of trust and preference as predictors of
actual performance. The results of our experiments demonstrate that evaluations
with proxy tasks did not predict the results of the evaluations with the actual
decision-making tasks. Further, the subjective measures on evaluations with
actual decision-making tasks did not predict the objective performance on those
same tasks. Our results suggest that by employing misleading evaluation
methods, our field may be inadvertently slowing its progress toward developing
human+AI teams that can reliably perform better than humans or AIs alone
Deletion of low molecular weight protein tyrosine phosphatase (Acp1) protects against stress-induced cardiomyopathy.
The low molecular weight protein tyrosine phosphatase (LMPTP), encoded by the ACP1 gene, is a ubiquitously expressed phosphatase whose in vivo function in the heart and in cardiac diseases remains unknown. To investigate the in vivo role of LMPTP in cardiac function, we generated mice with genetic inactivation of the Acp1 locus and studied their response to long-term pressure overload. Acp1(-/-) mice develop normally and ageing mice do not show pathology in major tissues under basal conditions. However, Acp1(-/-) mice are strikingly resistant to pressure overload hypertrophy and heart failure. Lmptp expression is high in the embryonic mouse heart, decreased in the postnatal stage, and increased in the adult mouse failing heart. We also show that LMPTP expression increases in end-stage heart failure in humans. Consistent with their protected phenotype, Acp1(-/-) mice subjected to pressure overload hypertrophy have attenuated fibrosis and decreased expression of fibrotic genes. Transcriptional profiling and analysis of molecular signalling show that the resistance of Acp1(-/-) mice to pathological cardiac stress correlates with marginal re-expression of fetal cardiac genes, increased insulin receptor beta phosphorylation, as well as PKA and ephrin receptor expression, and inactivation of the CaMKIIδ pathway. Our data show that ablation of Lmptp inhibits pathological cardiac remodelling and suggest that inhibition of LMPTP may be of therapeutic relevance for the treatment of human heart failure
6-PACK programme to decrease fall injuries in acute hospitals: Cluster randomised controlled trial
Objective: To evaluate the effect of the 6-PACK programme on falls and fall injuries in acute wards. Design: Cluster randomised controlled trial. Setting: Six Australian hospitals. Participants: All patients admitted to 24 acute wards during the trial period. Interventions: Participating wards were randomly assigned to receive either the nurse led 6-PACK programme or usual care over 12 months. The 6-PACK programme included a fall risk tool and individualised use of one or more of six interventions: “falls alert” sign, supervision of patients in the bathroom, ensuring patients’ walking aids are within reach, a toileting regimen, use of a low-low bed, and use of a bed/chair alarm. Main outcome measures: The co-primary outcomes were falls and fall injuries per 1000 occupied bed days. Results: During the trial, 46 245 admissions to 16 medical and eight surgical wards occurred. As many people were admitted more than once, this represented 31 411 individual patients. Patients’ characteristics and length of stay were similar for intervention and control wards. Use of 6-PACK programme components was higher on intervention wards than on control wards (incidence rate ratio 3.05, 95% confidence interval 2.14 to 4.34; P<0.001). In all, 1831 falls and 613 fall injuries occurred, and the rates of falls (incidence rate ratio 1.04, 0.78 to 1.37; P=0.796) and fall injuries (0.96, 0.72 to 1.27; P=0.766) were similar in intervention and control wards. Conclusions: Positive changes in falls prevention practice occurred following the introduction of the 6-PACK programme. However, no difference was seen in falls or fall injuries between groups. High quality evidence showing the effectiveness of falls prevention interventions in acute wards remains absent. Novel solutions to the problem of in-hospital falls are urgently needed
Conversion from calcineurin inhibitor to belatacept-based maintenance immunosuppression in renal transplant recipients:A randomized phase 3b Trial
Significance Statement This randomized trial demonstrates the safety and efficacy of conversion from calcineurin inhibitor (CNI)? to belatacept-based maintenance immunosuppression in renal transplant recipients 6?60 months post-transplant. Patients converted to belatacept showed sustained improvement in renal function associated with an acceptable safety profile consistent with prior experience and a smaller treatment difference in acute rejection postconversion compared with that observed in earlier studies in de novo renal allograft recipients. These results favor the use of belatacept as an alternative to continued long-term CNI-based maintenance immunosuppression, which is particularly relevant for CNI-intolerant patients, including those who experience nephrotoxicity. These data help inform clinical practice guidelines regarding the conversion of such patients to an alternative immunosuppressive drug regimen.Background Calcineurin inhibitors (CNIs) are standard of care after kidney transplantation, but they are associated with nephrotoxicity and reduced long-term graft survival. Belatacept, a selective T cell costimulation blocker, is approved for the prophylaxis of kidney transplant rejection. This phase 3 trial evaluated the efficacy and safety of conversion from CNI-based to belatacept-based maintenance immunosuppression in kidney transplant recipients.Methods Stable adult kidney transplant recipients 6?60 months post-transplantation under CNI-based immunosuppression were randomized (1:1) to switch to belatacept or continue treatment with their established CNI. The primary end point was the percentage of patients surviving with a functioning graft at 24 months.Results Overall, 446 renal transplant recipients were randomized to belatacept conversion (n=223) or CNI continuation (n=223). The 24-month rates of survival with graft function were 98% and 97% in the belatacept and CNI groups, respectively (adjusted difference, 0.8; 95.1% CI, ?2.1 to 3.7). In the belatacept conversion versus CNI continuation groups, 8% versus 4% of patients experienced biopsy-proven acute rejection (BPAR), respectively, and 1% versus 7% developed de novo donor-specific antibodies (dnDSAs), respectively. The 24-month eGFR was higher with belatacept (55.5 versus 48.5 ml/min per 1.73 m(2) with CNI). Both groups had similar rates of serious adverse events, infections, and discontinuations, with no unexpected adverse events. One patient in the belatacept group had post-transplant lymphoproliferative disorder.Conclusions Switching stable renal transplant recipients from CNI-based to belatacept-based immunosuppression was associated with a similar rate of death or graft loss, improved renal function, and a numerically higher BPAR rate but a lower incidence of dnDSA. Clinical Trial registry name and registration number: A Study in Maintenance Kidney Transplant Recipients Following Conversion to Nulojix? (Belatacept)-Based, NCT01820572Nephrolog
Mortality Prediction after the First Year of Kidney Transplantation: An Observational Study on Two European Cohorts.
After the first year post transplantation, prognostic mortality scores in kidney transplant recipients can be useful for personalizing medical management. We developed a new prognostic score based on 5 parameters and computable at 1-year post transplantation. The outcome was the time between the first anniversary of the transplantation and the patient's death with a functioning graft. Afterwards, we appraised the prognostic capacities of this score by estimating time-dependent Receiver Operating Characteristic (ROC) curves from two prospective and multicentric European cohorts: the DIVAT (Données Informatisées et VAlidées en Transplantation) cohort composed of patients transplanted between 2000 and 2012 in 6 French centers; and the STCS (Swiss Transplant Cohort Study) cohort composed of patients transplanted between 2008 and 2012 in 6 Swiss centers. We also compared the results with those of two existing scoring systems: one from Spain (Hernandez et al.) and one from the United States (the Recipient Risk Score, RRS, Baskin-Bey et al.). From the DIVAT validation cohort and for a prognostic time at 10 years, the new prognostic score (AUC = 0.78, 95%CI = [0.69, 0.85]) seemed to present significantly higher prognostic capacities than the scoring system proposed by Hernandez et al. (p = 0.04) and tended to perform better than the initial RRS (p = 0.10). By using the Swiss cohort, the RRS and the the new prognostic score had comparable prognostic capacities at 4 years (AUC = 0.77 and 0.76 respectively, p = 0.31). In addition to the current available scores related to the risk to return in dialysis, we recommend to further study the use of the score we propose or the RRS for a more efficient personalized follow-up of kidney transplant recipients
Sequencing and Characterisation of Complete Mitogenome DNA for Rasbora hobelmani (Cyprinidae) with Phylogenetic Consideration
The Kottelat rasbora Rasbora hobelmani is a small ray-finned fish categorized under the genus Rasbora in the Cyprinidae family. In this study, the complete mitogenome sequence of R. hobelmani was sequenced using two pairs of primers covering overlapping regions. The mitogenome is 16541 bp in length, housing 22 transfer RNA genes, 13 protein-coding genes, two ribosomal RNA genes and one putative control region. Identical gene organisation was spotted between this species and other Rasbora genus members. The heavy strand contains 28 genes while the light strand contains the remaining nine genes. Most protein-coding genes employ ATG as start codon, except for the COI gene, which exploits GTG instead. The central conserved sequence blocks (CSB-F, CSB-E and CSB-D), variable sequence blocks (CSB-3, CSB-2 and CSB-1) as well as the terminal associated sequence (TAS) are conserved in the control region. The maximum likelihood phylogenetic tree revealed the close phylogeny of R. hobelmani with R. sumatrana, R. aprotaenia, R. lateristriata and R. steineri with bootstrap value of at least 99%. This work acts as milestone towards future evolution and population genetics studies of this species as well as the Rasbora genus
Detection of astrocytic tau pathology facilitates recognition of chronic traumatic encephalopathy neuropathologic change
Traumatic brain injury (TBI) is associated with the development of a range of neurodegenerative pathologies, including chronic traumatic encephalopathy (CTE). Current consensus diagnostic criteria define the pathognomonic cortical lesion of CTE neuropathologic change (CTE-NC) as a patchy deposition of hyperphosphorylated tau in neurons, with or without glial tau in thorn-shaped astrocytes, typically towards the depths of sulci and clustered around small blood vessels. Nevertheless, although incorporated into consensus diagnostic criteria, the contribution of the individual cellular components to identification of CTE-NC has not been formally evaluated. To address this, from the Glasgow TBI Archive, cortical tissue blocks were selected from consecutive brain donations from contact sports athletes in which there was known to be either CTE-NC (n = 12) or Alzheimer’s disease neuropathologic change (n = 4). From these tissue blocks, adjacent tissue sections were stained for tau antibodies selected to reveal either solely neuronal pathology (3R tau; GT-38) or mixed neuronal and astroglial pathologies (4R tau; PHF-1). These stained sections were then randomised and independently assessed by a panel of expert neuropathologists, blind to patient clinical history and primary antibody applied to each section, who were asked to record whether CTE-NC was present. Results demonstrate that, in sections stained for either 4R tau or PHF-1, consensus recognition of CTE-NC was high. In contrast, recognition of CTE-NC in sections stained for 3R tau or GT-38 was poor; in the former no better than chance. Our observations demonstrate that the presence of both neuronal and astroglial tau pathologies facilitates detection of CTE-NC, with its detection less consistent when neuronal tau pathology alone is visible. The combination of both glial and neuronal pathologies, therefore, may be required for detection of CTE-NC
- …