183 research outputs found
Health-related quality of life as a predictor of pediatric healthcare costs: A two-year prospective cohort analysis
BACKGROUND: The objective of this study was to test the primary hypothesis that parent proxy-report of pediatric health-related quality of life (HRQL) would prospectively predict pediatric healthcare costs over a two-year period. The exploratory hypothesis tested anticipated that a relatively small group of children would account for a disproportionately large percent of healthcare costs. METHODS: 317 children (157 girls) ages 2 to 18 years, members of a managed care health plan with prospective payment participated in a two-year prospective longitudinal study. At Time 1, parents reported child HRQL using the Pediatric Quality of Life Inventory™ (PedsQL™ 4.0) Generic Core Scales, and chronic health condition status. Costs, based on health plan utilization claims and encounters, were derived for 6, 12, and 24 months. RESULTS: In multiple linear regression equations, Time 1 parent proxy-reported HRQL prospectively accounted for significant variance in healthcare costs at 6, 12, and 24 months. Adjusted regression models that included both HRQL scores and chronic health condition status accounted for 10.1%, 14.4%, and 21.2% of the variance in healthcare costs at 6, 12, and 24 months. Parent proxy-reported HRQL and chronic health condition status together defined a 'high risk' group, constituting 8.7% of the sample and accounting for 37.4%, 59.2%, and 62% of healthcare costs at 6, 12, and 24 months. The high risk group's per member per month healthcare costs were, on average, 12 times that of other enrollees' at 24 months. CONCLUSIONS: While these findings should be further tested in a larger sample, our data suggest that parent proxy-reported HRQL can be used to prospectively predict healthcare costs. When combined with chronic health condition status, parent proxy-reported HRQL can identify an at risk group of children as candidates for proactive care coordination
Spatial tethering of kinases to their substrates relaxes evolutionary constraints on specificity
Signal transduction proteins are often multi-domain proteins that arose through the fusion of previously independent proteins. How such a change in the spatial arrangement of proteins impacts their evolution and the selective pressures acting on individual residues is largely unknown. We explored this problem in the context of bacterial two-component signalling pathways, which typically involve a sensor histidine kinase that specifically phosphorylates a single cognate response regulator. Although usually found as separate proteins, these proteins are sometimes fused into a so-called hybrid histidine kinase. Here, we demonstrate that the isolated kinase domains of hybrid kinases exhibit a dramatic reduction in phosphotransfer specificity in vitro relative to canonical histidine kinases. However, hybrid kinases phosphotransfer almost exclusively to their covalently attached response regulator domain, whose effective concentration exceeds that of all soluble response regulators. These findings indicate that the fused response regulator in a hybrid kinase normally prevents detrimental cross-talk between pathways. More generally, our results shed light on how the spatial properties of signalling pathways can significantly affect their evolution, with additional implications for the design of synthetic signalling systems.National Science Foundation (U.S.) (CAREER Award)National Science Foundation (U.S.). Graduate Research Fellowship Progra
Brain Cells in the Avian ‘Prefrontal Cortex’ Code for Features of Slot-Machine-Like Gambling
Slot machines are the most common and addictive form of gambling. In the current study, we recorded from single neurons in the ‘prefrontal cortex’ of pigeons while they played a slot-machine-like task. We identified four categories of neurons that coded for different aspects of our slot-machine-like task. Reward-Proximity neurons showed a linear increase in activity as the opportunity for a reward drew near. I-Won neurons fired only when the fourth stimulus of a winning (four-of-a-kind) combination was displayed. I-Lost neurons changed their firing rate at the presentation of the first nonidentical stimulus, that is, when it was apparent that no reward was forthcoming. Finally, Near-Miss neurons also changed their activity the moment it was recognized that a reward was no longer available, but more importantly, the activity level was related to whether the trial contained one, two, or three identical stimuli prior to the display of the nonidentical stimulus. These findings not only add to recent neurophysiological research employing simulated gambling paradigms, but also add to research addressing the functional correspondence between the avian NCL and primate PFC
Optimization of Federated Learning's Client Selection for Non-IID Data Based on Grey Relational Analysis
Federated learning (FL) is a novel distributed learning framework designed
for applications with privacy-sensitive data. Without sharing data, FL trains
local models on individual devices and constructs the global model on the
server by performing model aggregation. However, to reduce the communication
cost, the participants in each training round are randomly selected, which
significantly decreases the training efficiency under data and device
heterogeneity. To address this issue, in this paper, we introduce a novel
approach that considers the data distribution and computational resources of
devices to select the clients for each training round. Our proposed method
performs client selection based on the Grey Relational Analysis (GRA) theory by
considering available computational resources for each client, the training
loss, and weight divergence. To examine the usability of our proposed method,
we implement our contribution on Amazon Web Services (AWS) by using the
TensorFlow library of Python. We evaluate our algorithm's performance in
different setups by varying the learning rate, network size, the number of
selected clients, and the client selection round. The evaluation results show
that our proposed algorithm enhances the performance significantly in terms of
test accuracy and the average client's waiting time compared to
state-of-the-art methods, federated averaging and Pow-d
Toward an Ontology of Collaborative Learning Healthcaresystems
Objective:To establish a basis for a domain ontology - a formal, explicit specificationof a shared conceptualization - of collaborative learning healthcare systems (CLHSs)in order to facilitate measurement, explanation, and improvement.Methods:We adapted the“Methontology”approach to begin building an ontologyof CLHSs. We specified the purpose of an ontology, acquired domain knowledge vialiterature review, conceptualized a common framework of CLHSs using a groundedapproach, refined these concepts based on expert panel input, and illustrated con-cept application via four cases.Results:The set of concepts identified as important to include in an ontologyincludes goals, values, structure, actors, environment, and products. To establish thisset of concepts, we gathered input from content experts in two ways. First, expertpanel methods were used to elicit feedback on these concepts and to test the elicita-tion of terms for the vocabulary of the Values concept. Second, from these discus-sions we developed a mapping exercise to test the intuitiveness of the concepts,requesting that network leaders from four CLHSs complete a mapping exercise toassociate characteristics of their networks with the high-level concepts, building thevocabulary for each concept in a grounded fashion. We also solicited feedback fromthese participants on the experience of completing the mapping exercise, finding thatthe exercise is acceptable and could aid in CLHS development and collaboration.Respondents identified opportunities to improve the operational definitions of eachconcept to ensure that corresponding vocabularies are distinct and non-overlapping.Discussion:Our results provide a foundation for developing a formal, explicit sharedconceptualization of CLHSs. Once developed, such a tool can be useful for measure-ment, explanation, and improvement. Further work, including alignment to a top-levelontology, expanding the vocabulary, and defining relations between vocabulary isrequired to formally build out an ontology for these uses
Dynamic predictive probabilities to monitor rapid cystic fibrosis disease progression.
Cystic fibrosis (CF) is a progressive, genetic disease characterized by frequent, prolonged drops in lung function. Accurately predicting rapid underlying lung-function decline is essential for clinical decision support and timely intervention. Determining whether an individual is experiencing a period of rapid decline is complicated due to its heterogeneous timing and extent, and error component of the measured lung function. We construct individualized predictive probabilities for "nowcasting" rapid decline. We assume each patient's true longitudinal lung function, S(t), follows a nonlinear, nonstationary stochastic process, and accommodate between-patient heterogeneity through random effects. Corresponding lung-function decline at time t is defined as the rate of change, S'(t). We predict S'(t) conditional on observed covariate and measurement history by modeling a measured lung function as a noisy version of S(t). The method is applied to data on 30 879 US CF Registry patients. Results are contrasted with a currently employed decision rule using single-center data on 212 individuals. Rapid decline is identified earlier using predictive probabilities than the center's currently employed decision rule (mean difference: 0.65 years; 95% confidence interval (CI): 0.41, 0.89). We constructed a bootstrapping algorithm to obtain CIs for predictive probabilities. We illustrate real-time implementation with R Shiny. Predictive accuracy is investigated using empirical simulations, which suggest this approach more accurately detects peak decline, compared with a uniform threshold of rapid decline. Median area under the ROC curve estimates (Q1-Q3) were 0.817 (0.814-0.822) and 0.745 (0.741-0.747), respectively, implying reasonable accuracy for both. This article demonstrates how individualized rate of change estimates can be coupled with probabilistic predictive inference and implementation for a useful medical-monitoring approach
Special Libraries, May-June 1932
Volume 23, Issue 5https://scholarworks.sjsu.edu/sla_sl_1932/1004/thumbnail.jp
Recommended from our members
Steady state free radical budgets and ozone photochemistry during TOPSE
A steady state model, constrained by a number of measured quantities, was used to derive peroxy radical levels for the conditions of the Tropospheric Ozone Production about the Spring Equinox (TOPSE) campaign. The analysis is made using data collected aboard the NCAR/NSF C-130 aircraft from February through May 2000 at latitudes from 40° to 85°N, and at altitudes from the surface to 7.6 km. HO2 + RO2 radical concentrations were measured during the experiment, which are compared with model results over the domain of the study showing good agreement on the average. Average measurement/model ratios are 1.04 (σ = 0.73) and 0.96 (σ = 0.52) for the MLB and HLB, respectively. Budgets of total peroxy radical levels as well as of individual free radical members were constructed, which reveal interesting differences compared to studies at lower latitudes. The midlatitude part of the study region is a significant net source of ozone, while the high latitudes constitute a small net sink leading to the hypothesis that transport from the middle latitudes can explain the observed increase in ozone in the high latitudes. Radical reservoir species concentrations are modeled and compared with the observations. For most conditions, the model does a good job of reproducing the formaldehyde observations, but the peroxide observations are significantly less than steady state for this study. Photostationary state (PSS) derived total peroxy radical levels and NO/NO2ratios are compared with the measurements and the model; PSS-derived results are higher than observations or the steady state model at low NO concentrations
OpenCog Hyperon: A Framework for AGI at the Human Level and Beyond
An introduction to the OpenCog Hyperon framework for Artificiai General
Intelligence is presented. Hyperon is a new, mostly from-the-ground-up
rewrite/redesign of the OpenCog AGI framework, based on similar conceptual and
cognitive principles to the previous OpenCog version, but incorporating a
variety of new ideas at the mathematical, software architecture and
AI-algorithm level. This review lightly summarizes: 1) some of the history
behind OpenCog and Hyperon, 2) the core structures and processes underlying
Hyperon as a software system, 3) the integration of this software system with
the SingularityNET ecosystem's decentralized infrastructure, 4) the cognitive
model(s) being experimentally pursued within Hyperon on the hopeful path to
advanced AGI, 5) the prospects seen for advanced aspects like reflective
self-modification and self-improvement of the codebase, 6) the tentative
development roadmap and various challenges expected to be faced, 7) the
thinking of the Hyperon team regarding how to guide this sort of work in a
beneficial direction ... and gives links and references for readers who wish to
delve further into any of these aspects
Problematic substance use and its associated factors among street youth in Bahir Dar city, Ethiopia
BackgroundProblematic substance use is becoming a common problem in marginalized groups such as street youths. However, there is a dearth of studies on the prevalence and factors associated with problematic substance use among street youth in Ethiopia.ObjectiveThe objective of this study was to determine the prevalence of problematic substance use and identify its associated factors among street youth.MethodsThis community-based cross-sectional study was conducted between June and July 2020. A total of 252 participants were included in this study. Systematic random sampling was used to recruit participants. Cut down, annoyed, guilty feeling, and eye opening-adapted to include drugs (CAGE-AIDs) were used to assess problematic substance use. The data were entered into epidata and exported to SPSS version 25 for analysis. Logistic regression with a 95% confidence interval (CI) was used to show the strength of association. A p-value < 0.5 was statistically significant.ResultsThe prevalence of problematic substance use was 55.8%, 95% CI (49–63%). Peer pressure [adjusted odds ratio (AOR) = 3.01, 95% CI: 1.38, 6.59], family conflict [AOR = 5.05, 95% CI: 1.67, 15.25], physical abuse [AOR = 2.56, 95% CI: 1.11, 5.84], and substance use in the family [AOR = 2.85, 95% CI: 1.29, 6.27] were the factors significantly associated with problematic substance use.ConclusionThe prevalence of problematic substance use was high. It was also found that peer pressure, family conflict, substance use in the family, and physical abuse were the factors associated with problematic substance use. Therefore, proper screening and intervention for individuals with problematic substance use are needed, and further research should be conducted for marginalized groups
- …