1,155 research outputs found
Inference as growth: Peirce’s ecstatic logic of illation
For Peirce, logic is essentially illative, a relation of inferential growth. It follows that inference and argumentation are essentially ecstatic, an asymmetrical, ampliative movement from antecedent to consequent. It also follows that logic is inherently inductive. While deduction remains an essential and irreplaceable aspect of logic, it should be seen as a more abstract expression of the illative, semiological essence of inference as such
Hyper-arousal decreases human visual thresholds
Arousal has long been known to influence behavior and serves as an underlying component of cognition and consciousness. However, the consequences of hyper-arousal for visual perception remain unclear. The present study evaluates the impact of hyper-arousal on two aspects of visual sensitivity: visual stereoacuity and contrast thresholds. Sixty-eight participants participated in two experiments. Thirty-four participants were randomly divided into two groups in each experiment: Arousal Stimulation or Sham Control. The Arousal Stimulation group underwent a 50-second cold pressor stimulation (immersing the foot in 0–2° C water), a technique known to increase arousal. In contrast, the Sham Control group immersed their foot in room temperature water. Stereoacuity thresholds (Experiment 1) and contrast thresholds (Experiment 2) were measured before and after stimulation. The Arousal Stimulation groups demonstrated significantly lower stereoacuity and contrast thresholds following cold pressor stimulation, whereas the Sham Control groups showed no difference in thresholds. These results provide the first evidence that hyper-arousal from sensory stimulation can lower visual thresholds. Hyper-arousal\u27s ability to decrease visual thresholds has important implications for survival, sports, and everyday life
Country-level cost-effectiveness thresholds : initial estimates and the need for further research
Objectives: Cost-effectiveness analysis (CEA) can guide policymakers in resource allocation decisions. CEA assesses whether the health gains offered by an intervention are large enough relative to any additional costs to warrant adoption. Where there are constraints on the healthcare system’s budget or ability to increase expenditures, additional costs imposed by interventions have an ‘opportunity cost’ in terms of the health foregone as other interventions cannot be provided. Cost-effectiveness thresholds (CETs) are typically used to assess whether an intervention is worthwhile and should reflect health opportunity cost. However, CETs used by some decision makers - such as the World Health Organization (WHO) suggested CETs of 1-3 times gross domestic product per capita (GDP pc) - do not. This study estimates CETs based on opportunity cost for a wide range of countries. Methods: We estimate CETs based upon recent empirical estimates of opportunity cost (from the English NHS), estimates of the relationship between country GDP pc and the value of a statistical life, and a series of explicit assumptions. Results: CETs for Malawi (the lowest income country in the world), Cambodia (borderline low/low-middle income), El Salvador (borderline low-middle/upper-middle) and Kazakhstan (borderline high-middle/high) are estimated to be 44-518 (4-51%), 4,485-8,018 (32-59%); respectively. Conclusions: To date opportunity cost-based CETs for low/middle income countries have not been available. Although uncertainty exists in the underlying assumptions, these estimates can provide a useful input to inform resource allocation decisions and suggest that routinely used CETs have been too high
Comparison of mortality with home hemodialysis and center hemodialysis: A national study
Comparison of mortality with home hemodialysis and center hemodialysis: A national study. We sought to determine whether lower mortality rates reported with hemodialysis (HD) at home compared to hemodialysis in dialysis centers (center HD) could be explained by patient selection. Data are from the United States Renal Data System (USRDS) Special Study Of Case Mix Severity, a random national sample of 4,892 patients who started renal replacement therapy in 1986 to 1987. Intent-to-treat analyses compared mortality between home HD (N =70) and center HD patients (N = 3,102) using the Cox proportional hazards model. Home HD patients were younger and had a lower frequency of comorbid conditions. The unadjusted relative risk (RR) of death for home HD patients compared to center HD was 0.37 (P < 0.001). The RR adjusted for age, sex, race and diabetes, was 44% lower in home HD patients (RR = 0.56, P = 0.02). When additionally adjusted for comorbid conditions, this RR increased marginally (RR = 0.58, P = 0.03). A different analysis using national USRDS data from 1986/7 and without comorbid adjustment showed patients with training for self care hemodialysis at home or in a center (N = 418) had a lower mortality risk (RR = 0.78, P = 0.001) than center HD patients (N = 43,122). Statistical adjustment for comorbid conditions in addition to age, sex, race, and diabetes explains only a small amount of the lower mortality with home HD
Protecting healing relationships in the age of electronic health records: report from an international conference
We present findings of an international conference of diverse participants exploring the influence of electronic health records (EHRs) on the patient-practitioner relationship. Attendees united around a belief in the primacy of this relationship and the importance of undistracted attention. They explored administrative, regulatory, and financial requirements that have guided United States (US) EHR design and challenged patient-care documentation, usability, user satisfaction, interconnectivity, and data sharing. The United States experience was contrasted with those of other nations, many of which have prioritized patient-care documentation rather than billing requirements and experienced high user satisfaction. Conference participants examined educational methods to teach diverse learners effective patient-centered EHR use, including alternative models of care delivery and documentation, and explored novel ways to involve patients as healthcare partners like health-data uploading, chart co-creation, shared practitioner notes, applications, and telehealth. Future best practices must preserve human relationships, while building an effective patient-practitioner (or team)-EHR triad
Dissipation and Extra Light in Galactic Nuclei: II. 'Cusp' Ellipticals
We study the origin and properties of 'extra' or 'excess' central light in
the surface brightness profiles of cusp or power-law ellipticals. Dissipational
mergers give rise to two-component profiles: an outer profile established by
violent relaxation acting on stars present in the progenitors prior to the
final merger, and an inner stellar population comprising the extra light,
formed in a compact starburst. Combining a large set of hydrodynamical
simulations with data that span a broad range of profiles and masses, we show
that this picture is borne out -- cusp ellipticals are indeed 'extra light'
ellipticals -- and examine how the properties of this component scale with
global galaxy properties. We show how to robustly separate the 'extra' light,
and demonstrate that observed cusps are reliable tracers of the degree of
dissipation in the spheroid-forming merger. We show that the typical degree of
dissipation is a strong function of stellar mass, tracing observed disk gas
fractions at each mass. We demonstrate a correlation between extra light
content and effective radius at fixed mass: systems with more dissipation are
more compact. The outer shape of the light profile does not depend on mass,
with a mean outer Sersic index ~2.5. We explore how this relates to shapes,
kinematics, and stellar population gradients. Simulations with the gas content
needed to match observed profiles also reproduce observed age, metallicity, and
color gradients, and we show how these can be used as tracers of the degree of
dissipation in spheroid formation.Comment: 40 pages, 32 figures, accepted to ApJ (revised to match accepted
version
Value of the First Post-Transplant Biopsy for Predicting Long-Term Cardiac Allograft Vasculopathy (CAV) and Graft Failure in Heart Transplant Patients
BACKGROUND: Cardiac allograft vasculopathy (CAV) is the principal cause of long-term graft failure following heart transplantation. Early identification of patients at risk of CAV is essential to target invasive follow-up procedures more effectively and to establish appropriate therapies. We evaluated the prognostic value of the first heart biopsy (median: 9 days post-transplant) versus all biopsies obtained within the first three months for the prediction of CAV and graft failure due to CAV. METHODS AND FINDINGS: In a prospective cohort study, we developed multivariate regression models evaluating markers of atherothrombosis (fibrin, antithrombin and tissue plasminogen activator [tPA]) and endothelial activation (intercellular adhesion molecule-1) in serial biopsies obtained during the first three months post-transplantation from 172 patients (median follow-up = 6.3 years; min = 0.37 years, max = 16.3 years). Presence of fibrin was the dominant predictor in first-biopsy models (Odds Ratio [OR] for one- and 10-year graft failure due to CAV = 38.70, p = 0.002, 95% CI = 4.00-374.77; and 3.99, p = 0.005, 95% CI = 1.53-10.40) and loss of tPA was predominant in three-month models (OR for one- and 10-year graft failure due to CAV = 1.81, p = 0.025, 95% CI = 1.08-3.03; and 1.31, p = 0.001, 95% CI = 1.12-1.55). First-biopsy and three-month models had similar predictive and discriminative accuracy and were comparable in their capacities to correctly classify patient outcomes, with the exception of 10-year graft failure due to CAV in which the three-month model was more predictive. Both models had particularly high negative predictive values (e.g., First-biopsy vs. three-month models: 99% vs. 100% at 1-year and 96% vs. 95% at 10-years). CONCLUSIONS: Patients with absence of fibrin in the first biopsy and persistence of normal tPA in subsequent biopsies rarely develop CAV or graft failure during the next 10 years and potentially could be monitored less invasively. Presence of early risk markers in the transplanted heart may be secondary to ischemia/reperfusion injury, a potentially modifiable factor
Booster Vaccination Against SARS-CoV-2 Induces Potent Immune Responses in People With Human Immunodeficiency Virus
Background: People with human immunodeficiency virus (HIV) on antiretroviral therapy (ART) with good CD4 T-cell counts make effective immune responses following vaccination against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). There are few data on longer term responses and the impact of a booster dose. Methods: Adults with HIV were enrolled into a single arm open label study. Two doses of ChAdOx1 nCoV-19 were followed 12 months later by a third heterologous vaccine dose. Participants had undetectable viraemia on ART and CD4 counts >350 cells/μL. Immune responses to the ancestral strain and variants of concern were measured by anti-spike immunoglobulin G (IgG) enzyme-linked immunosorbent assay (ELISA), MesoScale Discovery (MSD) anti-spike platform, ACE-2 inhibition, activation induced marker (AIM) assay, and T-cell proliferation. Findings: In total, 54 participants received 2 doses of ChAdOx1 nCoV-19. 43 received a third dose (42 with BNT162b2; 1 with mRNA-1273) 1 year after the first dose. After the third dose, total anti-SARS-CoV-2 spike IgG titers (MSD), ACE-2 inhibition, and IgG ELISA results were significantly higher compared to Day 182 titers (P <. 0001 for all 3). SARS-CoV-2 specific CD4+ T-cell responses measured by AIM against SARS-CoV-2 S1 and S2 peptide pools were significantly increased after a third vaccine compared to 6 months after a first dose, with significant increases in proliferative CD4+ and CD8+ T-cell responses to SARS-CoV-2 S1 and S2 after boosting. Responses to Alpha, Beta, Gamma, and Delta variants were boosted, although to a lesser extent for Omicron. Conclusions: In PWH receiving a third vaccine dose, there were significant increases in B- and T-cell immunity, including to known variants of concern (VOCs)
Booster vaccination against SARS-CoV-2 induces potent immune responses in people with human immunodeficiency virus
Background
People with human immunodeficiency virus (HIV) on antiretroviral therapy (ART) with good CD4 T-cell counts make effective immune responses following vaccination against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). There are few data on longer term responses and the impact of a booster dose.
Methods
Adults with HIV were enrolled into a single arm open label study. Two doses of ChAdOx1 nCoV-19 were followed 12 months later by a third heterologous vaccine dose. Participants had undetectable viraemia on ART and CD4 counts >350 cells/µL. Immune responses to the ancestral strain and variants of concern were measured by anti-spike immunoglobulin G (IgG) enzyme-linked immunosorbent assay (ELISA), MesoScale Discovery (MSD) anti-spike platform, ACE-2 inhibition, activation induced marker (AIM) assay, and T-cell proliferation.
Findings
In total, 54 participants received 2 doses of ChAdOx1 nCoV-19. 43 received a third dose (42 with BNT162b2; 1 with mRNA-1273) 1 year after the first dose. After the third dose, total anti-SARS-CoV-2 spike IgG titers (MSD), ACE-2 inhibition, and IgG ELISA results were significantly higher compared to Day 182 titers (P < .0001 for all 3). SARS-CoV-2 specific CD4+ T-cell responses measured by AIM against SARS-CoV-2 S1 and S2 peptide pools were significantly increased after a third vaccine compared to 6 months after a first dose, with significant increases in proliferative CD4+ and CD8+ T-cell responses to SARS-CoV-2 S1 and S2 after boosting. Responses to Alpha, Beta, Gamma, and Delta variants were boosted, although to a lesser extent for Omicron.
Conclusions
In PWH receiving a third vaccine dose, there were significant increases in B- and T-cell immunity, including to known variants of concern (VOCs)
- …