1,497 research outputs found
In-office Functional Nasal Surgery
Nasal airway obstruction is a common complaint encountered by the otolaryngologist. In-office nasal procedures are becoming increasingly popular, and should be considered for patients desiring immediate treatment without the adverse effects of general anesthesia, operating room costs or scheduling delays. In this paper, we discuss the factors in patient selection, room set-up, and other considerations. We discuss the options available for in-office treatment for nasal valve repair including turbinoplasty, septoplasty, and nasal valve repair/functional rhinoplasty-type techniques described in our literature
A Dissociation between Perception and Action in the Material‐weight Illusion
We examined what forces are applied to objects that elicit this illusion when they are lifted.We predicted that:
(1) Forces on early trials will scale to each participant’s expectations of how much a particular block will weigh ‐ excessive force will be applied to the metal block and insufficient force applied to the polystyrene block.
(2) Forces on later trials will scale to the real weight of each block ‐ identical levels of force applied to all the blocks.
(3) MWI will persist throughout ‐ polystyrene block will feel the heaviest, metal block will feel the lightest
Targeted expression profiling reveals distinct stages of early canine fibroblast reprogramming are regulated by 2-oxoglutarate hydroxylases
Background: Ectopic expression of a defined set of transcription factors allows the reprogramming of mammalian somatic cells to pluripotency. Despite continuous progress in primate and rodent reprogramming, limited attention has been paid to cell reprogramming in domestic and companion species. Previous studies attempting to reprogram canine cells have mostly assessed a small number of presumptive canine induced pluripotent stem cell (iPSC) lines for generic pluripotency attributes. However, why canine cell reprogramming remains extremely inefficient is poorly understood. Methods: To better characterize the initial steps of pluripotency induction in canine somatic cells, we optimized an experimental system where canine fetal fibroblasts (cFFs) are transduced with the Yamanaka reprogramming factors by Sendai virus vectors. We use quantitative PCR arrays to measure the expression of 80 target genes at various stages of canine cell reprogramming. We ask how cFF reprogramming is influenced by small molecules affecting the epigenomic modification 5-hydroxymethylcytosine, specifically L-ascorbic acid and retinoic acid (AA/RA). Results: We found that the expression and catalytic output of a class of 2-oxoglutarate-dependent (2-OG) hydroxylases, known as ten-eleven translocation (TET) enzymes, can be modulated in canine cells treated with AA/RA. We further show that AA/RA treatment induces TET1 expression and facilitates early canine reprogramming, evidenced by upregulation of epithelial and pluripotency markers. Using a chemical inhibitor of 2-OG hydroxylases, we demonstrate that 2-OG hydroxylase activity regulates the expression of a subset of genes involved in mesenchymal-to-epithelial transition (MET) and pluripotency in early canine reprogramming. We identify a set of transcription factors depleted in maturing reprogramming intermediates compared to pluripotent canine embryonic stem cells. Conclusions: Our findings highlight 2-OG hydroxylases have evolutionarily conserved and divergent functions regulating the early reprogramming of canine somatic cells and show reprogramming conditions can be rationally optimized for the generation of maturing canine iPSC
A cortical information bottleneck during decision-making
Decision-making emerges from distributed computations across multiple brain areas, but it is unclear
why
the brain distributes the computation. In deep learning, artificial neural networks use multiple areas (or layers) to form optimal representations of task inputs. These optimal representations are
sufficient
to perform the task well, but
minimal
so they are invariant to other irrelevant variables. We recorded single neurons and multiunits in dorsolateral prefrontal cortex (DLPFC) and dorsal premotor cortex (PMd) in monkeys during a perceptual decision-making task. We found that while DLPFC represents task-related inputs required to compute the choice, the downstream PMd contains a minimal sufficient, or optimal, representation of the choice. To identify a mechanism for how cortex may form these optimal representations, we trained a multi-area recurrent neural network (RNN) to perform the task. Remarkably, DLPFC and PMd resembling representations emerged in the early and late areas of the multi-area RNN, respectively. The DLPFC-resembling area partially orthogonalized choice information and task inputs and this choice information was preferentially propagated to downstream areas through selective alignment with inter-area connections, while remaining task information was not. Our results suggest that cortex uses multi-area computation to form minimal sufficient representations by preferential propagation of relevant information between areas.The brain uses multiple areas for cognition, decision-making, and action, but it is unclear why the brain distributes the computation and why cortical activity differs by brain area. Machine learning and information theory suggests that one benefit of multiple areas is that it provides an “information bottleneck” that compresses inputs into an optimal representation that is minimal and sufficient to solve the task. Combining experimental recordings from behaving animals and computational simulations, we show that later brain areas have a tendency to form such minimal sufficient representations of task inputs through preferential propagation of task-relevant information present in earlier areas. Our results thus provide insight into why the brain uses multiple brain areas for supporting decision-making and action.5R01NS122969-03 - NIH/National Institute of Neurological Disorders & Stroke; 2019-12-77 - Whitehall Foundation, Inc.; 000000000000000000000000000000000000000000000000000000027923 - Brain & Behavior Research Foundation; 5R00NS092972-05 REVISED - NIH/National Institute of Neurological Disorders & Strokehttps://elifesciences.org/reviewed-preprints/89369Published versio
Draft genomes of two Artocarpus plants, jackfruit (A. heterophyllus) and breadfruit (A. altilis)
Two of the most economically important plants in the Artocarpus genus are jackfruit (A. heterophyllus Lam.) and breadfruit (A. altilis (Parkinson) Fosberg). Both species are long-lived trees that have been cultivated for thousands of years in their native regions. Today they are grown throughout tropical to subtropical areas as an important source of starch and other valuable nutrients. There are hundreds of breadfruit varieties that are native to Oceania, of which the most commonly distributed types are seedless triploids. Jackfruit is likely native to the Western Ghats of India and produces one of the largest tree-borne fruit structures (reaching up to 45 kg). To-date, there is limited genomic information for these two economically important species. Here, we generated 273 Gb and 227 Gb of raw data from jackfruit and breadfruit, respectively. The high-quality reads from jackfruit were assembled into 162,440 scaffolds totaling 982 Mb with 35,858 genes. Similarly, the breadfruit reads were assembled into 180,971 scaffolds totaling 833 Mb with 34,010 genes. A total of 2822 and 2034 expanded gene families were found in jackfruit and breadfruit, respectively, enriched in pathways including starch and sucrose metabolism, photosynthesis, and others. The copy number of several starch synthesis-related genes were found to be increased in jackfruit and breadfruit compared to closely-related species, and the tissue-specific expression might imply their sugar-rich and starch-rich characteristics. Overall, the publication of high-quality genomes for jackfruit and breadfruit provides information about their specific composition and the underlying genes involved in sugar and starch metabolism
Recommended from our members
Towards Prevention of Acute Lung Injury: Frequency and Outcomes of Emergency Department Patients At-Risk: A Multicenter Cohort Study
Background: Few emergency department (ED) evaluations on acute lung injury (ALI) have been carried out; hence, we sought to describe a cohort of hospitalized ED patients at risk for ALI development. Methods: Patients presenting to the ED with at least one predisposing condition to ALI were included in this study, a subgroup analysis of a multicenter observational cohort study (USCIITG-LIPS 1). Patients who met ALI criteria within 6 h of initial ED assessment, received end-of-life care, or were readmitted during the study period were excluded. Primary outcome was frequency of ALI development; secondary outcomes were ICU and hospital mortality. Results: Twenty-two hospitals enrolled 4,361 patients who were followed from the ED to hospital discharge. ALI developed in 303 (7.0 %) patients at a median onset of 2 days (IQR 2–5). Of the predisposing conditions, frequency of ALI development was highest in patients who had aortic surgery (43 %) and lowest in patients with pancreatitis (2.8 %). Compared to patients who did not develop ALI, those who did had higher ICU (24 % vs. 3.0 %, p < 0.001) and hospital (28 % vs. 4.6 %, p < 0.001) mortality, and longer hospital length of stay (16 vs. 5 days, p < 0.001). Among the 22 study sites, frequency of ALI development varied from less than 1 % to more than 12 % after adjustment for APACHE II. Conclusions: Seven percent of hospitalized ED patients with at least one predisposing condition developed ALI. The frequency of ALI development varied significantly according to predisposing conditions and across institutions. Further research is warranted to determine the factors contributing to ALI development
Association of Frailty and the Expanded Operative Stress Score with Preoperative Acute Serious Conditions, Complications, and Mortality in Males Compared to Females: A Retrospective Observational Study
OBJECTIVE: The aim of this study was to expand Operative Stress Score (OSS) increasing procedural coverage and assessing OSS and frailty association with Preoperative Acute Serious Conditions (PASC), complications and mortality in females versus males.
SUMMARY BACKGROUND DATA: Veterans Affairs male-dominated study showed high mortality in frail veterans even after very low stress surgeries (OSS1).
METHODS: Retrospective cohort using NSQIP data (2013-2019) merged with 180-day postoperative mortality from multiple hospitals to evaluate PASC, 30-day complications and 30-, 90-, and 180-day mortality.
RESULTS: OSS expansion resulted in 98.2% case coverage versus 87.0% using the original. Of 82,269 patients (43.8% male), 7.9% were frail/very frail. Males had higher odds of PASC [adjusted odds ratio (aOR) = 1.31, 95% confidence interval (CI) = 1.21-1.41, P \u3c 0.001] and severe/life-threatening Clavien-Dindo IV (CDIV) complications (aOR = 1.18, 95% CI = 1.09-1.28, P \u3c 0.001). Although mortality rates were higher (all time-points, P \u3c 0.001) in males versus females, mortality was similar after adjusting for frailty, OSS, and case status primarily due to increased male frailty scores. Additional adjustments for PASC and CDIV resulted in a lower odds of mortality in males (30-day, aOR = 0.81, 95% CI = 0.71-0.92, P = 0.002) that was most pronounced for males with PASC compared to females with PASC (30-day, aOR = 0.75, 95% CI = 0.56-0.99, P = 0.04).
CONCLUSIONS: Similar to the male-dominated Veteran population, private sector, frail patients have high likelihood of postoperative mortality, even after low-stress surgeries. Preoperative frailty screening should be performed regardless of magnitude of the procedure. Despite males experiencing higher adjusted odds of PASC and CDIV complications, females with PASC had higher odds of mortality compared to males, suggesting differences in the aggressiveness of care provided to men and women
Dorsal Periaqueductal gray ensembles represent approach and avoidance states
Animals must balance needs to approach threats for risk assessment and to avoid danger. The dorsal periaqueductal gray (dPAG) controls defensive behaviors, but it is unknown how it represents states associated with threat approach and avoidance. We identified a dPAG threatavoidance ensemble in mice that showed higher activity farther from threats such as the open arms of the elevated plus maze and a predator. These cells were also more active during threat avoidance behaviors such as escape and freezing, even though these behaviors have antagonistic motor output. Conversely, the threat approach ensemble was more active during risk assessment behaviors and near threats. Furthermore, unsupervised methods showed that avoidance/approach states were encoded with shared activity patterns across threats. Lastly, the relative number of cells in each ensemble predicted threat avoidance across mice. Thus, dPAG ensembles dynamically encode threat approach and avoidance states, providing a flexible mechanism to balance risk assessment and danger avoidance
Independent Associations of Neighborhood Deprivation and Patient-level Social Determinants of Health with Textbook Outcomes after Inpatient Surgery.
OBJECTIVE: Assess associations of Social Determinants of Health (SDoH) using Area Deprivation Index (ADI), race/ethnicity and insurance type with Textbook Outcomes (TO).
SUMMARY BACKGROUND DATA: Individual- and contextual-level SDoH affect health outcomes, but only one SDoH level is usually included.
METHODS: Three healthcare system cohort study using National Surgical Quality Improvement Program (2013-2019) linked with ADI risk-adjusted for frailty, case status and operative stress examining TO/TO components (unplanned reoperations, complications, mortality, Emergency Department/Observation Stays and readmissions).
RESULTS: Cohort (34,251 cases) mean age 58.3 [SD=16.0], 54.8% females, 14.1% Hispanics, 11.6% Non-Hispanic Blacks, 21.6% with ADI\u3e85, and 81.8% TO. Racial and ethnic minorities, non-Private insurance, and ADI\u3e85 patients had increased odds of urgent/emergent surgeries (aORs range: 1.17-2.83, all P85 and non-Private insurances had lower TO odds (aORs range: 0.55-0.93, all P85 lost significance after including case status. Urgent/emergent versus elective had lower TO odds (aOR=0.51, P85 patients had higher complication and mortality odds. Estimated reduction in TO probability was 9.9% (CI=7.2%-12.6%) for urgent/emergent cases, 7.0% (CI=4.6%-9.3%) for Medicaid, and 1.6% (CI=0.2%-3.0%) for non-Hispanic Black patients. TO probability difference for lowest-risk (White-Private-ADI≤85-elective) to highest-risk (Black-Medicaid-ADI\u3e85-urgent/emergent) was 29.8% for very frail patients.
CONCLUSION: Multi-level SDoH had independent effects on TO, predominately affecting outcomes through increased rates/odds of urgent/emergent surgeries driving complications and worse outcomes. Lowest-risk versus highest-risk scenarios demonstrated the magnitude of intersecting SDoH variables. Combination of insurance type and ADI should be used to identify high-risk patients to redesign care pathways to improve outcomes. Risk adjustment including contextual neighborhood deprivation and patient-level SDoH could reduce unintended consequences of value-based programs
- …