243 research outputs found
Art Education and Disability: Re-envisioning Educational Efficiency
The value of efficiency has long been an ideal of educational policy in the United States (Guthrie, 1980). Where the education-and especially the art education-of students who are experiencing disabilities is concerned, traditional notions of efficiency (which are primarily rooted in economic standards of measure) may prove inflexible and inadequate in assessing educational outcomes. Guthrie (1980) equates efficiency in the schools with productivity. He explains that a number of factors may affect productivity, including availability of resources and students\u27 environment and social background; likewise, students\u27 varying (dis)abilities can be added to these factors. Indeed, traditional educational efficiency emphasizes autonomy and uniform delivery of services over responsiveness to diversity of needs and the individualized education mandated by the Individuals with Disabilities Act (IDEA) for (pre)K-12 students who are experiencing disabilities. Semmel, Gerber, and Macmillian (1995) question whether the actual practice of special education is aligned with the intentions of the system of education. They imply that school districts may actually resist the inclusion of students experiencing disabilities into the general classroom setting because the segregationist form of special education was designed for reasons of economy and efficiency. What, then, are these roots of educational efficiency, and what role can/should the value of efficiency play in specialized education under the IDEA? Is there an approach to efficiency in the art education of students experiencing disabilities that may still address a diversity of needs? Here, I examine the background of the value of efficiency in education and how the IDEA apparently defines this value in serving the special educational needs of students experiencing disabilities. I also investigate this value through case study findings of a high school art class as an inclusive educational setting
Painting baryons onto N-body simulations of galaxy clusters with image-to-image deep learning
Galaxy cluster mass functions are a function of cosmology, but mass is not a
direct observable, and systematic errors abound in all its observable proxies.
Mass-free inference can bypass this challenge, but it requires large suites of
simulations spanning a range of cosmologies and models for directly observable
quantities. In this work, we devise a U-net - an image-to-image machine
learning algorithm - to ``paint'' the IllustrisTNG model of baryons onto
dark-matter-only simulations of galaxy clusters. Using 761 galaxy clusters with
from the TNG-300 simulation at , we
train the algorithm to read in maps of projected dark matter mass and output
maps of projected gas density, temperature, and X-ray flux. The models train in
under an hour on two GPUs, and then predict baryonic images for dark
matter maps drawn from the TNG-300 dark-matter-only (DMO) simulation in under
two minutes. Despite being trained on individual images, the model reproduces
the true scaling relation and scatter for the , as well as the
distribution functions of the cluster X-ray luminosity and gas mass. For just
one decade in cluster mass, the model reproduces three orders of magnitude in
. The model is biased slightly high when using dark matter maps from the
DMO simulation. The model performs well on inputs from TNG-300-2, whose mass
resolution is 8 times coarser; further degrading the resolution biases the
predicted luminosity function high. We conclude that U-net-based baryon
painting is a promising technique to build large simulated cluster catalogs
which can be used to improve cluster cosmology by combining existing
full-physics and large -body simulations.Comment: Accepted to MNRA
Prediction of Sublingual Bioavailability of Buprenorphine in Newborns with Neonatal Abstinence Syndromeâa case study on physiological and developmental changes using NONMEM and SIMCYP
Poster presented at 2009 American College of Clinical Pharmacology conference in Orlando. April 24-28.
Background: About 55 to 94% of infants born to opioid dependent mothershave neonatal abstinence syndrome (NAS). Buprenorphine (BUP) is usedclinically as an analgesic and a detoxification agent and a maintenancetreatment for opioid dependence. No data, however, has been reported about the use of sublingual administration of BUP below the age of 4 year, especially for term infants with NAS.
Objectives: Characterize pharmacokinetics (PK) of BUP in newborn patients;Evaluate the developmental changes in newborns in order to assist dosingoptimization in ongoing clinical studies.
Methods: In silico prediction of PK behavior and physiological development in newborn patients were evaluated using SIMCYP. Intravenous clearance was predicted through physiologically based simulation method in SIMCYP. Basedon sublingual clearance obtained from a one compartmental model developedpreviously using NONMEM, individual changes of sublingual bioavailability were evaluated with physiological development in the first one and half month during the newborn period.
Results: Intrinsic clearance of BUP in newborns were incorporated into enzymekinetic data obtained from literature. Change of sublingual bioavailability fornewborns was evaluated with bioavailability-postmenstrual age profiles.Sublingual bioavailability of BUP was estimated as 8.9--56.6% in newborn patients studied during the first one and half postnatal month.
Conclusion: Developmental considerations for the PK of BUP in newborns are important for the characterization of the dose-exposure relationship. We have evaluated this from âbottom-upâ and âtop-downâ approaches with SIMCYP and NONMEM respectively and found these approaches to be complementary andvaluable for clinical trial design and routine clinical care. Presumably theywould facilitate rational decision making in pediatric drug development as well
Strategies for Library Mergers & Centralizing Library Services
Abstract: More hospitals are merging or looking for ways to cut costs. Libraries need know how to respond as a result and/or look for way to centralize services.
Format: This session will have brief presentations followed by panel discussion and interactive question and answer
Sponsors: Hospital Library Section, NAHRS
Objectives: This session will provide success stories and lessons learned from librarians who have gone through library centralization or library integration following hospital mergers. Presentations will cover needs assessment/SWOT analysis, licensing, budgeting, technical challenges, staff, solo perspective, and other considerations.
Audience: Hospital librarians looking to centralize services or who have recently merged or will be impacted by a merger in the future.
Learning Objectives:
1) Discuss the benefits of a needs assessment/SWOT analysis when merging
2) List some of the considerations involved in unifying services
3) Study the licensing and budgeting issues faced by merging
4) Recognize technical issues involved in a merger
5) Describe staffing challenges
6) Discuss challenges faced by solos
Instructional Methods: presentations, slides, question/answer, and panel discussion
Participant Engagement: Presenters will actively solicit questions from audience for immediate answer and provide template materials that can be customized for their institution
Indeterminate and discrepant rapid HIV test results in couples' HIV testing and counselling centres in Africa
<p>Abstract</p> <p>Background</p> <p>Many HIV voluntary testing and counselling centres in Africa use rapid antibody tests, in parallel or in sequence, to establish same-day HIV status. The interpretation of indeterminate or discrepant results between different rapid tests on one sample poses a challenge. We investigated the use of an algorithm using three serial rapid HIV tests in cohabiting couples to resolve unclear serostatuses.</p> <p>Methods</p> <p>Heterosexual couples visited the Rwanda Zambia HIV Research Group testing centres in Kigali, Rwanda, and Lusaka, Zambia, to assess HIV infection status. Individuals with unclear HIV rapid antibody test results (indeterminate) or discrepant results were asked to return for repeat testing to resolve HIV status. If either partner of a couple tested positive or indeterminate with the screening test, both partners were tested with a confirmatory test. Individuals with indeterminate or discrepant results were further tested with a tie-breaker and monthly retesting. HIV-RNA viral load was determined when HIV status was not resolved by follow-up rapid testing. Individuals were classified based on two of three initial tests as "Positive", "Negative" or "Other". Follow-up testing and/or HIV-RNA viral load testing determined them as "Infected", "Uninfected" or "Unresolved".</p> <p>Results</p> <p>Of 45,820 individuals tested as couples, 2.3% (4.1% of couples) had at least one discrepant or indeterminate rapid result. A total of 65% of those individuals had follow-up testing and of those individuals initially classified as "Negative" by three initial rapid tests, less than 1% were resolved as "Infected". In contrast, of those individuals with at least one discrepant or indeterminate result who were initially classified as "Positive", only 46% were resolved as "Infected", while the remainder was resolved as "Uninfected" (46%) or "Unresolved" (8%). A positive HIV serostatus of one of the partners was a strong predictor of infection in the other partner as 48% of individuals who resolved as "Infected" had an HIV-infected spouse.</p> <p>Conclusions</p> <p>In more than 45,000 individuals counselled and tested as couples, only 5% of individuals with indeterminate or discrepant rapid HIV test results were HIV infected. This represented only 0.1% of all individuals tested. Thus, algorithms using screening, confirmatory and tie-breaker rapid tests are reliable with two of three tests negative, but not when two of three tests are positive. False positive antibody tests may persist. HIV-positive partner serostatus should prompt repeat testing.</p
Long-Dose Intensive Therapy Is Necessary for Strong, Clinically Significant, Upper Limb Functional Gains and Retained Gains in Severe/Moderate Chronic Stroke
Background. Effective treatment methods are needed for moderate/severely impairment chronic stroke. Objective. The questions were the following: (1) Is there need for long-dose therapy or is there a mid-treatment plateau? (2) Are the observed gains from the prior-studied protocol retained after treatment? Methods. Single-blind, stratified/randomized design, with 3 applied technology treatment groups, combined with motor learning, for long-duration treatment (300 hours of treatment). Measures were Arm Motor Ability Test time and coordination-function (AMAT-T, AMAT-F, respectively), acquired pre-/posttreatment and 3-month follow-up (3moF/U); Fugl-Meyer (FM), acquired similarly with addition of mid-treatment. Findings. There was no group difference in treatment response (P ⼠.16), therefore data were combined for remaining analyses (n = 31; except for FM pre/mid/post, n = 36). Pre-to-Mid-treatment and Mid-to-Posttreatment gains of FM were statistically and clinically significant (P \u3c .0001; 4.7 points and P \u3c .001; 5.1 points, respectively), indicating no plateau at 150 hours and benefit of second half of treatment. From baseline to 3moF/U: (1) FM gains were twice the clinically significant benchmark, (2) AMAT-F gains were greater than clinically significant benchmark, and (3) there was statistically significant improvement in FM (P \u3c .0001); AMAT-F (P \u3c .0001); AMAT-T (P \u3c .0001). These gains indicate retained clinically and statistically significant gains at 3moFU. From posttreatment to 3moF/U, gains on FM were maintained. There were statistically significant gains in AMAT-F (P = .0379) and AMAT-T P = .003
Knowledge Organization Systems for Systematic Chemical Assessments
BACKGROUND: Although the implementation of systematic review and evidence mapping methods stands to improve the transparency and accuracy of chemical assessments, they also accentuate the challenges that assessors face in ensuring they have located and included all the evidence that is relevant to evaluating the potential health effects an exposure might be causing. This challenge of information retrieval can be characterized in terms of "semantic" and "conceptual" factors that render chemical assessments vulnerable to the streetlight effect. OBJECTIVES: This commentary presents how controlled vocabularies, thesauruses, and ontologies contribute to overcoming the streetlight effect in information retrieval, making up the key components of Knowledge Organization Systems (KOSs) that enable more systematic access to assessment-relevant information than is currently achievable. The concept of Adverse Outcome Pathways is used to illustrate what a general KOS for use in chemical assessment could look like. DISCUSSION: Ontologies are an underexploited element of effective knowledge organization in the environmental health sciences. Agreeing on and implementing ontologies in chemical assessment is a complex but tractable process with four fundamental steps. Successful implementation of ontologies would not only make currently fragmented information about health risks from chemical exposures vastly more accessible, it could ultimately enable computational methods for chemical assessment that can take advantage of the full richness of data described in natural language in primary studies. https://doi.org/10.1289/EHP6994
Medication Complications in Extracorporeal Membrane Oxygenation.
The need for extracorporeal membrane oxygenation (ECMO) therapy is a marker of disease severity for which multiple medications are required. The therapy causes physiologic changes that impact drug pharmacokinetics. These changes can lead to exposure-driven decreases in efficacy or increased incidence of side effects. The pharmacokinetic changes are drug specific and largely undefined for most drugs. We review available drug dosing data and provide guidance for use in the ECMO patient population
- âŚ