73 research outputs found

    Verstetigung und Verbreitung von E-Learning im Verbundstudium. Onlinebefragung als Promotor und Instrument zur Einbeziehung der Lehrenden bei der Entwicklung und Umsetzung

    Full text link
    Das Verbundstudium der nordrhein-westfĂ€lischen Fachhochschulen bietet ĂŒber 3000 Studierenden die Möglichkeit, in einer Kombination von PrĂ€senz- und Selbststudium neben dem Beruf zu studieren. Das Institut fĂŒr Verbundstudien koordiniert und organisiert die Kooperationsprozesse der Hochschulen und engagiert sich mit seinem Bereich Hochschuldidaktik und Fernstudienentwicklung als Entwicklungs- und Kompetenzzentrum im Bereich der Neuen Medien und des E-Learnings. Zur Verbreitung und Verstetigung der digitalen Lehr- und Lernangebote sowie der Optimierung der Kooperations- und Supportstrukturen hat das Institut eine Onlinebefragung von 200 Lehrenden zur Situation und den Perspektiven des E-Learnings im Verbundstudium durchgefĂŒhrt. Die Studie zeigt, dass fĂŒr die Lehrenden auch zukĂŒnftig die gedruckte Lerneinheit das zentrale Element der Lehre sein wird. Sie sehen Bedarf zur ErgĂ€nzung und Anreicherung des Studiums sowie des Lernens und wĂŒnschen sich zur UnterstĂŒtzung der Lehre ergĂ€nzende digitale Elemente vor allem in folgenden Bereichen: Kommunikation, ErgĂ€nzungen zu Lerneinheiten (Linklisten, Übungen, ergĂ€nzende Medien und Materialien), ĂŒbergreifendes Glossar. Die Ergebnisse der Onlinebefragung sind die Grundlage des von den Gremien des Verbundstudiums beschlossenen E-Learning-Konzepts. Die von den Lehrenden gewĂŒnschten digitalen Elemente und Funktionen sind im Rahmen der Entwicklung durch den Bereich Hochschuldidaktik und Fernstudienentwicklung in der E-Learning-Umgebung VS-online umgesetzt worden. Zurzeit werden die bereitgestellten Elemente und Funktionen von den einzelnen VerbundstudiengĂ€ngen mit BeitrĂ€gen und Inhalten gefĂŒllt. (DIPF/Orig.

    Once-daily saquinavir (SAQ)/ritonavir (RTV) (2000/100 mg) with abacavir/lamivudine (600/300 mg) or tenofovir/emtricitabine (245/300 mg) in naĂŻve patients

    Get PDF
    Poster presentation: Background In the past years, once-daily (QD) dosing of antiretroviral combination therapy has become an increasingly available treatment option for HIV-1+ patients. Methods Open label study in which HIV-1+ patients treated with SAQ/RTV (1000/100 mg BID) and two NRTIs with HIV-RNA-PCR < 50 copies/ml were switched to SAQ/RTV(2000/100 mg QD) with unchanged NRTI-backbone. CD4-cells, HIV-RNA-PCR, SAQ and RTV drug-levels and metabolic parameters were compared. Summary of results 17 patients (15 male, 42 years), median CD4 456 ± 139/micro l were included so far. The median follow-up time is 4 months. The HIV-RNA-PCR remained <50 copies/ml for all patients. Fasting metabolic parameters remained unchanged. The SAQ AUC 0–12 h were significantly higher when given QD vs. BID (median 29,400 vs. 18,500 ng*h/ml; p = 0.009), whereas the Cmin, Cmax and AUC was lower for RTV when given QD vs. BID (7,400 vs. 11,700 ng*h/ml; p = 0.02). Conclusion In this ongoing study SAQ/RTV (2000/100 mg QD) was well tolerated and demonstrated higher SAQ and lower RTV drug levels as compared to the BID dosing schedule. (Table 1 and Figure 1.

    The ESCAPE project : Energy-efficient Scalable Algorithms for Weather Prediction at Exascale

    Get PDF
    In the simulation of complex multi-scale flows arising in weather and climate modelling, one of the biggest challenges is to satisfy strict service requirements in terms of time to solution and to satisfy budgetary constraints in terms of energy to solution, without compromising the accuracy and stability of the application. These simulations require algorithms that minimise the energy footprint along with the time required to produce a solution, maintain the physically required level of accuracy, are numerically stable, and are resilient in case of hardware failure. The European Centre for Medium-Range Weather Forecasts (ECMWF) led the ESCAPE (Energy-efficient Scalable Algorithms for Weather Prediction at Exascale) project, funded by Horizon 2020 (H2020) under the FET-HPC (Future and Emerging Technologies in High Performance Computing) initiative. The goal of ESCAPE was to develop a sustainable strategy to evolve weather and climate prediction models to next-generation computing technologies. The project partners incorporate the expertise of leading European regional forecasting consortia, university research, experienced high-performance computing centres, and hardware vendors. This paper presents an overview of the ESCAPE strategy: (i) identify domain-specific key algorithmic motifs in weather prediction and climate models (which we term Weather & Climate Dwarfs), (ii) categorise them in terms of computational and communication patterns while (iii) adapting them to different hardware architectures with alternative programming models, (iv) analyse the challenges in optimising, and (v) find alternative algorithms for the same scheme. The participating weather prediction models are the following: IFS (Integrated Forecasting System); ALARO, a combination of AROME (Application de la Recherche a l'Operationnel a Meso-Echelle) and ALADIN (Aire Limitee Adaptation Dynamique Developpement International); and COSMO-EULAG, a combination of COSMO (Consortium for Small-scale Modeling) and EULAG (Eulerian and semi-Lagrangian fluid solver). For many of the weather and climate dwarfs ESCAPE provides prototype implementations on different hardware architectures (mainly Intel Skylake CPUs, NVIDIA GPUs, Intel Xeon Phi, Optalysys optical processor) with different programming models. The spectral transform dwarf represents a detailed example of the co-design cycle of an ESCAPE dwarf. The dwarf concept has proven to be extremely useful for the rapid prototyping of alternative algorithms and their interaction with hardware; e.g. the use of a domain-specific language (DSL). Manual adaptations have led to substantial accelerations of key algorithms in numerical weather prediction (NWP) but are not a general recipe for the performance portability of complex NWP models. Existing DSLs are found to require further evolution but are promising tools for achieving the latter. Measurements of energy and time to solution suggest that a future focus needs to be on exploiting the simultaneous use of all available resources in hybrid CPU-GPU arrangements

    QMEAN server for protein model quality estimation

    Get PDF
    Model quality estimation is an essential component of protein structure prediction, since ultimately the accuracy of a model determines its usefulness for specific applications. Usually, in the course of protein structure prediction a set of alternative models is produced, from which subsequently the most accurate model has to be selected. The QMEAN server provides access to two scoring functions successfully tested at the eighth round of the community-wide blind test experiment CASP. The user can choose between the composite scoring function QMEAN, which derives a quality estimate on the basis of the geometrical analysis of single models, and the clustering-based scoring function QMEANclust which calculates a global and local quality estimate based on a weighted all-against-all comparison of the models from the ensemble provided by the user. The web server performs a ranking of the input models and highlights potentially problematic regions for each model. The QMEAN server is available at http://swissmodel.expasy.org/qmean

    Data accuracy, consistency and completeness of the national Swiss cystic fibrosis patient registry: Lessons from an ECFSPR data quality project.

    Get PDF
    BACKGROUND Good data quality is essential when rare disease registries are used as a data source for pharmacovigilance studies. This study investigated data quality of the Swiss cystic fibrosis (CF) registry in the frame of a European Cystic Fibrosis Society Patient Registry (ECFSPR) project aiming to implement measures to increase data reliability for registry-based research. METHODS All 20 pediatric and adult Swiss CF centers participated in a data quality audit between 2018 and 2020, and in a re-audit in 2022. Accuracy, consistency and completeness of variables and definitions were evaluated, and missing source data and informed consents (ICs) were assessed. RESULTS The first audit included 601 out of 997 Swiss people with CF (60.3 %). Data quality, as defined by data correctness ≄95 %, was high for most of the variables. Inconsistencies of specific variables were observed because of an incorrect application of the variable definition. The proportion of missing data was low with 5 % of missing documents). After providing feedback to the centers, availability of genetic source data and ICs improved. CONCLUSIONS Data audits demonstrated an overall good data quality in the Swiss CF registry. Specific measures such as support of the participating sites, training of data managers and centralized data collection should be implemented in rare disease registries to optimize data quality and provide robust data for registry-based scientific research

    The ESCAPE project: Energy-efficient Scalable Algorithms for Weather Prediction at Exascale

    Get PDF
    Abstract. In the simulation of complex multi-scale flows arising in weather and climate modelling, one of the biggest challenges is to satisfy strict service requirements in terms of time to solution and to satisfy budgetary constraints in terms of energy to solution, without compromising the accuracy and stability of the application. These simulations require algorithms that minimise the energy footprint along with the time required to produce a solution, maintain the physically required level of accuracy, are numerically stable, and are resilient in case of hardware failure. The European Centre for Medium-Range Weather Forecasts (ECMWF) led the ESCAPE (Energy-efficient Scalable Algorithms for Weather Prediction at Exascale) project, funded by Horizon 2020 (H2020) under the FET-HPC (Future and Emerging Technologies in High Performance Computing) initiative. The goal of ESCAPE was to develop a sustainable strategy to evolve weather and climate prediction models to next-generation computing technologies. The project partners incorporate the expertise of leading European regional forecasting consortia, university research, experienced high-performance computing centres, and hardware vendors. This paper presents an overview of the ESCAPE strategy: (i) identify domain-specific key algorithmic motifs in weather prediction and climate models (which we term Weather & Climate Dwarfs), (ii) categorise them in terms of computational and communication patterns while (iii) adapting them to different hardware architectures with alternative programming models, (iv) analyse the challenges in optimising, and (v) find alternative algorithms for the same scheme. The participating weather prediction models are the following: IFS (Integrated Forecasting System); ALARO, a combination of AROME (Application de la Recherche Ă  l'OpĂ©rationnel Ă  Meso-Echelle) and ALADIN (Aire LimitĂ©e Adaptation Dynamique DĂ©veloppement International); and COSMO–EULAG, a combination of COSMO (Consortium for Small-scale Modeling) and EULAG (Eulerian and semi-Lagrangian fluid solver). For many of the weather and climate dwarfs ESCAPE provides prototype implementations on different hardware architectures (mainly Intel Skylake CPUs, NVIDIA GPUs, Intel Xeon Phi, Optalysys optical processor) with different programming models. The spectral transform dwarf represents a detailed example of the co-design cycle of an ESCAPE dwarf. The dwarf concept has proven to be extremely useful for the rapid prototyping of alternative algorithms and their interaction with hardware; e.g. the use of a domain-specific language (DSL). Manual adaptations have led to substantial accelerations of key algorithms in numerical weather prediction (NWP) but are not a general recipe for the performance portability of complex NWP models. Existing DSLs are found to require further evolution but are promising tools for achieving the latter. Measurements of energy and time to solution suggest that a future focus needs to be on exploiting the simultaneous use of all available resources in hybrid CPU–GPU arrangements

    The American Congress of Rehabilitation Medicine Diagnostic Criteria for Mild Traumatic Brain Injury

    Get PDF
    Objective: To develop new diagnostic criteria for mild traumatic brain injury (TBI) that are appropriate for use across the lifespan and in sports, civilian trauma, and military settings. Design: Rapid evidence reviews on 12 clinical questions and Delphi method for expert consensus. Participants: The Mild Traumatic Brain Injury Task Force of the American Congress of Rehabilitation Medicine Brain Injury Special Interest Group convened a Working Group of 17 members and an external interdisciplinary expert panel of 32 clinician-scientists. Public stakeholder feedback was analyzed from 68 individuals and 23 organizations. Results: The first 2 Delphi votes asked the expert panel to rate their agreement with both the diagnostic criteria for mild TBI and the supporting evidence statements. In the first round, 10 of 12 evidence statements reached consensus agreement. Revised evidence statements underwent a second round of expert panel voting, where consensus was achieved for all. For the diagnostic criteria, the final agreement rate, after the third vote, was 90.7%. Public stakeholder feedback was incorporated into the diagnostic criteria revision prior to the third expert panel vote. A terminology question was added to the third round of Delphi voting, where 30 of 32 (93.8%) expert panel members agreed that ‘the diagnostic label ‘concussion’ may be used interchangeably with ‘mild TBI’ when neuroimaging is normal or not clinically indicated.’ Conclusions: New diagnostic criteria for mild TBI were developed through an evidence review and expert consensus process. Having unified diagnostic criteria for mild TBI can improve the quality and consistency of mild TBI research and clinical care.</p
    • 

    corecore