854 research outputs found

    Essential guidelines for computational method benchmarking

    Get PDF
    In computational biology and other sciences, researchers are frequently faced with a choice between several computational methods for performing data analyses. Benchmarking studies aim to rigorously compare the performance of different methods using well-characterized benchmark datasets, to determine the strengths of each method or to provide recommendations regarding suitable choices of methods for an analysis. However, benchmarking studies must be carefully designed and implemented to provide accurate, unbiased, and informative results. Here, we summarize key practical guidelines and recommendations for performing high-quality benchmarking analyses, based on our experiences in computational biology.Comment: Minor update

    To What Extent Can Individualisation in Terms of Different Types of Mode Improve Learning Outcomes and Learner Satisfaction? A Pre-study

    Get PDF
    With the latest technological developments and associated new possibilities in teaching, the personalisation of learning is gaining more and more importance. It assumes that individual learning experiences and results could generally be improved when personal learning preferences are considered. To do justice to the complexity of the personalisation possibilities of teaching and learning processes, we illustrate the components of learning and teaching in the digital environment and their interdependencies in an initial model. Furthermore, in a pre-study, we investigate the relationships between the learner's ability to (digital) self-organise, the learner’s prior- knowledge learning in different variants of mode and learning outcomes as one part of this model. With this pre-study, we are taking the first step towards a holistic model of teaching and learning in digital environments

    Essential guidelines for computational method benchmarking

    Get PDF
    In computational biology and other sciences, researchers are frequently faced with a choice between several computational methods for performing data analyses. Benchmarking studies aim to rigorously compare the performance of different methods using well-characterized benchmark datasets, to determine the strengths of each method or to provide recommendations regarding suitable choices of methods for an analysis. However, benchmarking studies must be carefully designed and implemented to provide accurate, unbiased, and informative results. Here, we summarize key practical guidelines and recommendations for performing high-quality benchmarking analyses, based on our experiences in computational biology

    Estimation of Bluefin Tuna (Thunnus thynnus) mean length in sea cages by acoustical means

    Full text link
    [EN] This paper proposes an indirect method to estimate Bluefin Tuna (Thunnus thynnus) biomass in cages using acoustic techniques. Two Simrad EK60 echosounders working at 120 and 200 kHz and a stereo camera were used to obtain target strength (TS) to fork length (FL) relationships for both operating frequencies. The equipment was placed at the bottom of a floating cage, facing towards the surface to record the ventral aspect of fish. The acoustic and optical recordings were automatically analysed and the combination of acoustic and optical results provided unequivocal TS-FL assignments. Good relationships between TS and FL were obtained for both frequencies even without discriminating data from different fish tilts and without using beam directivity compensation. Stronger correlations were obtained for compensated TS at 200 kHz when reduced tuna swimming tilt was considered. TS measurements were compared to MFS numerical predictions for a Bluefin swimbladder model, with the simulation results showing good agreement with experimental measurements. The results allow the mean tuna length to be predicted in growing or fattening cages from acoustic data raising the possibility of improving production management and of providing a useful tool for catch control estimations made by international organisations (like ICCAT) and government bodies. (c) 2022 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).This work was supported by funding from the ACUSTUNA pro-ject ref. CTM2015-70446-R (MINECO/ERDF, EU). The authors are grateful for the support provided by Grup Balfego for their collab-oration and implication in installing and maintaining of the exper-imental setup in their tuna cages at l'Ametlla de Mar and for providing an ABFT specimen for x-ray imaging.Puig Pons, V.; Muñoz-Benavent, P.; Pérez Arjona, I.; Ladino-Velásquez, A.; Llorens-Escrich, S.; Andreu García, G.; Valiente González, JM.... (2022). Estimation of Bluefin Tuna (Thunnus thynnus) mean length in sea cages by acoustical means. Applied Acoustics. 197:1-13. https://doi.org/10.1016/j.apacoust.2022.10896011319

    Analysis and design of a capsule landing system and surface vehicle control system for Mars exploration

    Get PDF
    Problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars are reported. Problem areas include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis, terrain modeling and path selection; and chemical analysis of specimens. These tasks are summarized: vehicle model design, mathematical model of vehicle dynamics, experimental vehicle dynamics, obstacle negotiation, electrochemical controls, remote control, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer subsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, and chromatograph model evaluation and improvement

    NASA Composite Materials Development: Lessons Learned and Future Challenges

    Get PDF
    Composite materials have emerged as the materials of choice for increasing the performance and reducing the weight and cost of military, general aviation, and transport aircraft and space launch vehicles. Major advancements have been made in the ability to design, fabricate, and analyze large complex aerospace structures. The recent efforts by Boeing and Airbus to incorporate composite into primary load carrying structures of large commercial transports and to certify the airworthiness of these structures is evidence of the significant advancements made in understanding and use of these materials in real world aircraft. NASA has been engaged in research on composites since the late 1960 s and has worked to address many development issues with these materials in an effort to ensure safety, improve performance, and improve affordability of air travel for the public good. This research has ranged from synthesis of advanced resin chemistries to development of mathematical analyses tools to reliably predict the response of built-up structures under combined load conditions. The lessons learned from this research are highlighted with specific examples to illustrate the problems encountered and solutions to these problems. Examples include specific technologies related to environmental effects, processing science, fabrication technologies, nondestructive inspection, damage tolerance, micromechanics, structural mechanics, and residual life prediction. The current state of the technology is reviewed and key issues requiring additional research identified. Also, grand challenges to be solved for expanded use of composites in aero structures are identified

    Saturation driven flow controller for oxygen therapy

    Get PDF
    Despite receiving oxygen therapy, many COPD patients experience extended periods of hypoxemia during routine daily activities. In others, inappropriately high oxygen flow rates can depress hypoxic drive leading to worsening CO2 retention. As flow-rates during LTOT are fixed, oxygen delivery will not respond to patients' fluctuations in oxygen demand. The research project has aimed to develop and evaluate a closed-loop control method capable of actively varying flow-rates in response to the measured oxygen demand. We demonstrate how SpO2 from ambulatory or overnight pulse oximetry can be used as feedback into an automated O2 flow-rate controller. A model to mimic the patient oxygen saturation response has been developed in a computer simulation to help characterize the closed-loop system. With the collaboration of the Academic Unit for Sleep and Breathing at the Royal Brompton Hospital, the controller response has also been validated against patient saturation measurements recorded during overnight pulse oximetry monitoring. Preclinical computer simulations indicated an improved matching between oxygen supply and demand, maintaining SpO2 above threshold to maximize therapeutic efficacy. An investigational system capable of regulating the Saturation Driven Oxygen Therapy (SDOT) was constructed. In a randomised cross-over clinical pilot study, we further evaluated the SDOT system against constant-flow LTOT during exercise. The clinical results indicate that compared to standard oxygen therapy, SDOT produced a significant reduction in time spent with hypoxemia, decreased the extent of hypoxemia and lowered mean heart rates during exercise. Moreover, for patients with acceptable resting oxygen levels, SDOT provided conservation benefits by reducing the rate of oxygen consumption. The study established the potential to significantly improve the efficacy and economic delivery of this gold standard therapy.Open Acces

    Treatment Planning Automation for Rectal Cancer Radiotherapy

    Get PDF
    Background Rectal cancer is a common type of cancer. There is an acute health disparity across the globe where a significant population of the world lack adequate access to radiotherapy treatments which is a part of the standard of care for rectal cancers. Safe radiotherapy treatments require specialized planning expertise and are time-consuming and labor-intensive to produce. Purpose: To alleviate the health disparity and promote the safe and quality use of radiotherapy in treating rectal cancers, the entire treatment planning process needs to be automated. The purpose of this project is to develop automated solutions for the treatment planning process of rectal cancers that would produce clinically acceptable and high-quality plans. To achieve this goal, we first automated two common existing treatment techniques, 3DCRT and VMAT, for rectal cancers, and then explored an alternative method for creating a treatment plan using deep learning. Methods: To automate the 3DCRT treatment technique, we used deep learning to predict the shapes of field apertures for primary and boost fields based on CT and location and the shapes of GTV and involved lymph nodes. The results of the predicted apertures were evaluated by a GI radiation oncologist. We then designed an algorithm to automate the forward-planning process with the capacity of adding fields to homogenize the dose at the target volumes using the field-in-field technique. The algorithm was validated on the clinical apertures and the plans produced were scored by a radiation oncologist. The field aperture prediction and the algorithm were combined into an end-to-end process and were tested on a separate set of patients. The resulting final plans were scored by a GI radiation oncologist for their clinical acceptability. To automate of VMAT treatment technique, we used deep learning models to segment CTV and OARs and automated the inverse planning process, based on a RapidPlan model. The end-to-end process requires only the GTV contour and a CT scan as inputs. Specifically, the segmentation models could auto-segment CTV, bowel bag, large bowel, small bowel, total bowel, femurs, bladder, bone marrow, and female and male genitalia. All the OARs were contoured under the guidance of and reviewed by a GI radiation oncologist. For auto-planning, the RapidPlan model was designed for VMAT delivery with 3 arcs and validated separately by two GI radiation oncologists. Finally, the end-to-end pipeline was evaluated on a separate set of testing patients, and the resulting plans were scored by two GI radiation oncologists. Existing inverse planning methods rely on 1D information from DVH values,2D information from DVH lines,or 3D dose distributions using machine learning for plan optimizations. The project explored the possibility of using deep learning to create 3D dose distributions directly for VMAT treatment plans. The training data consisted of patients treated by the VMAT treatment technique in the short-course fractionation scheme that uses 5 Gy per fraction for 5 fractions. Two deep learning architectures were investigated for their ability to emulate clinical dose distributions: 3D DDUNet and 2D cGAN. The top-performing model for each architecture was identified based on the difference in DVH values, DVH lines, and dose distribution between the predicted dose and the corresponding clinical plans. Results: For 3DCRT automation, the predicted apertures were 100%, 95%, and 87.5% clinically acceptable for the posterior-anterior, laterals, and boost apertures, respectively. The forward planning algorithm created wedged plans that were 85% clinically acceptable with clinical apertures. The end-to-end workflow generated 97% clinically acceptable plans for the separate test patients. For the VMAT automation, CTV contours were 89% clinically acceptable without necessary modifications and all the OAR contours were clinically acceptable without edits except for large and small bowels. The RaidPlan model was evaluated to produce 100% and 91% of clinically acceptable plans per two GI radiation oncologists. For the testing of end-to-end workflow, 88% and 62% of the final plans were accepted by two GI radiation oncologists. For the evaluation of deep learning architectures, the top-performing model of the DDUNet architecture used the medium patch size and inputs of CT, PTV times prescription dose mask, CTV, PTV 10 mm expansion, and the external body structure. The model with inputs CT, PTV, and CTV masks performed the best for the cGAN architecture. Both the DDUNet and cGAN architectures could predict 3D dose distributions that had DVH values that were statistically the same as the clinical plans. Conclusions: We have successfully automated the clinical workflow for generating either 3DCRT or VMAT radiotherapy plans for rectal cancer for our institution. This project showed that the existing treatment planning techniques for rectal cancer can be automated to generate clinically acceptable and safe plans with minimal inputs and no human intervention for most patients. The project also showed that deep learning architectures can be used for predicting dose distributions

    HEALTH LITERACY: A BIBLIOMETRIC AND CITATION ANALYSIS

    Get PDF
    The concept of health literacy finds its origins in the field of education. In its brief history the definition, structure, and direction of the field has changed dramatically and has emerged as a multidisciplinary endeavor full of discipline specific manifestations, most recently, public health literacy. Using bibliometric and citation analyses, this study investigated the field of health literacy from the first use of the term in 1974 to the present year, 2010. A range of databases from the various fields that have contributed to the field were searched using the keyword string, “health literacy.” Data was organized, cleaned and parsed using EndNote X3. A free, Java-based application, CiteSpace, was utilized for visualization of author co-citations, document co-citations, keyword co-occurrences, and document co-citation clusters. This research presents researchers, librarians and those interested in the field with information to efficiently conduct literature searches and understand the structure of the field. In addition, this research provides insight into how and where the field may be progressing in terms of multi- and interdisciplinary research
    corecore