13 research outputs found

    Consistent Point Data Assimilation in Firedrake and Icepack

    Full text link
    We present methods and tools that significantly improve the ability to estimate quantities and fields which are difficult to directly measure, such as the fluidity of ice, using point data sources, such as satellite altimetry. These work with both sparse and dense point data with estimated quantities and fields becoming more accurate as the number of measurements are increased. Such quantities and fields are often used as inputs to mathematical models that are used to make predictions so improving their accuracy is of vital importance. We demonstrate how our methods and tools can increase the accuracy of results, ensure posterior consistency, and aid discourse between modellers and experimenters. To do this, we bring point data into the finite element method ecosystem as discontinuous fields on meshes of disconnected vertices. Point evaluation can then be formulated as a finite element interpolation operation (dual-evaluation). Our new abstractions are well-suited to automation. We demonstrate this by implementing them in Firedrake, which generates highly optimised code for solving PDEs with the finite element method. Our solution integrates with dolfin-adjoint/pyadjoint which allows PDE-constrained optimisation problems, such as data assimilation, to be solved through forward and adjoint mode automatic differentiation. We demonstrate our new functionality through examples in the fields of groundwater hydrology and glaciology

    Evaluating a new generation of wearable high-density diffuse optical tomography (HD-DOT) technology via retinotopic mapping in the adult brain

    Get PDF
    We investigated the performance of a novel HD-DOT system by replicating a series of classic visual stimulation paradigms. Haemodynamic response functions and cortical activation maps replicated the results obtained with larger fibre-based systems

    ANIMATE: Wearable, flexible, and ultra-lightweight high-density diffuse optical tomography technologies for functional neuroimaging of newborns

    Get PDF
    We have developed a series of wearable high-density diffuse optical tomography (HD-DOT) technologies specifically for neonatal applications. These systems provide an ultra-lightweight form factor, a low profile and high mechanical flexibility. This new technology is validated using a novel, anatomically accurate dynamic phantom

    Evaluating a new generation of wearable high-density diffuse optical tomography technology via retinotopic mapping of the adult visual cortex

    Get PDF
    High-density diffuse optical tomography (HD-DOT) has been shown to approach the resolution and localization accuracy of blood oxygen level dependent-functional magnetic resonance imaging in the adult brain by exploiting densely spaced, overlapping samples of the probed tissue volume, but the technique has to date required large and cumbersome optical fiber arrays. : To evaluate a wearable HD-DOT system that provides a comparable sampling density to large, fiber-based HD-DOT systems, but with vastly improved ergonomics. : We investigated the performance of this system by replicating a series of classic visual stimulation paradigms, carried out in one highly sampled participant during 15 sessions to assess imaging performance and repeatability. : Hemodynamic response functions and cortical activation maps replicate the results obtained with larger fiber-based systems. Our results demonstrate focal activations in both oxyhemoglobin and deoxyhemoglobin with a high degree of repeatability observed across all sessions. A comparison with a simulated low-density array explicitly demonstrates the improvements in spatial localization, resolution, repeatability, and image contrast that can be obtained with this high-density technology. : The system offers the possibility for minimally constrained, spatially resolved functional imaging of the human brain in almost any environment and holds particular promise in enabling neuroscience applications outside of the laboratory setting. It also opens up new opportunities to investigate populations unsuited to traditional imaging technologies. [Abstract copyright: © 2021 The Authors.

    Design and validation of a mechanically flexible and ultra-lightweight high-density diffuse optical tomography system for functional neuroimaging of newborns

    Get PDF
    Neonates are a highly vulnerable population. The risk of brain injury is greater during the first days and weeks after birth than at any other time of life. Functional neuroimaging that can be performed longitudinally and at the cot-side has the potential to improve our understanding of the evolution of multiple forms of neurological injury over the perinatal period. However, existing technologies make it very difficult to perform repeated and/or long-duration functional neuroimaging experiments at the cot-side. We aimed to create a modular, high-density diffuse optical tomography (HD-DOT) technology specifically for neonatal applications that is ultra-lightweight, low profile and provides high mechanical flexibility. We then sought to validate this technology using an anatomically accurate dynamic phantom. An advanced 10-layer rigid-flexible printed circuit board technology was adopted as the basis for the DOT modules, which allows for a compact module design that also provides the flexibility needed to conform to the curved infant scalp. Two module layouts were implemented: dual-hexagon and triple-hexagon. Using in-built board-to-board connectors, the system can be configured to provide a vast range of possible layouts. Using epoxy resin, thermochromic dyes, and MRI-derived 3D-printed moulds, we constructed an electrically switchable, anatomically accurate dynamic phantom. This phantom was used to quantify the imaging performance of our flexible, modular HD-DOT system. Using one particular module configuration designed to cover the infant sensorimotor system, the device provided 36 source and 48 detector positions, and over 700 viable DOT channels per wavelength, ranging from 10 to over an area of approximately . The total weight of this system is only 70 g. The signal changes from the dynamic phantom, while slow, closely simulated real hemodynamic response functions. Using difference images obtained from the phantom, the measured 3D localization error provided by the system at the depth of the cortex was in the of range 3 to 6 mm, and the lateral image resolution at the depth of the neonatal cortex is estimated to be as good as 10 to 12 mm. The HD-DOT system described is ultra-low weight, low profile, can conform to the infant scalp, and provides excellent imaging performance. It is expected that this device will make functional neuroimaging of the neonatal brain at the cot-side significantly more practical and effective. [Abstract copyright: © 2021 The Authors.

    firedrakeproject/fiat: The Finite Element Automated Tabulator

    No full text
    <p>This release is specifically created to document the version of fiat used in a particular set of experiments using Firedrake. Please do not cite this as a general source for Firedrake or any of its dependencies. Instead, refer to https://www.firedrakeproject.org/citing.html</p&gt

    A practical approach to the older patient with cancer

    No full text

    Guidelines for the use and interpretation of assays for monitoring autophagy

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Guidelines for the use and interpretation of assays for monitoring autophagy

    No full text

    Guidelines for the use and interpretation of assays for monitoring autophagy

    No full text
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field
    corecore