349 research outputs found

    OPTIMAL DESIGN OF A CONVECTION-ALLOWING ENSEMBLE FROM A MODEL ERROR PERSPECTIVE

    Get PDF
    As computer technology continues to improve, resources are becoming increasingly available for running an ensemble of NWP simulations with sufficient resolution that convection parameterization is no longer needed and individual storms can be captured on the grid scale. Due to past limitations on technology, research into the optimal design of convection-allowing ensembles has remained limited. The purpose of ensembles is to account for forecast uncertainty that arises due to errors in the forecast process. There are errors in the initial conditions resulting from incomplete spatiotemporal sampling of the atmosphere and measurement error. There are errors in the lateral boundary conditions that drive limited-area models (as is the case for current experimental convection-allowing ensembles). There are also errors associated with the model formulation caused by numerical error from discretization of the model grid, truncation of numerical schemes, and inadequate subgrid-scale physics parameterizations that provide source and sink terms for the resolved-scale variables. This dissertation focuses on methods of accounting for uncertainty in assorted model physics components. Three research projects were performed using the WRF model with 4 km grid spacing. One investigated methods of accounting for microphysics uncertainty. Another investigated the impact of adding a stochastic model error representation scheme. The third project investigated methods of accounting for uncertainty in the land-surface model component. Two methods of perturbing the microphysics were compared: perturbing fixed parameters within a single scheme and using multiple schemes. The latter, termed mixed microphysics, was found to generate somewhat more skillful and more reliable probabilistic forecasts of precipitation, although the former, termed perturbed parameter microphysics, also performed well. A stochastic model error scheme that was originally created to improve large-scale forecasts in the ECMWF model was converted for use in the WRF model. The stochastic kinetic energy backscatter, or SKEB, scheme injects kinetic energy at all scales using a forced power spectrum. Its utility in a convection-allowing ensemble was analyzed and found to add significant ensemble spread while also reducing ensemble mean error. The SKEB scheme does not perturb any moisture variables, but was successful in causing slight improvements to precipitation forecasts. The final portion of this dissertation features an exploratory study to determine a perturbation strategy for the land-surface model component. Literature review revealed many uncertainties in current land-surface models that have not been accounted for in prior experimental convection-allowing ensembles. A set of perturbations that reflect uncertainty in the calculation of sensible and latent heat flux was determined and applied to a small set of cases. The sensitivity of convection forecasts to these perturbations was assessed and compared to that from other physics perturbations (microphysics and PBL physics). It was also determined that adding land-surface model perturbations to other existing physics perturbations further increases ensemble diversity and improves probabilistic forecasts of precipitation. Overall, it is encouraging that the WRF provides a wealth of resources for accounting for model physics uncertainty in convection-allowing NWP forecasts

    Reproducibility of graph metrics of human brain structural networks

    Get PDF
    Recent interest in human brain connectivity has led to the application of graph theoretical analysis to human brain structural networks, in particular white matter connectivity inferred from diffusion imaging and fiber tractography. While these methods have been used to study a variety of patient populations, there has been less examination of the reproducibility of these methods. A number of tractography algorithms exist and many of these are known to be sensitive to user-selected parameters. The methods used to derive a connectivity matrix from fiber tractography output may also influence the resulting graph metrics. Here we examine how these algorithm and parameter choices influence the reproducibility of proposed graph metrics on a publicly available test-retest dataset consisting of 21 healthy adults. The dice coefficient is used to examine topological similarity of constant density subgraphs both within and between subjects. Seven graph metrics are examined here: mean clustering coefficient, characteristic path length, largest connected component size, assortativity, global efficiency, local efficiency, and rich club coefficient. These reproducibility of these network summary measures is examined using the intraclass correlation coefficient (ICC). Graph curves are created by treating the graph metrics as functions of a parameter such as graph density. Functional data analysis techniques are usedto examine differences in graph measures that result from the choice of fiber tracking algorithm. The graph metrics consistently showed good levels of reproducibility as measured with ICC, with the exception of some instability at low graph density levels. The global and local efficiency measures were the most robust to the choice of fiber tracking algorithm

    LOCOMOOR : a LOw-COst MOORing for the measurement of internal solitary waves

    Get PDF
    Presented at the ONR/MTS Buoy Workshop, May 9-11, 2000, Clark Laboratory, Woods Hole Oceanographic Institution, Woods Hole, MAIn order to supplement the ASIAEX field effort to measure the temporal and spatial structure of the internal solitary wave field in relationship to acoustic propagation and scattering studies, an array of low-cost temperature moorings (LOCOMOOR) has been developed. The basic concept is to provide spatial coverage as opposed to dense vertical resolution in temperature. Three temperature sensors on each mooring will adequately measure the time of passage of the internal solitary waves. A horizontal array of 20 of these moorings deployed for about three weeks will allow the internal solitary wave front geometry (curvature) and velocity to be measured as they propagate through the experiment region. The arrival time of each pulse within the packet of internal waves will be easily resolved, but the wave amplitude less exactly estimated. However, the amplitude will be very well measured by the velocity and density observations on the more heavily instrumented environmental moorings associated with the acoustic experiment

    Multivariate Normalization with Symmetric Diffeomorphisms for Multivariate Studies

    Get PDF
    Current clinical and research neuroimaging protocols acquire images using multiple modalities, for instance, T1, T2, diffusion tensor and cerebral blood flow magnetic resonance images (MRI). These multivariate datasets provide unique and often complementary anatomical and physiological information about the subject of interest. We present a method that uses fused multiple modality (scalar and tensor) datasets to perform intersubject spatial normalization. Our multivariate approach has the potential to eliminate inconsistencies that occur when normalization is performed on each modality separately. Furthermore, the multivariate approach uses a much richer anatomical and physiological image signature to infer image correspondences and perform multivariate statistical tests. In this initial study, we develop the theory for Multivariate Symmetric Normalization (MVSyN), establish its feasibility and discuss preliminary results on a multivariate statistical study of 22q deletion syndrome

    Childhood socioeconomic status and childhood maltreatment: Distinct associations with brain structure

    Get PDF
    The present study examined the relationship between childhood socioeconomic status (SES), childhood maltreatment, and the volumes of the hippocampus and amygdala between the ages of 25 and 36 years. Previous work has linked both low SES and maltreatment with reduced hippocampal volume in childhood, an effect attributed to childhood stress. In 46 adult subjects, only childhood maltreatment, and not childhood SES, predicted hippocampal volume in regression analyses, with greater maltreatment associated with lower volume. Neither factor was related to amygdala volume. When current SES and recent interpersonal stressful events were also considered, recent interpersonal stressful events predicted smaller hippocampal volumes over and above childhood maltreatment. Finally, exploratory analyses revealed a significant sex by childhood SES interaction, with women’s childhood SES showing a significantly more positive relation (less negative) with hippocampus volume than men’s. The overall effect of childhood maltreatment but not SES, and the sex-specific effect of childhood SES, indicate that different forms of stressful childhood adversity affect brain development differently

    A Digital Atlas of the Dog Brain

    Get PDF
    There is a long history and a growing interest in the canine as a subject of study in neuroscience research and in translational neurology. In the last few years, anatomical and functional magnetic resonance imaging (MRI) studies of awake and anesthetized dogs have been reported. Such efforts can be enhanced by a population atlas of canine brain anatomy to implement group analyses. Here we present a canine brain atlas derived as the diffeomorphic average of a population of fifteen mesaticephalic dogs. The atlas includes: 1) A brain template derived from in-vivo, T1-weighted imaging at 1 mm isotropic resolution at 3 Tesla (with and without the soft tissues of the head); 2) A co-registered, high-resolution (0.33 mm isotropic) template created from imaging of ex-vivo brains at 7 Tesla; 3) A surface representation of the gray matter/white matter boundary of the high-resolution atlas (including labeling of gyral and sulcal features). The properties of the atlas are considered in relation to historical nomenclature and the evolutionary taxonomy of the Canini tribe. The atlas is available for download (https://cfn.upenn.edu/aguirre/wiki/public:data_plosone_2012_datta)

    Lessons from Elwha Ecosystem Restoration: Integrating science, policy, and management

    Get PDF
    After decades of debate, planning, and environmental impact studies, the largest planned dam removal project in history was conducted on the Elwha River in Washington State, USA, from 2011 to 2014. Because this was such a unique and unprecedented project, the planning, implementation, and monitoring has occurred over 30 years and required diverse expertise across multiple stakeholders. This knowledge portfolio required that individuals and organizations with different missions collaborate effectively to ensure a successful project. The story of this river, with rugged headwaters, protected wilderness, legendary and culturally important salmon runs, and two hydroelectric dams whose placement marshaled wholesale socioeconomic and ecological changes, has become an iconic saga of change, perseverance and renewal. The two dams blocked fish migrations and disrupted sediment transport for a century, disrupting the ecological structure and function of the Elwha River. Removal of the 64-m and 32-m tall dams and the release of a large portion of the 21 million cubic meters of stored reservoir sediment, has provided a living laboratory to study the patterns, processes and outcomes of dam removal across freshwater, estuarine, and marine ecosystem boundaries from multiple scientific perspectives. This collaborative multidisciplinary approach has allowed us to improve our understanding of ecosystem responses and linkages while making the Elwha River one of the best studied dam removal projects
    • …
    corecore