880 research outputs found

    Total energy density as an interpretative tool

    Full text link
    We present an unambiguous formulation for the total energy density within density-functional theory. We propose that it be used as a tool for the interpretation of computed energy and electronic structure changes during structural transformations and chemical reactions, augmenting the present use of electron density changes and changes in the Kohn-Sham local density of states and Kohn-Sham energy density.Comment: 5 pages, 3 embedded figures, submitted to J. Chem. Phy

    Alcoholic adaptation : a preliminary investigation of the transactional analysis viewpoint, with application to delta and gamma alcoholics

    Get PDF
    Two delta and two gamma alcoholics were assessed by case study according to the theoretical formulations of Transactional Analysis. Scripts and Games were elicited through the Thematic Apperception Test, Laddering Procedure, and Life History. Analyses demonstrated, firstly, that parallels were present between the parent-child relationship and present adult transactions; secondly, that needs, fears, and control mechanisms were traceable to early parental injunctions; thirdly, that among these subjects, delta alcoholics tend to play the alcoholic game "Lush", and gamma alcoholics tend to play the alcoholic game "Drunk and Proud". It was concluded that the script and existential position appear to play an important role in the maintenance of the drinking pattern. Alcoholic Loss of Control appears to be influenced by the degree to which aggression is suppressed

    Dysbindin-1 in dorsolateral prefrontal cortex of schizophrenia cases is reduced in an isoform-specific manner unrelated to altered dysbindin-1 gene expression

    Get PDF
    DTNBP1 (dystrobrevin binding protein 1) remains one of the top candidate genes in schizophrenia. Reduced expression of this gene and the protein it encodes, dysbindin-1, has been reported in the dorsolateral prefrontal cortex (DLPFC) of schizophrenia cases. It has not been established, however, if all dysbindin-1 isoforms are reduced in the DLPFC or if the reduction is associated with reduced DTNBP1 gene expression. Using Western blotting of whole-tissue lysates of the DLPFC with antibodies differentially sensitive to the three major isoforms of this protein (dysbindin-1A, -1B, and -1C), we found no significant differences between our schizophrenia cases and matched controls in dysbindin-1A or -1B, but did find a mean 46% reduction in dysbindin-1C in 71% of 28 case-control pairs (p = 0.022). This occurred in the absence of the one DTNBP1 risk haplotype for schizophrenia reported in the US and without alteration in levels of dysbindin-1C transcripts. Conversely, the absence of changes in the dysbindin-1A and -1B isoforms was accompanied by increased levels of their transcripts. We thus found no correspondence between alterations in dysbindin-1 gene and protein expression, the latter of which might be due to posttranslational modifications such as ubiquitination. Reduced DLPFC dysbindin-1C in schizophrenia probably occurs in PSDs, where we find dysbindin-1C to be heavily concentrated in the human brain. Given known postsynaptic effects of dysbindin-1 reductions in the rodent homolog of the prefrontal cortex, these findings suggest that reduced dysbindin-1C in the DLPFC may contribute to cognitive deficits of schizophrenia by promoting NMDA receptor hypofunction

    New QCD Sum Rules for Nucleons in Nuclear Matter

    Get PDF
    Two new QCD sum rules for nucleons in nuclear matter are obtained from a mixed correlator of spin-1/2 and spin-3/2 interpolating fields. These new sum rules, which are insensitive to the poorly known four-quark condensates, provide additional information on the nucleon scalar self-energy. These new sum rules are analyzed along with previous spin-1/2 interpolator-based sum rules which are also insensitive to the poorly known four-quark condensates. The analysis indicates consistency with the expectations of relativistic nuclear phenomenology at nuclear matter saturation density. However, a weaker density dependence near saturation is suggested. Using previous estimates of in-medium condensate uncertainties, we find M=0.640.09+0.13M^* = 0.64^{+0.13}_{-0.09} GeV and Σv=0.290.10+0.06\Sigma_v = 0.29^{+0.06}_{-0.10} GeV at nuclear matter saturation density.Comment: 10 page RevTeX Manuscript with embedded figures. Revised manuscript accepted for publication. This and related papers may also be obtained from http://www.phys.washington.edu/~derek/Publications.htm

    The Floodwater Depth Estimation Tool (FwDET v2.0) for improved remote sensing analysis of coastal flooding

    Get PDF
    Remote sensing analysis is routinely used to map flooding extent either retrospectively or in near-real time. For flood emergency response, remote-sensing-based flood mapping is highly valuable as it can offer continued observational information about the flood extent over large geographical domains. Information about the floodwater depth across the inundated domain is important for damage assessment, rescue, and prioritizing of relief resource allocation, but cannot be readily estimated from remote sensing analysis. The Floodwater Depth Estimation Tool (FwDET) was developed to augment remote sensing analysis by calculating water depth based solely on an inundation map with an associated digital elevation model (DEM). The tool was shown to be accurate and was used in flood response activations by the Global Flood Partnership. Here we present a new version of the tool, FwDET v2.0, which enables water depth estimation for coastal flooding. FwDET v2.0 features a new flood boundary identification scheme which accounts for the lack of confinement of coastal flood domains at the shoreline. A new algorithm is used to calculate the local floodwater elevation for each cell, which improves the tool\u27s runtime by a factor of 15 and alleviates inaccurate local boundary assignment across permanent water bodies. FwDET v2.0 is evaluated against physically based hydrodynamic simulations in both riverine and coastal case studies. The results show good correspondence, with an average difference of 0.18 and 0.31 m for the coastal (using a 1 m DEM) and riverine (using a 10 m DEM) case studies, respectively. A FwDET v2.0 application of using remote-sensing-derived flood maps is presented for three case studies. These case studies showcase FwDET v2.0 ability to efficiently provide a synoptic assessment of floodwater. Limitations include challenges in obtaining high-resolution DEMs and increases in uncertainty when applied for highly fragmented flood inundation domains

    Robot introspection through learned hidden Markov models

    Get PDF
    In this paper we describe a machine learning approach for acquiring a model of a robot behaviour from raw sensor data. We are interested in automating the acquisition of behavioural models to provide a robot with an introspective capability. We assume that the behaviour of a robot in achieving a task can be modelled as a finite stochastic state transition system. Beginning with data recorded by a robot in the execution of a task, we use unsupervised learning techniques to estimate a hidden Markov model (HMM) that can be used both for predicting and explaining the behaviour of the robot in subsequent executions of the task. We demonstrate that it is feasible to automate the entire process of learning a high quality HMM from the data recorded by the robot during execution of its task.The learned HMM can be used both for monitoring and controlling the behaviour of the robot. The ultimate purpose of our work is to learn models for the full set of tasks associated with a given problem domain, and to integrate these models with a generative task planner. We want to show that these models can be used successfully in controlling the execution of a plan. However, this paper does not develop the planning and control aspects of our work, focussing instead on the learning methodology and the evaluation of a learned model. The essential property of the models we seek to construct is that the most probable trajectory through a model, given the observations made by the robot, accurately diagnoses, or explains, the behaviour that the robot actually performed when making these observations. In the work reported here we consider a navigation task. We explain the learning process, the experimental setup and the structure of the resulting learned behavioural models. We then evaluate the extent to which explanations proposed by the learned models accord with a human observer's interpretation of the behaviour exhibited by the robot in its execution of the task

    NASAs Mid-Atlantic Communities and Areas at Intensive Risk Demonstration: Translating Compounding Hazards to Societal Risk

    Get PDF
    Remote sensing provides a unique perspective on our dynamic planet, tracking changes and revealing the course of complex interactions. Long term monitoring and targeted observation combine with modeling and mapping to provide increased awareness of hydro-meteorological and geological hazards. Disasters often follow hazards and the goal of NASAs Disasters Program is to look at the earth as a highly coupled system to reduce risk and enable resilience. Remote sensing and geospatial science are used as tools to help answer critical questions that inform decisions. Data is not the same as information, nor does understanding of processes necessarily translate into decision support for disaster preparedness, response and recovery. Accordingly, NASA is engaging the scientific and decision-support communities to apply remote sensing, modeling, and related applications in Communities and Areas at Intensive Risk (CAIR). In 2017, NASAs Applied Sciences Disasters Program hosted a regional workshop to explore these issues with particular focus on coastal Virginia and North Carolina. The workshop brought together partners in academia, emergency management, and scientists from NASA and partnering federal agencies to explore capabilities among the team that could improve understanding of the physical processes related to these hazards, their potential impact to changing communities, and to identify methodologies for supporting emergency response and risk mitigation. The resulting initiative, the mid-Atlantic CAIR project, demonstrates the ability to integrate satellite derived earth observations and physical models into actionable, trusted knowledge. Severe storms and associated storm surge, sea level rise, and land subsidence coupled with increasing populations and densely populated, aging critical infrastructure often leave coastal regions and their communities extremely vulnerable. The integration of observations and models allow for a comprehensive understanding of the compounding risk experienced in coastal regions and enables individuals in all positions make risk-informed decisions. This initiative uses a representative storm surge case as a baseline to produce flood inundation maps. These maps predict building level impacts at current day and for sea level rise (SLR) and subsidence scenarios of the future in order to inform critical decisions at both the tactical and strategic levels. To accomplish this analysis, the mid-Atlantic CAIR project brings together Federal research activities with academia to examine coastal hazards in multiple ways: 1) reanalysis of impacts from 2011 Hurricane Irene, using numerical weather modeling in combination with coastal surge and hydrodynamic, urban inundation modeling to evaluate combined impact scenarios considering SLR and subsidence, 2) remote sensing of flood extent from available optical imagery, 3) adding value to remotely sensed flood maps through depth predictions, and 4) examining coastal subsidence as measured through time-series analysis of synthetic aperture radar observations. Efforts and results are published via ArcGIS story maps to communicate neighborhoods and infrastructure most vulnerable to changing conditions. Story map features enable time-aware flood mapping using hydrodynamic models, photographic comparison of flooding following Hurricane Irene, as well as visualization of heightened risk in the future due to SLR and land subsidence

    Optimisation of sample thickness for THz-TDS measurements

    Get PDF
    How thick should the sample be for a transmission THz-TDS measurement? Should the sample be as thick as possible? The answer is `no'. Although more thickness allows T-rays to interact more with bulk material, SNR rolls off with thickness due to signal attenuation. Then, should the sample be extremely thin? Again, the answer is `no'. A sample that is too thin renders itself nearly invisible to T-rays, in such a way that the system can hardly sense the difference between the sample and a free space path. So, where is the optimal boundary between `too thick' and `too thin'? The trade-off is analysed and revealed in this paper, where our approach is to find the optimal thickness that results in the minimal variance of measured optical constants.Comment: 13 pages, 11 figure

    Chiral Corrections to Lattice Calculations of Charge Radii

    Full text link
    Logarithmic divergences in pion and proton charge radii associated with chiral loops are investigated to assess systematic uncertainties in current lattice determinations of charge radii. The chiral corrections offer a possible solution to the long standing problem of why present lattice calculations yield proton and pion radii which are similar in size.Comment: PostScript file only. Ten pages. Figures included. U. of MD Preprint #92-19
    corecore