415 research outputs found

    Overview of the Field Phase of the NASA Tropical Cloud Systems and Processes (TCSP)Experiment

    Get PDF
    The Tropical Cloud Systems and Processes experiment is sponsored by the National Aeronautics and Space Administration (NASA) to investigate characteristics of tropical cyclone genesis, rapid intensification and rainfall using a three-pronged approach that emphasizes satellite information, suborbital observations and numerical model simulations. Research goals include demonstration and assessment of new technology, improvements to numerical model parameterizations, and advancements in data assimilation techniques. The field phase of the experiment was based in Costa Rica during July 2005. A fully instrumented NASA ER-2 high altitude airplane was deployed with Doppler radar, passive microwave instrumentation, lightning and electric field sensors and an airborne simulator of visible and infrared satellite sensors. Other assets brought to TCSP were a low flying uninhabited aerial vehicle, and a surface-based radiosonde network. In partnership with the Intensity Forecasting Experiment of the National Oceanic and Atmospheric Administration (NOAA) Hurricane Research Division, two NOAA P-3 aircraft instrumented with radar, passive microwave, microphysical, and dropsonde instrumentation were also deployed to Costa Rica. The field phase of TCSP was conducted in Costa Rica to take advantage of the geographically compact tropical cyclone genesis region of the Eastern Pacific Ocean near Central America. However, the unusual 2005 hurricane season provided numerous opportunities to sample tropical cyclone development and intensification in the Caribbean Sea and Gulf of Mexico as well. Development of Hurricane Dennis and Tropical Storm Gert were each investigated over several days in addition to Hurricane Emily as it was close to Saffir-Simpson Category 5 intensity. An overview of the characteristics of these storms along with the pregenesis environment of Tropical Storm Eugene in the Eastern Pacific will be presented

    Analysis of the first gigantic jet recorded over continental North America

    Get PDF
    [1] Two low-light cameras near Marfa, Texas, recorded a gigantic jet over northern Mexico on 13 May 2005 at approximately 0423:50 UTC. Assuming that the farthest of two candidate storm systems was its source, the bright lower channel ended in a fork at around 50–59 km height with the very dim upper branches extended to 69–80 km altitude. During the time window containing the jet, extremely low frequency magnetic field recordings show that there was no fast charge moment change larger than 50 coulomb times kilometers (C km) but there was a larger and slower charge moment change of 520 C km over 70 ms. The likely parent thunderstorm was a high-precipitation supercell cluster containing a persistent mesocyclone, with radar echo tops of at least 17 km. However, photogrammetric analysis suggests that the gigantic jet occurred over the forward flank downdraft region with echo tops of 14 km. This part of the supercell may have had an inverted-polarity charge configuration as evidenced by positive cloud-to-ground lightning flashes (+CG) dominating over negative flashes (-CG), while -CGs occurred under the downwind anvil. Four minutes before the gigantic jet, -CG activity practically ceased in this area, while +CG rates increased, culminating during the 20 s leading up to the gigantic jet with four National Lightning Detection Network–detected +CGs. A relative lull in lightning activity of both polarities was observed for up to 1.5 min after the gigantic jet. The maturing storm subsequently produced 30 sprites between 0454 and 0820 UTC, some associated with extremely large impulse charge moment change values.Peer ReviewedPostprint (published version

    Beyond Hebb: Exclusive-OR and Biological Learning

    Full text link
    A learning algorithm for multilayer neural networks based on biologically plausible mechanisms is studied. Motivated by findings in experimental neurobiology, we consider synaptic averaging in the induction of plasticity changes, which happen on a slower time scale than firing dynamics. This mechanism is shown to enable learning of the exclusive-OR (XOR) problem without the aid of error back-propagation, as well as to increase robustness of learning in the presence of noise.Comment: 4 pages RevTeX, 2 figures PostScript, revised versio

    Proteome-based plasma biomarkers for Alzheimer's disease

    Get PDF
    Alzheimer's disease is a common and devastating disease for which there is no readily available biomarker to aid diagnosis or to monitor disease progression. Biomarkers have been sought in CSF but no previous study has used two-dimensional gel electrophoresis coupled with mass spectrometry to seek biomarkers in peripheral tissue. We performed a case-control study of plasma using this proteomics approach to identify proteins that differ in the disease state relative to aged controls. For discovery-phase proteomics analysis, 50 people with Alzheimer's dementia were recruited through secondary services and 50 normal elderly controls through primary care. For validation purposes a total of 511 subjects with Alzheimer's disease and other neurodegenerative diseases and normal elderly controls were examined. Image analysis of the protein distribution of the gels alone identifies disease cases with 56% sensitivity and 80% specificity. Mass spectrometric analysis of the changes observed in two-dimensional electrophoresis identified a number of proteins previously implicated in the disease pathology, including complement factor H (CFH) precursor and α-2-macroglobulin (α- 2M). Using semi-quantitative immunoblotting, the elevation of CFH and α- 2M was shown to be specific for Alzheimer's disease and to correlate with disease severity although alternative assays would be necessary to improve sensitivity and specificity. These findings suggest that blood may be a rich source for biomarkers of Alzheimer's disease and that CFH, together with other proteins such as α- 2M may be a specific markers of this illness. © 2006 The Author(s).link_to_subscribed_fulltex

    The electromagnetic Christodoulou memory effect and its application to neutron star binary mergers

    Full text link
    Gravitational waves are predicted by the general theory of relativity. It has been shown that gravitational waves have a nonlinear memory, displacing test masses permanently. This is called the Christodoulou memory. We proved that the electromagnetic field contributes at highest order to the nonlinear memory effect of gravitational waves, enlarging the permanent displacement of test masses. In experiments like LISA or LIGO which measure distances of test masses, the Christodoulou memory will manifest itself as a permanent displacement of these objects. It has been suggested to detect the Christodoulou memory effect using radio telescopes investigating small changes in pulsar’s pulse arrival times. The latter experiments are based on present-day technology and measure changes in frequency. In the present paper, we study the electromagnetic Christodoulou memory effect and compute it for binary neutron star mergers. These are typical sources of gravitational radiation. During these processes, not only mass and momenta are radiated away in form of gravitational waves, but also very strong magnetic fields are produced and radiated away. Moreover, a large portion of the energy is carried away by neutrinos. We give constraints on the conditions, where the energy transported by electromagnetic radiation is of similar or slightly higher order than the energy radiated in gravitational waves or in form of neutrinos. We find that for coalescing neutron stars, large magnetic fields magnify the Christodoulou memory as long as the gaseous environment is sufficiently rarefied. Thus the observed effect on test masses of a laser interferometer gravitational wave detector will be enlarged by the contribution of the electromagnetic field. Therefore, the present results are important for the planned experiments. Looking at the null asymptotics of spacetimes, which are solutions of the Einstein–Maxwell equations, we derive the electromagnetic Christodoulou memory effect. We obtain an exact solution of the full nonlinear problem, no approximations were used. Moreover, our results allow to answer astrophysical questions, as the knowledge about the amount of energy radiated away in a neutron star binary merger enables us to gain information about the source of the gravitational waves.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/98597/1/0264-9381_29_21_215003.pd

    Attentive Learning of Sequential Handwriting Movements: A Neural Network Model

    Full text link
    Defense Advanced research Projects Agency and the Office of Naval Research (N00014-95-1-0409, N00014-92-J-1309); National Science Foundation (IRI-97-20333); National Institutes of Health (I-R29-DC02952-01)

    Toward shared decision-making in degenerative cervical myelopathy: Protocol for a mixed methods study

    Get PDF
    Health care decisions are a critical determinant in the evolution of chronic illness. In shared decision-making (SDM), patients and clinicians work collaboratively to reach evidence-based health decisions that align with individual circumstances, values, and preferences. This personalized approach to clinical care likely has substantial benefits in the oversight of degenerative cervical myelopathy (DCM), a type of nontraumatic spinal cord injury. Its chronicity, heterogeneous clinical presentation, complex management, and variable disease course engenders an imperative for a patient-centric approach that accounts for each patient's unique needs and priorities. Inadequate patient knowledge about the condition and an incomplete understanding of the critical decision points that arise during the course of care currently hinder the fruitful participation of health care providers and patients in SDM. This study protocol presents the rationale for deploying SDM for DCM and delineates the groundwork required to achieve this. The study's primary outcome is the development of a comprehensive checklist to be implemented upon diagnosis that provides patients with essential information necessary to support their informed decision-making. This is known as a core information set (CIS). The secondary outcome is the creation of a detailed process map that provides a diagrammatic representation of the global care workflows and cognitive processes involved in DCM care. Characterizing the critical decision points along a patient's journey will allow for an effective exploration of SDM tools for routine clinical practice to enhance patient-centered care and improve clinical outcomes. Both CISs and process maps are coproduced iteratively through a collaborative process involving the input and consensus of key stakeholders. This will be facilitated by Myelopathy.org, a global DCM charity, through its Research Objectives and Common Data Elements for Degenerative Cervical Myelopathy community. To develop the CIS, a 3-round, web-based Delphi process will be used, starting with a baseline list of information items derived from a recent scoping review of educational materials in DCM, patient interviews, and a qualitative survey of professionals. A priori criteria for achieving consensus are specified. The process map will be developed iteratively using semistructured interviews with patients and professionals and validated by key stakeholders. Recruitment for the Delphi consensus study began in April 2023. The pilot-testing of process map interview participants started simultaneously, with the formulation of an initial baseline map underway. This protocol marks the first attempt to provide a starting point for investigating SDM in DCM. The primary work centers on developing an educational tool for use in diagnosis to enable enhanced onward decision-making. The wider objective is to aid stakeholders in developing SDM tools by identifying critical decision junctures in DCM care. Through these approaches, we aim to provide an exhaustive launchpad for formulating SDM tools in the wider DCM community. DERR1-10.2196/46809. [Abstract copyright: ©Irina Sangeorzan, Grazia Antonacci, Anne Martin, Ben Grodzinski, Carl M Zipser, Rory K J Murphy, Panoraia Andriopoulou, Chad E Cook, David B Anderson, James Guest, Julio C Furlan, Mark R N Kotter, Timothy F Boerger, Iwan Sadler, Elizabeth A Roberts, Helen Wood, Christine Fraser, Michael G Fehlings, Vishal Kumar, Josephine Jung, James Milligan, Aria Nouri, Allan R Martin, Tammy Blizzard, Luiz Roberto Vialle, Lindsay Tetreault, Sukhvinder Kalsi-Ryan, Anna MacDowall, Esther Martin-Moore, Martin Burwood, Lianne Wood, Abdul Lalkhen, Manabu Ito, Nicky Wilson, Caroline Treanor, Sheila Dugan, Benjamin M Davies. Originally published in JMIR Research Protocols (https://www.researchprotocols.org), 09.10.2023.

    The Influence of Markov Decision Process Structure on the Possible Strategic Use of Working Memory and Episodic Memory

    Get PDF
    Researchers use a variety of behavioral tasks to analyze the effect of biological manipulations on memory function. This research will benefit from a systematic mathematical method for analyzing memory demands in behavioral tasks. In the framework of reinforcement learning theory, these tasks can be mathematically described as partially-observable Markov decision processes. While a wealth of evidence collected over the past 15 years relates the basal ganglia to the reinforcement learning framework, only recently has much attention been paid to including psychological concepts such as working memory or episodic memory in these models. This paper presents an analysis that provides a quantitative description of memory states sufficient for correct choices at specific decision points. Using information from the mathematical structure of the task descriptions, we derive measures that indicate whether working memory (for one or more cues) or episodic memory can provide strategically useful information to an agent. In particular, the analysis determines which observed states must be maintained in or retrieved from memory to perform these specific tasks. We demonstrate the analysis on three simplified tasks as well as eight more complex memory tasks drawn from the animal and human literature (two alternation tasks, two sequence disambiguation tasks, two non-matching tasks, the 2-back task, and the 1-2-AX task). The results of these analyses agree with results from quantitative simulations of the task reported in previous publications and provide simple indications of the memory demands of the tasks which can require far less computation than a full simulation of the task. This may provide a basis for a quantitative behavioral stoichiometry of memory tasks

    Theorems on existence and global dynamics for the Einstein equations

    Get PDF
    This article is a guide to theorems on existence and global dynamics of solutions of the Einstein equations. It draws attention to open questions in the field. The local-in-time Cauchy problem, which is relatively well understood, is surveyed. Global results for solutions with various types of symmetry are discussed. A selection of results from Newtonian theory and special relativity that offer useful comparisons is presented. Treatments of global results in the case of small data and results on constructing spacetimes with prescribed singularity structure or late-time asymptotics are given. A conjectural picture of the asymptotic behaviour of general cosmological solutions of the Einstein equations is built up. Some miscellaneous topics connected with the main theme are collected in a separate section.Comment: Submitted to Living Reviews in Relativity, major update of Living Rev. Rel. 5 (2002)
    • …
    corecore