7,388 research outputs found

    Gravitational Collapse and Fragmentation in Molecular Clouds with Adaptive Mesh Refinement

    Get PDF
    We describe a powerful methodology for numerical solution of 3-D self-gravitational hydrodynamics problems with extremely high resolution. Our method utilizes the technique of local adaptive mesh refinement (AMR), employing multiple grids at multiple levels of resolution. These grids are automatically and dynamically added and removed as necessary to maintain adequate resolution. This technology allows for the solution of problems in a manner that is both more efficient and more versatile than other fixed and variable resolution methods. The application of AMR to simulate the collapse and fragmentation of a molecular cloud, a key step in star formation, is discussed. Such simulations involve many orders of magnitude of variation in length scale as fragments form. In this paper we briefly describe the methodology and present an illustrative application for nonisothermal cloud collapse. We describe the numerical Jeans condition, a criterion for stability of self-gravitational hydrodynamics problems. We show the first well-resolved nonisothermal evolutionary sequence beginning with a perturbed dense molecular cloud core that leads to the formation of a binary system consisting of protostellar cores surrounded by distinct protostellar disks. The scale of the disks, of order 100 AU, is consistent with observations of gaseous disks surrounding single T-Tauri stars and debris disks surrounding systems such as β\beta Pictoris.Comment: 10 pages, 6 figures (color postscript). To appear in the proceedings of Numerical Astrophysics 1998, Tokyo, March 10-13, 199

    The effect of time constraint on anticipation, decision making, and option generation in complex and dynamic environments

    Get PDF
    Researchers interested in performance in complex and dynamic situations have focused on how individuals predict their opponent(s) potential courses of action (i.e., during assessment) and generate potential options about how to respond (i.e., during intervention). When generating predictive options, previous research supports the use of cognitive mechanisms that are consistent with long-term working memory (LTWM) theory (Ericsson and Kintsch in Phychol Rev 102(2):211–245, 1995; Ward et al. in J Cogn Eng Decis Mak 7:231–254, 2013). However, when generating options about how to respond, the extant research supports the use of the take-the-first (TTF) heuristic (Johnson and Raab in Organ Behav Hum Decis Process 91:215–229, 2003). While these models provide possible explanations about how options are generated in situ, often under time pressure, few researchers have tested the claims of these models experimentally by explicitly manipulating time pressure. The current research investigates the effect of time constraint on option-generation behavior during the assessment and intervention phases of decision making by employing a modified version of an established option-generation task in soccer. The results provide additional support for the use of LTWM mechanisms during assessment across both time conditions. During the intervention phase, option-generation behavior appeared consistent with TTF, but only in the non-time-constrained condition. Counter to our expectations, the implementation of time constraint resulted in a shift toward the use of LTWM-type mechanisms during the intervention phase. Modifications to the cognitive-process level descriptions of decision making during intervention are proposed, and implications for training during both phases of decision making are discussed

    The contribution of the anaesthetist to risk-adjusted mortality after cardiac surgery

    Get PDF
    It is widely accepted that the performance of the operating surgeon affects outcomes, and this has led to the publication of surgical results in the public domain. However, the effect of other members of the multidisciplinary team is unknown. We studied the effect of the anaesthetist on mortality after cardiac surgery by analysing data collected prospectively over ten years of consecutive cardiac surgical cases from ten UK centres. Casemix-adjusted outcomes were analysed in models that included random-effects for centre, surgeon and anaesthetist. All cardiac surgical operations for which the EuroSCORE model is appropriate were included, and the primary outcome was in-hospital death up to three months postoperatively. A total of 110 769 cardiac surgical procedures conducted between April 2002 and March 2012 were studied, which included 127 consultant surgeons and 190 consultant anaesthetists. The overwhelming factor associated with outcome was patient risk, accounting for 95.75% of the variation for in-hospital mortality. The impact of the surgeon was moderate (intra-class correlation coefficient 4.00% for mortality), and the impact of the anaesthetist was negligible (0.25%). There was no significant effect of anaesthetist volume above ten cases per year. We conclude that mortality after cardiac surgery is primarily determined by the patient, with small but significant differences between surgeons. Anaesthetists did not appear to affect mortality. These findings do not support public disclosure of cardiac anaesthetists' results, but substantially validate current UK cardiac anaesthetic training and practice. Further research is required to establish the potential effects of very low anaesthetic caseloads and the effect of cardiac anaesthetists on patient morbidity

    Cancer cells resist antibody-mediated destruction by neutrophils through activation of the exocyst complex

    Get PDF
    Cytotoxicity; Immunity; ImmunotherapyCitotoxicidad; Inmunidad; InmunoterapiaCitotoxicitat; Immunitat; ImmunoteràpiaBackground Neutrophils kill antibody-opsonized tumor cells using trogocytosis, a unique mechanism of destruction of the target plasma. This previously unknown cytotoxic process of neutrophils is dependent on antibody opsonization, Fcγ receptors and CD11b/CD18 integrins. Here, we demonstrate that tumor cells can escape neutrophil-mediated cytotoxicity by calcium (Ca2+)-dependent and exocyst complex-dependent plasma membrane repair. Methods We knocked down EXOC7 or EXOC4, two exocyst components, to evaluate their involvement in tumor cell membrane repair after neutrophil-induced trogocytosis. We used live cell microscopy and flow cytometry for visualization of the host and tumor cell interaction and tumor cell membrane repair. Last, we reported the mRNA levels of exocyst in breast cancer tumors in correlation to the response in trastuzumab-treated patients. Results We found that tumor cells can evade neutrophil antibody-dependent cellular cytotoxicity (ADCC) by Ca2+-dependent cell membrane repair, a process induced upon neutrophil trogocytosis. Absence of exocyst components EXOC7 or EXOC4 rendered tumor cells vulnerable to neutrophil-mediated ADCC (but not natural killer cell-mediated killing), while neutrophil trogocytosis remained unaltered. Finally, mRNA levels of exocyst components in trastuzumab-treated patients were inversely correlated to complete response to therapy. Conclusions Our results support that neutrophil attack towards antibody-opsonized cancer cells by trogocytosis induces an active repair process by the exocyst complex in vitro. Our findings provide insight to the possible contribution of neutrophils in current antibody therapies and the tolerance mechanism of tumor cells and support further studies for potential use of the exocyst components as clinical biomarkers.This work was supported by the Dutch Cancer Society (grant numbers 10300 and 11537, awarded to TKvdB and HLM, respectively)

    Do logarithmic proximity measures outperform plain ones in graph clustering?

    Full text link
    We consider a number of graph kernels and proximity measures including commute time kernel, regularized Laplacian kernel, heat kernel, exponential diffusion kernel (also called "communicability"), etc., and the corresponding distances as applied to clustering nodes in random graphs and several well-known datasets. The model of generating random graphs involves edge probabilities for the pairs of nodes that belong to the same class or different predefined classes of nodes. It turns out that in most cases, logarithmic measures (i.e., measures resulting after taking logarithm of the proximities) perform better while distinguishing underlying classes than the "plain" measures. A comparison in terms of reject curves of inter-class and intra-class distances confirms this conclusion. A similar conclusion can be made for several well-known datasets. A possible origin of this effect is that most kernels have a multiplicative nature, while the nature of distances used in cluster algorithms is an additive one (cf. the triangle inequality). The logarithmic transformation is a tool to transform the first nature to the second one. Moreover, some distances corresponding to the logarithmic measures possess a meaningful cutpoint additivity property. In our experiments, the leader is usually the logarithmic Communicability measure. However, we indicate some more complicated cases in which other measures, typically, Communicability and plain Walk, can be the winners.Comment: 11 pages, 5 tables, 9 figures. Accepted for publication in the Proceedings of 6th International Conference on Network Analysis, May 26-28, 2016, Nizhny Novgorod, Russi

    Heroes and villains of world history across cultures

    Get PDF
    © 2015 Hanke et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are creditedEmergent properties of global political culture were examined using data from the World History Survey (WHS) involving 6,902 university students in 37 countries evaluating 40 figures from world history. Multidimensional scaling and factor analysis techniques found only limited forms of universality in evaluations across Western, Catholic/Orthodox, Muslim, and Asian country clusters. The highest consensus across cultures involved scientific innovators, with Einstein having the most positive evaluation overall. Peaceful humanitarians like Mother Theresa and Gandhi followed. There was much less cross-cultural consistency in the evaluation of negative figures, led by Hitler, Osama bin Laden, and Saddam Hussein. After more traditional empirical methods (e.g., factor analysis) failed to identify meaningful cross-cultural patterns, Latent Profile Analysis (LPA) was used to identify four global representational profiles: Secular and Religious Idealists were overwhelmingly prevalent in Christian countries, and Political Realists were common in Muslim and Asian countries. We discuss possible consequences and interpretations of these different representational profiles.This research was supported by grant RG016-P-10 from the Chiang Ching-Kuo Foundation for International Scholarly Exchange (http://www.cckf.org.tw/). Religion Culture Entropy China Democracy Economic histor

    Decision, Sensation, and Habituation: A Multi-Layer Dynamic Field Model for Inhibition of Return

    Get PDF
    Inhibition of Return (IOR) is one of the most consistent and widely studied effects in experimental psychology. The effect refers to a delayed response to visual stimuli in a cued location after initial priming at that location. This article presents a dynamic field model for IOR. The model describes the evolution of three coupled activation fields. The decision field, inspired by the intermediate layer of the superior colliculus, receives endogenous input and input from a sensory field. The sensory field, inspired by earlier sensory processing, receives exogenous input. Habituation of the sensory field is implemented by a reciprocal coupling with a third field, the habituation field. The model generates IOR because, due to the habituation of the sensory field, the decision field receives a reduced target-induced input in cue-target-compatible situations. The model is consistent with single-unit recordings of neurons of monkeys that perform IOR tasks. Such recordings have revealed that IOR phenomena parallel the activity of neurons in the intermediate layer of the superior colliculus and that neurons in this layer receive reduced input in cue-target-compatible situations. The model is also consistent with behavioral data concerning temporal expectancy effects. In a discussion, the multi-layer dynamic field account of IOR is used to illustrate the broader view that behavior consists of a tuning of the organism to the environment that continuously and concurrently takes place at different spatiotemporal scales

    Evidence for Anthropogenic Surface Loading as Trigger Mechanism of the 2008 Wenchuan Earthquake

    Full text link
    Two and a half years prior to China's M7.9 Wenchuan earthquake of May 2008, at least 300 million metric tons of water accumulated with additional seasonal water level changes in the Minjiang River Valley at the eastern margin of the Longmen Shan. This article shows that static surface loading in the Zipingpu water reservoir induced Coulomb failure stresses on the nearby Beichuan thrust fault system at <17km depth. Triggering stresses exceeded levels of daily lunar and solar tides and perturbed a fault area measuring 416+/-96km^2. These stress perturbations, in turn, likely advanced the clock of the mainshock and directed the initial rupture propagation upward towards the reservoir on the "Coulomb-like" Beichuan fault with rate-and-state dependent frictional behavior. Static triggering perturbations produced up to 60 years (0.6%) of equivalent tectonic loading, and show strong correlations to the coseismic slip. Moreover, correlations between clock advancement and coseismic slip, observed during the mainshock beneath the reservoir, are strongest for a longer seismic cycle (10kyr) of M>7 earthquakes. Finally, the daily event rate of the micro-seismicity (M>0.5) correlates well with the static stress perturbations, indicating destabilization.Comment: 22 pages, 4 figures, 3 table

    Chronic depression: development and evaluation of the luebeck questionnaire for recording preoperational thinking (LQPT)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A standardized instrument for recording the specific cognitive psychopathology of chronically depressed patients has not yet been developed. Up until now, preoperational thinking of chronically depressed patients has only been described in case studies, or through the external observations of therapists. The aim of this study was to develop and evaluate a standardized self-assessment instrument for measuring preoperational thinking that sufficiently conforms to the quality criteria for test theory.</p> <p>Methods</p> <p>The "Luebeck Questionnaire for Recording Preoperational Thinking (LQPT)" was developed and evaluated using a german sample consisting of 30 episodically depressed, 30 chronically depressed and 30 healthy volunteers. As an initial step the questionnaire was subjected to an item analysis and a final test form was compiled. In a second step, reliability and validity tests were performed.</p> <p>Results</p> <p>Overall, the results of this study showed that the LQPT is a useful, reliable and valid instrument. The reliability (split-half reliability 0.885; internal consistency 0.901) and the correlations with other instruments for measuring related constructs (control beliefs, interpersonal problems, stress management) proved to be satisfactory. Chronically depressed patients, episodically depressed patients and healthy volunteers could be distinguished from one another in a statistically significant manner (p < 0.001).</p> <p>Conclusion</p> <p>The questionnaire fulfilled the classical test quality criteria. With the LQPT there is an opportunity to test the theory underlying the CBASP model.</p
    corecore