470,609 research outputs found

    Managing trade-offs in landscape restoration and revegetation projects

    Get PDF
    Landscape restoration projects often have multiple and disparate conservation, resource enhancement and sometimes economic objectives, since projects that seek to meet more than one objective tend to be viewed more positively by funding agencies and the community. The degree to which there are tradeoffs among desired objectives is an important variable for decision-makers, yet this is rarely explicitly considered. In particular, the existence of ecological thresholds has important implications for decision-making at both the project level and the regional level. We develop a model of the possibilities and choices for an agency seeking to achieve two environmental objectives in a region through revegetation of a number of sites. A graphical model of the production possibilities sets for a single revegetation project is developed and different tradeoff relationships are discussed and illustrated. Then the model is used to demonstrate the possibilities for managing all such projects within a region. We show that where there are thresholds in the tradeoff relationship between two objectives, specialization (single- or dominant- objective projects) should be considered. This is illustrated using a case study in which revegetation is used to meet avian biodiversity and salinity mitigation objectives. We conclude that where there are sufficient scientific data, explicit consideration of different types of tradeoffs can assist in making decisions about the most efficient mix and type of projects to better achieve a range of objectives within a region

    Binocular functional architecture for detection of contrast-modulated gratings

    Get PDF
    Combination of signals from the two eyes is the gateway to stereo vision. To gain insight into binocular signal processing, we studied binocular summation for luminance-modulated gratings (L or LM) and contrast-modulated gratings (CM). We measured 2AFC detection thresholds for a signal grating (0.75 c/deg, 216msec) shown to one eye, both eyes, or both eyes out-of-phase. For LM and CM, the carrier noise was in both eyes, even when the signal was monocular. Mean binocular thresholds for luminance gratings (L) were 5.4dB better than monocular thresholds - close to perfect linear summation (6dB). For LM and CM the binocular advantage was again 5-6dB, even when the carrier noise was uncorrelated, anti-correlated, or at orthogonal orientations in the two eyes. Binocular combination for CM probably arises from summation of envelope responses, and not from summation of these conflicting carrier patterns. Antiphase signals produced no binocular advantage, but thresholds were about 1-3dB higher than monocular ones. This is not consistent with simple linear summation, which should give complete cancellation and unmeasurably high thresholds. We propose a three-channel model in which noisy monocular responses to the envelope are binocularly combined in a contrast-weighted sum, but also remain separately available to perception via a max operator. Vision selects the largest of the three responses. With in-phase gratings the binocular channel dominates, but antiphase gratings cancel in the binocular channel and the monocular channels mediate detection. The small antiphase disadvantage might be explained by a subtle influence of background responses on binocular and monocular detection

    Coincidence and coherent data analysis methods for gravitational wave bursts in a network of interferometric detectors

    Full text link
    Network data analysis methods are the only way to properly separate real gravitational wave (GW) transient events from detector noise. They can be divided into two generic classes: the coincidence method and the coherent analysis. The former uses lists of selected events provided by each interferometer belonging to the network and tries to correlate them in time to identify a physical signal. Instead of this binary treatment of detector outputs (signal present or absent), the latter method involves first the merging of the interferometer data and looks for a common pattern, consistent with an assumed GW waveform and a given source location in the sky. The thresholds are only applied later, to validate or not the hypothesis made. As coherent algorithms use a more complete information than coincidence methods, they are expected to provide better detection performances, but at a higher computational cost. An efficient filter must yield a good compromise between a low false alarm rate (hence triggering on data at a manageable rate) and a high detection efficiency. Therefore, the comparison of the two approaches is achieved using so-called Receiving Operating Characteristics (ROC), giving the relationship between the false alarm rate and the detection efficiency for a given method. This paper investigates this question via Monte-Carlo simulations, using the network model developed in a previous article.Comment: Spelling mistake corrected in one author's nam

    NETASA: neural network based prediction of solvent accessibility

    Get PDF
    Motivation: Prediction of the tertiary structure of a protein from its amino acid sequence is one of the most important problems in molecular biology. The successful prediction of solvent accessibility will be very helpful to achieve this goal. In the present work, we have implemented a server, NETASA for predicting solvent accessibility of amino acids using our newly optimized neural network algorithm. Several new features in the neural network architecture and training method have been introduced, and the network learns faster to provide accuracy values, which are comparable or better than other methods of ASA prediction. Results: Prediction in two and three state classification systems with several thresholds are provided. Our prediction method achieved the accuracy level upto 90% for training and 88% for test data sets. Three state prediction results provide a maximum 65% accuracy for training and 63% for the test data. Applicability of neural networks for ASA prediction has been confirmed with a larger data set and wider range of state thresholds. Salient differences between a linear and exponential network for ASA prediction have been analysed

    Prognostic value of test(s) for O6-methylguanine–DNA methyltransferase (MGMT) promoter methylation for predicting overall survival in people with glioblastoma treated with temozolomide

    Get PDF
    BACKGROUND: Glioblastoma is an aggressive form of brain cancer. Approximately five in 100 people with glioblastoma survive for five years past diagnosis. Glioblastomas that have a particular modification to their DNA (called methylation) in a particular region (the O(6)‐methylguanine–DNA methyltransferase (MGMT) promoter) respond better to treatment with chemotherapy using a drug called temozolomide. OBJECTIVES: To determine which method for assessing MGMT methylation status best predicts overall survival in people diagnosed with glioblastoma who are treated with temozolomide. SEARCH METHODS: We searched MEDLINE, Embase, BIOSIS, Web of Science Conference Proceedings Citation Index to December 2018, and examined reference lists. For economic evaluation studies, we additionally searched NHS Economic Evaluation Database (EED) up to December 2014. SELECTION CRITERIA: Eligible studies were longitudinal (cohort) studies of adults with diagnosed glioblastoma treated with temozolomide with/without radiotherapy/surgery. Studies had to have related MGMT status in tumour tissue (assessed by one or more method) with overall survival and presented results as hazard ratios or with sufficient information (e.g. Kaplan‐Meier curves) for us to estimate hazard ratios. We focused mainly on studies comparing two or more methods, and listed brief details of articles that examined a single method of measuring MGMT promoter methylation. We also sought economic evaluations conducted alongside trials, modelling studies and cost analysis. DATA COLLECTION AND ANALYSIS: Two review authors independently undertook all steps of the identification and data extraction process for multiple‐method studies. We assessed risk of bias and applicability using our own modified and extended version of the QUality In Prognosis Studies (QUIPS) tool. We compared different techniques, exact promoter regions (5'‐cytosine‐phosphate‐guanine‐3' (CpG) sites) and thresholds for interpretation within studies by examining hazard ratios. We performed meta‐analyses for comparisons of the three most commonly examined methods (immunohistochemistry (IHC), methylation‐specific polymerase chain reaction (MSP) and pyrosequencing (PSQ)), with ratios of hazard ratios (RHR), using an imputed value of the correlation between results based on the same individuals. MAIN RESULTS: We included 32 independent cohorts involving 3474 people that compared two or more methods. We found evidence that MSP (CpG sites 76 to 80 and 84 to 87) is more prognostic than IHC for MGMT protein at varying thresholds (RHR 1.31, 95% confidence interval (CI) 1.01 to 1.71). We also found evidence that PSQ is more prognostic than IHC for MGMT protein at various thresholds (RHR 1.36, 95% CI 1.01 to 1.84). The data suggest that PSQ (mainly at CpG sites 74 to 78, using various thresholds) is slightly more prognostic than MSP at sites 76 to 80 and 84 to 87 (RHR 1.14, 95% CI 0.87 to 1.48). Many variants of PSQ have been compared, although we did not see any strong and consistent messages from the results. Targeting multiple CpG sites is likely to be more prognostic than targeting just one. In addition, we identified and summarised 190 articles describing a single method for measuring MGMT promoter methylation status. AUTHORS' CONCLUSIONS: PSQ and MSP appear more prognostic for overall survival than IHC. Strong evidence is not available to draw conclusions with confidence about the best CpG sites or thresholds for quantitative methods. MSP has been studied mainly for CpG sites 76 to 80 and 84 to 87 and PSQ at CpG sites ranging from 72 to 95. A threshold of 9% for CpG sites 74 to 78 performed better than higher thresholds of 28% or 29% in two of three good‐quality studies making such comparisons

    Programming of subthalamic nucleus deep brain stimulation with hyperdirect pathway and corticospinal tract-guided parameter suggestions.

    Get PDF
    Deep brain stimulation (DBS) of the subthalamic nucleus (STN) is an effective treatment for advanced Parkinson's disease. Stimulation of the hyperdirect pathway (HDP) may mediate the beneficial effects, whereas stimulation of the corticospinal tract (CST) mediates capsular side effects. The study's objective was to suggest stimulation parameters based on the activation of the HDP and CST. This retrospective study included 20 Parkinson's disease patients with bilateral STN DBS. Patient-specific whole-brain probabilistic tractography was performed to extract the HDP and CST. Stimulation parameters from monopolar reviews were used to estimate volumes of tissue activated and to determine the streamlines of the pathways inside these volumes. The activated streamlines were related to the clinical observations. Two models were computed, one for the HDP to estimate effect thresholds and one for the CST to estimate capsular side effect thresholds. In a leave-one-subject-out cross-validation, the models were used to suggest stimulation parameters. The models indicated an activation of 50% of the HDP at effect threshold, and 4% of the CST at capsular side effect threshold. The suggestions for best and worst levels were significantly better than random suggestions. Finally, we compared the suggested stimulation thresholds with those from the monopolar reviews. The median suggestion errors for the effect threshold and side effect threshold were 1 and 1.5 mA, respectively. Our stimulation models of the HDP and CST suggested STN DBS settings. Prospective clinical studies are warranted to optimize tract-guided DBS programming. Together with other modalities, these may allow for assisted STN DBS programming

    From linguistic innovation in blogs to language learning in adults : what do interaction networks tell us?

    Get PDF
    Social networks have been found to play an increasing role in human behaviour and even the attainment of individuals. We present the results of two projects applying SNA to language phenomena. One involves exploring the social propagation of ne ologisms in a social software (microblogging service), the other investigating the impact of social network structure and peer interaction dynamics on second-language learning outcomes in the setting of naturally occurring face-to-face interaction. From local, low-level interactions between agents verbally communicating with one another we aim to describe the processes underlying the emergence of more global systemic order and dynamics, using the latest methods of complexity science. In the former study, we demonstrate 1) the emergence of a linguistic norm, 2) that the general lexical innovativeness of Internet users scales not like a power law, but a unimodal, 3) that the exposure thresholds necessary for a user to adopt new lexemes from his/her neighbours concentrate at low values, suggesting that—at least in low-stakes scenarios—people are more susceptible to social influence than may erstwhile have been expected, and 4) that, contrary to common expectations, the most popular tags are characterised by high adoption thresholds. In the latter, we find 1) that the best predictor of performance is reciprocal interactions between individuals in the language being acquired, 2) that outgoing interactions in the acquired language are a better predictor than incoming interactions, and 3) not surprisingly, a clear negative relationship between performance and the intensity of interactions with same-native-language speakers. We also compare models where social interactions are weighted by homophily with those that treat them as orthogonal to each other

    Core Mass Function: The Role of Gravity

    Full text link
    We analyze the mass distribution of cores formed in an isothermal, magnetized, turbulent, and self-gravitating nearly critical molecular cloud model. Cores are identified at two density threshold levels. Our main results are that the presence of self-gravity modifies the slopes of the core mass function (CMF) at the high mass end. At low thresholds, the slope is shallower than the one predicted by pure turbulent fragmentation. The shallowness of the slope is due to the effects of core coalescence and gas accretion. Most importantly, the slope of the CMF at the high mass end steepens when cores are selected at higher density thresholds, or alternatively, if the CMF is fitted with a log-normal function, the width of the lognormal distribution decreases with increasing threshold. This is due to the fact that gravity plays a more important role in denser structures selected at higher density threshold and leads to the conclusion that the role of gravity is essential in generating a CMF that bears more resemblance with the IMF when cores are selected with an increasing density threshold in the observations.Comment: 13 pages, 4 figures, accepted to ApJ Letters. An additional simulation has been added. The sign of the beta values in Fig. 1 changed to fit their definition in the text. The main conclusions, however unchanged, are better clarifie

    The Concentration of Agricultural Production and Growth of Agricultural Holdings,

    Get PDF
    Between the 1998 and 2000 agricultural censuses, the number of agricultural holdings fell from one million to 664 000. This fall resulted in a slight increase in the relative concentration of agricultural production, with the smallest holdings decreasing in size and the largest holdings becoming larger. There are two explanatory variables which today have a greater influence on holding size than in the past: the age of the manager of the agricultural holding, with younger managers coming to increasingly larger holdings, and the legal form. The starting size has little incidence on the growth of holdings: the concentration of production occurs more due to a rise in economic size thresholds than the cornering of the market by the largest holdings. The slight movement towards concentration observed over the last 15 years is essentially linked to the development of corporate farming, which is better suited to larger holdings than the individual farmer.Size Growth, Demography, Agricultural Holdings
    corecore