41,392 research outputs found

    Patterns of creation and usage of wikipedia content

    Get PDF
    This is the Post-print version of the Article. The official Published version can be accessed from the link below - Copyright @ 2012 IEEEWikipedia is the largest online service storing user-generated content. Its pages are open to anyone for addition, deletion and modifications, and the effort of contributors is recorded and can be tracked in time. Although potentially the Wikipedia web content could exhibit unbounded growth, it is still not clear whether the effort of developers and the output generated are actually following patterns of continuous growth. It is also not clear how the users access such content, and if recurring patterns of usage are detectable showing how the Wikipedia content typically is viewed by interested readers. Using the category of Wikipedia as macro-agglomerates, this study reveals that Wikipedia categories face a decreasing growth trend over time, after an initial, exponential phase of development. On the other hand the study demonstrates that the number of views to the pages within the categories follow a linear, unbounded growth. The link between software usefulness and the need for software maintenance over time has been established by Lehman and other; the link betweenWikipedia usage and changes to the content, unlike software, appear to follow a two-phase evolution of production followed by consumption.This study is partly funded by the University of East London

    Majorana neutrino decay in an Effective Approach

    Get PDF
    The search strategy or the finding of new effects for heavy neutrinos often relies on their different decay channels to detectable particles. In particular in this work we study the decay of a Majorana neutrino with interactions obtained from an effective general theory modeling new physics at the scale Λ\Lambda. The results obtained are general because they are based in an effective theory and not in specific models. We are interested in relatively light heavy Majorana neutrinos, with masses lower than the WW mass (mN<mWm_N<m_W). This mass range simplifies the study by reducing the possible decay modes. Moreover, we found that for Λ1\Lambda\sim 1 TeeV, the neutrino plus photon channel could give explanation to different observations: we analyze the potentiality of the studied interactions to explain some neutrino-related problems like the MiniBooNE and SHALON anomalies. We show in different figures the dominant branching ratios and the decay length of the Majorana neutrino in this approach. This kind of heavy neutral leptons could be searched for in the LHC with the use of displaced vertices techniques. \Comment: 15 page, 5 figure

    A screening method for ranking chemicals by their fate and behaviour in the environment and potential toxic effects in humans following non-occupational exposure

    Get PDF
    A large number of chemicals are released intentionally or unintentionally into the environment each year. These include thousands of substances that are currently listed worldwide and several hundred new substances added annually (Mücke et al., 1986). When these compounds are used, they can reach microorganisms, plants, animals and man either in their original state or in the form of reaction and degradation products via air, water, soil or foodstuffs. Hence environmental chemicals can occur in practically all environmental compartments and ecosystems. It is not feasible to conduct assessments of human exposure and possible associated health effects for all chemicals. Even if the necessary resources were available, reliable data for a quantitative evaluation are likely to be absent in most cases. This has led to the development of schemes for prioritising compounds likely to be of environmental significance. Such schemes can be used to direct future research efforts towards the prioritised compounds. This study was commissioned by the Department of Health (DH) as part of a broader research activity that aims to identify key priority chemicals of concern to human health at routine levels of environmental exposure. The main pathways of human exposure are shown in Figure 1.1. A review of the principal prioritisation schemes used by different organisations to assess the significance of chemical release into the environment has been conducted by the MRC Institute for Environment and Health (IEH, 2003). This review showed that the approaches used by different organisations vary widely, depending on the initial reasons for which the schemes were developed. The basic information presented in the review was used to develop a simple screening method for ranking chemicals. The model used in this prioritisation scheme is outlined in Figure 1.2. The main purpose in developing the prioritisation scheme for DH was to develop a dedicated priority setting method capable of identifying chemicals in air, water, soil and foodstuffs that might pose a significant risk to human health following low level environmental exposure. The methodology was developed in order to identify compounds that required further assessment and those that had data gaps. More detailed risk assessments were conducted at a later stage on those compounds prioritised as being of high importancea. The screening methodology was developed for ‘existing chemicals’ as these are of greatest concern because data on their toxicity and/or fate and behaviour are often unknownb. The production of a priority list was designed to highlight compounds that required further regulatory measures to reduce exposure of the general population and for which an in-depth risk characterisation would be necessary to assist in the evaluation and implementation of activities for reducing environmental risks. This might include an assessment of the costs of such risks to human health and the costs of reduction measures. As the scheme also aimed to identify data gaps that might warrant further investigation, the application of default categories for chemicals with no data was also considered. The overall aim was to develop a screening methodology that is quick, clear and simple to use and that can easily be revised to take into account new information on compounds as and when it becomes available. a Benzene (IEH Report on Benzene in the Environment, R12); 4,6-dichlorocresol, hexachloro-1,3-butadiene, tetrachlorobenzene, 2,4,6-trichlorophenol (reports to DH; available from MRC Institute for Environment and Health b ‘Existing Substances’ are those that were placed in the European Union (EU) market before 1981. Prior to 1981 regulatory requirements were related to products intended for certain uses (e.g. veterinary medicines) and did not require assessment of the hazardous properties of any substance before they were released into the market. For substances placed on the market after 1981 (classified as ‘New Substances’) there is a legal requirement to conduct such assessments. Regulatory agencies require the collection of extensive documentation for safety before a chemical, for example, can be used in foods or commercial products. IEH Web Report W14, posted March 2004 at http://www.le.ac.uk/ieh/ 4 This report describes how physicochemical properties and toxicological data were incorporated into a screening model to assess the potential fate and transfer of chemicals between different environmental compartments and to predict the potential human exposure to toxic chemicals through the inhalation of contaminated air and the ingestion of water and food. It must be stressed, however, that the method devised is a simple screening process and that a more detailed assessment is necessary to determine the potential transfer through the foodchain of a chemical and the full extent of any adverse health effects. Sections 2 and 4 present the physicochemical properties, toxicological data and algorithms used to screen the compounds. Section 3 summarises the groups of chemicals that were included in the screening process. The results of the prioritisation scheme and comments on their limitations and constraints are presented in Section 5

    Optimal diffusion in ecological dynamics with Allee effect in a metapopulation

    Full text link
    How diffusion impacts on ecological dynamics under the Allee effect and spatial constraints? That is the question we address. Employing a microscopic minimal model in a metapopulation (without imposing nonlinear birth and death rates) we evince --- both numerically and analitically --- the emergence of an optimal diffusion that maximises the survival probability. Even though, at first such result seems counter-intuitive, it has empirical support from recent experiments with engineered bacteria. Moreover, we show that this optimal diffusion disappears for loose spatial constraints.Comment: 16 pages; 6 figure

    The long memory story of real interest rates. Can it be supported?

    Get PDF
    This papers finds evidence of fractional integration for a number of monthly ex post real interest rate series using the GPH semiparametric estimator on data from fourteen European countries and the US. However, we pose empirical questions on certain time series requirements that emerge from fractional integration and we find that they do not hold pointing to "spurious" long memory and casting doubts with respect to the theoretical origins of long memory in our sample. Common stochastic trends expressed as the sum of stationary past errors do not seem appropriate as an explanation of real interest rate covariation. From an economic perspective, our results suggest that most European countries show higher speed of real interest rate equalization with Germany rather than the US.

    Filamentary fragmentation in a turbulent medium

    Get PDF
    We present the results of smoothed particle hydrodynamic simulations investigating the evolution and fragmentation of filaments that are accreting from a turbulent medium. We show that the presence of turbulence, and the resulting inhomogeneities in the accretion flow, play a significant role in the fragmentation process. Filaments which experience a weakly turbulent accretion flow fragment in a two-tier hierarchical fashion, similar to the fragmentation pattern seen in the Orion Integral Shaped Filament. Increasing the energy in the turbulent velocity field results in more sub-structure within the filaments, and one sees a shift from gravity-dominated fragmentation to turbulence-dominated fragmentation. The sub-structure formed in the filaments is elongated and roughly parallel to the longitudinal axis of the filament, similar to the fibres seen in observations of Taurus, and suggests that the fray and fragment scenario is a possible mechanism for the production of fibres. We show that the formation of these fibre-like structures is linked to the vorticity of the velocity field inside the filament and the filament's accretion from an inhomogeneous medium. Moreover, we find that accretion is able to drive and sustain roughly sonic levels of turbulence inside the filaments, but is not able to prevent radial collapse once the filaments become supercritical. However, the supercritical filaments which contain fibre-like structures do not collapse radially, suggesting that fibrous filaments may not necessarily become radially unstable once they reach the critical line-density.Comment: (Accepted for publication in MNRAS

    On the relationship between inflation persistence and temporal aggregation

    Get PDF
    This paper examines the impact of temporal aggregation on alternative definitions of inflation persistence. Using the CPI and the core PCE deflator of the US, our results show that temporal aggregation from the monthly to the quarterly to the annual frequency induces persistence in the inflation series.
    corecore