2,544 research outputs found

    “This is the way ‘I’ create my passwords ...":does the endowment effect deter people from changing the way they create their passwords?

    Get PDF
    The endowment effect is the term used to describe a phenomenon that manifests as a reluctance to relinquish owned artifacts, even when a viable or better substitute is offered. It has been confirmed by multiple studies when it comes to ownership of physical artifacts. If computer users also "own", and are attached to, their personal security routines, such feelings could conceivably activate the same endowment effect. This would, in turn, lead to their over-estimating the \value" of their existing routines, in terms of the protection they afford, and the risks they mitigate. They might well, as a consequence, not countenance any efforts to persuade them to adopt a more secure routine, because their comparison of pre-existing and proposed new routine is skewed by the activation of the endowment effect.In this paper, we report on an investigation into the possibility that the endowment effect activates when people adopt personal password creation routines. We did indeed find evidence that the endowment effect is likely to be triggered in this context. This constitutes one explanation for the failure of many security awareness drives to improve password strength. We conclude by suggesting directions for future research to confirm our findings, and to investigate the activation of the effect for other security routines

    Models of protein production along the cell cycle: an investigation of possible sources of noise

    Full text link
    In this article, we quantitatively study, through stochastic models, the efects of several intracellular phenomena, such as cell volume growth, cell division, gene replication as well as fuctuations of available RNA polymerases and ribosomes. These phenomena are indeed rarely considered in classic models of protein production and no relative quantitative comparison among them has been performed. The parameters for a large and representative class of proteins are determined using experimental measures. The main important and surprising conclusion of our study is to show that despite the signifcant fuctuations of free RNA polymerases and free ribosomes, they bring little variability to protein production contrary to what has been previously proposed in the literature. After verifying the robustness of this quite counter-intuitive result, we discuss its possible origin from a theoretical view, and interpret it as the result of a mean-feld efect

    A Stochastic Analysis of Autoregulation of Gene Expression

    Full text link
    This paper analyzes, in the context of a prokaryotic cell, the stochastic variability of the number of proteins when there is a control of gene expression by an autoregulation scheme. The goal of this work is to estimate the efficiency of the regulation to limit the fluctuations of the number of copies of a given protein. The autoregulation considered in this paper relies mainly on a negative feedback: the proteins are repressors of their own gene expression. The efficiency of a production process without feedback control is compared to a production process with an autoregulation of the gene expression assuming that both of them produce the same average number of proteins. The main characteristic used for the comparison is the standard deviation of the number of proteins at equilibrium. With a Markovian representation and a simple model of repression, we prove that, under a scaling regime, the repression mechanism follows a Hill repression scheme with an hyperbolic control. An explicit asymptotic expression of the variance of the number of proteins under this regulation mechanism is obtained. Simulations are used to study other aspects of autoregulation such as the rate of convergence to equilibrium of the production process and the case where the control of the production process of proteins is achieved via the inhibition of mRNAs

    A Primer for Black Hole Quantum Physics

    Full text link
    The mechanisms which give rise to Hawking radiation are revealed by analyzing in detail pair production in the presence of horizons. In preparation for the black hole problem, three preparatory problems are dwelt with at length: pair production in an external electric field, thermalization of a uniformly accelerated detector and accelerated mirrors. In the light of these examples, the black hole evaporation problem is then presented. The leitmotif is the singular behavior of modes on the horizon which gives rise to a steady rate of production. Special emphasis is put on how each produced particle contributes to the mean albeit arising from a particular vacuum fluctuation. It is the mean which drives the semiclassical back reaction. This aspect is analyzed in more detail than heretofore and in particular its drawbacks are emphasized. It is the semiclassical theory which gives rise to Hawking's famous equation for the loss of mass of the black hole due to evaporation dM/dt≃−1/M2dM/dt \simeq -1/M^2. Black hole thermodynamics is derived from the evaporation process whereupon the reservoir character of the black hole is manifest. The relation to the thermodynamics of the eternal black hole through the Hartle--Hawking vacuum and the Killing identity are displayed. It is through the analysis of the fluctuations of the field configurations which give rise to a particular Hawking photon that the dubious character of the semiclassical theory is manifest. The present frontier of research revolves around this problem and is principally concerned with the fact that one calls upon energy scales that are greater than Planckian and the possibility of a non unitary evolution as well. These last subjects are presented in qualitative fashion only, so that this review stops at the threshold of quantum gravity.Comment: An old review article on black hole evaporation and black hole thermodynamics, put on the archive following popular demand, 178 pages, 21 figures (This text differs in slightly from the published version

    Analysis and modeling of green wood milling: Chip production by slabber

    Get PDF
    During the primary transformation of wood, logs are faced with slabber heads. Chips produced are raw materials for pulp paper and particleboard industries. Efficiency of these industries is partly due to particle size distribution. Command of this distribution is no easy matter because of great dependence on cutting conditions and variability in material. This study aimed a better understanding and predictionof chip fragmentation. It starts with a detailed description of cutting kinematic and interaction between knife and log. This leads to the numerical development of a generic slabber head. Chip fragmentation phenomena were studied through experiments in dynamic conditions. These experiments were carried out thanks to a pendulum (Vc = 400 m/min). It was instrumented with piezoelectric force sensors and high speed camera. Obtained results agreed very well with previous quasi-static experiments

    Sharing resources for performance and energy optimization of concurrent streaming applications

    Get PDF
    We aim at finding optimal mappings for concurrent streaming applications. Each application consists of a linear chain with several stages, and processes successive data sets in pipeline mode. The objective is to minimize the energy consumption of the whole platform, while satisfying given performance-related bounds on the period and latency of each application. The problem is to decide which processors to enroll, at which speed (or mode) to use them, and which stages they should execute. Processors can be identical (with the same modes) or heterogeneous. We also distinguish two mapping categories, interval mappings, and general mappings. For interval mappings, a processor is assigned a set of consecutive stages of the same application, so there is no resource sharing across applications. On the contrary, the assignment is fully arbitrary for general mappings, hence a processor can be reused for several applications. On the theoretical side, we establish complexity results for this tri-criteria mapping problem (energy, period, latency), classifying polynomial versus NP-complete instances. Furthermore, we derive an integer linear program that provides the optimal solution in the most general case. On the experimental side, we design polynomial-time heuristics, and assess their absolute performance thanks to the linear program. One main goal is to assess the impact of processor sharing on the quality of the solution

    The dimensions of alienation : a survey of organized industrial workers.

    Get PDF
    Dept. of Sociology and Anthropology. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis1972 .R35. Source: Masters Abstracts International, Volume: 40-07, page: . Thesis (M.A.)--University of Windsor (Canada), 1972
    • 

    corecore