2,658 research outputs found
The long delayed solution of the Bukhvostov Lipatov model
In this paper I complete the solution of the Bukhvostov Lipatov model by
computing the physical excitations and their factorized S matrix. I also
explain the paradoxes which led in recent years to the suspicion that the model
may not be integrable.Comment: 9 page
L’absentéisme : importance, nature et remèdes
Après avoir traité de l'importance et de retendue du phénomène d'absentéisme au sein des entreprises ainsi que des coûts impliqués, les auteurs identifient les principaux modèles explicatifs ainsi que les limites inhérentes à ces divers modèles en vue d'en présenter un schéma intégrateur. Ensuite, ils identifient les diverses stratégies d'intervention à la portée de la direction des entreprises utilisées en pratique afin d'enrayer, tout au moins en partie, l'absentéisme et ses effets au sein des entreprises.In this article, we will deal with absenteeism from three different yet complementary perspectives. First of all, the importance and the prevalence of absenteeism in the workplace will be discussed, with special attention to the question of its attendant costs. Next, we will present a brief synopsis of the models which figure most prominently in the explanation of absenteeism. Care will be taken to identify some of the main limitations of each model, with a view toward developing an integrated and more appropriate explanatory framework. Finally, drawing upon this explanatory framework, we will briefly outline several of the strategies available to management for the purpose of reducing absenteeism and minimizing its adverse effects on work organizations.Extent of the Problem and its Related CostsA conservative estimate of the incidence of absenteeism would be that on any . given day, an average of 3.5 to 4% of Canadian workers do not show up for work (frequency of absenteeism is about 2.6 times per year per worker for a yearly average of 8 to 9 days per employee). Further, in terms of the distribution of absenteeism, 25% of all employees seldom skip work (say, one day per year), whereas 15% of workers are responsible for 40% of the absences.The importance of this particular finding is especially evident when we consider the costs of absenteeism to the work organization. For example, let us assume that this relatively conservative estimate of 9 days per employee per year is a reasonable one. Let us further assume an average salary of 50 x 1.75 x 5,000 = 878.50 per employee).Causes and Correlates of AbsenteeismA thorough review of the literature strongly suggests that the causes and correlates are so diverse as to argue against the search for a single remedy for absenteeism, one which could be optimally effective in combatting all causes pertinent to a given situation. Instead, we are persuaded that the effective control of absenteeism can come only from an in-depth analysis of the problem on a case-by-case basis; i.e., one may reasonably suppose that absenteeism among white-collar clerical employees and blue-collar production workers in a given company may represent behavioral symptoms whose underlying causes are different and whose solutions are necessarily different also. In other words, in the treatment of absenteeism it appears doubtful that there exists a universally applicable solution or one which is capable of eliminating absenteeism altogether.Despite the diversity of causes and correlates cited, it is nonetheless possible to integrate these variables by focusing on the underlying process leading to absenteeism. In this context, presence at work or absenteeism may be viewed as the result of the following process: First, the individual has to think that if he wants to, he can show up for work; second, the individual has to think further that the consequences of being present at work are more favorable to him than the consequences of being absent; third, he has to be effectively able to get to work and, finally, the effects of the organizational policies and procedures have to be perceived as more compelling to be present than to be absent.Intervention Strategies and their EffectsStrategies for reducing absenteeism may be classified as either ''participative" or "hierarchical".Strategies termed "participative" are commonly based on a conception of human behaviour akin to the principles of Douglas Me Gregor's Theory Y, which assumes that employees are by nature highly motivated, diligent, autonomous, and so forth. For proponents of "participative" solutions, then countering absenteeism becomes essentially a matter of providing the circumstances necessary to the realization of the employee's full potential. Concrete examples of such strategies would include building semi-autonomous work teams, establishing flexible work schedules, adjusting levels of remuneration according to the technicality of the job or the efficiency of the employee, and so forth.In contrast, strategies labelled "hierarchical" emphasize the resumption by management of control over the attendance behaviour and the output of employees. As has typically been the case, management seeks to exert control by calling upon an array of incentives and disincentives designed to increase the attractiveness of staying on the job and/or to increase the costs of absenteeism to the employee. Unlike theparticipative approach, which encourages worker involvement in the solution to the problem, hierarchical strategies imply that the initiatives, the incentives, and the control over problem-solving processes originate with management. (In most cases, the "nerve center" in the control System is the immediate supervisor, who is called upon to communicate policies, arrange and preside over meetings, apply sanctions, ...).One might well wonder whether one type of strategy has been shown to be consistently superior to the other. However, at the present time, the state of empirical evidence on the question simply does not permit us to state anunequivocal preference for either participative or hierarchical strategies. Nevertheless, our comprehensive review of theory and research on all facets of absenteeism, its diagnosis and treatment, prompts us to offer the following tentative conclusions.First of all, we estimate that the use of an intervention strategy of the "hierarchical" variety, which leads to a decrease of, say, 20% in the absenteeism rate, should result in a savings of 79 per employee per year, and this only if the intervention succeeded in improving by 40% the quality of working life (as perceived by the employees). What is more, the costs involved in a participative type of problem-solving orientation are doubtless higher than for a hierarchical solution. All the same, we must not forget that an organization could well have valid reasons for wanting to improve the quality of working lifeother than a desire to reduce absenteeism. However, one of the built-in problems of the whole approach is that it mistakenly attributed absenteeism toall employees and proposes that all such problems be treated at the same time and in the say way, as if there existed a single cause and a single, universally applicable remedy.Our final conclusion, then, is that future research and intervention efforts should be more sharply focused than has been the case up to now. At the very least, this change in orientation would imply (1) identifying those individuals (or categories of individuals) most prone to absences, and (2) involving chronically absent employees, their supervisors, and union representatives in the analysis of causes and the exploration of avenues of solution
Boundary interactions changing operators and dynamical correlations in quantum impurity problems
Recent developments have made possible the computation of equilibrium
dynamical correlators in quantum impurity problems. In many situations however,
one is rather interested in correlators subject to a non equilibrium initial
preparation; this is the case for instance for the occupation probability
in the double well problem of dissipative quantum mechanics (DQM). We
show in this paper how to handle this situation in the framework of integrable
quantum field theories by introducing ``boundary interactions changing
operators''. We determine the properties of these operators by using an
axiomatic approach similar in spirit to what is done for form-factors. This
allows us to obtain new exact results for ; for instance, we find that
that at large times (or small ), the leading behaviour for g < 1/2} is
, with the universal ratio.
.Comment: 4 pages, revte
Elliptical body fossils from the Fortunian (Early Cambrian) of Normandy (NW France)
Body fossils have been discovered in the Fortunian deposits of the Rozel Cape, in Normandy (NW France). The material consists of about 80 specimens preserved on a shale surface, recently observed at the base of a cliff at the Cap Rozel, in the Cotentin region. The fossils, centimetric in size, have an elliptical outline, with a peripheral bulge, generally without other conspicuous ornamentation, but showing sometimes concentric or radial lines possibly of taphonomic origins. In addition, these body fossils are preserved parallel to the bedding plane, locally rich in horizontal trace fossils (e.g. Archaeonassa Fenton & Fenton, 1937, Helminthoidichnites Fitch, 1850, Helminthopsis Heer, 1877) and also complex treptichinids burrows (e.g. Treptichnus pedum (Seilacher, 1955)) sometimes associated with microbial mats. The sedimentological characteristics of these deposits (ripple marks, syneresis cracks) correspond to a shallow marine shelf environment, with a variable hydrodynamism in the intertidal zone, low for surfaces showing elliptic fossils and syneresis cracks, higher for surfaces with ripple marks. These new discoveries unravel the potential of the Fortunian strata from Normandy and provide new information about the early Cambrian biocenoses
Correlation functions of disorder operators in massive ghost theories
The two-dimensional ghost systems with negative integral central charge
received much attention in the last years for their role in a number of
applications and in connection with logarithmic conformal field theory. We
consider the free massive bosonic and fermionic ghost systems and concentrate
on the non-trivial sectors containing the disorder operators. A unified
analysis of the correlation functions of such operators can be performed for
ghosts and ordinary complex bosons and fermions. It turns out that these
correlators depend only on the statistics although the scaling dimensions of
the disorder operators change when going from the ordinary to the ghost case.
As known from the study of the ordinary case, the bosonic and fermionic
correlation functions are the inverse of each other and are exactly expressible
through the solution of a non-linear differential equation.Comment: 8 pages, late
Cerebellar rTMS disrupts predictive language processing
The human cerebellum plays an important role in language, amongst other cognitive and motor functions [1], but a unifying theoretical framework about cerebellar language function is lacking. In an established model of motor control, the cerebellum is seen as a predictive machine, making short-term estimations about the outcome of motor commands. This allows for flexible control, on-line correction, and coordination of movements [2]. The homogeneous cytoarchitecture of the cerebellar cortex suggests that similar computations occur throughout the structure, operating on different input signals and with different output targets [3]. Several authors have therefore argued that this ‘motor’ model may extend to cerebellar nonmotor functions [3], [4] and [5], and that the cerebellum may support prediction in language processing [6]. However, this hypothesis has never been directly tested. Here, we used the ‘Visual World’ paradigm [7], where on-line processing of spoken sentence content can be assessed by recording the latencies of listeners' eye movements towards objects mentioned. Repetitive transcranial magnetic stimulation (rTMS) was used to disrupt function in the right cerebellum, a region implicated in language [8]. After cerebellar rTMS, listeners showed delayed eye fixations to target objects predicted by sentence content, while there was no effect on eye fixations in sentences without predictable content. The prediction deficit was absent in two control groups. Our findings support the hypothesis that computational operations performed by the cerebellum may support prediction during both motor control and language processing
Quantification of magma ascent rate through rockfall monitoring at the growing/collapsing lava dome of Volcán de Colima, Mexico
International audienceThe most recent eruptive phase of Volc'an de Colima, Mexico, started in 1998 and was characterized by dome growth with a variable effusion rate, interrupted intermittently by explosive eruptions. Between November 2009 and June 2011, activity at the dome was mostly limited to a lobe on the western side where it had previously started overflowing the crater rim, leading to the generation of rockfall events. As a consequence of this, no significant increase in dome volume was perceivable and the rate of magma ascent, a crucial parameter for volcano monitoring and hazard assessment could no longer be quantified via measurements of the dome's dimensions. Here, we present alternative approaches to quantify the magma ascent rate. We estimate the volume of individual rockfalls through the detailed analysis of sets of photographs (before and after individual rockfall events). The relationship between volume and infrared images of the freshly exposed dome surface and the seismic signals related to the rockfall events were then investigated. Larger rockfall events exhibited a correlation between its previously estimated volume and the surface temperature of the freshly exposed dome surface, as well as the mean temperature of rockfall mass distributed over the slope. We showed that for larger events, the volume of the rockfall correlates with the maximum temperature of the newly exposed lava dome as well as a proxy for seismic energy. It was therefore possible to calibrate the seismic signals using the volumes estimated from photographs and the count of rockfalls over a certain period was used to estimate the magma extrusion flux for the period investigated. Over the course of the measurement period, significant changes were observed in number of rockfalls, rockfall volume and hence averaged extrusion rate. The extrusion rate was not constant: it increased from 0.008±0.003 to 0.02±0.007m3 s−1 during 2010 and dropped down to 0.008±0.003m3 s−1 again in March 2011. In June 2011, magma extrusion had come to a halt. The methodology presented represents a reliable tool to constrain the growth rate of domes that are repeatedly affected by partial collapses. There is a good correlation between thermal and seismic energies and rockfall volume. Thus it is possible to calibrate the seismic records associated with the rockfalls (a continuous monitoring tool) to improve volcano monitoring at volcanoes with active dome growth
Exact Friedel oscillations in the g=1/2 Luttinger liquid
A single impurity in the 1D Luttinger model creates a local modification of
the charge density analogous to the Friedel oscillations. In this paper, we
present an exact solution of the case (the equivalent of the
Toulouse point) at any temperature and impurity coupling, expressing the
charge density in terms of a hypergeometric function. We find in particular
that at , the oscillatory part of the density goes as at small
distance and at large distance.Comment: 1 reference added. 13 pages, harvma
Monthly forecasting of French GDP: A revised version of the OPTIM model.
This paper presents a revised version of the model OPTIM, proposed by Irac and Sédillot (2002), used at the Banque de France in order to predict French GDP quarterly growth rate, for the current and next quarters. The model is designed to be used on a monthly basis by integrating monthly economic information through bridge models, for both supply and demand sides of GDP. For each GDP component, bridge equations are specified by using a general-to-specific approach implemented in an automated way by Hoover and Perez (1999) and improved by Krolzig and Hendry (2001). This approach allows to select explanatory variables among a large data set of hard and soft data. The final choice of equations relies on a recursive forecast study, which also helps to assess the forecasting performance of the revised OPTIM model in the prediction of aggregated GDP. This study is based on pseudo real-time forecasts taking publication lags into account. It turns out that the model outperforms benchmark models.GDP forecasting ; Bridge models ; General-to-specific approach
- …