788 research outputs found

    Collective Bargaining, Mutuality, and Workers Participation in Management: An International Analysis

    Get PDF
    Les efforts entrepris pour développer la participation des travailleurs à la gestion ont principalement suivi deux voies. L'une, dont la négociation collective est un exemple, implique l'antagonisme; l'autre, représentée par les groupes de travail autonomes, la coopération des producteurs et (dans certaines limites) la cogestion, est de nature mutualiste ou consensuelle. Aux États-Unis et en Grande-Bretagne, c'est le modèle antagoniste que l'on trouve généralement dans les usines et les industries où les salariés sont syndiqués. Dans le système d'autogestion de la Yougoslavie, les kibboutz d'Israël et les coopératives de producteurs d'un certain nombre de pays, c'est le modèle mutualiste qui a été adopté. Mais on a aussi, en République fédérale d'Allemagne, en Suède et au Japon, des exemples de systèmes où les deux méthodes existent côte à côte dans la même industrie ou la même entreprise. Et il existe certaines manières de procéder qui combinent les conceptions antagoniques et mutualistes.Bien que le système de négociation collective des États-Unis soit fondé sur l'antagonisme, certains employeurs et syndicats ont évolué vers la méthode mutualiste en créant des commissions ou des programmes mixtes de coopération, dont les tâches sont ordinairement bien déterminées. Les résultats de la coopération dans le domaine du rendement de la production, de la réduction des coûts et de la lutte contre le gaspillage ont été minces. Dans la plupart des cas, les expériences faites aux États-Unis en matière d'enrichissement et de valorisation des tâches ainsi qu'avec les groupes de travail semi-autonomes ont eu lieu dans des établissements dont le personnel n'était pas syndiqué. La méthode antagoniste permet théoriquement de régler à peu près tous les problèmes et questions relevant de la méthode mutualiste, mais les syndicats américains se sont bornés à diriger leurs efforts vers l'organisation rigoureuse et la réglementation des questions d'emploi.L'expérience de la République fédérale d'Allemagne, de la Suède et de quelques autres pays d'Europe montre bien que le champ de la négociation collective peut s'étendre au-delà des questions classiques de salaires et de conditions d'emploi et être, en même temps, lié à des programmes mutualistes au lieu de travail et au conseil de direction. Les objectifs et les attitudes jouent un rôle capital dans ces cas. Il est significatif que le mutualisme soit souvent le résultat de l'action du pouvoir politique ou d'une redistribution de ce pouvoir dans l'ensemble de la société. Mais la volonté a aussi contribué dans une mesure capitale à donner leur physionomie aux relations professionnelles antagonistes et mutualistesSince the end of World War II, worker participation in management has expanded in varying degrees, in different forms, and at different levels. In West Europe both collective bargaining and mutualism have expanded dramatically and workers participation in management seems destined to advance. In Britain and North America the adversary System of collective bargaining has predominated. Mutualistic schemes have been in the small minority. The attitudinal climate has not been conducive to consensus thinking in industrial relations

    Sensitivity And Out-Of-Sample Error in Continuous Time Data Assimilation

    Full text link
    Data assimilation refers to the problem of finding trajectories of a prescribed dynamical model in such a way that the output of the model (usually some function of the model states) follows a given time series of observations. Typically though, these two requirements cannot both be met at the same time--tracking the observations is not possible without the trajectory deviating from the proposed model equations, while adherence to the model requires deviations from the observations. Thus, data assimilation faces a trade-off. In this contribution, the sensitivity of the data assimilation with respect to perturbations in the observations is identified as the parameter which controls the trade-off. A relation between the sensitivity and the out-of-sample error is established which allows to calculate the latter under operational conditions. A minimum out-of-sample error is proposed as a criterion to set an appropriate sensitivity and to settle the discussed trade-off. Two approaches to data assimilation are considered, namely variational data assimilation and Newtonian nudging, aka synchronisation. Numerical examples demonstrate the feasibility of the approach.Comment: submitted to Quarterly Journal of the Royal Meteorological Societ

    On Variational Data Assimilation in Continuous Time

    Full text link
    Variational data assimilation in continuous time is revisited. The central techniques applied in this paper are in part adopted from the theory of optimal nonlinear control. Alternatively, the investigated approach can be considered as a continuous time generalisation of what is known as weakly constrained four dimensional variational assimilation (WC--4DVAR) in the geosciences. The technique allows to assimilate trajectories in the case of partial observations and in the presence of model error. Several mathematical aspects of the approach are studied. Computationally, it amounts to solving a two point boundary value problem. For imperfect models, the trade off between small dynamical error (i.e. the trajectory obeys the model dynamics) and small observational error (i.e. the trajectory closely follows the observations) is investigated. For (nearly) perfect models, this trade off turns out to be (nearly) trivial in some sense, yet allowing for some dynamical error is shown to have positive effects even in this situation. The presented formalism is dynamical in character; no assumptions need to be made about the presence (or absence) of dynamical or observational noise, let alone about their statistics.Comment: 28 Pages, 12 Figure

    Data assimilation in the low noise regime with application to the Kuroshio

    Full text link
    On-line data assimilation techniques such as ensemble Kalman filters and particle filters lose accuracy dramatically when presented with an unlikely observation. Such an observation may be caused by an unusually large measurement error or reflect a rare fluctuation in the dynamics of the system. Over a long enough span of time it becomes likely that one or several of these events will occur. Often they are signatures of the most interesting features of the underlying system and their prediction becomes the primary focus of the data assimilation procedure. The Kuroshio or Black Current that runs along the eastern coast of Japan is an example of such a system. It undergoes infrequent but dramatic changes of state between a small meander during which the current remains close to the coast of Japan, and a large meander during which it bulges away from the coast. Because of the important role that the Kuroshio plays in distributing heat and salinity in the surrounding region, prediction of these transitions is of acute interest. Here we focus on a regime in which both the stochastic forcing on the system and the observational noise are small. In this setting large deviation theory can be used to understand why standard filtering methods fail and guide the design of the more effective data assimilation techniques. Motivated by our analysis we propose several data assimilation strategies capable of efficiently handling rare events such as the transitions of the Kuroshio. These techniques are tested on a model of the Kuroshio and shown to perform much better than standard filtering methods.Comment: 43 pages, 12 figure

    Improving Incremental Balance in the GSI 3DVAR Analysis System

    Get PDF
    The Gridpoint Statistical Interpolation (GSI) analysis system is a unified global/regional 3DVAR analysis code that has been under development for several years at the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center. It has recently been implemented into operations at NCEP in both the global and North American data assimilation systems (GDAS and NDAS). An important aspect of this development has been improving the balance of the analysis produced by GSI. The improved balance between variables has been achieved through the inclusion of a Tangent Linear Normal Mode Constraint (TLNMC). The TLNMC method has proven to be very robust and effective. The TLNMC as part of the global GSI system has resulted in substantial improvement in data assimilation both at NCEP and at the NASA Global Modeling and Assimilation Office (GMAO)

    National service: the sixties meet the nineties

    Get PDF
    City Year is a community service program in Boston, MA, envisioned to help empower citizens to solve their own problems in the area. The program\u27s citizen participation component reincarnates an ideology upheld by the youth in the 1960s

    What is the correct cost functional for variational data assimilation?

    Get PDF
    Variational approaches to data assimilation, and weakly constrained four dimensional variation (WC-4DVar) in particular, are important in the geosciences but also in other communities (often under different names). The cost functions and the resulting optimal trajectories may have a probabilistic interpretation, for instance by linking data assimilation with maximum aposteriori (MAP) estimation. This is possible in particular if the unknown trajectory is modelled as the solution of a stochastic differential equation (SDE), as is increasingly the case in weather forecasting and climate modelling. In this situation, the MAP estimator (or “most probable path” of the SDE) is obtained by minimising the Onsager–Machlup functional. Although this fact is well known, there seems to be some confusion in the literature, with the energy (or “least squares”) functional sometimes been claimed to yield the most probable path. The first aim of this paper is to address this confusion and show that the energy functional does not, in general, provide the most probable path. The second aim is to discuss the implications in practice. Although the mentioned results pertain to stochastic models in continuous time, they do have consequences in practice where SDE’s are approximated by discrete time schemes. It turns out that using an approximation to the SDE and calculating its most probable path does not necessarily yield a good approximation to the most probable path of the SDE proper. This suggest that even in discrete time, a version of the Onsager–Machlup functional should be used, rather than the energy functional, at least if the solution is to be interpreted as a MAP estimator

    Self energies of the pion and the delta isobar from the ^3He(e,e'pi^+)^3H reaction

    Full text link
    In a kinematically complete experiment at the Mainz microtron MAMI, pion angular distributions of the 3^3He(e,e'π+)3\pi^+)^3H reaction have been measured in the excitation region of the Δ\Delta resonance to determine the longitudinal (LL), transverse (TT), and the LTLT interference part of the differential cross section. The data are described only after introducing self-energy modifications of the pion and Δ\Delta-isobar propagators. Using Chiral Perturbation Theory (ChPT) to extrapolate the pion self energy as inferred from the measurement on the mass shell, we deduce a reduction of the π+\pi^+ mass of Δmπ+=(1.72.1+1.7)\Delta m_{\pi^+} = (-1.7^{+ 1.7}_{- 2.1}) MeV/c2^2 in the neutron-rich nuclear medium at a density of ρ=(0.0570.057+0.085)\rho = (0.057^{+ 0.085}_{- 0.057}) fm3^{-3}. Our data are consistent with the Δ\Delta self energy determined from measurements of π0\pi^0 photoproduction from 4^4He and heavier nuclei.Comment: Elsart, 12 pages and 4 figures, Correspondent: Professor Dr. Dr. h.c. mult. Achim Richter, [email protected], submitted to Phys. Rev. Let
    corecore