59 research outputs found

    CRISIS AND GOVERNANCE. VALUATION OF SKILLS AS A FACTOR OF ORGANIZATIONAL COHESION

    Get PDF
    Purpose: The growing of the international exchanges and the liberalization of many economic sectors lead to the instability of the relation between employer – employee. At any time, this one is likely to be broken to allow the company to face economic or financial requirements. In such a context, how to have motivated employees in an organization which can rapidly exclude them? The aim of this analysis is to evaluate in which measurements the recognition of skills seems a suggestion for solution. Any worker invests himself in his work because he reinforces his employability. This factor will enable him to reinforce the probability of finding a new employment in case he has suddenly been laid off .Methodology and approach: The starting point of this reflection rests on two hypothesis:Any organization needs motivated employees in order to get a competitive advantage.The motivation of the employees can be obtained even when the objectives of the employee and those of the company are divergent. The homogeneity of the methods used and the heterogeneity of the goals constitute the new challenge of the organizations to ensure their durability.Considering the hypothesis and the aim of the analysis, a trans field study, including sociology/ management/labor law has been led to highlight a possible adhesion of the employees to the objectives of an organization evolving in an unstable environment.Findings : The skills recognition and the reinforcement of the employability seem to reply to the problems posed. However, the implementation of such a process is conditioned with various elements among which an fair definition of the criteria allowing the employees to have the feeling of a right recognition.Implications for further research : The validation of the study’s results could be get by the control of a comparative study between two similar organizations. One will implement a process of recognition of competences, the other preserving a more traditional way of human resources managing. For a better evaluation of the impact of the difference between the two ways of managing, the companies tested will be selected within service industry.

    On the Application of Formal Techniques for Dependable Concurrent Systems

    Get PDF
    The pervasiveness of computer systems in virtually every aspect of daily life entails a growing dependence on them. These systems have become integral parts of our societies as we continue to use and rely on them on a daily basis. This trend of digitalization is set to carry on, bringing forth the question of how dependable these systems are. Our dependence on these systems is in acute need for a justification based on rigorous and systematic methods as recommended by internationally recognized safety standards. Ensuring that the systems we depend on meet these recommendations is further complicated by the increasingly widespread use of concurrent systems, which are notoriously hard to analyze due to the substantial increase in complexity that the interactions between different processing entities engenders. In this thesis, we introduce improvements on existing formal analysis techniques to aid in the development of dependable concurrent systems. Applying formal analysis techniques can help us avoid incidents with catastrophic consequences by uncovering their triggering causes well in advance. This work focuses on three types of analyses: data-flow analysis, model checking and error propagation analysis. Data-flow analysis is a general static analysis technique aimed at predicting the values that variables can take at various points in a program. Model checking is a well-established formal analysis technique that verifies whether a program satisfies its specification. Error propagation analysis (EPA) is a dynamic analysis whose purpose is to assess a program's ability to withstand unexpected behaviors of external components. We leverage data-flow analysis to assist in the design of highly available distributed applications. Given an application, our analysis infers rules to distribute its workload across multiple machines, improving the availability of the overall system. Furthermore, we propose improvements to both explicit and bounded model checking techniques by exploiting the structure of the specification under consideration. The core idea behind these improvements lies in the ability to abstract away aspects of the program that are not relevant to the specification, effectively shortening the verification time. Finally, we present a novel approach to EPA based on symbolic modeling of execution traces. The symbolic scheme uses a dynamic sanitizing algorithm to eliminate effects of non-determinism in the execution traces of multi-threaded programs.The proposed approach is the first to achieve a 0% rate of false positives for multi-threaded programs. The work in this thesis constitutes an improvement over existing formal analysis techniques that can aid in the development of dependable concurrent systems, particularly with respect to availability and safety

    ETUDE DE L’ISOTHERME 25 °C DU SYSTEME QUASI QUATERNAIRE H2O - Zn(NO3)2,6H2O – Cu(NO3)2,3H2O - NH4NO3 II- ISOPLETHES : 41 MASSE % Cu(NO3)2, [MASSE DE NH4NO3] = -10/9 [MASSE DE Zn(NO3)2] + 100, [MASSE DE H2O] = 0.5702 [MASSE DE ZN(NO3)2] + 0.2879[ MASSE DE

    Get PDF
    The solid-liquid equilibria of the quasi quaternary system H2O Zn(NO3)2,6H2O Cu(NO3)2,3H2O-NH4NO3 were studied at 25°C by using a synthetic method based on conductivity measurements. Three isoplethic sections has been established at 25°C and the stable solid phases which appear are: NH4NO3 (IV), Zn(NO3)2,6H2O, Cu(NO3)2,3H2O and metastable Cu(NO3),2.5H2O. Neither double salts, nor mixed crystals are observed at these temperature and composition range

    Influence of renal replacement modalities on amikacin population pharmacokinetics in critically ill patients on continuous renal replacement therapy

    Get PDF
    The objective of this study was to describe amikacin pharmacokinetics (PK) in critically ill patients receiving equal doses (30 ml/kg of body weight/h) of continuous venovenous hemofiltration (CVVH) and continuous venovenous hemodiafiltration (CVVHDF). Patients receiving amikacin and undergoing CVVH or CVVHDF were eligible. Population pharmacokinetic analysis and Monte Carlo simulation were undertaken using the Pmetrics software package for R. Sixteen patients (9 undergoing CVVH, 11 undergoing CVVHDF) and 20 sampling intervals were analyzed. A two-compartment linear model best described the data. Patient weight was the only covariate that was associated with drug clearance. The mean +/- standard deviation parameter estimates were 25.2 +/- 17.3 liters for the central volume, 0.89 +/- 1.17 h(-1) for the rate constant for the drug distribution from the central to the peripheral compartment, 2.38 +/- 6.60 h(-1) for the rate constant for the drug distribution from the peripheral to the central compartment, 4.45 +/- 2.35 liters/h for hemodiafiltration clearance, and 4.69 +/- 2.42 liters/h for hemofiltration clearance. Dosing simulations for amikacin supported the use of high dosing regimens (>= 25 mg/kg) and extended intervals (36 to 48 h) for most patients when considering PK/pharmacodynamic (PD) targets of a maximum concentration in plasma (C-max)/MIC ratio of >= 8 and a minimal concentration o

    Trace Sanitizer:Eliminating the Effects of Non-Determinism of Error Propagation Analysis

    Get PDF
    Modern computing systems typically relax execution determinism, for instance by allowing the CPU scheduler to inter- leave the execution of several threads. While beneficial for performance, execution non-determinism affects programs' execution traces and hampers the comparability of repeated executions. We present TraceSanitizer, a novel approach for execution trace comparison in Error Propagation Analyses (EPA) of multi-threaded programs. TraceSanitizer can identify and compensate for non- determinisms caused either by dynamic memory allocation or by non-deterministic scheduling. We formulate a condition under which TraceSanitizer is guaranteed to achieve a 0% false positive rate, and automate its verification using Satisfiability Modulo Theory (SMT) solving techniques. TraceSanitizer is comprehensively evaluated using execution traces from the PARSEC and Phoenix benchmarks. In contrast with other approaches, Trace- Sanitizer eliminates false positives without increasing the false negative rate (for a specific class of programs), with reasonable performance overheads

    On the Application of Formal Techniques for Dependable Concurrent Systems

    No full text
    The pervasiveness of computer systems in virtually every aspect of daily life entails a growing dependence on them. These systems have become integral parts of our societies as we continue to use and rely on them on a daily basis. This trend of digitalization is set to carry on, bringing forth the question of how dependable these systems are. Our dependence on these systems is in acute need for a justification based on rigorous and systematic methods as recommended by internationally recognized safety standards. Ensuring that the systems we depend on meet these recommendations is further complicated by the increasingly widespread use of concurrent systems, which are notoriously hard to analyze due to the substantial increase in complexity that the interactions between different processing entities engenders. In this thesis, we introduce improvements on existing formal analysis techniques to aid in the development of dependable concurrent systems. Applying formal analysis techniques can help us avoid incidents with catastrophic consequences by uncovering their triggering causes well in advance. This work focuses on three types of analyses: data-flow analysis, model checking and error propagation analysis. Data-flow analysis is a general static analysis technique aimed at predicting the values that variables can take at various points in a program. Model checking is a well-established formal analysis technique that verifies whether a program satisfies its specification. Error propagation analysis (EPA) is a dynamic analysis whose purpose is to assess a program's ability to withstand unexpected behaviors of external components. We leverage data-flow analysis to assist in the design of highly available distributed applications. Given an application, our analysis infers rules to distribute its workload across multiple machines, improving the availability of the overall system. Furthermore, we propose improvements to both explicit and bounded model checking techniques by exploiting the structure of the specification under consideration. The core idea behind these improvements lies in the ability to abstract away aspects of the program that are not relevant to the specification, effectively shortening the verification time. Finally, we present a novel approach to EPA based on symbolic modeling of execution traces. The symbolic scheme uses a dynamic sanitizing algorithm to eliminate effects of non-determinism in the execution traces of multi-threaded programs.The proposed approach is the first to achieve a 0% rate of false positives for multi-threaded programs. The work in this thesis constitutes an improvement over existing formal analysis techniques that can aid in the development of dependable concurrent systems, particularly with respect to availability and safety

    Etude par frottement interieur de l'ordre directionnel dans l'alliage concentre Au Ni

    No full text
    SIGLECNRS T Bordereau / INIST-CNRS - Institut de l'Information Scientifique et TechniqueFRFranc
    • …
    corecore