345 research outputs found

    L'informatique de gestion, entre technique pure et outil de gestion : une perspective historique à travers les discours des responsables informatiques de 1970 à 2000

    Get PDF
    This research aims at understanding how the contribution of information technology within the organizations is thought. It is based on an analysis of discourses of Information Technology business leaders between 1970 and 2000. It confronts the archives of a professional association, the Club Informatique des Grandes Entreprises Françaises (CIGREF), with historical publications and retrospective interviews. It presents the evolution of the representation given to InformationTechnology in their historical context. Beyond historical periods, it highlights two different conceptions of technology, guiding the IT function management. On one side, IT management systems are conceived as a pure technique that inherently carries virtues for business management. The real-time technology, for example, should be considered because it increases the speed of information flows, considered as a key element of companies’ competitiveness. The IT function then develops the technical potentials available to the company. On the other side, it is seen as a management tool that enables the implementation of methods, organizations and processes within a technical framework and, thus, drives the organizational changes. The IT function must then understand the business issues todevelop a relevant technical solutions framework. Finally, research shows that the second conception is promoted in order to strengthen the legitimacy of the Technology and the IT function in organizations.Cette recherche vise à comprendre comment est pensé l’apport de l’informatique de gestion dans l’organisation. Elle s’appuie sur une analyse des discours des responsables de l’informatique de gestion entre 1970 et 2000. Elle confronte les archives d’une association de professionnels, le Club Informatique des Grandes Entreprises Françaises (CIGREF), avec des publications d’époque et des témoignages rétrospectifs. Elle montre l’évolution des représentations portées sur l’informatique de gestion dans leur contexte historique. Au-delà des périodes, elle met en évidence deux conceptions différentes de la technique, qui guident le management de la fonction. Dans un cas, l’informatique de gestion est pensée comme une technique pure qui porte intrinsèquement des vertus pour la gestion. Le temps réel, par exemple, est à envisager parce qu’il accroît la rapidité de circulation de l’information, considéré comme un élément essentiel de la compétitivité des entreprises. La fonction développe alors le potentiel technique à la disposition de l’entreprise. Dans l’autre, elle est considérée comme un outil de gestion qui encastre dans un support technique une vision des méthodes et des relations à déployer dans l’entreprise et véhicule le changement organisationnel. La fonction se doit alors d’appréhender les enjeux opérationnels pour proposer un agencement technique pertinent. Enfin, la recherche montre que la seconde conception est promue afin de renforcer la légitimité de la technique et de la fonction dans les organisations

    Quantifying uncertainty in an industrial approach : an emerging consensus in an old epistemological debate

    Get PDF
    Uncertainty is ubiquitous in modern decision-making supported by quantitative modeling. While uncertainty treatment has been initially largely developed in risk or environmental assessment, it is gaining large-spread interest in many industrial fields generating knowledge and practices going beyond the classical risk versus uncertainty or epistemic versus aleatory debates. On the basis of years of applied research in different sectors at the European scale, this paper discusses the emergence of a methodological consensus throughout a number of fields of engineering and applied science such as metrology, safety and reliability, protection against natural risk, manufacturing statistics, numerical design and scientific computing etc. In relation with the applicable regula-tion and standards and a relevant quantity of interest for decision-making, this approach involves in particular the proper identification of key steps such as : the quantification (or modeling) of the sources of uncertainty, possibly involving an inverse approach ; their propagation through a pre-existing physical-industrial model; the ranking of importance or sensitivity analysis and sometimes a subsequent optimisation step. It aims at giving a consistent and industrially-realistic framework for practical mathematical modeling, assumingly restricted to quantitative and quantifiable uncertainty, and illustrated on three typical examples. Axes of further research proving critical for the environmental or industrial issues are outlined: the information challenges posed by uncertainty modeling in the context of data scarcity, and the corresponding calibration and inverse probabilistic techniques, bound to be developed to best value industrial or environmental monitoring and data acquisition systems under uncertainty; the numerical challenges entailing considerable development of high-performance computing in the field; the acceptability challenges in the context of the precautionary principle

    Uncertainty and sensitivity analysis in quantitative pest risk assessments : practical rules for risk assessors

    Get PDF
    Quantitative models have several advantages compared to qualitative methods for pest risk assessments (PRA). Quantitative models do not require the definition of categorical ratings and can be used to compute numerical probabilities of entry and establishment, and to quantify spread and impact. These models are powerful tools, but they include several sources of uncertainty that need to be taken into account by risk assessors and communicated to decision makers. Uncertainty analysis (UA) and sensitivity analysis (SA) are useful for analyzing uncertainty in models used in PRA, and are becoming more popular. However, these techniques should be applied with caution because several factors may influence their results. In this paper, a brief overview of methods of UA and SA are given. As well, a series of practical rules are defined that can be followed by risk assessors to improve the reliability of UA and SA results. These rules are illustrated in a case study based on the infection model of Magarey et al. (2005) where the results of UA and SA are shown to be highly dependent on the assumptions made on the probability distribution of the model inputs

    Probing dynamics of HIV-1 nucleocapsid protein/target hexanucleotide complexes by 2-aminopurine

    Get PDF
    The nucleocapsid protein (NC) plays an important role in HIV-1, mainly through interactions with the genomic RNA and its DNA copies. Though the structures of several complexes of NC with oligonucleotides (ODNs) are known, detailed information on the ODN dynamics in the complexes is missing. To address this, we investigated the steady state and time-resolved fluorescence properties of 2-aminopurine (2Ap), a fluorescent adenine analog introduced at positions 2 and 5 of AACGCC and AATGCC sequences. In the absence of NC, 2Ap fluorescence was strongly quenched in the flexible ODNs, mainly through picosecond to nanosecond dynamic quenching by its neighboring bases. NC strongly restricted the ODN flexibility and 2Ap local mobility, impeding the collisions of 2Ap with its neighbors and thus, reducing its dynamic quenching. Phe16→Ala and Trp37→Leu mutations largely decreased the ability of NC to affect the local dynamics of 2Ap at positions 2 and 5, respectively, while a fingerless NC was totally ineffective. The restriction of 2Ap local mobility was thus associated with the NC hydrophobic platform at the top of the folded fingers. Since this platform supports the NC chaperone properties, the restriction of the local mobility of the bases is likely a mechanistic component of these properties

    Multiple Functions and Disordered Nature of Nucleocapsid Proteins of Retroviruses and Hepadnaviruses

    Get PDF
    This chapter aims at presenting small viral proteins that orchestrate replication of the human immunodeficiency virus type-1 (HIV-1) and the human hepatitis virus (HBV), two canonical examples of small human pathogens. HIV-1 nucleocapsid protein (NC) and the C-terminal domain (CTD) of the HBV core protein (HBc) are essential structural components of the virus capsid ensuring protection of the viral genome; they also chaperone replication of the HIV-1 genomic RNA and the HBV DNA by a reverse-transcription mode, and later, these proteins kick-start virus morphogenesis. HIV-1 NC and HBV CTD belong to the family of intrinsically disordered proteins (IDP), a characteristic rendering possible a large number of molecular interactions. Although these viral proteins share little sequence homologies, they have in common to be rich in basic amino acids and endowed with RNA-binding and chaperoning activities. Similar viral RNA-binding proteins (vRBP) are also encoded for by other virus families, notably flaviviruses, hantaviruses, and coronaviruses. We discuss how these vRBPs function based on the abundant RBP family that plays key physiological roles via multiple interactions with non-coding RNA regulating immune defenses and cell stress. Moreover, these RBPs are flexible molecules allowing dynamic interactions with many RNA and protein partners in a semi-solid milieu favoring biochemical reactions

    Specific implications of the HIV-1 nucleocapsid zinc fingers in the annealing of the primer binding site complementary sequences during the obligatory plus strand transfer

    Get PDF
    Synthesis of the HIV-1 viral DNA by reverse transcriptase involves two obligatory strand transfer reactions. The second strand transfer corresponds to the annealing of the (−) and (+) DNA copies of the primer binding site (PBS) sequence which is chaperoned by the nucleocapsid protein (NCp7). NCp7 modifies the (+)/(−)PBS annealing mechanism by activating a loop–loop kissing pathway that is negligible without NCp7. To characterize in depth the dynamics of the loop in the NCp7/PBS nucleoprotein complexes, we investigated the time-resolved fluorescence parameters of a (−)PBS derivative containing the fluorescent nucleoside analogue 2-aminopurine at positions 6, 8 or 10. The NCp7-directed switch of (+)/(−)PBS annealing towards the loop pathway was associated to a drastic restriction of the local DNA dynamics, indicating that NCp7 can ‘freeze’ PBS conformations competent for annealing via the loops. Moreover, the modifications of the PBS loop structure and dynamics that govern the annealing reaction were found strictly dependent on the integrity of the zinc finger hydrophobic platform. Our data suggest that the two NCp7 zinc fingers are required to ensure the specificity and fidelity of the second strand transfer, further underlining the pivotal role played by NCp7 to control the faithful synthesis of viral HIV-1 DNA

    Identifying intrinsic variability in multivariate systems through linearised inverse methods

    Get PDF
    A growing number of industrial risk studies include some form of treatment of the numerous sources of uncertainties affecting the conclusions; in the uncertainty treatment framework considered in this paper, the intrinsic variability of the uncertainty sources is modelled by a multivariate probability distribution. A key difficulty traditionally encountered at this stage is linked to the highly-limited sampling information directly available on uncertain input variables. A possible solution lies in the integration of indirect information, such as data on other more easily observable parameters linked to the parameters of interest through a well-known physical model. This leads to a probabilistic inverse problem: The objective is to identify a probability distribution, the dispersion of which is independent of the sample size since intrinsic variability is at stake. To limit to a reasonable level the number of (usually large CPU-time consuming) physical model runs inside the inverse algorithms, a linear approximation in a Gaussian framework are investigated in this paper. First a simple criterion is exhibited to ensure the identifiability of the model (i.e. the existence and unicity of a solution to the inverse problem). Then, the solution is computed via EM-type algorithms taking profit of the missing data structure of the estimation problem. The presentation includes a so-called ECME algorithm that can be used to overcome the possible pathology of slow convergence which affects the standard EM algorithm. Numerical experiments on simulated and real data sets highlight the good performances of these algorithms, as well as some precautions to be taken when using this approach

    Robustesse de l'estimation de fiabilité d'une structure : un nouvel algorithme valorisant la monotonie, illustré sur un exemple industriel

    Get PDF
    Dans le cadre des travaux pour la justification de la tenue en service de la cuve d'un réacteur à eau sous pression, des études probabilistes se rapportant au risque de rupture brutale de la cuve soumise à des transitoires d'exploitation et accidentels ont été menées depuis de nombreuses années en appui/complément des analyses déterministes réglementaires réalisées par EDF. Ces études couplant un modèle physique (pour la sollicitation et la résistance du composant) et un modèle probabiliste des incertitudes sont destinées à estimer la fiabilité de la structure ; elles se heurtent au problème de l'extrême faiblesse des probabilités, et au défi classique de l'optimisation du rapport entre le temps de calcul et la robustesse du résultat obtenu. Un nouvel algorithme, valorisant entièrement la monotonie de la fonction de défaillance, est proposé et illustré sur un cas d'étude simplifié se reliant à un enjeu industriel, où il permet de contrôler de façon robuste les méthodes fiabilistes de type FORM-SORM
    corecore