547 research outputs found

    A N-dimensional Stochastic Control Algorithm for Electricity Asset Management on PC cluster and Blue Gene Supercomputer

    No full text
    International audienceManagement of French electricity production to control cost while satisfying demand, leads to solve a stochastic optimization problem where the main sources of uncertainty are the demand load, the electricity and fuel market prices, the hydraulicity, and the availability of the thermal production assets. A stochastic dynamic programming method is an interesting solution, but is both CPU and memory consuming. It requires parallelization to achieve speedup and size up, and to deal with a big number of stocks (N) and a big number of uncertainty factors. This paper introduces a distribution of a N-dimension stochastic dynamic programming application, on PC clusters and IBM Blue Gene/L super-computer. It has needed to parallelize input and output file accesses from thousands of processors, to load balance a N-dimension cube of data and computation evolving at each time step, and to compute Monte-Carlo simulations requiring data spread in many separate files managed by different processors. Finally, a successful experiment of a 7-stock problem using up to 8192 processors validates this distribution strategy

    La régulation par les standards ISO

    No full text
    International audienceDes appareillages électriques, en passant par les bilans comptables jusqu'aux technologies de l'information et de la communication, rares sont les secteurs économiques qui ne soient pas soumis à des normes dites techniques. Les normes techniques, dont les normes ISO sont certainement les plus connues (notamment ISO 9000 et 14000), sont des standards de production qui ont pour but d'améliorer la qualité, la sécurité ou la comptabilité de biens et de services. De fait, si ces normes passent souvent pour un objet obscur et complexe engageant les quelques initiés des domaines concernés, elles n'en demeurent pas moins un véritable processus de régulation économique à l'échelle mondiale. Le terme ISO est polysémique puisqu'il désigne aussi bien l'organisation internationale (International Organization for Standardization) au sein de laquelle sont adoptés les standards que les normes elles-mêmes. Créée en 1946, à la suite de la Fédération internationale des associations nationales de normalisation, l'ISO présente une structure fédérative regroupant les agences nationales de 148 pays. Le nombre impressionnant de normes publiées (13 700 depuis 1947) tend à attester de l'importance de ces dispositions pour les secteurs économiques concernés. A l'heure des débats sur la mondialisation, il peut être intéressant de se pencher sur une entreprise de régulation internationale qui, loin de réactiver des oppositions classiques (libéralisme économique contre régulation ou intérêts privés contre autorité publique), semble, au contraire, réaliser un syncrétisme original. Les politiques de normalisation internationale mettent en relation une multitude d'acteurs publics et privés : organisations internationales, administrations nationales, agences, centres de recherches, entreprises, associations, etc. Cette diversité d'acteurs n'épuise cependant pas la notion de politique publique, mais incite à la concevoir sous un nouveau jour, qui soit susceptible d'en faire ressortir tant l'originalité des formes que la portée effective dans les cas qui nous intéressent. Dans la contribution sans doute un peu trop dense que nous présentons ici, nous aimerions à la fois interroger les entrées classiques d'analyse de ces politiques, et proposer une démarche d'analyse un peu différente, basée sur l'idée de format empruntée à Rémi Barbier, et de monopolisation des formats. Ce cadre d'analyse permet de saisir à la fois les modalités d'élaboration des prescriptions et la mise à l'épreuve de ces prescriptions dans les situations de travail qui les opérationnalisent

    A new preconditioner update strategy for the solution of sequences of linear systems in structural mechanics: application to saddle point problems in elasticity

    Get PDF
    Many applications in structural mechanics require the numerical solution of sequences of linear systems typically issued from a finite element discretization of the governing equations on fine meshes. The method of Lagrange multipliers is often used to take into account mechanical constraints. The resulting matrices then exhibit a saddle point structure and the iterative solution of such preconditioned linear systems is considered as challenging. A popular strategy is then to combine preconditioning and deflation to yield an efficient method.We propose an alternative that is applicable to the general case and not only to matrices with a saddle point structure. In this approach, we consider to update an existing algebraic or application-based preconditioner, using specific available information exploiting the knowledge of an approximate invariant subspace or of matrix-vector products. The resulting preconditioner has the form of a limited memory quasi-Newton matrix and requires a small number of linearly independent vectors. Numerical experiments performed on three large-scale applications in elasticity highlight the relevance of the new approach. We show that the proposed method outperforms the deflation method when considering sequences of linear systems with varying matrices

    Limited memory preconditioners for nonsymmetric systems

    Get PDF
    This paper presents a class of limited memory preconditioners (LMPs) for solving linear systems of equations with multiple nonsymmetric matrices and multiple right-hand sides. These preconditioners based on limited memory quasi-Newton formulas require a small number k of linearly independent vectors. They may be used to improve an existing first-level preconditioner and are especially worth considering when the solution of a sequence of linear systems with slowly varying left-hand sides is addressed

    L'individu, le monastère et l'église : représentations de la progression spirituelle dans les Monodiae de Guibert de Nogent au XIIe siècle

    Get PDF
    L'oeuvre «autobiographique» de Guibert de Nogent, ses Monodiae, s'insère dans l'histoire de son monastère et dans le récit des troubles qui agitent la région de Laon au début du XIIe siècle. Les trois livres composant l'oeuvre mettent en relation des espaces imbriqués. La vie de l'individu est représentée comme un mouvement introspectif vers le salut permis par la confession et la conversion. Le monastère s'illustre comme une voie privilégiée dans l'atteinte de ce but. ce lieu qui se distingue des autres par sa proximité du divin s'avère un cadre de transition idéal. Enfin, Guibert de Nogent reporte sur la société de Laon sa réflexion sur la progression spirituelle. Les péchés des évêques et le bris de la sacralité de l'église, associée à l'âme de la communauté, entraînent la population dans le tumulte

    Arthroscopic Bristow-Latarjet Combined With Bankart Repair Restores Shoulder Stability in Patients With Glenoid Bone Loss

    Get PDF
    BACKGROUND: Arthroscopic Bankart repair alone cannot restore shoulder stability in patients with glenoid bone loss involving more than 20% of the glenoid surface. Coracoid transposition to prevent recurrent shoulder dislocation according to Bristow-Latarjet is an efficient but controversial procedure. QUESTIONS/PURPOSES: We determined whether an arthroscopic Bristow-Latarjet procedure with concomitant Bankart repair (1) restored shoulder stability in this selected subgroup of patients, (2) without decreasing mobility, and (3) allowed patients to return to sports at preinjury level. We also evaluated (4) bone block positioning, healing, and arthritis and (5) risk factors for nonunion and coracoid screw pullout. METHODS: Between July 2007 and August 2010, 79 patients with recurrent anterior instability and bone loss of more than 20% of the glenoid underwent arthroscopic Bristow-Latarjet-Bankart repair; nine patients (11%) were either lost before 2-year followup or had incomplete data, leaving 70 patients available at a mean of 35 months. Postoperative radiographs and CT scans were evaluated for bone block positioning, healing, and arthritis. Any postoperative dislocation or any subjective complaint of occasional to frequent subluxation was considered a failure. Physical examination included ROM in both shoulders to enable comparison and instability signs (apprehension and relocation tests). Rowe and Walch-Duplay scores were obtained at each review. Patients were asked whether they were able to return to sports at the same level and practice forced overhead sports. Potential risk factors for nonhealing were assessed. RESULTS: At latest followup, 69 of 70 (98%) patients had a stable shoulder, external rotation with arm at the side was 9° less than the nonoperated side, and 58 (83%) returned to sports at preinjury level. On latest radiographs, 64 (91%) had no osteoarthritis, and bone block positioning was accurate, with 63 (90%) being below the equator and 65 (93%) flush to the glenoid surface. The coracoid graft healed in 51 (73%), it failed to unite in 14 (20%), and graft osteolysis was seen in five (7%). Bone block nonunion/migration did not compromise shoulder stability but was associated with persistent apprehension and less return to sports. Use of screws that were too short or overangulated, smoking, and age higher than 35 years were risk factors for nonunion. CONCLUSIONS: The arthroscopic Bristow-Latarjet procedure combined with Bankart repair for anterior instability with severe glenoid bone loss restored shoulder stability, maintained ROM, allowed return to sports at preinjury level, and had a low likelihood of arthritis. Adequate healing of the transferred coracoid process to the glenoid neck is an important factor for avoiding persistent anterior apprehension. LEVEL OF EVIDENCE: Level IV, therapeutic study. See Instructions for Authors for a complete description of levels of evidence

    Stochastic control optimization & simulation applied to energy management: From 1-D to N-D problem distributions, on clusters, supercomputers and Grids

    No full text
    International audienceManagement of electricity production to control cost while satisfying demand, leads to solve a stochastic optimization problem where the main sources of uncertainty are the demand load, the electricity and fuel market prices, the hydraulicity, and the availability of the thermal production assets. A stochastic dynamic programming method is an interesting solution for non convex optimization, but is both CPU and memory consuming. It requires parallelization to achieve speedup and size up, and to deal with a big number of stocks (N) and a big number of uncertainty factors. This talk will introduce a collaboration between EDF (a French electricity producer) and SUPELEC (a French engineering school and research laboratory) that aimed to distribute N-dimension stochastic dynamic programming applications on large distributed architectures, like PC clusters and IBM Blue Gene supercomputers. This collaboration was initiated in a French ANR project about Distributed and Grid computing applied to financial mathematic problems (the “GCPMF” ANR project). From an applicative point of view, the goal of this research was to be able to deal with at least three or four uncertainty factors, and at least six or seven stocks in optimization, while being able to efficiently use in simulation the commands calculated. The simulations are used after optimization in order to generate gain estimations on different periods and in order to estimate the associated risks. The methodology developed in this research project will bring some reference calculations that will help to derive some simplified versions to use in production. From a computer science point of view, three different parallelization strategies have been carried out in order to access input and output files from thousands of processors, to distribute a N dimensional cube of data used at each time step of an optimization algorithm, and to compute independent simulations requiring data spread in many separate files managed by different processors. All designed parallel algorithms have been experimented on a 7- stocks problem (7-dimensions problem) on different parallel architectures. We successfully used up to 256 processors of a PC cluster and up to 8192 processors of a Blue Gene/L supercomputer, achieving scalability with regular decrease of the execution time. We started distributing a 1-dimension stochastic control algorithm (applied to a gas storage valuation) in February 2007, and we extended our distribution to a N- dimension algorithm in 2008 (applied to electricity production management). In the next months this industrial and large scale distributed application will be used: – by EDF to study and optimize its energetic stock management and electricity production, using its new Blue Gene/P supercomputer up to 32000 processors; – by SUPELEC (IMS group) and INRIA (AlGorille and Reso teams) to run large experiments on Grid'5000, analyze communications and performances, and optimize task distribution when using several sites of Grid'5000. A global collaboration between EDF, SUPELEC and INRIA will allow comparing performances of this real and not embarrassingly parallel application, on supercomputers, different large PC-clusters and one multi-site Grid

    Analyse des images PLIF-Al Ă  haute cadence pour l'Ă©tude de la combustion de goutte d'aluminium en flammes de propergol solide

    Get PDF
    International audienceThe combustion of aluminum particles is a key factor for solid-propellant propulsion in terms of performance and stability. An automatic detection of droplets in Al-PLIF image was developed. The "Maximally Stable Extremal Regions" detection method is evaluated on two image sets previously obtained at 1.0 and 1.5 MPa. The method shows good detection performances compared to a set of Ground Truth images for both pressure levels. When applied to 3000-image series, more than 35000 objects were detected on LIF images. This is very promising for future statistical analysis of Al combustion based on Al-PLIF diagnostic.La combustion des particules d'aluminium est un facteur crucial pour la propulsion solide en termes de performance et de stabilité. Cette présentation expose nos travaux actuels sur la détection automatisée de gouttelettes sur les images acquises par fluorescence laser (LIF) de l'aluminium atomique. La méthode de détection MSER (Maximally Stable Extremal Regions) est testée sur deux séries d'images obtenues pour des combustions de propergols aluminisés à 1.0 et 1.5 MPa.De bonnes performances de détection sont obtenues comparées aux images "Vérité Terrain". Plus de 35000 objets sont détectés sur les images LIF. Ces premiers résultats de détection automatisé sont encourageants pour une prochaine analyse statistique des diagnostics LIF de la combustion des gouttes d'aluminium
    • …
    corecore