38 research outputs found

    Oblivious Routing: Static Routing Prepared Against Network Traffic and Link Failures

    Get PDF
    International audienceNetwork routing considers the problem of finding one or multiple paths to transfer packets from their source to their destination, ideally making the best use of the available resources (for instance, by minimising the congestion in the network). Oblivious routing is a technique that generates static routing schemes that are independent of the traffic, but still have strong theoretical guarantees about its performance (for instance, measured by link congestion). This work presents a numerical study of oblivious routing, in both synthetic and realistic networks. It also contains a novel extension to link failures, to which the routing should be immunised

    Structure, temporal evolution, and heat flux estimates from the Lucky Strike deep-sea hydrothermal field derived from seafloor image mosaics

    Get PDF
    Author Posting. © American Geophysical Union, 2012. This article is posted here by permission of American Geophysical Union for personal use, not for redistribution. The definitive version was published in Geochemistry Geophysics Geosystems 13 (2012): Q04007, doi:10.1029/2011GC003990.Here we demonstrate with a study of the Lucky Strike hydrothermal field that image mosaicing over large seafloor areas is feasible with new image processing techniques, and that repeated surveys allow temporal studies of active processes. Lucky Strike mosaics, generated from >56,000 images acquired in 1996, 2006, 2008 and 2009, reveal the distribution and types of diffuse outflow throughout the field, and their association with high-temperature vents. In detail, the zones of outflow are largely controlled by faults, and we suggest that the spatial clustering of active zones likely reflects the geometry of the underlying plumbing system. Imagery also provides constraints on temporal variability at two time-scales. First, based upon changes in individual outflow features identified in mosaics acquired in different years, we document a general decline of diffuse outflow throughout the vent field over time-scales up to 13 years. Second, the image mosaics reveal broad patches of seafloor that we interpret as fossil outflow zones, owing to their association with extinct chimneys and hydrothermal deposits. These areas encompass the entire region of present-day hydrothermal activity, suggesting that the plumbing system has persisted over long periods of time, loosely constrained to hundreds to thousands of years. The coupling of mosaic interpretation and available field measurements allow us to independently estimate the heat flux of the Lucky Strike system at ~200 to 1000 MW, with 75% to >90% of this flux taken up by diffuse hydrothermal outflow. Based on these heat flux estimates, we propose that the temporal decline of the system at short and long time scales may be explained by the progressive cooling of the AMC, without replenishment. The results at Lucky Strike demonstrate that repeated image surveys can be routinely performed to characterize and study the temporal variability of a broad range of vent sites hosting active processes (e.g., cold seeps, hydrothermal fields, gas outflows, etc.), allowing a better understanding of fluid flow dynamics from the sub-seafloor, and a quantification of fluxes.This project was funded by CNRS/IFREMER through the 2006, 2008, 2009 and 2010 cruises within the MoMAR program (France), by ANR (France) Mothseim Project NT05-3 42213 to J. Escartín, and by grant CTM2010-15216/MAR from the Spanish Ministry of Science to R. Garcia and J. Escartín. T. Barreyre was supported by University Paris Diderot (Paris 7– France) and Institut de Physique du Globe de Paris (IPGP, France). E. Mittelstaedt was supported by the International Research Fellowship Program of the U.S. National Science Foundation (OISE-0757920).2012-10-1

    Caractérisation de la flexibilités de sites industriels par modèles de réservoirs

    Full text link
    Electro-intensive industrial sites are very dependent on electricity prices to remain competitive. Nevertheless, they can often tune their processes in order to decrease their electricity consumption during the most critical periods, for example by using decision support systems based on mathematical modelling of their processes. Our goal is to estimate the flexibility potential of a complete site, not to tune each process very precisely. To this end, we propose a generic paradigm to help conceiving such models: reservoirs are the basic building block, which allows for great expressiveness while being close to the physics. More specifically, we do not need very precise models for our purposes, but ones that can be efficiently included in optimisation models. Our first results show that the obtained reservoir models can give sufficiently good approximations for metallurgical and other processes

    À la découverte de Julia !

    No full text
    DoctoralDans le bestiaire des langages de programmation, on trouve régulièrement de nouveaux venus, avec un succès souvent très éphémère. La première préversion de Julia est sortie en février 2012, après une incubation dans les laboratoires du MIT. Son objectif est assez simple : fournir une excellente performance à l'exécution du code (en se rapprochant du C), tout en gardant la facilité d'écriture des langages plus dynamiques comme Python ou MATLAB. Plus de dix ans après, le langage et sa communauté se sont développés, la performance est souvent très proche du C (le lecteur de fichiers CSV le plus rapide est d'ailleurs écrit en Julia), les mécanismes de base du langage ont permis de développer une composabilité à grande échelle dans tout l'écosystème (on peut combiner les fonctionnalités de plusieurs paquets qui n'ont jamais été prévus pour fonctionner ensemble).Pour les fonctionnalités qui ne sont pas disponibles nativement en Julia, il est très facile d'accéder à du code écrit dans d'autres langages : C, Fortran, mais aussi Python, R ou MATLAB.Cet atelier présentera les bases de la syntaxe de Julia et montrera son importance dans le cadre de projets de science des données. Notamment, il décrira brièvement DataFrames.jl, qui fournit une interface très similaire à Pandas en Python ou DataFrame en R, mais aussi Graphs.jl, une implémentation très performante d'algorithmes de graphe génériques, aussi simple à utiliser que NetworkX en Python mais bien plus rapide

    Embedding reservoirs in industrial models to exploit their flexibility

    No full text
    International audienceIn the context of energy transition, industrial plants that heavily rely on electricity face more and more price volatility. To continue operating in these conditions, the directors become continually more willing to increase their flexibility, i.e. their ability to react to price fluctuations. This work proposes an intuitive methodology to mathematically model electro-intensive processes in order to assess their flexibility potential. To this end, we introduce the notion of reservoir, a storage of either material or energy, that allows models based on this paradigm to have interpretations close to the physics of the processes. The design of the reservoir methodology has three distinct goals: (1) to be easy and quick to build by an energy-sector consultant; (2) to be effortlessly converted into mixed-integer linear or nonlinear programs; (3) to be straightforward to understand by nontechnical people, thanks to their graphic nature. We apply this methodology to two industrial case studies, namely an induction furnace (linear model) and an industrial cooling installation (nonlinear model), where we can achieve significant cost savings. In both cases, the models can be quickly written using our method and solved by appropriate solver technologies

    ConstraintProgrammingExtensions.jl: An MOI/JuMP extension for constraint programming

    No full text
    International audienceConstraint programming is a modelling paradigm that has proved to be extremely useful in many real world scenarios, like computing optimum schedules or vehicle routings. It is often viewed as either a complementary or a competing technology to mathematical programming, trading modelling ease with computational efficiency. Both approaches have seen many developments in terms of modelling language and solvers alike, including in Julia. Even though several constraint-programming solvers are available (or entirely written) in Julia, JuMP and MathOptInterface (its solver abstraction layer) do not give access to them in the same, unified way as mathematical programming, though the latest versions of JuMP have been designed to provide great flexibility. ConstraintProgrammingExtensions is currently a one-man project bringing constraint programming to JuMP. Its main part is a large series of sets that aim at providing a common interface for constraint programming solvers. It also consists of a series of bridges that define relationships between those sets (including between high-level constraints such as knapsacks and mathematical-programming formulations) and of a FlatZinc reader-writer to import and export models in that common format, already supported by tens of solvers. As a side effect, ConstraintProgrammingExtensions is also becoming a way to ease modelling for mathematical programming, as high-level constraints can be used with traditional mathematical-programming solvers. This presentation details the current state of ConstraintProgrammingExtensions, some of its design decisions, and future developments when JuMP and MathOptInterface do not provide sufficient versatility: for instance, several constraint-programming solvers allow graphs as first-class decision variables; also, constraint programming is not restricted by the linearity or the convexity of mathematical expressions, unlike many mathematical-programming solvers

    Implémentation et comparaison de la programmation stochastique et robuste

    Full text link
    Traditional optimisation tools focus on deterministic problems: scheduling airline flight crews (with as few employees as possible while still meeting legal constraints, such as maximum working time), finding the shortest path in a graph (used by navigation systems to give directions, usually based on GPS signals), etc. However, this deterministic hypothesis sometimes yields useless solutions: actual parameters cannot always be known to full precision, one reason being their randomness. For example, when scheduling trucks for freight transportation, if there is unexpected congestion on the roads, the deadlines might not be met, the company might be required to financially compensate for this delay, but also for the following deliveries that could not be made on schedule. Two main approaches are developed in the literature to take into account this uncertainty: take decision based on probability distributions of the uncertain parameters (stochastic programming) or considering they lie in some set (robust programming). In general, the first one leads to a large increase in the size of the problems to solve (and thus requires algorithms to work around this dimensionality curse), while the second is more conservative but tends to change the nature of the programs (which can impose a new solver technology). Some authors claim that those two mindsets are equivalent, meaning that the solutions they provide are equivalent when faced with the same uncertainty. The goal of this thesis is to explore this question: for various problems, implement those two approaches, and compare them. Is one solution more secluded from variations due to the uncertain parameters? Does it bring benefits over a deterministic approach? Is one cheaper than the other to compute

    Optimisation sous incertitude : comparaison entre la programmation stochastique et robuste

    Full text link
    Traditional optimisation tools focus on deterministic problems: scheduling airline flight crews (with as few employees as possible while still meeting legal constraints, such as maximum working time), finding the shortest path in a graph (used by navigation systems to give directions), etc. However, this deterministic hypothesis sometimes provides useless solutions: actual parameters cannot always be known to full precision, one reason being their randomness. For example, when scheduling trucks for freight transportation, if there is unexpected congestion on the roads, the deadlines might not be met, the company might be required to financially compensate for this delay, but also for the following deliveries that could not be made on schedule. Two main approaches are developed in the literature to take into account this uncertainty: make decision based on probability distributions of the uncertain parameters (stochastic programming) or considering they lie in a so-called ¿uncertainty set¿ (robust programming). In general, the first one leads to a large increase in the size of the problems to solve (and thus requires algorithms to work around this dimensionality curse), while the second is more conservative but tends to change the nature of the programs (which can impose a new solver technology). This talk compares the two approaches on three different cases: facility location, unit-commitment, and reservoir management. On the implementation side, multiple specific algorithms have been implemented to solve stochastic programs in order to compare their relative performance: Benders¿ decomposition, progressive hedging, and the deterministic equivalent. When comparing stochastic and robust programming, many differences appear in many aspects, even though the literature about those is very scarce. (Furthermore, those two approaches are not incompatible: both can be used in the same optimisation model to take into account different parts of the uncertainty.) Concerning solving time, stochastic programming quickly gives rise to intractable problems, which requires the development of more specific algorithm just to be able to solve them to an acceptable accuracy in decent time. What is more, the stochastic description of the uncertain values (with a discretisation of the probability distribution through scenarios) must cover all the possible uncertainty, otherwise the solution risks overfitting those scenarios, and is likely to have poor performance on close but different scenarios that may happen in practice ¿ which imposes to use a large number of scenarios, which yields very large (and hardly tractable) optimisation programs. On the other hand, by using specific uncertainty sets, robust programming yields programs that are only very slightly harder to solve, with an objective function that is very close to that of stochastic programming, but with totally different robustness properties: by using an uncertainty set computed from the scenarios, and not the scenarios themselves, it is able to withstand a much higher uncertainty than stochastic programming. However, when facing other types of uncertainty, this conclusion might turn untrue, with robust programming unable to cope with them and to bring interesting solutions to the table

    Un voyage au pays de Julia : un langage de programmation rapide et dynamique

    Full text link
    Usually, dynamic programming languages (like Python, R, or MATLAB) are quite slow when executed, which causes performance problems in many applications. Julia is a blossoming language, both dynamic and fast, fully open source, with a syntax that is very similar to that of MATLAB. This talk presents briefly Julia and its ecosystem.D'habitude, les langages dynamiques comme Python, R ou MATLAB sont lents à l'exécution, ce qui pose problème dans bon nombre d'applications. Julia est un langage en plein essor, dynamique et performant, entièrement open source, avec une syntaxe proche de MATLAB. Cette présentation parle de Julia et de son écosystème
    corecore