755 research outputs found

    On The Date of the Copper Age in the United States

    Get PDF
    During the mid-19th century, some scholars believed that the chevron beads found in early Indian graves had been brought to North America by globe-trotting Phoenicians or representatives of some other higher European civilization. A paper on the subject published in 1862 by one of the theory\u27s proponents is reproduced here, along with contemporary descriptions and illustrations of the beads under discussion

    Optimizing the date of an upgrading investment in a data network

    Get PDF
    Due to the introduction of new services, the volume of data transferred in mobile networks is rapidly growing and operators periodically face the necessity to upgrade their network. Such upgrades allow them to increase the capacity and provide adequate Quality of Service (QoS). In this paper we propose a general framework for deriving the optimal date for a network upgrade. We show that this date is the result of a compromise between the decrease of upgrade investment cost with time and the loss of profit generated by insufficient capacity. The upgrade should hence be performed when the loss of profit, derived using analytical capacity expressions, exceeds the expected discount. The model presented herein accounts for the randomness of the demand and upgrading cost functions, and results are given for a HSDPA network.data flows, brownian motion, quality of service, customer satisfaction

    Risk-hedging using options for an upgrading investment in a data network

    Get PDF
    In this paper, we illustrate how a mobile data network operator can plan an upgrading investment to anticipate explosions of the demand, taking into account the expected generated profit and the customers satisfaction. The former parameter grows with the demand, whereas the latter sinks if the demand is too high as throughput may collapse. As the equipment price decreases with time, it may be interesting to wait rather than to invest at once. We then propose a real option strategy to hedge against the risk that the investment has to take place earlier than expected. At last, we price this option with a backward dynamic programming approach, using recent improvements based on least-squares estimations.

    California in the greenhouse: regional climate policies and the global environment

    Get PDF
    This thesis explores how climate policy is developing at sub-national or “regional” scales of decision-making. It considers local-global connections on both the science and the politics of climate change by investigating four main research questions as they pertain to regional climate action: What triggers regional policy action on global climate change? What arguments and lines of evidence underlie the policy discourse? How do “winning” arguments gain salience? How does regional action make a difference to broader scale climate policy? The research is conducted through one in-depth case study in California. It shows that action on climate change mitigation in California is enabled in part by past action in related policy arenas of air pollution control and energy policy within a multilevel, social-practice environmental governance framework. More recently the emergences of a comprehensive policy framework is triggered by a unique policy window where a change in California’s leadership capitalised on the void of federal policy to reframe arguments for state-level action on climate change. The case study identifies two dominant policy frames leading to a third master frame or meta-narrative in the period 2004-6: i) climate change as a problem of regional environment risk; ii) mitigation policy as a “win-win” for the local economy and the environment; iii) climate change as a regional policy issue. This period represents a paradigm shift from a previous dominant framing that characterised climate change as predominantly a national rather than a state policy issue. The case study shows that today’s dominant policy frames rely upon a process of co-construction that combine insights from expert and local knowledge, thus intertwining “facts” and “ values in the policy process. “Winning arguments” or policy frames gain salience through a relatively open policy process, which permits an array of non-governmental actors -- including social movement organisations, business organisations and experts -- to operate in the outer-periphery of the policy process and generate ideas in a timely way to influence policy decisions. The research underscores the power of localising problems of global environmental change and their solutions, of taking up climate change as a regional policy issue where solutions can be tapered to reflect regional contexts and norms. It shows that there is a relatively larger scope for experimentation and social and technical innovation at regional scale, compared to broader scales of action, which can open the way for cross-scale learning and influence to emerge

    Kinetic modeling of polyurethane pyrolysis using non-isothermal thermogravimetric analysis

    No full text
    International audienceThe pyrolysis of polyurethane was studied by dynamic thermogravimetry analysis (TGA). Thestudied polyurethane is used as organic binder in casting process to make sand cores and molds. Asemi-empirical model is presented that can be used to describe polyurethane pyrolysis occurringduring TGA experiments. This model assumes that the polyurethane is pyrolysed by severalparallel independent reactions. The kinetic parameters of polyurethane pyrolysis were evaluatedby fitting the model to the experimental data obtained by TGA over a wide variety of heatingrates. A nonlinear least-squares optimization method is employed in the fitting procedure. Ahybrid objectives based simultaneously on the mass (TG) and mass loss rate (DTG) curves hasbeen used in the least-squares method. The values of the activation energy obtained by the nonlinearfitting were then recalculated by the methods of Kissinger and Friedmand. Furthermore,the parameters obtained in the present paper were then compared with those reported in theliterature

    Membrane Fission: A Computational Complexity Perspective

    Get PDF
    Membrane fission is a process by which a biological membrane is split into two new ones in the manner that the content of the initial membrane is separated and distributed between the new membranes. Inspired by this biological phenomenon, membrane separation rules were considered in membrane computing. In this work, we investigate cell-like P systems with symport/antiport rules and membrane separation rules from a computational complexity perspective. Specifically, we establish a limit on the efficiency of such P systems which use communication rules of length at most two, and we prove the computational efficiency of this kind of models when using communication rules of length at most three. Hence, a sharp borderline between tractability and NP–hardness is provided in terms of the length of communication rules.Ministerio de Economía y Competitividad TIN2012-3743

    Taking stock of progress under the Clean Development Mechanism (CDM)

    Get PDF
    The Kyoto Protocol’s Clean Development Mechanism (CDM) was established in 1997 with the dual purposes of assisting non-Annex I Parties in achieving sustainable development and assisting Annex I Parties in achieving compliance with their quantified greenhouse gas (GHG) emission commitments. This paper looks at the achievements of the CDM to date in the context of wider private and public flows of investment into developing countries. Market demand for GHG credits from CDM projects comes from Annex I countries’ emission commitments. Annex I countries can meet those commitments by domestic as well as international emission mitigation activities, including the CDM. The CDM can be an attractive compliance option as it can help meet Annex I GHG commitments more cost-effectively through project-based activities that are consistent with host-countries’ sustainable development priorities. The extent of the demand for CDM credits depends on the stringency of emission commitments, the “gap” between countries’ emission commitments and actual emissions, and the relative use of CDM and other means of meeting emission commitment

    La gestion dynamique des relations hauteur-débit des stations d'hydrométrie et le calcul des incertitudes associées : un indicateur de gestion, de qualité et de suivi des points de mesure

    Get PDF
    Dealer or owner operator of electricity production structures, EDF is responsible for their operation in safe condition and for the respect of the limits imposed by the regulations. Thus, the knowledge of water resources is one of EDF main concerns since the company remains preoccupied about the proper use of its facilities. The knowledge of streamflow is one of its priorities to better respond to three key issues that are plant safety, compliance with regulatory requirements, and optimizing the means of production. To meet these needs, EDF-DTG (Division Technique Générale) operates an observation network that includes both climatic parameters such as air and temperature, then the precipitations and the snow, but also the streamflow. The data collected allows real time monitoring of rivers, as well as hydrological studies and the sizing of structures. Ensuring the quality of the stream flow data is a priority. Up to now it is not possible to measure continuously the flow of a river since direct measurements of discharge are time consuming and expensive. In common cases the flow of a river can be deduced from continuous measurements of water level. Punctual measurements of discharge called gaugings allow to develop a stage-discharge relationship named rating curve. These are permanently installed equipment on rivers for measuring levels that are called hydrometric station. It is clear that the whole process constitutes an indirect way of estimating the discharge in rivers whose associated uncertainties need to be described. Quantification of confidence intervals is however not the only problem of the hydrometer. Fast changes in the stage-discharge relationship often make the streamflow real time monitoring quite difficult while the needs of continuous high reliability data is obvious. The historical method to produce the rating curve based on a construction from a suffcient number of gaugings chronologically contiguous and well distributed over the widest possible range of discharge remains poorly adapted to fast or cyclical changes of the stage-discharge relationship. The classical method does not take suffciently into account the erosion and sedimentation processes as well as the seasonal vegetation growth. Besides, the ability to perform gaugings by management teams generally remains quite limited. To get the most accurate streamflow data and to improve their reliability, this thesis explores an original dynamic method to compute rating curves based on historical gaugings from a hydrometric station while calculating the associated uncertainties. First, a dynamic rating curve assessment is created in order to compute a rating curve for each gauging of a considered hydrometric station. After the tracing, a model of uncertainty is built around each computed rating curve. It takes into account the uncertainty of gaugings, but also the uncertainty in the measurment of the water height, the sensitivity of the stage discharge relationship and the quality of the tracing. A variographic analysis is used to age the gaugings and the rating curves and obtain a final confidence interval increasing with time, and actualizing at each new gauging since it gives rise to a new rating curve more reliable because more recent for the prediction of discharge to come. Chronological series of streamflow data are the obtained homogeneously and with a confidence interval that takes into consideration the aging of the rating curves. By taking into account the variability of the flow conditions and the life of the hydrometric station, the method can answer important questions in the field of hydrometry such as « How many gauging a year have to be made so as to produce stream flow data with an average uncertainty of X\% ? » and « When and in which range of water flow do we have to realize those gaugings ? ».Pour répondre à trois enjeux principaux que sont la sûreté des installations, le respect d'exigences règlementaires et l'optimisation des moyens de production, EDF-DTG a développé un réseau d'observations qui comprend les paramètres climatiques tels que la température de l'air, les précipitations et l'enneigement, mais aussi le débit des rivières. Les données collectées permettent la surveillance en « temps réel » des cours d'eau ainsi que la réalisation d'études hydrologiques quantitatives ou de dimensionnement qui nécessitent de disposer de séries patrimoniales de références. Assurer la qualité des données de débits est donc un enjeu de première importance. On ne mesure pourtant pas en continu le débit d'un cours d'eau car les dispositifs à mettre en œuvre restent onéreux et difficilement techniquement réalisable. Le plus souvent, c'est à partir de mesures en continu du niveau des rivières que l'on déduit le débit de ces dernières. Des mesures ponctuelles de débits appelées « jaugeages » permettent de caler un modèle hauteur-débit nommé « courbe de tarage ». Ce sont les équipements installés sur les rivières pour la mesure des niveaux qui sont dénommés « station d'hydrométrie ». Force est de constater que l'ensemble de ce processus constitue une manière indirecte de détermination du débit dont le niveau d'incertitude mérite d'être décrit. À chacune des valeurs de débit produites peut être associé un intervalle de confiance qui tient compte des incertitudes de chacune des étapes. La rapidité de variation de la relation hauteur-débit rend souvent difficile le suivi en temps réel du débit alors que les besoins de la surveillance temps réel des ouvrages imposent une bonne fiabilité des données en continu. Or, en ce qui concerne les stations les moins stables, la méthode historique pour produire la courbe de tarage qui repose sur une construction à partir d'un nombre suffisant de jaugeages chronologiquement contigus et bien répartis sur la plus grande gamme possible reste mal adaptée aux changements rapides ou cycliques de la relation hauteur-débit. Elle ne prend pas assez en compte les phénomènes d'érosion et de sédimentation rapides ainsi que la croissance saisonnière d'herbiers car la capacité à réaliser des jaugeages par les équipes de gestion reste en général assez limitée. Ainsi, pour améliorer la qualité et la fiabilité des données de débits, ces travaux de thèse explorent une méthode de tracé dynamique des courbes de tarage et un calcul des incertitudes associées. Une gestion dynamique de la courbe de tarage est créée de sorte que chaque jaugeage donne lieu au tracé d'une nouvelle courbe de tarage. Après le tracé, un modèle d'incertitudes est construit autour de chaque courbe de tarage. Il prend en compte les incertitudes des jaugeages, les erreurs sur la mesure de hauteur d'eau, la sensibilité de la relation hauteur-débit et l'incertitude sur le tracé lui-même. Une approche variographique est utilisée pour faire vieillir les jaugeages et les courbes de tarage afin d'obtenir un intervalle de confiance augmentant avec le temps, et se réactualisant à chaque nouveau jaugeage puisque ce dernier donne lieu au tracé d'une nouvelle courbe de tarage, plus fiable car plus récente pour l'estimation des débits à venir. Des chroniques de débit sont enfin obtenues de façon homogène et avec un intervalle de confiance prenant en compte le vieillissement des courbes générées. En prenant mieux en compte la variabilité des conditions d'écoulement et la vie des stations, la méthode créée et son modèle d'incertitudes permet de construire des outils de gestion et d'optimisation d'exploitation des points de mesure. Elle répond à des questions récurrentes en hydrométrie comme : « Combien de jaugeages faut-il réaliser en une année pour produire des données de débit avec une incertitude moyenne de X% ? » et « Quand et dans quelle gamme de débit réaliser ces jaugeages ? »
    corecore