556 research outputs found

    Distributed eventual leader election in the crash-recovery and general omission failure models.

    Get PDF
    102 p.Distributed applications are present in many aspects of everyday life. Banking, healthcare or transportation are examples of such applications. These applications are built on top of distributed systems. Roughly speaking, a distributed system is composed of a set of processes that collaborate among them to achieve a common goal. When building such systems, designers have to cope with several issues, such as different synchrony assumptions and failure occurrence. Distributed systems must ensure that the delivered service is trustworthy.Agreement problems compose a fundamental class of problems in distributed systems. All agreement problems follow the same pattern: all processes must agree on some common decision. Most of the agreement problems can be considered as a particular instance of the Consensus problem. Hence, they can be solved by reduction to consensus. However, a fundamental impossibility result, namely (FLP), states that in an asynchronous distributed system it is impossible to achieve consensus deterministically when at least one process may fail. A way to circumvent this obstacle is by using unreliable failure detectors. A failure detector allows to encapsulate synchrony assumptions of the system, providing (possibly incorrect) information about process failures. A particular failure detector, called Omega, has been shown to be the weakest failure detector for solving consensus with a majority of correct processes. Informally, Omega lies on providing an eventual leader election mechanism

    La aplicación de las leyes indígenas en Chile durante la República (1866-1930): la labor de la prensa. La labor fiscalizadora del Congreso Nacional

    Get PDF
    By adapting the principles of Constitutionalism in Chile, the legal privileges that the indigenous people had during the Monarchy are suppressed. Such abolition left the indigenous in a vulnerable situation that damaged them, especially regarding the conservation of the property of their land. The Republic recognizes this mistake and restored, starting in 1866 and gradually, some legal benefi ts to the indigenous, but does not return to an effective protection system such as in the Spanish monarchy. The difficulties in implementing these Republican laws are described in detail by the press at the time, and were subject to the oversight role of National Congress, especially since the early twentieth century.Al adaptarse los principios del Constitucionalismo en Chile, se suprimen los privilegios legales que tuvieron los indígenas en el Derecho Indiano. Dicha abolición deja a los naturales en una situación vulnerable que los perjudicó, especialmente en cuanto a la conservación de la propiedad de sus tierras. La República reconoce su error y restablece, a partir de 1866 y gradualmente, algunos benefi cios legales de los indígenas, pero no se vuelve a un sistema de protección efectiva como el que creó la monarquía española. Las difi cultades en la aplicación de estas leyes republicanas son descritas con detalle en la prensa de la época, y fueron objeto de la labor fi scalizadora del Congreso Nacional, especialmente desde los inicios del siglo XX

    Functional Size Measurement and Model Verification for Software Model-Driven Developments: A COSMIC-based Approach

    Full text link
    Historically, software production methods and tools have a unique goal: to produce high quality software. Since the goal of Model-Driven Development (MDD) methods is no different, MDD methods have emerged to take advantage of the benefits of using conceptual models to produce high quality software. In such MDD contexts, conceptual models are used as input to automatically generate final applications. Thus, we advocate that there is a relation between the quality of the final software product and the quality of the models used to generate it. The quality of conceptual models can be influenced by many factors. In this thesis, we focus on the accuracy of the techniques used to predict the characteristics of the development process and the generated products. In terms of the prediction techniques for software development processes, it is widely accepted that knowing the functional size of applications in order to successfully apply effort models and budget models is essential. In order to evaluate the quality of generated applications, defect detection is considered to be the most suitable technique. The research goal of this thesis is to provide an accurate measurement procedure based on COSMIC for the automatic sizing of object-oriented OO-Method MDD applications. To achieve this research goal, it is necessary to accurately measure the conceptual models used in the generation of object-oriented applications. It is also very important for these models not to have defects so that the applications to be measured are correctly represented. In this thesis, we present the OOmCFP (OO-Method COSMIC Function Points) measurement procedure. This procedure makes a twofold contribution: the accurate measurement of objectoriented applications generated in MDD environments from the conceptual models involved, and the verification of conceptual models to allow the complete generation of correct final applications from the conceptual models involved. The OOmCFP procedure has been systematically designed, applied, and automated. This measurement procedure has been validated to conform to the ISO 14143 standard, the metrology concepts defined in the ISO VIM, and the accuracy of the measurements obtained according to ISO 5725. This procedure has also been validated by performing empirical studies. The results of the empirical studies demonstrate that OOmCFP can obtain accurate measures of the functional size of applications generated in MDD environments from the corresponding conceptual models.Marín Campusano, BM. (2011). Functional Size Measurement and Model Verification for Software Model-Driven Developments: A COSMIC-based Approach [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/11237Palanci

    Sentencia de la CIJ en el caso Papeleras del Río Uruguay

    Get PDF

    Provision of soil information for biophysical modelling

    Get PDF
    This thesis is concerned with the generation of a framework for addressing soil data needs, specifically for biophysical modelling. The soil system is an important ecosystem actor, supporting most of the worlds' food production and being the major terrestrial carbon stocks, thereby information about it is crucial for management and policy making. To provide this information, it is important to deliver information of the highest possible quality; thus the need to define guidelines to standardise, not only the methodologies, but the minimum requirements that information must meet. In this project, providing soil data is addressed in two ways. The first scenario investigates the use of soil information to predict other soil properties, using pedotransfer function (PTFs). In the second scenario, it is assumed that the end-user does not have extra information about the soil properties at a specific location. In this case, the use of existing soil maps is a traditional solution, thus a framework for generating maps at national/continental scale, using digital soil mapping (DSM) techniques, is proposed

    Machine learning to generate soil information

    Get PDF
    This thesis is concerned with the novel use of machine learning (ML) methods in soil science research. ML adoption in soil science has increased considerably, especially in pedometrics (the use of quantitative methods to study the variation of soils). In parallel, the size of the soil datasets has also increased thanks to projects of global impact that aim to rescue legacy data or new large extent surveys to collect new information. While we have big datasets and global projects, currently, modelling is mostly based on "traditional" ML approaches which do not take full advantage of these large data compilations. This compilation of these global datasets is severely limited by privacy concerns and, currently, no solution has been implemented to facilitate the process. If we consider the performance differences derived from the generality of global models versus the specificity of local models, there is still a debate on which approach is better. Either in global or local DSM, most applications are static. Even with the large soil datasets available to date, there is not enough soil data to perform a fully-empirical, space-time modelling. Considering these knowledge gaps, this thesis aims to introduce advanced ML algorithms and training techniques, specifically deep neural networks, for modelling large datasets at a global scale and provide new soil information. The research presented here has been successful at applying the latest advances in ML to improve upon some of the current approaches for soil modelling with large datasets. It has also created opportunities to utilise information, such as descriptive data, that has been generally disregarded. ML methods have been embraced by the soil community and their adoption is increasing. In the particular case of neural networks, their flexibility in terms of structure and training makes them a good candidate to improve on current soil modelling approaches
    • …
    corecore