708 research outputs found

    Non-parametric Estimation of Stochastic Differential Equations with Sparse Gaussian Processes

    Get PDF
    The application of Stochastic Differential Equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a non-parametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudo-samples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behaviour of complex systems

    Cosmological Bianchi Class A models in S\'aez-Ballester theory

    Get PDF
    We use the S\'aez-Ballester (SB) theory on anisotropic Bianchi Class A cosmological model, with barotropic fluid and cosmological constant, using the Hamilton or Hamilton-Jacobi approach. Contrary to claims in the specialized literature, it is shown that the S\'aez-Ballester theory cannot provide a realistic solution to the dark matter problem of Cosmology for the dust epoch, without a fine tunning because the contribution of the scalar field in this theory is equivalent to a stiff fluid (as can be seen from the energy--momentum tensor for the scalar field), that evolves in a different way as the dust component. To have similar contributions of the scalar component and the dust component implies that their past values were fine tunned. So, we reinterpreting this null result as an indication that dark matter plays a central role in the formation of structures and galaxy evolution, having measureable effects in the cosmic microwave bound radiation, and than this formalism yield to this epoch as primigenius results. We do the mention that this formalism was used recently in the so called K-essence theory applied to dark energy problem, in place to the dark matter problem. Also, we include a quantization procedure of the theory which can be simplified by reinterpreting the theory in the Einstein frame, where the scalar field can be interpreted as part of the matter content of the theory, and exact solutions to the Wheeler-DeWitt equation are found, employing the Bianchi Class A cosmological models.Comment: 24 pages; ISBN: 978-953-307-626-3, InTec

    A visual analytics framework for cluster analysis of DNA microarray data

    Get PDF
    Prova tipográficaCluster analysis of DNA microarray data is an important but difficult task in knowledge discovery processes. Many clustering methods are applied to analysis of data for gene expression, but none of them is able to deal with an absolute way with the challenges that this technology raises. Due to this, many applications have been developed for visually representing clustering algorithm results on DNA microarray data, usually providing dendrogram and heat map visualizations. Most of these applications focus only on the above visualizations, and do not offer further visualization components to the validate the clustering methods or to validate one another. This paper proposes using a visual analytics framework in cluster analysis of gene expression data. Additionally, it presents a new method for finding cluster boundaries based on properties of metric spaces. Our approach presents a set of visualization components able to interact with each other; namely, parallel coordinates, cluster boundary genes, 3D cluster surfaces and DNA microarray visualizations as heat maps. Experimental results have shown that our framework can be very useful in the process of more fully understanding DNA microarray data. The software has been implemented in Java, and the framework is publicly available at http://www. analiticavisual.com/jcastellanos/3DVisualCluster/3D-VisualCluster.This work has been partially funded by the Spanish Ministry of Science and Innovation, the Plan E from the Spanish Government, the European Union from the ERDF (TIN2009-14057-C03-02)

    Intersymbol and Intercarrier Interference in OFDM Systems: Unified Formulation and Analysis

    Get PDF
    A unified matrix formulation is presented for the analysis of intersymbol and intercarrier interference in orthogonal frequency-division multiplexing (OFDM) systems. The proposed formulation relies on six parameters and allows studying various schemes, including those with windowing in the transmitter and/or in the receiver (called windowed OFDM systems), which may add cyclic suffix and/or cyclic prefix (CP), besides the conventional CP-OFDM. The proposed framework encompasses seven different OFDM systems. It considers the overlap-and-add procedure performed in the transmitter of windowed OFDM systems, being jointly formulated with the channel convolution. The intersymbol and intercarrier interference, caused when the order of the channel impulse response is higher than the number of CP samples, is characterized. A new equivalent channel matrix that is useful for calculating both the received signal and the interference power is defined and characterized. Unlike previous works, this new channel matrix has no restrictions on the length of the channel impulse response, which means that the study is not constrained to the particular case of two or three data blocks interfering in the received signal. Theoretical expressions for the powers of three different kinds of interference are derived. These expressions allow calculating the signal-to-interference-plus-noise ratio, useful for computing the data rate of each OFDM system. The proposed formulation is applied to realistic examples, showing its effectiveness through comparisons based on numerical performance assessments of the considered OFDM systems

    Influence of Activation Parameters on the Mechanical and Microstructure Properties of an Alkali-Activated BOF Steel Slag

    Get PDF
    ABSTRACT: Steel slag (SS) is a secondary material from steelmaking production with little commercial value. Its volumetric expansion and low reactivity limit the use of SS in Portland cement (PC)- based materials. This study investigated the potential use of basic oxygen furnace (BOF) slag as a single precursor in alkali-activated matrices (AAMs). Six AAM pastes were assessed by changing the silica modulus (0.75, 1.50 and 2.22) and the sodium concentration (4% or 6% Na2O?wt. SS). The early hydration was assessed using isothermal calorimetry (IC), followed by the assessment of the mechanical performance (compressive strength), apparent porosity, and structure and microstructure characterization (X-ray diffraction, thermogravimetric analysis and scanning electron microscopy). The results indicated that although the BOF slag may be considered a low-reactivity material, the alkaline environment effectively dissolved important crystalline phases to produce hydrates (reaction products). An optimized combination of activator sources was achieved with 4% Na2O and a silica modulus of 1.50?2.22, with a compressive strength up to 20 MPa, a significant amount of reaction products (C-S-H/C-A-S-H gels), and low initial and cumulative heat release. Those properties will help to promote SS recycling use in future engineering projects that do not require high-strength materials.This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior—Brasil (CAPES)—finance code 001, grant PPM-00709-18 (FAPEMIG) and grant 316882/2021-6 (CNPq

    Comparison of the healthcare system of Chile and Brazil: strengths, inefficiencies, and expenditures

    Get PDF
    Background: Governments in Latin America are constantly facing the problem of managing scarce resources to satisfy alternative needs, such as housing, education, food, and healthcare security. Those needs, combined with increasing crime levels, require financial resources to be solved. Objective: The objective of this review was to characterizar the health system and health expenditure of a large country (Brazil) and a small country (Chile) and identify some of the challenges these two countries face in improving the health services of their population. Methods: A literature review was conducted by searching journals, databases, and other electronic resources to identify articles and research publications describing health systems in Brazil and Chile. Results: The review showed that the economic restriction and the economic cycle have an impact on the funding of the public health system. This result was true for the Brazilian health system after 2016, despite the change to a unique health system one decade earlier. In the case of Chile, there are different positions about which one is the best health system: a dual public and private or just public one. As a result, a referendum on September 4, 2022, of a new constitution, which incorporated a unique health system, was rejected. At the same time, the Government ended the copayment in the public health system in September 2022, excluding illnesses referred to the private sector. Another issue detected was the fragility of the public and private sector coverage due to the lack of funding. Conclusions: The health care system in Chile and Brazil has improved in the last decades. However, the public healthcare systems still need additional funding and efficiency improvement to respond to the growing health requirements needed from the population. © 2022, The Author(s).info:eu-repo/semantics/publishedVersio

    Positive and Negative Evidence Accumulation Clustering for Sensor Fusion: An Application to Heartbeat Clustering

    Get PDF
    In this work, a new clustering algorithm especially geared towards merging data arising from multiple sensors is presented. The algorithm, called PN-EAC, is based on the ensemble clustering paradigm and it introduces the novel concept of negative evidence. PN-EAC combines both positive evidence, to gather information about the elements that should be grouped together in the final partition, and negative evidence, which has information about the elements that should not be grouped together. The algorithm has been validated in the electrocardiographic domain for heartbeat clustering, extracting positive evidence from the heartbeat morphology and negative evidence from the distances between heartbeats. The best result obtained on the MIT-BIH Arrhythmia database yielded an error of 1.44%. In the St. Petersburg Institute of Cardiological Technics 12-Lead Arrhythmia Database database (INCARTDB), an error of 0.601% was obtained when using two electrocardiogram (ECG) leads. When increasing the number of leads to 4, 6, 8, 10 and 12, the algorithm obtains better results (statistically significant) than with the previous number of leads, reaching an error of 0.338%. To the best of our knowledge, this is the first clustering algorithm that is able to process simultaneously any number of ECG leads. Our results support the use of PN-EAC to combine different sources of information and the value of the negative evidenceThis research was funded by the Ministry of Science, Innovation and Universities of Spain, and the European Regional Development Fund of the European Commission, Grant Nos. RTI2018-095324-B-I00, RTI2018-097122-A-I00, and RTI2018-099646-B-I00S
    • …
    corecore