27 research outputs found

    What are the most important factors that influence the changes in London Real Estate Prices? How to quantify them?

    Get PDF
    Abstract. In recent years, real estate industry has captured government and public attention around the world. The factors influencing the prices of real estate are diversified and complex. However, due to the limitations and one-sidedness of their respective views, they did not provide enough theoretical basis for the fluctuation of house price and its influential factors. The purpose of this paper is to build a housing price model to make the scientific and objective analysis of London's real estate market trends from the year 1996 to 2016 and proposes some countermeasures to reasonably control house prices.  Specifically, the paper analyzes eight factors which affect the house prices from two aspects: housing supply and demand and find out the factor which is of vital importance to the increase of housing price per square meter. The problem of a high level of multicollinearity between them is solved by using principal components analysis.Keywords. Real estate market, Real estate price.JEL. L85, R30, R33

    Graph ODE with Factorized Prototypes for Modeling Complicated Interacting Dynamics

    Full text link
    This paper studies the problem of modeling interacting dynamical systems, which is critical for understanding physical dynamics and biological processes. Recent research predominantly uses geometric graphs to represent these interactions, which are then captured by powerful graph neural networks (GNNs). However, predicting interacting dynamics in challenging scenarios such as out-of-distribution shift and complicated underlying rules remains unsolved. In this paper, we propose a new approach named Graph ODE with factorized prototypes (GOAT) to address the problem. The core of GOAT is to incorporate factorized prototypes from contextual knowledge into a continuous graph ODE framework. Specifically, GOAT employs representation disentanglement and system parameters to extract both object-level and system-level contexts from historical trajectories, which allows us to explicitly model their independent influence and thus enhances the generalization capability under system changes. Then, we integrate these disentangled latent representations into a graph ODE model, which determines a combination of various interacting prototypes for enhanced model expressivity. The entire model is optimized using an end-to-end variational inference framework to maximize the likelihood. Extensive experiments in both in-distribution and out-of-distribution settings validate the superiority of GOAT

    A Comprehensive Survey on Deep Graph Representation Learning

    Full text link
    Graph representation learning aims to effectively encode high-dimensional sparse graph-structured data into low-dimensional dense vectors, which is a fundamental task that has been widely studied in a range of fields, including machine learning and data mining. Classic graph embedding methods follow the basic idea that the embedding vectors of interconnected nodes in the graph can still maintain a relatively close distance, thereby preserving the structural information between the nodes in the graph. However, this is sub-optimal due to: (i) traditional methods have limited model capacity which limits the learning performance; (ii) existing techniques typically rely on unsupervised learning strategies and fail to couple with the latest learning paradigms; (iii) representation learning and downstream tasks are dependent on each other which should be jointly enhanced. With the remarkable success of deep learning, deep graph representation learning has shown great potential and advantages over shallow (traditional) methods, there exist a large number of deep graph representation learning techniques have been proposed in the past decade, especially graph neural networks. In this survey, we conduct a comprehensive survey on current deep graph representation learning algorithms by proposing a new taxonomy of existing state-of-the-art literature. Specifically, we systematically summarize the essential components of graph representation learning and categorize existing approaches by the ways of graph neural network architectures and the most recent advanced learning paradigms. Moreover, this survey also provides the practical and promising applications of deep graph representation learning. Last but not least, we state new perspectives and suggest challenging directions which deserve further investigations in the future

    Improved Deadbeat Predictive Control Based Current Harmonic Suppression Strategy for IPMSM

    No full text
    When the interior permanent magnet synchronous motor (IPMSM) is running, there are abundant harmonics in the stator current. In order to achieve the suppression of current harmonics, the current harmonic extraction method and current harmonic controller are studied in this paper. Firstly, a simple and accurate method for extracting current harmonics is proposed by means of multiple synchronous rotating frame transformation (MSRFT). Secondly, an improved deadbeat predictive control (IDPC) based current harmonic controller is designed after analyzing the advantages and disadvantages of traditional current harmonic controllers. Thirdly, IDPC-based current harmonic suppression strategy is proposed by combining the proposed current harmonic extraction method and the proposed current harmonic controller. The proposed strategy can still effectively achieve current harmonic suppression when the motor runs at low speed, medium speed and high speed and the controller parameters are mismatched with the motor parameters. Finally, the feasibility and effectiveness of the proposed strategy are verified by simulation and experiments

    Improved Deadbeat Predictive Control Based Current Harmonic Suppression Strategy for IPMSM

    No full text
    When the interior permanent magnet synchronous motor (IPMSM) is running, there are abundant harmonics in the stator current. In order to achieve the suppression of current harmonics, the current harmonic extraction method and current harmonic controller are studied in this paper. Firstly, a simple and accurate method for extracting current harmonics is proposed by means of multiple synchronous rotating frame transformation (MSRFT). Secondly, an improved deadbeat predictive control (IDPC) based current harmonic controller is designed after analyzing the advantages and disadvantages of traditional current harmonic controllers. Thirdly, IDPC-based current harmonic suppression strategy is proposed by combining the proposed current harmonic extraction method and the proposed current harmonic controller. The proposed strategy can still effectively achieve current harmonic suppression when the motor runs at low speed, medium speed and high speed and the controller parameters are mismatched with the motor parameters. Finally, the feasibility and effectiveness of the proposed strategy are verified by simulation and experiments

    Influence of fishery management on trophic interactions and biomass fluxes in Lake Taihu based on a trophic mass-balance model exercise on a long-term data series

    No full text
    With increasing anthropogenic activities, freshwater ecosystems around the world are becoming increasingly affected by various pressures, including eutrophication, overfishing, and irrational stocking, which may have a negative impact on the food web structure. Despite the extensive research and proposed management measures for eutrophic lakes, there are only few analysis on long-term monitoring data regarding fishery resources. Additionally, there is a lack of evaluation and prediction of the effectiveness of current fish management policies. To remedy this, we analyzed long-term monitoring data from Lake Taihu, China, a severely eutrophicated lake with a skewed fish size structure exhibiting dominance of small individuals. We first constructed 14 Ecopath models to investigate how trophic interactions and biomass fluxes changed from 2007 to 2020. Subsequently, the Ecosim model was used to predict how the biomass of fish and the ecosystem network respond to the initiated 10-years fishing ban. Our results demonstrate long-term changes in fish biomass and ecosystem stability. The analyses revealed that 1) the biomass development in different feeding types of fish is controlled by human activities (mainly catches and stocking) and trophic interactions and 2) the rate of decline in ecosystem network stability slows down during the fishing ban. The primary focus of this study was to fill the gap in long-term serial studies of fish monitoring data and ecosystem stability in the lake and, for the first time, to predict the outcome of the fishing ban from an ecosystem perspective using the Ecosim model. Overall, our results emphasize the importance of rational stocking and fishing policies and provide a better understanding of the changes in the ecological dynamics in Lake Taihu of relevance for the management and restoration of the lake

    Potential of Staphylea holocarpa Wood for Renewable Bioenergy

    No full text
    Energy is indispensable in human life and social development, but this has led to an overconsumption of non-renewable energy. Sustainable energy is needed to maintain the global energy balance. Lignocellulose from agriculture or forestry is often discarded or directly incinerated. It is abundantly available to be discovered and studied as a biomass energy source. Therefore, this research uses Staphylea holocarpa wood as feedstock to evaluate its potential as energy source. We characterized Staphylea holocarpa wood by utilizing FT–IR, GC–MS, TGA, Py/GC–MS and NMR. The results showed that Staphylea holocarpa wood contained a large amount of oxygenated volatiles, indicating that it has the ability to act as biomass energy sources which can achieve green chemistry and sustainable development

    Characterization of atmospheric organic carbon and element

    No full text
    ABSTRACT Concentrations of organic carbon (OC) and elemental carbon (EC) in atmospheric particles were measured in Tianjin during January, April, July and October in 2008. The 24-h PM 2.5 (particles with aerodynamic diameters less than 2.5 micrometer [ m]) and PM 10 (particles with aerodynamic diameters less than 10 micrometer [ m]) samples were simultaneously collected every day during sampling periods. These samples were analyzed for OC/EC by thermal/optical reflectance (TOR) following the Interagency Monitoring of Protected Visual Environments (IMPROVE) protocol. The annual average concentration was 109.8 ± 48.5 g/m 3 in PM 2.5 , and 196.2 ± 86.1 g/m 3 in PM 10 , respectively. The average ratio of PM 2.5 /PM 10 was 57.9%, indicating the PM 2.5 had been one of the main contaminations affecting urban atmospheric environmental quality in Tianjin. The concentrations of OC and EC in PM 2.5 and PM 10 were all relatively higher in winter and fall and lower in summer and spring. This seasonal variation could be attributed to the cooperative effects of changes in emission rates and seasonal meteorological conditions. The annual average concentration of the estimated secondary organic carbon (SOC) was 14.9 g/m 3 and occupied 61.7% of the total OC in PM 2.5 , while those in PM 10 were 23.4 g/m 3 and 61.2%, respectively, indicating SOC had been an important contributor to organic aerosol in Tianjin. The distribution of eight carbon fractions (OC1, OC2, OC3, OC4, EC1, EC2, EC3 and OP) was also reported and found that the biomass burning, coal-combustion and motor-vehicle exhaust were all contributed to the carbonaceous particles in Tianjin

    GLCC: A General Framework for Graph-Level Clustering

    No full text
    This paper studies the problem of graph-level clustering, which is a novel yet challenging task. This problem is critical in a variety of real-world applications such as protein clustering and genome analysis in bioinformatics. Recent years have witnessed the success of deep clustering coupled with graph neural networks (GNNs). However, existing methods focus on clustering among nodes given a single graph, while exploring clustering on multiple graphs is still under-explored. In this paper, we propose a general graph-level clustering framework named Graph-Level Contrastive Clustering (GLCC) given multiple graphs. Specifically, GLCC first constructs an adaptive affinity graph to explore instance- and cluster-level contrastive learning (CL). Instance-level CL leverages graph Laplacian based contrastive loss to learn clustering-friendly representations while cluster-level CL captures discriminative cluster representations incorporating neighbor information of each sample. Moreover, we utilize neighbor-aware pseudo-labels to reward the optimization of representation learning. The two steps can be alternatively trained to collaborate and benefit each other. Experiments on a range of well-known datasets demonstrate the superiority of our proposed GLCC over competitive baselines
    corecore