28 research outputs found

    Bifurcation analysis of Leslie-Gower predator-prey system with harvesting and fear effect

    Get PDF
    In the paper, a Leslie-Gower predator-prey system with harvesting and fear effect is considered. The existence and stability of all possible equilibrium points are analyzed. The bifurcation dynamic behavior at key equilibrium points is investigated to explore the intrinsic driving mechanisms of population interaction modes. It is shown that the system undergoes various bifurcations, including transcritical, saddle-node, Hopf and Bogdanov-Takens bifurcations. The numerical simulation results show that harvesting and fear effect can seriously affect the dynamic evolution trend and coexistence mode. Furthermore, it is particularly worth pointing out that harvesting not only drives changes in population coexistence mode, but also has a certain degree delay. Finally, it is anticipated that these research results will be beneficial for the vigorous development of predator-prey system

    General Collaborative Filtering for Web Service QoS Prediction

    No full text
    To avoid the expensive and time-consuming evaluation, collaborative filtering (CF) methods have been widely studied for web service QoS prediction in recent years. Among the various CF techniques, matrix factorization is the most popular one. Much effort has been devoted to improving matrix factorization collaborative filtering. The key idea of matrix factorization is that it assumes the rating matrix is low rank and projects users and services into a shared low-dimensional latent space, making a prediction by using the dot product of a user latent vector and a service latent vector. Unfortunately, unlike the recommender systems, QoS usually takes continuous values with very wide range, and the low rank assumption might incur high bias. Furthermore, when the QoS matrix is extremely sparse, the low rank assumption also incurs high variance. To reduce the bias, we must use more complex assumptions. To reduce the variance, we can adopt complex regularization techniques. In this paper, we proposed a neural network based framework, named GCF (general collaborative filtering), with the dropout regularization, to model the user-service interactions. We conduct our experiments on a large real-world dataset, the QoS values of which are obtained from 339 users on 5825 web services. The comprehensive experimental studies show that our approach offers higher prediction accuracy than the traditional collaborative filtering approaches

    Multi-Level Bayesian Safety Analysis With Unprocessed Automatic Vehicle Identification Data For An Urban Expressway

    No full text
    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature

    Feasibility Of Incorporating Reliability Analysis In Traffic Safety Investigation

    No full text
    The feasibility of using reliability analysis methods in traffic safety analyses is investigated. The reliability analysis approach, frequently used to evaluate the probability of failure of a specific structural system, has two main outcomes: the reliability index and the design points. Two approaches to using these two outcomes in traffic safety analyses are presented. Data from a mountainous freeway in Colorado are used. The reliability index is used to evaluate the hazardous freeway segments by incorporating the traffic flow parameters provided by radar detectors. The design points are employed to evaluate real-time crash occurrence risk at the disaggregate level with weather parameters. Finally, results of the reliability analysis approach are compared with the results of traditional methods. The reliability analysis method shows promising application in traffic safety studies. With the use of the reliability indexes, the three most hazardous freeway segments are identified. Moreover, with the design points, the accuracy rate of predicting crash occurrence is improved by 10.53% as compared with the logistic regression method

    Assessing The Impact Of Reduced Visibility On Traffic Crash Risk Using Microscopic Data And Surrogate Safety Measures

    No full text
    Due to the difficulty of obtaining accurate real-time visibility and vehicle based traffic data at the same time, there are only few research studies that addressed the impact of reduced visibility on traffic crash risk. This research was conducted based on a new visibility detection system by mounting visibility sensor arrays combined with adaptive learning modules to provide more accurate visibility detections. The vehicle-based detector, Wavetronix SmartSensor HD, was installed at the same place to collect traffic data. Reduced visibility due to fog were selected and analyzed by comparing them with clear cases to identify the differences based on several surrogate measures of safety under different visibility classes. Moreover, vehicles were divided into different types and the vehicles in different lanes were compared in order to identify whether the impact of reduced visibility due to fog on traffic crash risk varies depending on vehicle types and lanes. Log-Inverse Gaussian regression modeling was then applied to explore the relationship between time to collision and visibility together with other traffic parameters. Based on the accurate visibility and traffic data collected by the new visibility and traffic detection system, it was concluded that reduced visibility would significantly increase the traffic crash risk especially rear-end crashes and the impact on crash risk was different for different vehicle types and for different lanes. The results would be helpful to understand the change in traffic crash risk and crash contributing factors under fog conditions. We suggest implementing the algorithms in real-time and augmenting it with ITS measures such as VSL and DMS to reduce crash risk

    Modeling and verifying of CPS component services based on hybrid automata

    No full text
    In recent years, the modeling and verifying of Cyber-Physical System (CPS) is now an important aspect of CPS researches. Because of the CPS&#39; complex architecture, it may suffer from the state-space explosion problem when we verify CPS models by model checking methods. Therefore, we offer a method which models CPS with Component Services. The method treats the CPS components as a service provider, and models component services to further simplify the system&#39;s state-space. We verify the correctness of this model and solve the synchronous/asynchronous communication problems. &copy; 2014 SERSC. In recent years, the modeling and verifying of Cyber-Physical System (CPS) is now an important aspect of CPS researches. Because of the CPS&#39; complex architecture, it may suffer from the state-space explosion problem when we verify CPS models by model checking methods. Therefore, we offer a method which models CPS with Component Services. The method treats the CPS components as a service provider, and models component services to further simplify the system&#39;s state-space. We verify the correctness of this model and solve the synchronous/asynchronous communication problems. &copy; 2014 SERSC.</p

    Decomposition of xylene in strong ionization non-thermal plasma at atmospheric pressure

    No full text
    A large amount of volatile organic compounds (VOCs) produced by industry have caused serious environmental pollution. In this paper, the removal effect of simulated xylene by strong ionization dielectric barrier discharge (DBD) plasma at atmospheric pressure and its degradation mechanism and pathway were studied. The effect of gas residence time, and initial xylene concentration was studied. The results showed that higher voltage caused an increase in discharge power, and with the increase of voltage, the concentration of ozone and nitrogen oxide in the reactor increased. The degradation efficiency decreased from 98.1% to 80.2% when xylene concentration increased from 50 ppm to 550 ppm at 4kV. And with the increase of residence time from 0.301s to 1s, the degradation efficiency increased from 78.5% to 98.6%. According to GC-MS analysis, the degradation products were ethyl acetate and n-hexylmethylamine at 4kv. And the main intermediates are 2,4-2-tert-butylphenol, 2-aminopentane, 2-methyl-5 - (2-aminopropyl) - phenol and propionamide at 1.5kV

    Identification of potential biomarkers and pathways for sepsis using RNA sequencing technology and bioinformatic analysis

    No full text
    Long non-coding RNAs (lncRNAs) has been proven by many to play a crucial part in the process of sepsis. To obtain a better understanding of sepsis, the molecular biomarkers associated with it, and its possible pathogenesis, we obtained data from RNA-sequencing analysis using serum from three sepsis patients and three healthy controls (HCs). Using edgeR (one of the Bioconductor software package), we identified 1118 differentially expressed mRNAs (DEmRNAs) and 1394 differentially expressed long noncoding RNAs (DElncRNAs) between sepsis patients and HCs. We identified the biological functions of these disordered genes using Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) signaling pathway analyses. The GO analysis showed that the homophilic cell adhesion via plasma membrane adhesion molecules was the most significantly enriched category. The KEGG signaling pathway analysis indicated that the differentially expressed genes (DEGs) were most significantly enriched in retrograde endocannabinoid signaling. Using STRING, a protein-protein interaction network was also created, and Cytohubba was used to determine the top 10 hub genes. To examine the relationship between the hub genes and sepsis, we examined three datasets relevant to sepsis that were found in the gene expression omnibus (GEO) database. PTEN and HIST2H2BE were recognized as hub gene in both GSE4607, GSE26378, and GSE9692 datasets. The receiver operating characteristic (ROC) curves indicate that PTEN and HIST2H2BE have good diagnostic value for sepsis. In conclusion, this two hub genes may be biomarkers for the early diagnosis of sepsis, our findings should deepen our understanding of the pathogenesis of sepsis

    A Trajectory Tracking Control Based on a Terminal Sliding Mode for a Compliant Robot with Nonlinear Stiffness Joints

    Get PDF
    A nonlinear stiffness actuator (NSA) can achieve high torque/force resolution in the low stiffness range and high bandwidth in the high stiffness range. However, for the NSA, due to the imperfect performance of the elastic mechanical component such as friction, hysteresis, and unmeasurable energy consumption caused by former factors, it is more difficult to achieve accurate position control compared to the rigid actuator. Moreover, for a compliant robot with multiple degree of freedoms (DOFs) driven by NSAs, the influence of every NSA on the trajectory of the end effector is different and even coupled. Therefore, it is a challenge to implement precise trajectory control on a robot driven by such NSAs. In this paper, a control algorithm based on the Terminal Sliding Mode (TSM) approach is proposed to control the end effector trajectory of the compliant robot with multiple DOFs driven by NSAs. This control algorithm reduces the coupling of the driving torque, and mitigates the influence of parametric variation. The closed-loop system’s finite time convergence and stability are mathematically established via the Lyapunov stability theory. Moreover, under the same experimental conditions, by the comparison between the Proportion Differentiation (PD) controller and the controller using TSM method, the algorithm’s efficacy is experimentally verified on the developed compliant robot. The results show that the trajectory tracking is more accurate for the controller using the TSM method compared to the PD controller
    corecore