23,237 research outputs found

    Pathophysiology, histopathology and therapeutic of SARS-CoV-2

    Get PDF
    The rapid transmission of SARS-CoV-2 through the world has induced the scientist to understand the histopathology of the virus and then to find an effective drug. However, many of the point associated with the virus pathogenicity still unknown and need more studies. In this chapter the pathophysiology, histopathology and therapeutic of SARS-CoV-2 has been reviewed. It was appeared that pathogenicity of SARS-CoV-2 is belonging to the viral with genome structure which acting by blocking the host innate immune response. Both chloroquine and hydroxyl-chloroquine have similar structure and mechanism action and they are among the most effective antiviral for treating the patents with the SARS-CoV-2. Chloroquine works by inhibition the intracellular organism by increasing the pH

    MaaSim: A Liveability Simulation for Improving the Quality of Life in Cities

    Get PDF
    Urbanism is no longer planned on paper thanks to powerful models and 3D simulation platforms. However, current work is not open to the public and lacks an optimisation agent that could help in decision making. This paper describes the creation of an open-source simulation based on an existing Dutch liveability score with a built-in AI module. Features are selected using feature engineering and Random Forests. Then, a modified scoring function is built based on the former liveability classes. The score is predicted using Random Forest for regression and achieved a recall of 0.83 with 10-fold cross-validation. Afterwards, Exploratory Factor Analysis is applied to select the actions present in the model. The resulting indicators are divided into 5 groups, and 12 actions are generated. The performance of four optimisation algorithms is compared, namely NSGA-II, PAES, SPEA2 and eps-MOEA, on three established criteria of quality: cardinality, the spread of the solutions, spacing, and the resulting score and number of turns. Although all four algorithms show different strengths, eps-MOEA is selected to be the most suitable for this problem. Ultimately, the simulation incorporates the model and the selected AI module in a GUI written in the Kivy framework for Python. Tests performed on users show positive responses and encourage further initiatives towards joining technology and public applications.Comment: 16 page

    A Process to Implement an Artificial Neural Network and Association Rules Techniques to Improve Asset Performance and Energy Efficiency

    Get PDF
    In this paper, we address the problem of asset performance monitoring, with the intention of both detecting any potential reliability problem and predicting any loss of energy consumption e ciency. This is an important concern for many industries and utilities with very intensive capitalization in very long-lasting assets. To overcome this problem, in this paper we propose an approach to combine an Artificial Neural Network (ANN) with Data Mining (DM) tools, specifically with Association Rule (AR) Mining. The combination of these two techniques can now be done using software which can handle large volumes of data (big data), but the process still needs to ensure that the required amount of data will be available during the assets’ life cycle and that its quality is acceptable. The combination of these two techniques in the proposed sequence di ers from previous works found in the literature, giving researchers new options to face the problem. Practical implementation of the proposed approach may lead to novel predictive maintenance models (emerging predictive analytics) that may detect with unprecedented precision any asset’s lack of performance and help manage assets’ O&M accordingly. The approach is illustrated using specific examples where asset performance monitoring is rather complex under normal operational conditions.Ministerio de Economía y Competitividad DPI2015-70842-

    Regional population expenditure for foodstuffs in the Russian Federation: componential and cluster analyses

    Full text link
    The article describes the solving of the problem of conducting the component and cluster analyses of population expenditure on food as one of the most important components of the standard of living. The purpose of the analysis is to develop the regional clusters of the Russian Federation, which vary in the structure of household expenditure for foodstuffs. The foodstuffs are presented in absolute units taking into integral account the standard of living index. The methods of intellectual analysis such as component and cluster analyses are applied as the research methods. The procedure for the data intellectual analysis based on the interconnected performance of component and cluster analyses is proposed. The procedure of the data intellectual analysis considers the interrelation between the results received by different methods, and also the possibility to return to the previous method for the purpose of repeating the analysis to specify consistently the clusters composition. Few clusters of the wealthy regions characterized by the high and average levels of expenditure for foodstuffs are revealed as well as the quite many clusters of not enough wealthy and not wealthy regions characterized by the low level of expenditure for foodstuffs. It is shown that the growth of standard of living characterized by the size of a gross regional product per capita is followed by the growth of the Gini coefficient, which indicates both the inequality of income distribution and reduction in expenditure for low-value foodstuffs. The results of the analysis can be applied to the development of the decision-making support system intended for the analysis of the scenarios of macroeconomic regulation in the eld of income policy for the purpose of increasing the standard of living of population. The analysis of the population expenditure for foodstuffs has allowed to reveal the cluster structure of the regions of the Russian Federation, to show it according to the generalized indications, to formulate the specific characteristics of the clusters of the regions and important management decisions

    Evaluation of the NDICEA model

    Get PDF
    Within the N-Toolbox project the NDICEA nitrogen model, one of the key tools in the virtual Toolbox, has been improved and tested in England, Denmark and Spain. The model performance was evaluated on datasets from these three countries by means of visual observation, RMSE and RSR from the soil nitrogen dynamics. In England the scenarios with organic fertilizer performed better than those with artificial fertilizer, leading to the suggestion that the calculated nitrogen release out of fertilizer could be improved. Timing of the soil sampling on soil inorganic nitrogen is important to realize a good model evaluation; two samples only, before sowing and after harvest, is not enough. When soil mineral nitrogen samples were taken during crop growth, model calculation and measured values showed sometimes big differences. It is suggested to improve the plant nitrogen uptake sub-model. In the Danish dataset the soil mineral N of the topsoil was well described, but that of the subsoil was not. This might be caused by the depth of the subsoil, which was up to 2.5 meters. The model performance could be improved by introducing a multi-layer soil sub-model instead of the actual two-layer soil sub-model. Spain, with its different climatic and soil conditions, needed an adaptation of the evapotranspiration calculation and a calibration of the scenarios to reach an acceptable model performance. If more Spanish datasets were studied, the NDICEA model could be enriched with standard Spanish soils and evapotranspiration data. For the improvement of the model, equations from the EU-ROTATE_N model are used to describe root growth and nitrogen uptake in more detail

    VM-MAD: a cloud/cluster software for service-oriented academic environments

    Full text link
    The availability of powerful computing hardware in IaaS clouds makes cloud computing attractive also for computational workloads that were up to now almost exclusively run on HPC clusters. In this paper we present the VM-MAD Orchestrator software: an open source framework for cloudbursting Linux-based HPC clusters into IaaS clouds but also computational grids. The Orchestrator is completely modular, allowing flexible configurations of cloudbursting policies. It can be used with any batch system or cloud infrastructure, dynamically extending the cluster when needed. A distinctive feature of our framework is that the policies can be tested and tuned in a simulation mode based on historical or synthetic cluster accounting data. In the paper we also describe how the VM-MAD Orchestrator was used in a production environment at the FGCZ to speed up the analysis of mass spectrometry-based protein data by cloudbursting to the Amazon EC2. The advantages of this hybrid system are shown with a large evaluation run using about hundred large EC2 nodes.Comment: 16 pages, 5 figures. Accepted at the International Supercomputing Conference ISC13, June 17--20 Leipzig, German

    An evolutionary behavioral model for decision making

    Get PDF
    For autonomous agents the problem of deciding what to do next becomes increasingly complex when acting in unpredictable and dynamic environments pursuing multiple and possibly conflicting goals. One of the most relevant behavior-based model that tries to deal with this problem is the one proposed by Maes, the Bbehavior Network model. This model proposes a set of behaviors as purposive perception-action units which are linked in a nonhierarchical network, and whose behavior selection process is orchestrated by spreading activation dynamics. In spite of being an adaptive model (in the sense of self-regulating its own behavior selection process), and despite the fact that several extensions have been proposed in order to improve the original model adaptability, there is not a robust model yet that can self-modify adaptively both the topological structure and the functional purpose\ud of the network as a result of the interaction between the agent and its environment. Thus, this work proffers an innovative hybrid model driven by gene expression programming, which makes two main contributions: (1) given an initial set of meaningless and unconnected units, the evolutionary mechanism is able to build well-defined and robust behavior networks which are adapted and specialized to concrete internal agent's needs and goals; and (2)\ud the same evolutionary mechanism is able to assemble quite\ud complex structures such as deliberative plans (which operate in the long-term) and problem-solving strategies

    A FIELD-PROGRAMMABLE GATE ARRAY IMPLEMENTATION OF A COGNITIVE RADAR TARGET RECOGNITION SYSTEM

    Get PDF
    The objective of this study is to design a field-programmable gate array (FPGA) implementation of a cognitive radar (CRr) target recognition system for electronic warfare applications. This thesis expands on the closed-loop adaptive matched waveform transmission technique called probability of weighted energy (PWE). This work also investigates the feasibility of applying the PWE technique in a functional digital hardware realization. Initially, a PWE Monte Carlo simulation model is developed in the Verilog hardware description language that is simulated in the Xilinx Vivado environment. The Verilog module components developed in the Monte Carlo model are then incorporated into a CRr target recognition system experiment utilizing the Xilinx VCU118 Evaluation Board. The VCU118 features the Virtex UltraScale+ high-performance FPGA to accomplish CRr adaptive waveform generation and transmission, digital signal processing requirements, and target classification. The Rohde & Schwarz SMW200A Vector Signal Generator and FSW Signal & Spectrum Analyzer function as the radar system transmitter and receiver, respectively, while the FPGA implementation enables the closed feedback loop used by the CRr.Lieutenant Commander, United States NavyApproved for public release; distribution is unlimited
    corecore