66 research outputs found

    Neural hypernetwork approach for pulmonary embolism diagnosis

    Get PDF
    Background Hypernetworks are based on topological simplicial complexes and generalize the concept of two-body relation to many-body relation. Furthermore, Hypernetworks provide a significant generalization of network theory, enabling the integration of relational structure, logic and analytic dynamics. A pulmonary embolism is a blockage of the main artery of the lung or one of its branches, frequently fatal. Results Our study uses data on 28 diagnostic features of 1427 people considered to be at risk of pulmonary embolism enrolled in the Department of Internal and Subintensive Medicine of an Italian National Hospital “Ospedali Riuniti di Ancona”. Patients arrived in the department after a first screening executed by the emergency room. The resulting neural hypernetwork correctly recognized 94 % of those developing pulmonary embolism. This is better than previous results obtained with other methods (statistical selection of features, partial least squares regression, topological data analysis in a metric space). Conclusion In this work we successfully derived a new integrative approach for the analysis of partial and incomplete datasets that is based on Q-analysis with machine learning. The new approach, called Neural Hypernetwork, has been applied to a case study of pulmonary embolism diagnosis. The novelty of this method is that it does not use clinical parameters extracted by imaging analysis

    Review on computational methods for Lyapunov functions

    Get PDF
    Lyapunov functions are an essential tool in the stability analysis of dynamical systems, both in theory and applications. They provide sufficient conditions for the stability of equilibria or more general invariant sets, as well as for their basin of attraction. The necessity, i.e. the existence of Lyapunov functions, has been studied in converse theorems, however, they do not provide a general method to compute them. Because of their importance in stability analysis, numerous computational construction methods have been developed within the Engineering, Informatics, and Mathematics community. They cover different types of systems such as ordinary differential equations, switched systems, non-smooth systems, discrete-time systems etc., and employ di_erent methods such as series expansion, linear programming, linear matrix inequalities, collocation methods, algebraic methods, set-theoretic methods, and many others. This review brings these different methods together. First, the different types of systems, where Lyapunov functions are used, are briefly discussed. In the main part, the computational methods are presented, ordered by the type of method used to construct a Lyapunov function

    Hopfield Network as Static Optimizer: Learning the Weights and Eliminating the Guesswork

    No full text
    Abstract This article presents a simulation study for validation of an adaptation methodology for learning weights of a Hopfield neural network configured as a static optimizer. The quadratic Liapunov function associated with the Hopfield network dynamics is leveraged to map the set of constraints associated with a static optimization problem. This approach leads to a set of constraint-specific penalty or weighting coefficients whose values need to be defined. The methodology leverages a learning-based approach to define values of constraint weighting coefficients through adaptation. These values are in turn used to compute values of network weights, effectively eliminating the guesswork in defining weight values for a given static optimization problem, which has been a long-standing challenge in artificial neural networks. The simulation study is performed using the Traveling Salesman problem from the domain of combinatorial optimization. Simulation results indicate that the adaptation procedure is able to guide the Hopfield network towards solutions of the problem starting with random values for weights and constraint weighting coefficients. At the conclusion of the adaptation phase, the Hopfield network acquires weight values which readily position the network to search for local minimum solutions. The demonstrated successful application of the adaptation procedure eliminates the need to guess or predetermine the values for weights of the Hopfield network
    corecore