402 research outputs found

    Advanced Methods For Small Signal Stability Analysis And Control In Modern Power Systems

    Get PDF
    This thesis is aimed at exploring issues relating to power system security analysis particularly arising under an open access deregulated environment. Numerical methods and computation algorithms locating the critical security condition points and visualizing the security hyper-plane in the parameter space are proposed. The power industry is undergoing changes leading to restructuring and privatizing in many countries. This restructuring consists in changing the power industry from a regulated and vertically integrated form into regional, competitive and functionally separate entities. This is done with the prospect of increasing efficiency by better management and better usage of existing equipment and lower price of electricity to all types of customers while maintaining a reliable system. As a result of deregulation and restructuring, power suppliers will increasingly try to deliver more energy to customers using existing system facilities, thereby putting the system under heavy stress. Accordingly, many technical and economic issues have arisen, for example, all or some of transient instability, aperiodic and oscillatory instability, insufficient reactive power supply, and even voltage collapse problems may coexist. This situation introduces the requirement for comprehensive analytical tools to assess the system security conditions, as well as to provide optimal control strategies to overcome these problems. There are computational techniques for assessing the power system stability critical conditions in given loading directions, but it is not enough to just have a few critical points in the parameter space to formulate an optimal control to avoid insecurity. A boundary or hyper-plane containing all such critical and subcritical security condition points will provide a comprehensive understanding of the power system operational situation and therefore can be used to provide a global optimal control action to enhance the system security. With the security boundary or hyper-plane available, the system operators can place the power system inside the security boundaries, away from instability, and enhance its security in an optimal way. Based on proper power system modelling, a general method is proposed to locate the power system small signal stability characteristic points, which include load flow feasibility points, aperiodic and oscillatory stability points, minimum and maximum damping points. Numerical methods for tracing the power system bifurcation boundaries are proposed to overcome nonconvexity and provide an efficient parameter continuation approach to trace stability boundaries of interest. A Delta-plane method for visualizing the power system load flow feasibility and bifurcation boundaries is proposed. The optimization problem defined by assessing the minimal distance from an operating point to the boundaries is considered. In particular, emphasis is placed on computing all locally minimal and the global minimum distances. Due to the complexity of any power system, traditional optimization techniques sometimes fail to locate the global optimal solutions which are essential to power system security analysis. However, genetic algorithms, due to their robustness and loose problem pre-requisites, are shown to fulfill the task rather satisfactorily. Finally, a toolbox is described which incorporates all these proposed techniques, and is being developed for power system stability assessment and enhancement analysis

    Appearing Out of Nowhere: The Emergence of Spacetime in Quantum Gravity

    Get PDF
    Quantum gravity is understood as a theory that, in some sense, unifies general relativity (GR) and quantum theory, and is supposed to replace GR at extremely small distances (high-energies). It may be that quantum gravity represents the breakdown of spacetime geometry described by GR. The relationship between quantum gravity and spacetime has been deemed "emergence", and the aim of this thesis is to investigate and explicate this relation. After finding traditional philosophical accounts of emergence to be inappropriate, I develop a new conception of emergence by considering physical case studies including condensed matter physics, hydrodynamics, critical phenomena and quantum field theory understood as effective field theory. This new conception of emergence is independent of reduction and derivation. Instead, a low-energy theory is understood as emergent from a high-energy theory if it is novel and autonomous compared to the high-energy theory, and the low-energy physics is dependent (in a particular, minimal sense) on the high-energy physics (this dependence is revealed by the techniques of effective field theory and the renormalisation group). These ideas are important in exploring the relationship between quantum gravity and GR, where GR is understood as an effective, low-energy theory of quantum gravity. Without experimental data or a theory of quantum gravity, we rely on principles and techniques from other areas of physics to guide the way. As well as considering the idea of emergence appropriate to treating GR as an effective field theory, I investigate the emergence of spacetime (and other aspects of GR) in several concrete approaches to quantum gravity, including examples of the condensed matter approaches, the "discrete approaches" (causal set theory, causal dynamical triangulations, quantum causal histories and quantum graphity) and loop quantum gravity.Comment: PhD thesis submitted to the University of Sydne

    Complex networks: Structure and dynamics

    Full text link

    Towards an Efficient Gas Exchange Monitoring with Electrical Impedance Tomography - Optimization and validation of methods to investigate and understand pulmonary blood flow with indicator dilution

    Get PDF
    In vielen Fällen sind bei Patienten, die unter stark gestörtem Gasaustausch der Lunge leiden, die regionale Lungenventilation und die Perfusion nicht aufeinander abgestimmt. Besonders bei Patienten mit akutem Lungenversagen sind sehr heterogene räumliche Verteilungen von Belüftung und Perfusion der Lunge zu beobachten. Diese Patienten müssen auf der Intensivstation künstlich beatmet und überwacht werden, um einen ausreichenden Gasaustausch sicherzustellen. Bei schweren Lungenverletzungen ist es schwierig, durch die Anwendung hoher Beatmungsdrücke und -volumina eine optimale Balance zwischen dem Rekrutieren kollabierter Regionen zu finden, und gleichzeitig die Lunge vor weiterem Schaden durch die von außen angelegten Drücke zu schützen. Das Interesse für eine bettseitige Messung und Darstellung der regionalen Belüftungs- und Perfusionsverteilung für den Einsatz auf der Intensivstation ist in den letzten Jahren stark gestiegen, um eine lungenprotektive Beatmung zu ermöglichen und klinische Diagnosen zu vereinfachen. Die Elektrische-Impedanztomographie (EIT) ist ein nicht-invasives, strahlungsfreies und sehr mobil einsetzbares System. Es bietet eine hohe zeitliche Abtastung und eine funktionelle räumliche Auflösung, die es ermöglicht, dynamische (patho-) physiologische Prozesse zu visualisieren und zu überwachen. Die medizinische Forschung an EIT hat sich dabei hauptsächlich auf die Schätzung der räumlichen Belüftung konzentriert. Kommerziell erhältliche Systeme haben gezeigt, dass die EIT eine wertvolle Entscheidungshilfe während der mechanischen Beatmung darstellt. Allerdings ist die Abschätzung der pulmonalen Perfusion mit EIT noch nicht etabliert. Dies könnte das fehlende Glied sein, um die Analyse des pulmonalen Gasaustauschs am Krankenbett zu ermöglichen. Obwohl einige Publikationen die prinzipielle Machbarkeit der indikatorgestützten EIT zur Schätzung der räumlichen Verteilung des pulmonalen Blutflusses gezeigt haben, müssen diese Methoden optimiert und durch Vergleich mit dem Goldstandard des Lungenperfusions-Monitorings validiert werden. Darüber hinaus ist weitere Forschung notwendig, um zu verstehen welche physiologischen Informationen der EIT-Perfusionsschätzung zugrunde liegen. Mit der vorliegenden Arbeit soll die Frage beantwortet werden, ob bei der klinischen Anwendung von EIT neben der regionalen Belüftung auch räumliche Informationen des pulmonalen Blutflusses geschätzt werden können, um damit potenziell den pulmonalen Gasaustausch am Krankenbett beurteilen zu können. Die räumliche Verteilung der Perfusion wurde durch Bolusinjektion einer leitfähigen Kochsalzlösung als Indikator geschätzt, um die Verteilung des Indikators während seines Durchgangs durch das Gefäßsystem der Lunge zu verfolgen. Verschiedene dynamische EIT-Rekonstruktionsmethoden und Perfusionsparameter Schätzmethoden wurden entwickelt und verglichen, um den pulmonalen Blutfluss robust beurteilen zu können. Die geschätzten regionalen EIT-Perfusionsverteilungen wurden gegen Goldstandard Messverfahren der Lungenperfusion validiert. Eine erste Validierung wurde anhand von Daten einer tierexperimentellen Studie durchgeführt, bei der die Multidetektor-Computertomographie als vergleichende Lungenperfusionsmessung verwendet wurde. Darüber hinaus wurde im Rahmen dieser Arbeit eine umfassende präklinische Tierstudie durchgeführt, um die Lungenperfusion mit indikatorverstärkter EIT und Positronen-Emissions-Tomographie während mehrerer verschiedener experimenteller Zustände zu untersuchen. Neben einem gründlichen Methodenvergleich sollte die klinische Anwendbarkeit der indikatorgestützten EIT-Perfusionsmessung untersucht werden, indem wir vor allem die minimale Indikatorkonzentration analysierten, die eine robuste Perfusionsschätzung erlaubte und den geringsten Einfluss für den Patienten darstellt. Neben den experimentellen Validierungsstudien wurden zwei in-silico-Untersuchungen durchgeführt, um erstens die Sensitivität von EIT gegenüber des Durchgangs eines leitfähigen Indikators durch die Lunge vor stark heterogenem pulmonalen Hintergrund zu bewerten. Zweitens untersuchten wir die physiologischen Einflüsse, die zu den rekonstruierten EITPerfusionsbildern beitragen, um die Limitationen der Methode besser zu verstehen. Die Analysen zeigten, dass die Schätzung der Lungenperfusion auf der Basis der indikatorverstärkten EIT ein großes Potenzial für die Anwendung in der klinischen Praxis aufweist, da wir sie mit zwei Goldstandard-Perfusionsmesstechniken validieren konnten. Zudem konnten wertvolle Schlüsse über die physiologischen Einflüsse auf die geschätzten EIT Perfusionsverteilungen gezogen werden

    Accurate Real-Time Framework for Complex Pre-defined Cuts in Finite Element Modeling

    Get PDF
    PhD ThesisAchieving detailed pre-defined cuts on deformable materials is vitally pivotal for many commercial applications, such as cutting scenes in games and vandalism effects in virtual movies. In these types of applications, the majority of resources are allocated to achieve high-fidelity representations of materials and the virtual environments. In the case of limited computing resources, it is challenging to achieve a convincing cutting effect. On the premise of sacrificing realism effects or computational cost, a considerable amount of research work has been carried out, but the best solution that can be compatible with both cases has not yet been identified. This doctoral dissertation is dedicated to developing a unique framework for representing pre-defined cuts of deformable surface models, which can achieve real-time, detailed cutting while maintaining the realistic physical behaviours. In order to achieve this goal, we have made in-depth explorations from geometric and numerical perspectives. From a geometric perspective, we propose a robust subdivision mechanism that allows users to make arbitrary predetermined cuts on elastic surface models based on the finite element method (FEM). Specifically, after the user separates the elements in an arbitrary manner (i.e., linear or non-linear) on the topological mesh, we then optimise the resulting mesh by regenerating the triangulation within the element based on the Delaunay triangulation principle. The optimisation of regenerated triangles, as a process of refining the ill-shaped elements that have small Aspect Ratio, greatly improves the realism of physical behaviours and guarantees that the refinement process is balanced with real-time requirements. The above subdivision mechanism can improve the visual effect of cutting, but it neglects the fact that elements cannot be perfectly cut through any pre-defined trajectories. The number of ill-shaped elements generated yield a significant impact on the optimisation time: a large number of ill-shaped elements will render the cutting slow or even collapse, and vice versa. Our idea is based on the core observation that the producing of ill-shaped elements is largely attributed to the condition number of the global stiffness matrix. Practically, for a stiffness matrix, a large condition number means that it is almost singular, and the calculation of its inverse or the solution of a system of linear equations are prone to large numerical errors and time-consuming. It motivates us to alleviate the impact of condition number of the global stiffness matrix from the numerical aspects. Specifically, we address this issue in a novel manner by converting the global stiffness matrix into the form of a covariance matrix, in which the number of conditions of the matrix can be reduced by exploiting appropriate matrix normalisation to the eigenvalues. Furthermore, we investigated the efficiency of two different scenarios: an exact square-root normalisation and its approximation based on the Newton-Schulz iteration. Experimental tests of the proposed framework demonstrate that it can successfully reproduce competitive visuals of detailed pre-defined cuts compared with the state-of-the-art method (Manteaux et al. 2015) while obtaining a significant improvement on the FPS, increasing up to 46.49 FPS and 21.93 FPS during and after the cuts, respectively. Also, the new refinement method can stably maintain the average Aspect Ratio of the model mesh after the cuts at less than 3 and the average Area Ratio around 3%. Besides, the proposed two matrix normalisation strategies, including ES-CGM and AS-CGM, have shown the superiority of time efficiency compared with the baseline method (Xin et al. 2018). Specifically, the ES-CGM and AS-CGM methods obtained 5 FPS and 10 FPS higher than the baseline method, respectively. These experimental results strongly support our conclusion which is that this new framework would provide significant benefits when implemented for achieving detailed pre-defined cuts at a real-time rate

    5th EUROMECH nonlinear dynamics conference, August 7-12, 2005 Eindhoven : book of abstracts

    Get PDF
    corecore