1,472 research outputs found

    Satellite swarms for auroral plasma science

    Get PDF
    With the growing accessibility of space, this thesis work sets out to explore space-based swarms to do multipoint magnetometer measurements of current systems embedded within the Aurora Borealis as an initial foray into concepts for space physics applications using swarms of small spacecraft. As a pathfinder, ANDESITE---a 6U CubeSat with eight deployable picosatellites---was built as part of this research. The mission will fly a local network of magnetometers above the Northern Lights. With the spacecraft due to launch on an upcoming ELaNa mission, here we discuss the details of the science motivation, the mathematical framework for current field reconstruction, the particular hardware implementation selected, the calibration procedures, and the pragmatic management needed to realize the spacecraft. After describing ANDESITE and defining its capability, we also propose a follow-on that uses propulsive nodes in a swarm, allowing measurements that can adaptively change to capture the physical phenomena of interest. To do this a flock of satellites needs to fall into the desired formation and maintain it for the duration of the science mission. A simple optimal controller is developed to model the deployment of the satellites. Using a Monte Carlo approach for the uncertain initial conditions, we bound the fuel cost of the mission and test the feasibility of the concept. To illustrate the system analysis needed to effectively design such swarms, this thesis also develops a framework that characterizes the spatial frequency response of the kilometer-scale filter created by the swarm as it flies through various current density structures in the ionospheric plasma. We then subjugate a nominal ANDESITE formation and the controlled swarm specified to the same analysis framework. The choice of sampling scheme and rigorous basic mathematical analysis are essential in the development of a multipoint-measurement mission. We then turn to a novel capability exploiting current trends in the commercial industry. Magnetometers deployed on the largest constellation to date are leveraged as a space-based magnetometer network. The constellation, operated by Planet Labs Inc., consists of nearly 200 satellites in two polar sun-synchronous orbits, with median spacecraft separations on the order of 375 km, and some occasions of opportunity providing much closer spacing. Each spacecraft contains a magneto-inductive magnetometer, able to sample the ambient magnetic field at 0.1 Hz to 10 Hz with <200 nT sensitivity. A feasibility study is presented wherein seven satellites from the Planet constellation were used to investigate space-time patterns in the current systems overlying an active auroral arc over a 10-minute interval. Throughout the this work advantages, limitations, and caveats in exploiting networks of lower quality magnetometers are discussed, pointing out the path forward to creating a global network that can monitor the space environment

    Development of a modelling framework for integrated catchment flood risk management

    Get PDF
    Flooding is one of the most significant issues facing the UK and Europe. New approaches are being sought to mitigate its impacts, and distributed, catchment-based techniques are becoming increasingly popular. These employ a range of measures, often working with the catchment’s natural processes, in order to improve flood resilience. There remains a lack of conclusive evidence, however, for the impacts of these approaches on the storm runoff, leading to considerable uncertainty in their effectiveness in terms of mitigating flood risk. A new modelling framework for design, assessment, and uncertainty estimation of such distributed, nature-based schemes is developed. An implementation of a semidistributed runoff model demonstrates robustness to spatio-temporal discretisation. Alongside a new hydraulic routing scheme, the model is used to evaluate the impacts on flood risk of in-channel measures applied within an 29 km2 agricultural catchment. Maximum additional channel storage of 70,000 m3 and a corresponding reduction of 11% in peak flows is seen. This, however, would not have been insufficient to prevent flooding in the event considered. Further modifications allow simulation of the impacts of wider measures employing natural processes. This is applied within an uncertainty estimation framework across the headwaters of three mixed-use catchments, ranging in size from 57 km2 to 200km2 , across a series of extreme storm events. A novel surface routing algorithm allows simulation of large arrays distributed features that intercept and store fast runoff. The effect of the measures can be seen across even the most extreme events, with a reduction of up to 15% in the largest peak, albeit that this large impact was associated with a low confidence level. The methodology can reflect the uncertainty in application of natural flood risk management with a poor or incomplete evidence base. The modelling results demonstrate the importance of antecedent conditions and of the timings and magnitudes of a series of storm events. The results shows the benefits of maximizing features’ storage utilisation by allowing a degree of “leakiness” to enable drain-down between storms. An unanticipated result was that some configurations of measures could synchronise previously asynchronous subcatchment flood waves and have a detrimental effect on the flood risk. The framework shows its utility in both modelling and evaluation of catchment-based flood risk management and in wider applications where computational efficiency and uncertainty estimation are important

    Path Integrals in the Sky: Classical and Quantum Problems with Minimal Assumptions

    Get PDF
    Cosmology has, after the formulation of general relativity, been transformed from a branch of philosophy into an active field in physics. Notwithstanding the significant improvements in our understanding of our Universe, there are still many open questions on both its early and late time evolution. In this thesis, we investigate a range of problems in classical and quantum cosmology, using advanced mathematical tools, and making only minimal assumptions. In particular, we apply Picard-Lefschetz theory, catastrophe theory, infinite dimensional measure theory, and weak-value theory. To study the beginning of the Universe in quantum cosmology, we apply Picard-Lefschetz theory to the Lorentzian path integral for gravity. We analyze both the Hartle-Hawking no-boundary proposal and Vilenkin's tunneling proposal, and demonstrate that the Lorentzian path integral corresponding to the mini-superspace formulation of the two proposals is well-defined. However, when including fluctuations, we show that the path integral predicts the existence of large fluctuations. This indicates that the Universe cannot have had a smooth beginning in Euclidean de Sitter space. In response to these conclusions, the scientific community has made a series of adapted formulations of the no-boundary and tunneling proposals. We show that these new proposals suffer from similar issues. Second, we generalize the weak-value interpretation of quantum mechanics to relativistic systems. We apply this formalism to a relativistic quantum particle in a constant electric field. We analyze the evolution of the relativistic particle in both the classical and the quantum regime and evaluate the back-reaction of the Schwinger effect on the electric field in 1+11+1-dimensional spacetime, using analytical methods. In addition, we develop a numerical method to evaluate both the wavefunction and the corresponding weak-values in more general electric and magnetic fields. We conclude the quantum part of this thesis with a chapter on Lorentzian path integrals. We propose a new definition of the real-time path integral in terms of Brownian motion on the Lefschetz thimble of the theory. We prove the existence of a σ\sigma-measure for the path integral of the non-relativistic free particle, the (inverted) harmonic oscillator, and the relativistic particle in a range of potentials. We also describe how this proposal extends to more general path integrals. In the classical part of this thesis, we analyze two problems in late-time cosmology. Multi-dimensional oscillatory integrals are prevalent in physics, but notoriously difficult to evaluate. We develop a new numerical method, based on multi-dimensional Picard-Lefschetz theory, for the evaluation of these integrals. The virtue of this method is that its efficiency increases when integrals become more oscillatory. The method is applied to interference patterns of lensed images near caustics described by catastrophe theory. This analysis can help us understand the lensing of astrophysical sources by plasma lenses, which is especially relevant in light of the proposed lensing mechanism for fast radio bursts. Finally, we analyze large-scale structure formation in terms of catastrophe theory. We show that the geometric structure of the three-dimensional cosmic-web is determined by both the eigenvalue and the eigenvector fields of the deformation tensor. We formulate caustic conditions, classifying caustics using properties of these fields. When applied to the Zel'dovich approximation of structure formation, the caustic conditions enable us to construct a caustic skeleton of the three-dimensional cosmic-web in terms of the initial conditions

    Scalable Performance Analysis of Massively Parallel Stochastic Systems

    No full text
    The accurate performance analysis of large-scale computer and communication systems is directly inhibited by an exponential growth in the state-space of the underlying Markovian performance model. This is particularly true when considering massively-parallel architectures such as cloud or grid computing infrastructures. Nevertheless, an ability to extract quantitative performance measures such as passage-time distributions from performance models of these systems is critical for providers of these services. Indeed, without such an ability, they remain unable to offer realistic end-to-end service level agreements (SLAs) which they can have any confidence of honouring. Additionally, this must be possible in a short enough period of time to allow many different parameter combinations in a complex system to be tested. If we can achieve this rapid performance analysis goal, it will enable service providers and engineers to determine the cost-optimal behaviour which satisfies the SLAs. In this thesis, we develop a scalable performance analysis framework for the grouped PEPA stochastic process algebra. Our approach is based on the approximation of key model quantities such as means and variances by tractable systems of ordinary differential equations (ODEs). Crucially, the size of these systems of ODEs is independent of the number of interacting entities within the model, making these analysis techniques extremely scalable. The reliability of our approach is directly supported by convergence results and, in some cases, explicit error bounds. We focus on extracting passage-time measures from performance models since these are very commonly the language in which a service level agreement is phrased. We design scalable analysis techniques which can handle passages defined both in terms of entire component populations as well as individual or tagged members of a large population. A precise and straightforward specification of a passage-time service level agreement is as important to the performance engineering process as its evaluation. This is especially true of large and complex models of industrial-scale systems. To address this, we introduce the unified stochastic probe framework. Unified stochastic probes are used to generate a model augmentation which exposes explicitly the SLA measure of interest to the analysis toolkit. In this thesis, we deploy these probes to define many detailed and derived performance measures that can be automatically and directly analysed using rapid ODE techniques. In this way, we tackle applicable problems at many levels of the performance engineering process: from specification and model representation to efficient and scalable analysis

    Managing Intellectual Property to Foster Agricultural Development

    Get PDF
    Over the past decades, consideration of IPRs has become increasingly important in many areas of agricultural development, including foreign direct investment, technology transfer, trade, investment in innovation, access to genetic resources, and the protection of traditional knowledge. The widening role of IPRs in governing the ownership of—and access to—innovation, information, and knowledge makes them particularly critical in ensuring that developing countries benefit from the introduction of new technologies that could radically alter the welfare of the poor. Failing to improve IPR policies and practices to support the needs of developing countries will eliminate significant development opportunities. The discussion in this note moves away from policy prescriptions to focus on investments to improve how IPRs are used in practice in agricultural development. These investments must be seen as complementary to other investments in agricultural development. IPRs are woven into the context of innovation and R&D. They can enable entrepreneurship and allow the leveraging of private resources for resolving the problems of poverty. Conversely, IPRs issues can delay important scientific advancements, deter investment in products for the poor, and impose crippling transaction costs on organizations if the wrong tools are used or tools are badly applied. The central benefit of pursuing the investments outlined in this note is to build into the system a more robust capacity for strategic and flexible use of IPRs tailored to development goals

    Control Algorithms for Distributed Networked Industrial Systems

    Get PDF
    corecore