114,335 research outputs found

    A critical evaluation of deterministic methods in size optimisation of reliable and cost effective standalone Hybrid renewable energy systems

    Get PDF
    Reliability of a hybrid renewable energy system (HRES) strongly depends on various uncertainties affecting the amount of power produced by the system. In the design of systems subject to uncertainties, both deterministic and nondeterministic design approaches can be adopted. In a deterministic design approach, the designer considers the presence of uncertainties and incorporates them indirectly into the design by applying safety factors. It is assumed that, by employing suitable safety factors and considering worst-case-scenarios, reliable systems can be designed. In fact, the multi-objective optimisation problem with two objectives of reliability and cost is reduced to a single-objective optimisation problem with the objective of cost only. In this paper the competence of deterministic design methods in size optimisation of reliable standalone wind-PV-battery, wind-PV-diesel and wind-PV-battery-diesel configurations is examined. For each configuration, first, using different values of safety factors, the optimal size of the system components which minimises the system cost is found deterministically. Then, for each case, using a Monte Carlo simulation, the effect of safety factors on the reliability and the cost are investigated. In performing reliability analysis, several reliability measures, namely, unmet load, blackout durations (total, maximum and average) and mean time between failures are considered. It is shown that the traditional methods of considering the effect of uncertainties in deterministic designs such as design for an autonomy period and employing safety factors have either little or unpredictable impact on the actual reliability of the designed wind-PV-battery configuration. In the case of wind-PV-diesel and wind-PV-battery-diesel configurations it is shown that, while using a high-enough margin of safety in sizing diesel generator leads to reliable systems, the optimum value for this margin of safety leading to a cost-effective system cannot be quantified without employing probabilistic methods of analysis. It is also shown that deterministic cost analysis yields inaccurate results for all of the investigated configurations

    Cost-effective aperture arrays for SKA Phase 1: single or dual-band?

    Full text link
    An important design decision for the first phase of the Square Kilometre Array is whether the low frequency component (SKA1-low) should be implemented as a single or dual-band aperture array; that is, using one or two antenna element designs to observe the 70-450 MHz frequency band. This memo uses an elementary parametric analysis to make a quantitative, first-order cost comparison of representative implementations of a single and dual-band system, chosen for comparable performance characteristics. A direct comparison of the SKA1-low station costs reveals that those costs are similar, although the uncertainties are high. The cost impact on the broader telescope system varies: the deployment and site preparation costs are higher for the dual-band array, but the digital signal processing costs are higher for the single-band array. This parametric analysis also shows that a first stage of analogue tile beamforming, as opposed to only station-level, all-digital beamforming, has the potential to significantly reduce the cost of the SKA1-low stations. However, tile beamforming can limit flexibility and performance, principally in terms of reducing accessible field of view. We examine the cost impacts in the context of scientific performance, for which the spacing and intra-station layout of the antenna elements are important derived parameters. We discuss the implications of the many possible intra-station signal transport and processing architectures and consider areas where future work could improve the accuracy of SKA1-low costing.Comment: 64 pages, 23 figures, submitted to the SKA Memo serie

    Simulations of partially coherent focal plane imaging arrays: Fisher matrix approach to performance evaluation

    Get PDF
    Focal plane arrays of bolometers are increasingly employed in astronomy at far--infrared to millimetre wavelengths. The focal plane fields and the detectors are both partially coherent in these systems, but no account has previously been taken of the effect of partial coherence on array performance. In this paper, we use our recently developed coupled--mode theory of detection together with Fisher information matrix techniques from signal processing to characterize the behaviour of partially coherent imaging arrays. We investigate the effects of the size and coherence length of both the source and the detectors, and the packing density of the array, on the amount of information that can be extracted from observations with such arrays.Comment: 14 pages, 7 figures, submitted to MNRAS 7th March 200

    Managing uncertainty in sound based control for an autonomous helicopter

    Get PDF
    In this paper we present our ongoing research using a multi-purpose, small and low cost autonomous helicopter platform (Flyper ). We are building on previously achieved stable control using evolutionary tuning. We propose a sound based supervised method to localise the indoor helicopter and extract meaningful information to enable the helicopter to further stabilise its flight and correct its flightpath. Due to the high amount of uncertainty in the data, we propose the use of fuzzy logic in the signal processing of the sound signature. We discuss the benefits and difficulties using type-1 and type-2 fuzzy logic in this real-time systems and give an overview of our proposed system

    Multi-objective design optimisation of standalone hybrid wind-PV-diesel systems under uncertainties

    Get PDF
    Optimal design of a standalone wind-PV-diesel hybrid system is a multi-objective optimisation problem with conflicting objectives of cost and reliability. Uncertainties in renewable resources, demand load and power modelling make deterministic methods of multi-objective optimisation fall short in optimal design of standalone hybrid renewable energy systems (HRES). Firstly, deterministic methods of analysis, even in the absence of uncertainties in cost modelling, do not predict the levelised cost of energy accurately. Secondly, since these methods ignore the random variations in parameters, they cannot be used to quantify the second objective, reliability of the system in supplying power. It is shown that for a given site and uncertainties profile, there exist an optimum margin of safety, applicable to the peak load, which can be used to size the diesel generator towards designing a cost-effective and reliable system. However, this optimum value is problem dependent and cannot be obtained deterministically. For two design scenarios, namely, finding the most reliable system subject to a constraint on the cost and finding the most cost-effective system subject to constraints on reliability measures, two algorithms are proposed to find the optimum margin of safety. The robustness of the proposed design methodology is shown through carrying out two design case studies

    Atmospheric monitoring and array calibration in CTA using the Cherenkov Transparency Coefficient

    Get PDF
    The Cherenkov Telescope Array (CTA) will be the next generation observatory employing different types of Cherenkov telescopes for the detection of particle showers initiated by very-high-energy gamma rays. A good knowledge of the Earth's atmosphere, which acts as a calorimeter in the detection technique, will be crucial for calibration in CTA. Variations of the atmosphere's transparency to Cherenkov light and not correctly performed calibration of individual telescopes in the array result in large systematic uncertainties on the energy scale. The Cherenkov Transparency Coefficient (CTC), developed within the H.E.S.S. experiment, quantifies the mean atmosphere transparency ascertained from data taken by Cherenkov telescopes during scientific observations. Provided that atmospheric conditions over the array are uniform, transparency values obtained per telescope can be also used for the calibration of individual telescope responses. The application of the CTC in CTA presents a challenge due to the greater complexity of the observatory and the variety of telescope cameras compared with currently operating experiments, such as H.E.S.S. We present here the first results of a feasibility study for extension of the CTC concept in CTA for purposes of the inter-calibration of the telescopes in the array and monitoring of the atmosphere.Comment: All CTA contributions at arXiv:1709.0348

    Reconstructing the cosmic-ray energy from the radio signal measured in one single station

    Full text link
    Short radio pulses can be measured from showers of both high-energy cosmic rays and neutrinos. While commonly several antenna stations are needed to reconstruct the energy of an air shower, we describe a novel method that relies on the radio signal measured in one antenna station only. Exploiting a broad frequency bandwidth of 80−30080-300 MHz, we obtain a statistical energy resolution of better than 15\% on a realistic Monte Carlo set. This method is both a step towards energy reconstruction from the radio signal of neutrino induced showers, as well as a promising tool for cosmic-ray radio arrays. Especially for hybrid arrays where the air shower geometry is provided by an independent detector, this method provides a precise handle on the energy of the shower even with a sparse array

    Architecture of Environmental Risk Modelling: for a faster and more robust response to natural disasters

    Full text link
    Demands on the disaster response capacity of the European Union are likely to increase, as the impacts of disasters continue to grow both in size and frequency. This has resulted in intensive research on issues concerning spatially-explicit information and modelling and their multiple sources of uncertainty. Geospatial support is one of the forms of assistance frequently required by emergency response centres along with hazard forecast and event management assessment. Robust modelling of natural hazards requires dynamic simulations under an array of multiple inputs from different sources. Uncertainty is associated with meteorological forecast and calibration of the model parameters. Software uncertainty also derives from the data transformation models (D-TM) needed for predicting hazard behaviour and its consequences. On the other hand, social contributions have recently been recognized as valuable in raw-data collection and mapping efforts traditionally dominated by professional organizations. Here an architecture overview is proposed for adaptive and robust modelling of natural hazards, following the Semantic Array Programming paradigm to also include the distributed array of social contributors called Citizen Sensor in a semantically-enhanced strategy for D-TM modelling. The modelling architecture proposes a multicriteria approach for assessing the array of potential impacts with qualitative rapid assessment methods based on a Partial Open Loop Feedback Control (POLFC) schema and complementing more traditional and accurate a-posteriori assessment. We discuss the computational aspect of environmental risk modelling using array-based parallel paradigms on High Performance Computing (HPC) platforms, in order for the implications of urgency to be introduced into the systems (Urgent-HPC).Comment: 12 pages, 1 figure, 1 text box, presented at the 3rd Conference of Computational Interdisciplinary Sciences (CCIS 2014), Asuncion, Paragua
    • …
    corecore